On-site
Senior
Full Time
Posted January 14, 2026
Tech Stack
amazon-web-services
microsoft-azure
azure-databricks
databricks
kubernetes
snowflake
pyspark
hadoop
apache-hadoop
python
scala
java
airflow
apache-airflow
docker
jenkins
kafka
Job Description
Role\*\*: Sr. Data Engineer
Required Technical Skill Set: Sr. Data Engineer
Desired Experience Range: 7 - 12 yrs
Notice Period: Immediate to 90Days only
Location of Requirement:
**Bangalore**
We are currently planning to do a Virtual
**Interview**
**Job Description:**
**Primary Skill:**
Data Engineer, AWS, Azure, Databricks, Kubernetes or Snowflake, Pyspark
**Responsibilities and Duties of the Role:**
- Work and mentor all members of the Data Reliability Engineering team to manage production incidents, triage and root cause issues and ensure timely resolution via Tier 1 and Tier 2 support. Offer recommendations via RCAs to prevent future incidents to engineering partners.
- Assist in designing and developing a platform to support incident observability and automation. This team will be required to build high quality data models and products that monitor and reports on data pipeline health and data quality.
- Collaborate with engineering teams to improve, maintain, performance tune, and respond to incidents on our big data pipeline infrastructure.
- Build out observability and intelligent monitoring of data pipelines and infrastructure to achieve early and automated anomaly detection and alerting. Present your research and insights to all levels of the company, clearly and concisely.
**Required Education, Experience/Skills/Training:**
Basic Qualifications
- 5+ years experience working on mission critical data pipelines and ETL systems.
- 5+ years of hands-on experience with big data technology, systems and tools such as AWS, Hadoop, Hive, and Snowflake
- Detailed problem-solving approach, coupled with a strong sense of ownership and drive
- A passionate bias to action and passion for delivering high-quality data solutions
- Expertise with common Data Engineering languages such as Python, Scala, Java, SQL and a proven ability to learn new programming languages
- Experience with workflow orchestration tools such as Airflow
- Deep understanding of end-to-end pipeline design and implementation
- Attention to detail and quality with excellent problem solving and interpersonal skills
**Preferred Qualifications**
- Advanced degrees are a plus.
- Strong data visualizations skills to convey information and results clearly
- Ability to work independently and drive your own projects.
- Exceptional interpersonal and communication skills.
- Impactful presentation skills in front of a large and diverse audience.
- Experience with DevOps tools such as Docker, Kubernetes, Jenkins, etc.
- Innate curiosity about consumer behavior and technology
- Experience with event messaging frameworks like Apache Kafka
- A fan of movies and television is a strong plus.