Hybrid
Mid Level
Full Time
Posted January 13, 2026
Tech Stack
spark
scala
java
python
docker
kubernetes
airflow
apache-airflow
cassandra
apache-cassandra
kafka
gitlab
jenkins
avature
Job Description
**Introduction**
At IBM, work is more than a job - it's a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you've never thought possible. Are you ready to lead in this new era of technology and solve some of the world's most challenging problems? If so, let's talk.
**Your Role And Responsibilities**
Your responsibilities will include:
- Writing effective and scalable data processing code using Spark with Scala/Java/Python
- Designing and implementing robust solutions \* Debugging applications to ensure low-latency and high-availability (Docker / Kubernetes)
- Interfacing with developed API and building Rest API
- Implementing and refactoring ETL processes (Airflow)
- Implementing security and data protection (HVault)
- Accommodating various data storage solutions (COS API, Cassandra)
**Preferred Education**
Bachelor's Degree
**Required Technical And Professional Expertise**
- Proficiency in data processing with Apache Spark (Scala/Java/Python)
- Work experience with Docker / Kubernetes
- Experience in implementing and refactoring ETL processes (Airflow / Python)
- Experience in implementing security and data protection
**Preferred Technical And Professional Experience**
Nice to have: Kafka, GitLab, Jenkins