Remote
Mid Level
Full Time
Posted January 02, 2026
Tech Stack
python
pandas
numpy
pyspark
postgresql
mysql
mongodb
cassandra
apache-cassandra
snowflake
processing-js
spark
hadoop
apache-hadoop
amazon-web-services
microsoft-azure
google-cloud-platform
airflow
apache-airflow
luigi
kafka
dbt
docker
kubernetes
appcast
Job Description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Apetan Consulting, is seeking the following. Apply via Dice today!
**Key Responsibilities**
- Design, build, and maintain ETL/ELT data pipelines
- Develop Python-based data processing applications
- Work with structured and unstructured data at scale
- Integrate data from multiple sources (APIs, databases, files, streams)
- Optimize data workflows for performance and reliability
- Ensure data quality, validation, and monitoring
- Collaborate with data scientists, analysts, and backend teams
- Manage and maintain data warehouses/lakes
- Implement logging, error handling, and automation
- Follow best practices for security and compliance
**Required Skills**
Programming
- Strong Python (Pandas, NumPy, PySpark)
- Writing clean, modular, and testable code
Databases & Storage
- SQL (PostgreSQL, MySQL, SQL Server)
- NoSQL (MongoDB, Cassandra optional)
- Data Warehouses (Snowflake, Redshift, BigQuery)
Big Data & Processing
- Apache Spark, Hadoop (preferred)
- Batch and streaming data processing
Cloud Platforms
- AWS / Azure / Google Cloud Platform
- S3, Lambda, Glue, Dataflow, BigQuery, etc.
Data Engineering Tools
- Airflow, Prefect, Luigi (orchestration)
- Kafka / PubSub (streaming optional)
- DBT (data transformation)
DevOps & Other
- Git, CI/CD
- Docker, Kubernetes (nice to have)
- Linux basics