KubeCraftJobs

DevOps & Cloud Job Board

Data Platform Engineer

Apple

הרצליה, מחוז תל אביב

On-site
Mid Level
Posted January 06, 2026

Tech Stack

coreos python postgresql trino pandas numpy kafka spark azure-databricks databricks airflow apache-airflow pypi-dagster docker kubernetes tableau grafana

Please log in or register to view job application links.

Job Description

We are seeking an experienced Data Platform Engineer to join our Storage Analytics team. You will design and build data solutions that provide critical insights into storage performance and usage across Apple's entire device ecosystem. **Description** Working with large-scale telemetry data from millions of Apple devices worldwide, you'll support multiple CoreOS Storage teams, including Software Update, Backup/Restore/Migration, Storage Attribution, and other storage domains.","responsibilities":"Design, build, and maintain scalable data processing infrastructure to handle large-scale telemetry from Apple's global device fleet Develop highly scalable data pipelines to ingest and process storage performance metrics, usage patterns, and system telemetry with actionable alerting and anomaly detection Build and maintain robust data platforms and ETL frameworks that enable CoreOS Storage teams to access, process, and derive insights from storage telemetry data Engineer automated data delivery systems and APIs that serve processed storage metrics to various engineering teams across different storage domains **Preferred Qualifications** Master's or PhD in Computer Science or related field Deep expertise in data principles, data architecture, and data modeling Strong problem-solving skills and meticulous attention to detail, with the ability to tackle loosely defined problem **Minimum Qualifications** Bachelor's degree in Computer Science or related technical field 4+ years of professional experience in data modeling, pipeline development, and software engineering Programming Languages: excellent programming skills in Python with strong computer science foundations (data structures, low-level parallelization. Database Management: Strong SQL skills and hands-on experience with relational databases and query engines (PostgreSQL, Impala, Trino) Experience with data analysis tools and libraries (Pandas/Polars, NumPy, dbt) Experience with big data technologies (Kafka, Spark, Databricks, S3) Experience with Apache Airflow, Dagster, or similar data orchestration frameworks for workflow orchestration, scheduling, and monitoring Experience with containerization and orchestration (Docker, Kubernetes /visualization & Reporting: Strong proficiency with creating and maintaining Tableau/Grafana dashboards and workflows","internalDetails":null,"eeoContent":null