On-site
Mid Level
Full Time
Posted January 08, 2026
Tech Stack
snowflake
amazon-web-services
informatica
airflow
apache-airflow
gitlab
github
github-actions
microsoft-azure
azure-devops
jenkins
terraform
python
google-cloud-platform
docker
kubernetes
microsoft-graph
appcast
Job Description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Digitive LLC, is seeking the following. Apply via Dice today!
Job Title:
**DevOps Engineer with Snowflake and IICS Exp.**
Location: Boston, MA (Local Only)
Contract: 11+ Months
W2 Only
**Notes:**
Need local MA candidates only with strong Snowflake and IICS experience
**Job Description**
Client is seeking an experienced DevOps Engineer to support our cloud data warehouse modernization initiative, migrating from a SQL Server/AWS based system to a Snowflake-based data platform. The DevOps Engineer is responsible for developing, maintaining, and optimizing data pipelines and integration processes that support analytics, reporting, and business operations. The DevOps Engineer will design and implement CI/CD pipelines, automate data pipeline deployments, and ensure operational reliability across Snowflake, Informatica, and Apache Airflow environments.
**Required Education:**
Bachelor s degree or equivalent years in Computer Science, Information Systems, Data Engineering, Health Informatics, or related field
**Required Skills, Experience, Qualifications & Abilities:**
3 7+ years in DevOps, Cloud Engineering, or Data Platform Engineering roles
Snowflake (roles, warehouses, performance tuning, cost control)
Apache Airflow (DAG orchestration, monitoring, deployments)
Informatica (IICS pipeline deployment automation preferred)
Strong CI/CD skills using GitLab, GitHub Actions, Azure DevOps, Jenkins, or similar
Proficiency with Terraform, Python, and Shell scripting
Deep understanding of cloud platforms: AWS, Azure, or Google Cloud Platform
Experience with containerization (Docker, Kubernetes), especially for Airflow
Strong knowledge of networking concepts and security controls
Effective communication with technical and non-technical stakeholders
Ability to troubleshoot complex distributed data workloads
Strong documentation and cross-team collaboration skills
Proactive and committed to process improvement and automation
Detail-oriented, with a focus on data accuracy and process improvement
**Preferred Skills, Experience, Qualifications & Abilities:**
Experience migrating from SQL Server or other legacy DW platforms
Knowledge of FinOps practices for Snowflake usage optimization
Background in healthcare, finance, or regulated industries a plus
**Job duties and Responsibilities:**
Build and maintain CI/CD (Continuous Integration (CI)/Continuous Delivery/Deployment (CD) pipelines for Snowflake, Informatica (IICS), and Airflow DAG (Directed Acyclic Graph) deployments
Implement automated code promotion between development, test, and production environments
Integrate testing, linting, and security scanning into deployment processes
Develop IaC(Infrastructure as Code using Terraform or similar tools to manage Snowflake objects, network, and cloud resources
Manage configuration and environment consistency across multi-region/multi-cloud setups
Maintain secure connectivity between cloud and on-prem systems (VPNs, private links, firewalls)
Implement logging and alerting for Airflow DAGs, Informatica workflows, and Snowflake performance
Develop proactive monitoring dashboards for job failures, data quality triggers, and warehouse usage
Optimize pipeline performance, concurrency, and cost governance in Snowflake
Own deployment frameworks for ETL/ELT code, SQL scripts, metadata updates
Support user access provisioning & RBAC alignment across Snowflake, Informatica, and Airflow
Troubleshoot platform and orchestration issues, lead incident response during outages
Enforce DevSecOps practices including encryption, secrets management, and key rotation
Implement audit, logging, compliance, and backup/restore strategies aligned with governance requirements
Participate in testing, deployment, and release management for new data workflows and enhancements.