We are seeking a Senior Data DevOps Engineer to support and optimize our data platform within Azure environments.
You will play a key role in deploying and maintaining scalable, secure data infrastructures while collaborating with various teams to enhance data workflows and CI/CD pipelines. Your expertise with Databricks and Azure tools will be essential to ensure smooth operations and continuous improvement of our data systems. Join us to contribute your skills in a dynamic environment focused on advanced data solutions.
Responsibilities
- Deploy and configure data platforms in Databricks according to approved architecture and solutions ensuring production readiness and scalability
- Collaborate with cross-functional teams including data engineering, machine learning, platform, and quality assurance to design and maintain CI/CD pipelines and supporting tools for data workflows
- Diagnose and resolve issues related to build and deployment pipelines, data workflows, and production workloads including performing root cause analysis and implementing preventive solutions
- Develop and standardize configuration management practices covering infrastructure configuration, environment parameters, secrets management, and cluster policies to maintain consistency
- Produce and update technical documentation such as deployment guides, operational runbooks, pipeline logic, and platform configurations
- Implement governance and security best practices for data platform environments
- Support automation efforts to promote efficient environment deployment and pipeline operations
- Participate in continuous improvement initiatives for data platform infrastructure and workflows
Requirements
- Experience of 3 or more years in roles such as Build Engineer, DevOps Engineer, or Platform Engineer supporting delivery and operations
- Proficiency in Databricks, Azure Data Factory, Azure DevOps, and general Microsoft Azure environment management
- Knowledge of DataOps practices, including automated deployment of data pipelines, environment promotion, and governance
- Experience in troubleshooting and resolving pipeline and production data workflow issues
- Strong communication and collaboration capabilities to work effectively with engineering, data, and operations teams
- English language proficiency at the B2 level or higher for participation in technical discussions and documentation preparation
- Ability to produce and maintain clear technical documentation and runbooks
Nice to have
- Experience with MLOps including model deployment, monitoring, and lifecycle automation for machine learning workloads