




Summary: Seeking a Lead Data DevOps (Azure) to design, improve, and manage scalable data infrastructure and workflows within the Azure cloud ecosystem. Highlights: 1. Lead data infrastructure design and improvement on Azure 2. Oversee scalable data solutions and efficient pipelines 3. Collaborate across teams for enhanced data system performance We are seeking a skilled **Lead Data DevOps (Azure)** to join our team and take ownership of designing and improving data infrastructure and workflows. In this role, you will oversee the implementation of scalable data solutions within the Azure cloud ecosystem. You will work closely with data engineering and cross\-functional teams to ensure efficient pipelines, system reliability, and enhanced performance. **Responsibilities** * Design, implement, and manage data infrastructure using Azure services like Data Lake (ADLSv2\), Databricks, Synapse, and Data Factory * Collaborate with data engineering teams to develop and refine workflows and pipelines * Automate data operations with Python to boost reliability and efficiency * Configure and manage CI/CD pipelines using tools such as Jenkins, GitHub Actions, or other platforms * Work with cross\-functional teams to enhance scalability, reliability, and overall performance of data systems * Set up, configure, and maintain data tools such as Apache Spark and Apache Kafka across cloud and on\-premises environments * Monitor data systems to detect and resolve performance or scalability issues * Troubleshoot and resolve complex technical problems involving data platforms and pipelines **Requirements** * At least 5 years of experience in Data Engineering or related roles * A minimum of one year of experience leading and managing development teams * Advanced knowledge of Python programming and batch processing workflows * Strong SQL expertise for managing and querying large datasets * Extensive experience working with Azure cloud services for data infrastructure management * Hands\-on experience with Infrastructure as Code tools like Ansible, Terraform, or CloudFormation * Proficiency in configuring and managing CI/CD pipelines using tools like Jenkins or GitHub Actions * Practical experience with data tools such as Spark, Airflow, or R for data processing and workflow automation * Advanced understanding of Linux operating systems, including scripting and system administration * Strong knowledge of network protocols and mechanisms such as TCP, UDP, ICMP, DHCP, DNS, and NAT * Fluent English communication skills, both written and spoken, at a B2\+ level or higher **Nice to have** * Familiarity with additional cloud platforms such as AWS or GCP * Experience with container orchestration tools like Kubernetes for managing workflows * Knowledge of monitoring and observability tools such as Prometheus, Grafana, or Azure Monitor * Exposure to Big Data technologies and advanced analytics processes * Experience implementing data governance and security measures within cloud environments


