Azure DevOps Engineer – 5 to 8 years of experience
Job ID: Dev-ETP-Pun-870
Location: India
We are seeking an experienced DevOps Engineer with 5 to 8 years of hands-on experience in building and managing infrastructure solutions. The ideal candidate will have expertise in Azure DevOps, build and release pipelines, infrastructure as code (IaC) using Terraform, and strong exposure to supporting Data Engineering pipelines. The role will involve automating and optimizing operational environments, supporting data workflows, and improving system performance.
Skills and Qualifications:
- 5-8 years of experience as a DevOps Engineer, preferably in environments with a strong focus on Azure and Data Engineering.
- Proficiency with Azure DevOps, including pipelines, repositories, boards, and artifacts, particularly for data engineering workflows.
- In-depth knowledge and experience with Terraform for infrastructure provisioning.
- Strong experience with Data Engineering pipelines, such as ETL/ELT, data lakes, data warehouses, and big data platforms.
- Hands-on experience with containerization (Docker, Kubernetes) for managing data workflows.
- Expertise in creating automated build and release pipelines for data engineering systems.
- Experience with monitoring systems, log management, and using tools like Prometheus, Grafana, or similar for both application and data environments.
- Understanding of DevOps processes, including continuous integration, continuous delivery, and DataOps practices for operational metrics.
- Experience in data security best practices and compliance in cloud environments.
- Strong problem-solving skills and ability to troubleshoot complex infrastructure and data pipeline issues.
- Excellent communication and collaboration skills, with a focus on automation and improvement.
Key Responsibilities:
- Design, implement, and manage CI/CD pipelines using Azure DevOps for various applications.
- Develop and maintain infrastructure provisioning scripts and configurations using Terraform and other IaC tools.
- Build and optimize data pipeline automation, enabling seamless data integration and transformation across various systems.
- Create and manage cloud environments for big data platforms, ensuring scalability, high availability, and cost-efficiency.
- Collaborate with Data Engineering teams to integrate DevOps practices, ensuring efficient data pipeline deployment and management.
- Monitor and optimize build and release processes, ensuring minimal downtime and robust testing environments.
- Implement data security and compliance measures within DevOps practices to safeguard sensitive data.
- Manage activity feeds, logs, and alerting for operational environments, including data processing pipelines.
- Work closely with cross-functional teams to define and implement DevOps and DataOps processes and practices.
- Identify and resolve issues related to infrastructure, data pipeline deployments, and systems at scale.
- Perform regular system monitoring, verifying the integrity and availability of all hardware, server resources, systems, and data pipelines.