305 King St W
Suite 1100
Kitchener, ON N2G 1B9
Canada
Senior Data DevOps - MLOps Expertise Gurgaon, India
Senior Data DevOps - MLOps Expertise Description
EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.
We are seeking a skilled and driven Senior Data DevOps Engineer with strong MLOps expertise to join our team.
The ideal candidate will have a deep understanding of data engineering, automation in data pipelines, and operationalizing machine learning models. The role requires a collaborative professional capable of designing, deploying, and managing scalable data and ML pipelines that align with business goals.
#LI-DNI#EasyApply
Responsibilities
- Develop, deploy, and manage CI/CD pipelines for data integration and machine learning model deployment
- Implement and maintain infrastructure for data processing and model training using cloud-based tools and services
- Automate data validation, transformation, and workflow orchestration processes
- Collaborate with data scientists, software engineers, and product teams to ensure seamless integration of ML models into production
- Optimize model serving and monitoring to enhance performance and reliability
- Maintain data versioning, lineage tracking, and reproducibility of ML experiments
- Proactively identify opportunities to improve deployment processes, scalability, and infrastructure resilience
- Ensure robust security measures are in place to protect data integrity and compliance with regulations
- Troubleshoot and resolve issues across the data and ML pipeline lifecycle
Requirements
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field
- 4+ years of experience in Data DevOps, MLOps, or related roles
- Strong proficiency in cloud platforms such as Azure, AWS, or GCP
- Experience with Infrastructure as Code (IaC) tools like Terraform, CloudFormation, or Ansible
- Expertise in containerization and orchestration technologies (e.g., Docker, Kubernetes)
- Hands-on experience with data processing frameworks (e.g., Apache Spark, Databricks)
- Proficiency in programming languages such as Python, with knowledge of data manipulation and ML libraries (e.g., Pandas, TensorFlow, PyTorch)
- Familiarity with CI/CD tools (e.g., Jenkins, GitLab CI/CD, GitHub Actions)
- Experience with version control tools (e.g., Git) and MLOps platforms (e.g., MLflow, Kubeflow)
- Strong understanding of monitoring, logging, and alerting systems (e.g., Prometheus, Grafana)
- Excellent problem-solving skills and the ability to work independently and as part of a team
- Strong communication and documentation skills
Nice to have
- Experience with DataOps concepts and tools (e.g., Airflow, dbt)
- Knowledge of data governance and tools like Collibra
- Familiarity with Big Data technologies (e.g., Hadoop, Hive)
- Certifications in cloud platforms or data engineering
We offer
- Opportunity to work on technical challenges that may impact across geographies
- Vast opportunities for self-development: online university, knowledge sharing opportunities globally, learning opportunities through external certifications
- Opportunity to share your ideas on international platforms
- Sponsored Tech Talks & Hackathons
- Unlimited access to LinkedIn learning solutions
- Possibility to relocate to any EPAM office for short and long-term projects
- Focused individual development
- Benefit package:
- Health benefits
- Retirement benefits
- Paid time off
- Flexible benefits
- Forums to explore beyond work passion (CSR, photography, painting, sports, etc.)