Skip navigation EPAM
CONTACT US

Senior Data DevOps Engineer Remote

  • hot

Senior Data DevOps Engineer Description

We're seeking a remote Senior Data DevOps Engineer to join our dynamic team for a new project focused on developing and managing data infrastructure in the cloud, primarily using AWS, Azure, or GCP.

In this role, you will be responsible for designing, deploying, and managing data systems, developing automation scripts and workflows for infrastructure provisioning, deployment, and monitoring, and optimizing performance, scalability, and reliability of data platforms and systems.

You will work closely with the data engineering team to ensure efficient data pipelines and processes, automating data workflows using Python. You will also be responsible for setting up and maintaining continuous integration and delivery (CI/CD) pipelines using tools such as Jenkins, GitHub Actions, or similar cloud-based CI/CD tools.


#LI-AP13#EasyApply

Responsibilities

  • Design, deploy, and manage data infrastructure in the cloud, primarily using AWS, Azure, or GCP
  • Develop and implement automation scripts and workflows for infrastructure provisioning, deployment, and monitoring using tools like Terraform or similar Infrastructure as Code (IaC) tools
  • Ensure efficient data pipelines and processes, automating data workflows using Python
  • Work closely with the data engineering team to ensure efficient data pipelines and processes, automating data workflows using Python
  • Set up and maintain continuous integration and delivery (CI/CD) pipelines using tools such as Jenkins, GitHub Actions, or similar cloud-based CI/CD tools
  • Collaborate with cross-functional teams to optimize the performance, scalability, and reliability of data platforms and systems
  • Install, configure, and maintain data tools such as Apache Spark, Apache Kafka, ELK Stack, Apache NiFi, Apache Airflow, or similar tools in both on-premises and cloud environments
  • Monitor and troubleshoot data systems, proactively identifying and resolving performance, scalability, and reliability issues

Requirements

  • Minimum of 3 years of experience in data infrastructure management and DevOps
  • Strong proficiency in Python, and Batch experience
  • Professional mastery of the Linux operating system
  • Strong knowledge of Cloud technologies (AWS, GCP or Azure)
  • Solid understanding of network protocols and mechanisms such as TCP, UDP, ICMP, DHCP, DNS, and NAT
  • Hands-on experience using or setting up data tools such as Spark, Airflow, R
  • Proficiency with SQL
  • Experience with Infrastructure as Code (IaC) tools 
  • Proficiency with setting up and managing CI/CD pipelines using tools like Jenkins, Bamboo, TeamCity, GitLab CI, GitHub Actions, or similar cloud-based CI/CD tools
  • Experience installing and configuring data tools such as Apache Spark, Apache Kafka, ELK Stack, Apache NiFi, Apache Airflow, or similar tools
  • Good verbal and written communication skills in English at a B2+ level

Nice to have

  • Expertise in AWS CloudFormation
  • Knowledge of Terraform and Ansible
  • Azure DevOps skills

A DAY IN THE LIFE

BLOG

Salman Talat
Director, Account Management
TORONTO, CANADA

Read More

BLOG

Iryna Kovalenko
Delivery Manager
KYIV, UKRAINE

Read More

BLOG

Jan Mazurek
Chief Business Analyst
GDANSK, POLAND

Read More

GET IN TOUCH

Hello.
How can we help you?

Get in touch with us. We'd love to hear from you.

Our
Locations