Skip navigation EPAM
CONTACT US

Databricks Engineer (relocation to Cyprus) Romania

Databricks Engineer (relocation to Cyprus) Description

We are looking for a Databricks Engineer with expertise in cloud infrastructure, big data processing, data integration and analytics, particularly with Azure.

In this role, you will play a crucial part in advancing our client’s Data-Driven Company initiative within the insurance sector. Your role will involve designing and building robust data architecture to support various aspects of the client’s business.

Join us at our Cyprus office, which offers a flexible hybrid work setup. If you're ready to leverage your skills and perspective to make a significant impact, apply now and help us transform our data capabilities in the finance and insurance industries.


#LI-DNI

Responsibilities

  • Design, development and maintenance of scalable data pipelines and architectures
  • Collaborate with Machine Learning engineers to integrate AI-driven insights and recommendations
  • Develop and optimize data models and ETL processes using Databricks and other technologies
  • Implement data quality checks and monitoring to ensure high data integrity
  • Stay updated with emerging trends and technologies in data engineering and propose adoption of new tools where beneficial
  • Troubleshoot and resolve data-related issues in a timely manner
  • Participate in code reviews and maintain high standards of code quality

Requirements

  • Experience as a Data Engineer in a project employing the Databricks platform
  • Hands-on experience with Databricks (Delta Lake, workflows, Delta Live Tables, deployment and versioning)
  • Solid understanding of data architectures, data modeling skills; experience in designing and building ETL pipelines with Databricks using external orchestrators like Airflow
  • Expertise in Python, PySpark and SQL
  • Proficiency in cloud-native technologies and software engineering best practices (containers, unit tests, linting and code style checks)
  • Engineering experience with either AWS, Azure or both
  • Experience with big data and performance optimization of data-intensive applications
  • Proactivity and client-facing experience
  • Ability to deal with ambiguity and work independently without constant direction
  • Desire to work in a transparent and fast-moving startup environment
  • Fluent English communication skills at a B2+ level

Nice to have

  • Experience setting up or maintaining CI/CD pipelines on Azure DevOps
  • Understanding of Data Observability and Data Quality Monitoring; experience integrating data quality checks in data pipelines
  • Knowledge of Ingestion Pipelines, PostgreSQL and Terraform

We offer

  • Private healthcare insurance
  • Regular performance assessments
  • Family friendly initiatives
  • Corporate Programs including Employee Referral Program with rewards
  • Learning and development opportunities including in-house training and coaching, professional certifications, over 22,000 courses on LinkedIn Learning Solutions and much more
  • *All benefits and perks are subject to certain eligibility requirements

GET IN TOUCH

Hello.
How can we help you?

Get in touch with us. We'd love to hear from you.

Our
Locations