Data Engineer
No resume? Download our questionnaire!
The global consulting company with an exclusive focus on the healthcare and life science industries is looking for a Data Engineer.
Responsibilities:
What You’ll Do
- Build and manage ETL/ELT pipelines using tools like Databricks, dbt, PySpark, and SQL.
- Contribute to scalable data platforms across cloud environments (Azure, AWS, GCP).
- Implement and maintain CI/CD workflows using tools such as GitHub Actions and Azure DevOps.
- Apply DataOps principles: pipeline versioning, testing, lineage, deployment automation, and monitoring.
- Integrate automated data quality checks, profiling, and validation into pipelines.
- Ensure strong data observability via logging, metrics, and alerting tools.
- Collaborate on infrastructure as code for data environments using Terraform or similar tools.
- Connect and orchestrate ingestion from APIs, relational databases, and file systems.
- Work in agile teams, contributing to standups, retrospectives, and continuous improvement.
Requirements — Must have:
What We’re Looking For
- We believe diverse perspectives and backgrounds lead to better ideas. Even if you don’t meet every requirement, we’d still love to hear from you.
- Core Experience & Skills
- Experience with cloud-native data engineering using tools such as Databricks, dbt, PySpark, and SQL.
- Comfort working with at least one major cloud platform (Azure, AWS, GCP) — and openness to others.
- Hands-on experience with CI/CD automation, especially with GitHub Actions or Azure Pipelines.
- Strong Python programming skills for transformation, scripting, and automation.
- Working knowledge of data quality, validation frameworks, and test-driven data development.
- Familiarity with observability practices including metrics, logging, and data lineage tools.
- Understanding of DataOps concepts, including reproducibility, automation, and collaboration.
- Team-first mindset and experience in agile environments (Scrum or Kanban).
- Professional working proficiency in English (our internal and client-facing working language).
Requirements — Nice to have:
- Experience with Snowflake or similar cloud data warehouses.
- Knowledge of data lineage tools and frameworks.
- Infrastructure automation using Terraform, Bash, or PowerShell.
- Exposure to data modeling techniques like Data Vault or dimensional modeling.
- Familiarity with data testing tools.
- Understanding of GxP or other healthcare data regulations.
- Experience with non-relational data systems (e.g., MongoDB, CosmosDB).
Languages
- English B2
We Offer:
— Relocation to Georgia, Batumi with excellent compensation package
— Modern office in very good location
— Full medical insurance for employee and his family
— Different training programs to support your personal and professional development
— An employee-centric culture directly inspired by employee feedback — your voice is heard, and your perspective encouraged
— Possibilities of career development and the opportunity to shape the company future
— Work in a fast growing, international company
— Friendly atmosphere and supportive Management team
Full-time
Office, hybrid
Contacts: davidovicholga0102@gmail.com
No resume? Download our questionnaire!