Databricks Engineer - Insurance

We are seeking a Data Engineer (Databricks) to support the growth of a global technology provider within the Insurtech space. The role focuses on designing and delivering ETL pipelines and scalable solutions using the Azure ecosystem, with an emphasis on enabling advanced analytics and data-driven decision-making. As a key player in a high-performing data engineering team, you will contribute to large-scale transformation initiatives within the P&C Insurance space by developing robust data models, optimising data flows, and ensuring the accuracy and accessibility of critical information. The position requires close collaboration with both technical and business stakeholders, and is an excellent opportunity to join a team that invests in your growth, with comprehensive training and certification programs and a real opportunity to showcase your talents.

Key Responsibilities

  • Design and develop ETL pipelines using Azure Data Factory for data ingestion and transformation
  • Work with Azure Data Lake, Synapse, and SQL DW to manage large volumes of data
  • Develop data transformation logic using SQL, Python, and PySpark code
  • Collaborate with cross-functional teams to translate business requirements into data solutions
  • Create mapping documents, transformation rules, and ensure quality delivery
  • Contribute to DevOps processes, CI/CD deployments, and agile delivery models

Requirements

  • 8+ years' experience in data engineering, ETL development and big data solutions
  • Recent experience within Insurance Technology essential
  • Solid expertise with Azure Databricks, Data Factory, ADLS, Synapse, and Azure SQL
  • Strong skills in SQL, Python, and PySpark
  • Solid understanding of DevOps, CI/CD, and Agile methodologies
  • Excellent communication and stakeholder management skills
Company
Calibre Candidates
Location
London, UK
Employment Type
Part-time
Posted
Company
Calibre Candidates
Location
London, UK
Employment Type
Part-time
Posted