Senior Data Engineer
Senior Data Engineer
Azure, Databricks, PySpark, Azure Data Factory
London - Hybrid (2 days in the office)
£75,000 to £85,000 + bonus and benefits
We are working with a leading Lloyd's and London Market insurer who are continuing to invest heavily in their Azure-based data platform. This role sits within a specialist data engineering team responsible for building and evolving a modern cloud data architecture.
You will work across a Databricks-led Azure data stack, contributing to both the design and development of scalable data pipelines and helping shape how the platform evolves. The team operate in a collaborative environment with strong technical capability, where engineers are encouraged to take ownership and influence solutions.
This role is suited to a hands-on Senior Data Engineer who is comfortable working across both development and design, with strong experience in Databricks and modern data engineering practices.
Responsibilities- Design and develop scalable data solutions using Azure technologies including Databricks, Azure Data Factory, ADLS and Synapse
- Build and optimise data pipelines using PySpark and SQL within a lakehouse architecture
- Work with Delta Lake and contribute to the implementation of structured data layers (eg bronze, silver, gold)
- Contribute to solution design, translating business requirements into technical data models and pipelines
- Collaborate with engineers, architects and product owners to deliver new data capabilities
- Implement and support CI/CD pipelines and DevOps practices across multiple environments
- Monitor and optimise performance of data pipelines, identifying improvements in scalability and efficiency
- Support data quality, validation and governance practices across the data platform
Requirements
- Strong experience as a Data Engineer within an Azure environment
- Hands-on experience with Databricks, including PySpark and Spark SQL
- Experience working with Azure Data Factory and Azure Data Lake (ADLS Gen2)
- Good understanding of Delta Lake and modern lakehouse architectures
- Strong SQL skills, including complex transformations and analytical queries
- Experience building and maintaining ETL or ELT pipelines at scale
- Exposure to CI/CD and DevOps tooling (eg Azure DevOps, GitHub Actions)
- Experience working in Agile teams with strong stakeholder engagement
Nice to have
- Experience with data modelling (dimensional/semantic models)
- Exposure to data governance or tooling such as Unity Catalog
- Experience with streaming or near Real Time data pipelines
- Experience within Financial Services or the London Market
Why apply
- Opportunity to work on a modern Azure and Databricks data platform
- Strong technical team with real ownership and influence
- Exposure to both engineering and solution design
- Clear opportunity to further develop within a cloud-first environment