Banbury, Oxfordshire, United Kingdom Hybrid / WFH Options
Cornwallis Elt Ltd
Senior Databricks Engineer Location: Banbury or Manchester (Hybrid - 2/3 days in office) Salary: £60,000 - £75,000 per annum Contract: Permanent, Full-Time We're working with a well-established financial services organisation that's growing its data More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong understanding and or use of More ❯
london, south east england, united kingdom Hybrid / WFH Options
Client Server
are an experienced Data Engineer within financial services environments You have expertise with GCP including BigQuery, Pub/Sub, Cloud Composer and IAM You have strong Python, SQL and PySpark skills You have experience with real-time data streaming using Kafka or Spark You have a good knowledge of Data Lakes, Data Warehousing, Data Modelling You're familiar with More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Client Server
are an experienced Data Engineer within financial services environments You have expertise with GCP including BigQuery, Pub/Sub, Cloud Composer and IAM You have strong Python, SQL and PySpark skills You have experience with real-time data streaming using Kafka or Spark You have a good knowledge of Data Lakes, Data Warehousing, Data Modelling You're familiar with More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong understanding and or use of More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Method Resourcing
collaboration, learning, and innovation What we're looking for Hands-on experience with the Azure Data Engineering stack (ADF, Databricks, Synapse, Data Lake) Strong skills in SQL and Python (PySpark experience is a bonus) Experience building and optimising ETL/ELT pipelines A background in Financial Services is a plus, but not essential A curious mindset and the ability More ❯
leadership and upskilling responsibilities. Key Responsibilities Build and maintain Databricks Delta Live Tables (DLT) pipelines across Bronze → Silver → Gold layers, ensuring quality, scalability, and reliability. Develop and optimise Spark (PySpark) jobs for large-scale distributed processing. Design and implement streaming data pipelines with Kafka/MSK, applying best practices for late event handling and throughput. Use Terraform and CI More ❯
leadership and upskilling responsibilities. Key Responsibilities Build and maintain Databricks Delta Live Tables (DLT) pipelines across Bronze → Silver → Gold layers, ensuring quality, scalability, and reliability. Develop and optimise Spark (PySpark) jobs for large-scale distributed processing. Design and implement streaming data pipelines with Kafka/MSK, applying best practices for late event handling and throughput. Use Terraform and CI More ❯
london (city of london), south east england, united kingdom
develop
leadership and upskilling responsibilities. Key Responsibilities Build and maintain Databricks Delta Live Tables (DLT) pipelines across Bronze → Silver → Gold layers, ensuring quality, scalability, and reliability. Develop and optimise Spark (PySpark) jobs for large-scale distributed processing. Design and implement streaming data pipelines with Kafka/MSK, applying best practices for late event handling and throughput. Use Terraform and CI More ❯
and machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, Delta Lake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
london (city of london), south east england, united kingdom
Miryco Consultants Ltd
and machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, Delta Lake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
and machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, Delta Lake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
data platforms, and integrations, while ensuring solutions meet regulatory standards and align with architectural best practices. Key Responsibilities: Build and optimise scalable data pipelines using Databricks and Apache Spark (PySpark). Ensure performance, scalability, and compliance (GxP and other standards). Collaborate on requirements, design, and backlog refinement. Promote engineering best practices including CI/CD, code reviews, and More ❯
data platforms, and integrations, while ensuring solutions meet regulatory standards and align with architectural best practices. Key Responsibilities: Build and optimise scalable data pipelines using Databricks and Apache Spark (PySpark). Ensure performance, scalability, and compliance (GxP and other standards). Collaborate on requirements, design, and backlog refinement. Promote engineering best practices including CI/CD, code reviews, and More ❯
london (city of london), south east england, united kingdom
Fimador
data platforms, and integrations, while ensuring solutions meet regulatory standards and align with architectural best practices. Key Responsibilities: Build and optimise scalable data pipelines using Databricks and Apache Spark (PySpark). Ensure performance, scalability, and compliance (GxP and other standards). Collaborate on requirements, design, and backlog refinement. Promote engineering best practices including CI/CD, code reviews, and More ❯
london, south east england, united kingdom Hybrid / WFH Options
Peaple Talent
having delivered in Microsoft Azure Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure Data Engineering certifications Databricks certifications What's in it for you: 📍Location: London (Hybrid) 💻Remote working: Occasional More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Peaple Talent
having delivered in Microsoft Azure Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure Data Engineering certifications Databricks certifications What's in it for you: 📍Location: London (Hybrid) 💻Remote working: Occasional More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Peaple Talent
having delivered in Microsoft Azure Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure Data Engineering certifications Databricks certifications What's in it for you: 📍Location: London (Hybrid) 💻Remote working: Occasional More ❯
data-driven culture all within a collaborative environment that values innovation and ownership. Tech you’ll be working with: Azure Data Lake Azure Synapse Databricks Data Factory Python, SQL, PySpark Terraform, GitHub Actions, CI/CD pipelines You’ll thrive here if you: Have strong experience building and leading Azure-based data platforms Enjoy mentoring and guiding other engineers More ❯
london (city of london), south east england, united kingdom
Accelero
data-driven culture all within a collaborative environment that values innovation and ownership. Tech you’ll be working with: Azure Data Lake Azure Synapse Databricks Data Factory Python, SQL, PySpark Terraform, GitHub Actions, CI/CD pipelines You’ll thrive here if you: Have strong experience building and leading Azure-based data platforms Enjoy mentoring and guiding other engineers More ❯
data-driven culture all within a collaborative environment that values innovation and ownership. Tech you’ll be working with: Azure Data Lake Azure Synapse Databricks Data Factory Python, SQL, PySpark Terraform, GitHub Actions, CI/CD pipelines You’ll thrive here if you: Have strong experience building and leading Azure-based data platforms Enjoy mentoring and guiding other engineers More ❯
london, south east england, united kingdom Hybrid / WFH Options
8Bit - Games Industry Recruitment
improve AI models over time. REQUIREMENTS 2 years of proven experience in data engineering for ML/AI, with strong proficiency in Python, SQL, and distributed data processing (e.g., PySpark). Hands-on experience with cloud data platforms (GCP, AWS, or Azure), orchestration frameworks (e.g., Airflow), and ELT/ETL tools. Familiarity with 2D and 3D data formats (e.g. More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
8Bit - Games Industry Recruitment
improve AI models over time. REQUIREMENTS 2 years of proven experience in data engineering for ML/AI, with strong proficiency in Python, SQL, and distributed data processing (e.g., PySpark). Hands-on experience with cloud data platforms (GCP, AWS, or Azure), orchestration frameworks (e.g., Airflow), and ELT/ETL tools. Familiarity with 2D and 3D data formats (e.g. More ❯
slough, south east england, united kingdom Hybrid / WFH Options
8Bit - Games Industry Recruitment
improve AI models over time. REQUIREMENTS 2 years of proven experience in data engineering for ML/AI, with strong proficiency in Python, SQL, and distributed data processing (e.g., PySpark). Hands-on experience with cloud data platforms (GCP, AWS, or Azure), orchestration frameworks (e.g., Airflow), and ELT/ETL tools. Familiarity with 2D and 3D data formats (e.g. More ❯
Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT orchestration for structured and unstructured data Proficiency in: PySpark, T-SQL, Notebooks and advanced data manipulation Performance monitoring and orchestration of Fabric solutions Power BI semantic models and Fabric data modelling DevOps deployment using ARM/Bicep templates More ❯