London, England, United Kingdom Hybrid / WFH Options
Axis Capital
Senior Data Engineer - (Azure/Databricks) Axis Capital London, United Kingdom Apply now Posted 15 days ago Permanent Competitive Senior Data Engineer - (Azure/Databricks) Axis Capital London, United Kingdom Apply now This is your opportunity to join AXIS Capital - a trusted global provider of specialty lines insurance and reinsurance. We stand apart for our outstanding client service, intelligent risk … family or parental status, or any other characteristic protected by law. Accommodation is available upon request for candidates taking part in the selection process. Senior Data Engineer (Azure/Databricks Job Family Grouping: Chief Underwriting Officer Job Family: Data & Analytics Location: London How does this role contribute to our collective success? The Data & Analytics department transforms raw data into actionable … MDW) technologies, big data, and Lakehouse architectures to ensure our data solutions are secure, efficient, and optimized. Key Responsibilities: Design and implement data solutions using Azure services, including Azure Databricks, ADF, and Data Lake Storage. Develop and maintain ETL/ELT pipelines to process structured and unstructured data from multiple sources. - Automate loads using Databricks workflows and Jobs Develop, test More ❯
Senior Data Engineer - (Azure/Databricks) page is loaded Senior Data Engineer - (Azure/Databricks) Apply locations London - Scalpel time type Full time posted on Posted 15 Days Ago job requisition id REQ05851 This is your opportunity to join AXIS Capital - a trusted global provider of specialty lines insurance and reinsurance. We stand apart for our outstanding client service, intelligent … family or parental status, or any other characteristic protected by law. Accommodation is available upon request for candidates taking part in the selection process. Senior Data Engineer (Azure/Databricks Job Family Grouping: Chief Underwriting Officer Job Family: Data & Analytics Location: London How does this role contribute to our collective success? The Data & Analytics department transforms raw data into actionable … MDW) technologies, big data, and Lakehouse architectures to ensure our data solutions are secure, efficient, and optimized. Key Responsibilities: Design and implement data solutions using Azure services, including Azure Databricks, ADF, and Data Lake Storage. Develop and maintain ETL/ELT pipelines to process structured and unstructured data from multiple sources. - Automate loads using Databricks workflows and Jobs Develop, test More ❯
to ongoing initiatives. This can include contributing to knowledge-sharing activities and data services. Essential technical experience you will demonstrate Strong experience designing and delivering data solutions in the Databricks Data Intelligence platform, either on Azure or AWS. Good working knowledge of Databricks components: DeltaLake, Unity Catalog, ML Flow, etc. Expertise in SQL, Python and Spark (Scala or Python). More ❯
Ibstock, England, United Kingdom Hybrid / WFH Options
Ibstock Plc
new, innovative and sustainable products and solutions, both for today and for a new era of building. To support our progress, we are currently recruiting for a Data Engineer (Databricks & BI) to come and join our team at Ibstock Head Office, LE67 6HS. (Applicants must have Right to work in the UK - We are unable to provide sponsorship for this … role) Job Purpose: The Data Engineer/BI Developer will play a critical role in developing and refining our modern data lakehouse platform, with a primary focus on Databricks for data engineering, transformation, and analytics. The role will involve designing, developing, and maintaining scalable data solutions to support business intelligence and reporting needs. This platform integrates Databricks, Power BI, and … JDE systems to provide near real-time insights for decision-making. Key Accountabilities: Ensure the existing design, development, and expansion of a near real-time data platform integrating AWS, Databricks and on-prem JDE systems. Develop ETL processes to integrate and transform data from multiple sources into a centralised data platform. Develop and optimise complex queries and data pipelines. Optimise More ❯
London, England, United Kingdom Hybrid / WFH Options
Locus Robotics
for deploying and scaling data systems. Highly desired experience with Azure, particularly Lakehouse and Eventhouse architectures. Experience with relevant infrastructure and tools including NATS, Power BI, Apache Spark/Databricks, and PySpark. Hands-on experience with data warehousing methodologies and optimization libraries (e.g., OR-Tools). Experience with log analysis, forensic debugging, and system performance tuning. Exposure to cloud-based More ❯
Gloucester, England, United Kingdom Hybrid / WFH Options
Benefact Group
and optimization of data pipelines and resources. Knowledge, skills and experience Essential: Cloud Platforms: Experience with Azure, AWS, or Google Cloud for data engineering. Cloud Data Tools: Expertise in Databricks, Snowflake, BigQuery, or Synapse Analytics. Programming & Scripting: Strong knowledge of Python, SQL, Spark, or similar. Data Modelling & Warehousing: Experience with cloud-based data architecture. CI/CD & DevOps: Knowledge of More ❯
Gloucester, England, United Kingdom Hybrid / WFH Options
Benefact Group plc
data pipelines and cloud resources. Knowledge, skills and experience Essential: Cloud Platforms: Demonstrable experience with Azure, AWS, or Google Cloud for data engineering. Cloud-Based Data Tools: Expertise in Databricks, Snowflake, BigQuery, or Synapse Analytics for scalable data solutions. Programming & Scripting: Strong knowledge of Python, SQL, Spark, or equivalent cloud technologies. Data Modelling & Warehousing: Experience with modern cloud-based data More ❯
of data pipelines and cloud resources. Knowledge, skills and experience Cloud Platforms: Demonstrable experience with Azure, AWS, or Google Cloud for data engineering. Cloud-Based Data Tools: Expertise in Databricks, Snowflake, BigQuery, or Synapse Analytics for scalable data solutions. Programming & Scripting: Strong knowledge of Python, SQL, Spark, or equivalent cloud technologies. Data Modelling & Warehousing: Experience with modern cloud-based data More ❯
London, England, United Kingdom Hybrid / WFH Options
DATAPAO
Join to apply for the Senior Data Engineer (Databricks) role at DATAPAO Get AI-powered advice on this job and more exclusive features. This range is provided by DATAPAO. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more. Base pay range Direct message the job poster from DATAPAO At DATAPAO, data … fuels collaboration, and growth knows no bounds. We are a leading Data Engineering and Data Science consulting company, recognized for our innovation and rapid growth. We have been named Databricks EMEA Emerging Business Partner of the Year and have achieved a second consecutive appearance on the Financial Times FT1000 list . We are currently looking for a Senior Data Engineer … of our most complex projects - individually or by leading small delivery teams. Our projects are fast-paced, typically 2 to 4 months long, and primarily use Apache Spark/Databricks on AWS/Azure. You will manage customer relationships either alone or with a Project Manager, and support our pre-sales, mentoring, and hiring efforts. What does it take to More ❯
London, England, United Kingdom Hybrid / WFH Options
Datapao
biggest multinational companies where years-long behemoth projects are the norm, our projects are fast-paced, typically 2 to 4 months long. Most are delivered using Apache Spark/Databricks on AWS/Azure and require you to directly manage the customer relationship alone or in collaboration with a Project Manager. Additionally, at this seniority level, we expect you to … Technical Expertise You (ideally) have 5+ years of experience in Data Engineering , with a focus on cloud platforms (AWS, Azure, GCP); You have a proven track record working with Databricks (PySpark, SQL, Delta Lake, Unity Catalog); You have extensive experience in ETL/ELT development and data pipeline orchestration (e.g., Databricks Workflows, DLT, Airflow, ADF, Glue, and Step Functions.); You … and optimize data like a pro; You know your way around CI/CD pipelines and Infrastructure as Code (Terraform, CloudFormation, or Bicep); You’ve hands-on experience integrating Databricks with BI tools (Power BI, Tableau, Looker). Consulting & Client-Facing Skills Ideally, you bring a proven history in consulting , from scoping to gathering requirements, designing solutions, and communicating effectively More ❯
Coalville, Leicestershire, East Midlands, United Kingdom Hybrid / WFH Options
Ibstock PLC
new, innovative and sustainable products and solutions, both for today and for a new era of building. To support our progress, we are currently recruiting for a Data Engineer (Databricks & BI) to come and join our team at Ibstock Head Office, LE67 6HS. (Applicants must have Right to work in the UK - We are unable to provide sponsorship for this … role) Job Purpose: The Data Engineer/BI Developer will play a critical role in developing and refining our modern data lakehouse platform, with a primary focus on Databricks for data engineering, transformation, and analytics. The role will involve designing, developing, and maintaining scalable data solutions to support business intelligence and reporting needs. This platform integrates Databricks, Power BI, and … JDE systems to provide near real-time insights for decision-making. Key Accountabilities: Ensure the existing design, development, and expansion of a near real-time data platform integrating AWS, Databricks and on-prem JDE systems. Develop ETL processes to integrate and transform data from multiple sources into a centralised data platform. Develop and optimise complex queries and data pipelines. Optimise More ❯
enterprise-scale data platforms for data ingestion, data processing, data transformation, business intelligence, AI, and advanced analytics. Proven hands-on capability with relevant technology: Azure Platform, Azure Data Services, Databricks, Power BI, SQL DW, Snowflake, Big Query, and Advanced Analytics. Proven ability to understand low-level data engineering solutions and languages (Spark, MPP, Python, Delta, Parquet). Experience with Azure More ❯
engineers and scientists working across Africa, Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data architectures such as Data Mesh, lakehouse, data vault and data warehouses. Our data … RESPONSIBILITIES Data Pipeline Development: Lead the design, implement, and maintain scalable data pipelines for ingesting, processing, and transforming large volumes of data from various sources using tools such as databricks, python and pyspark. Data Architecture: Architect scalable and efficient data solutions using the appropriate architecture design, opting for modern architectures where possible. Data Modeling: Design and optimize data models and … or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka, and Flink. Experience in using modern data architectures, such as lakehouse. Experience with CI/CD pipelines, version control systems like Git, and containerization (e.g., Docker). Experience More ❯
London, England, United Kingdom Hybrid / WFH Options
the LEGO Group
business needs & improve overall data availability for the business. Partner with E2E LEGO.com Operations digital product teams to ensure high quality data is collected and published to LEGO Nexus (databricks) to a standard fit for purpose for downstream delivery of data products. Enable LEGO Retail specific data understanding and champion data literacy via guidelines, training, drop-in sessions, documentation, and … a curious, solution-driven mindset. Fluent in English and open to light travel (up to 10 days/year). Nice to have Experience with CI/CD pipelines, Databricks, and Databricks Asset Bundles is beneficial. You’re comfortable working in cross-cultural environments and collaborating with global teams across time zones. A good understanding of consumer retail or direct More ❯
MySQL) Understanding of ETL/ELT principles , data architecture, and data warehouse concepts Familiarity with APIs, RESTful services, and JSON/XML data handling Experience with Azure Data Factory , Databricks , or AWS Glue Familiarity with CI/CD , version control (Git), and DevOps practices Knowledge of cloud platforms (Azure, AWS, or GCP) Basic understanding of data governance , security, and GDPR More ❯
Wilmslow, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
for building scalable, reliable data pipelines, managing data infrastructure, and supporting data products across various cloud environments, primarily Azure. Key Responsibilities: Develop end-to-end data pipelines using Python, Databricks, PySpark, and SQL. Integrate data from various sources including APIs, Excel, CSV, JSON, and databases. Manage data lakes, warehouses, and lakehouses within Azure cloud environments. Apply data modelling techniques such More ❯
London, England, United Kingdom Hybrid / WFH Options
Noir
Noir. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more. Data Engineer - Leading Energy Company - London (Tech Stack: Data Engineer, Databricks, Python, PySpark, Power BI, AWS QuickSight, AWS, TSQL, ETL, Agile Methodologies) Company Overview: Join a dynamic team, a leading player in the energy sector, committed to innovation and sustainable solutions. More ❯
to collaborate with cross-functional teams effectively. Inquisitive with an eagerness to learn. Preferred Qualifications Experience with additional Azure services like Azure Data Lake , Azure SQL Database , or Azure Databricks . Familiarity with Infrastructure-as-Code (IaC) practices. Strong knowledge of data governance and security best practices in the cloud. Previous experience in a DevOps environment with CI/CD More ❯
Boomi, MuleSoft), particularly in mid-market enterprise integration scenarios. Deep experience with the design, development, implementation and support of cloud-native data platforms such as Snowflake, Azure SQL Database, Databricks, Microsoft Fabric or Azure Synapse Analytics. Demonstrated success implementing data governance programs with tools like Collibra, Alation, Microsoft Purview, or Informatica, including projects around lineage, cataloging, and quality rules. Strong More ❯
Boomi, MuleSoft), particularly in mid-market enterprise integration scenarios. Deep experience with the design, development, implementation and support of cloud-native data platforms such as Snowflake, Azure SQL Database, Databricks, Microsoft Fabric or Azure Synapse Analytics. Demonstrated success implementing data governance programs with tools like Collibra, Alation, Microsoft Purview, or Informatica, including projects around lineage, cataloging, and quality rules. Strong More ❯
Techyard is supporting a growing Databricks-partnered consultancy in securing a Databricks Champion Solution Architect to lead the design and delivery of advanced data and AI solutions on the Databricks Lakehouse Platform. This strategic, high-impact role is ideal for a seasoned professional who can operate as both a hands-on architect and a trusted advisor—bridging business vision with … the consultancy’s data engineering capabilities as they scale their team and client base. Key Responsibilities: Architect and implement end-to-end, scalable data and AI solutions using the Databricks Lakehouse (Delta Lake, Unity Catalog, MLflow). Design and lead the development of modular, high-performance data pipelines using Apache Spark and PySpark. Champion the adoption of Lakehouse architecture (bronze … Collaborate with stakeholders, analysts, and data scientists to translate business needs into clean, production-ready datasets. Promote CI/CD, DevOps, and data reliability engineering (DRE) best practices across Databricks environments. Integrate with cloud-native services and orchestrate workflows using tools such as dbt, Airflow, and Databricks Workflows. Drive performance tuning, cost optimisation, and monitoring across data workloads. Mentor engineering More ❯
days ago Be among the first 25 applicants Direct message the job poster from TechYard Senior Delivery Consultant @ TechYard Recruitment | Connecting Talent to Opportunity Techyard is supporting a growing Databricks-partnered consultancy in securing a Databricks Champion Solution Architect to lead the design and delivery of advanced data and AI solutions on the Databricks Lakehouse Platform. This strategic, high-impact … the consultancy’s data engineering capabilities as they scale their team and client base. Key Responsibilities: Architect and implement end-to-end, scalable data and AI solutions using the Databricks Lakehouse (Delta Lake, Unity Catalog, MLflow). Design and lead the development of modular, high-performance data pipelines using Apache Spark and PySpark. Champion the adoption of Lakehouse architecture (bronze … Collaborate with stakeholders, analysts, and data scientists to translate business needs into clean, production-ready datasets. Promote CI/CD, DevOps, and data reliability engineering (DRE) best practices across Databricks environments. Integrate with cloud-native services and orchestrate workflows using tools such as dbt, Airflow, and Databricks Workflows. Drive performance tuning, cost optimisation, and monitoring across data workloads. Mentor engineering More ❯
Newcastle upon Tyne, England, United Kingdom Hybrid / WFH Options
Somerset Bridge
sector, shaping data pipelines and platforms that power smarter decisions, better pricing, and sharper customer insights. The Data Engineer will design, build, and optimise scalable data pipelines within Azure Databricks, ensuring high-quality, reliable data is available to support pricing, underwriting, claims, and operational decision-making. This role is critical in modernising SBG’s cloud-based data infrastructure, ensuring compliance … regulations, and enabling AI-driven analytics and automation. By leveraging Azure-native services, such as Azure Data Factory (ADF) for orchestration, Delta Lake for ACID-compliant data storage, and Databricks Structured Streaming for real-time data processing, the Data Engineer will help unlock insights, enhance pricing accuracy, and drive innovation. The role also includes optimising Databricks query performance, implementing robust … security controls (RBAC, Unity Catalog), and ensuring enterprise-wide data reliability. Working closely with Data Architects, Pricing Teams, Data Analysts, and IT, this role will ensure our Azure Databricks data ecosystem is scalable, efficient, and aligned with business objectives. Additionally, the Data Engineer will contribute to cost optimisation, governance, and automation within Azure’s modern data platform. Key Responsibilities Data More ❯
London, England, United Kingdom Hybrid / WFH Options
Aimpoint Digital
individuals who want to drive value, work in a fast-paced environment, and solve real business problems. You are a coder who writes efficient and optimized code leveraging key Databricks features. You are a problem-solver who can deliver simple, elegant solutions as well as cutting-edge solutions that, regardless of complexity, your clients can understand, implement, and maintain. You … Strong written and verbal communication skills required Ability to manage an individual workstream independently 3+ years of experience developing and deploying ML models in any platform (Azure, AWS, GCP, Databricks etc.) Ability to apply data science methodologies and principles to real life projects Expertise in software engineering concepts and best practices Self-starter with excellent communication skills, able to work … independently, and lead projects, initiatives, and/or people Willingness to travel. Want to stand out? Consulting Experience Databricks Machine Learning Associate or Machine Learning Professional Certification. Familiarity with traditional machine learning tools such as Python, SKLearn, XGBoost, SparkML, etc. Experience with deep learning frameworks like TensorFlow or PyTorch. Knowledge of ML model deployment options (e.g., Azure Functions, FastAPI, Kubernetes More ❯
engineers and scientists working across Africa, Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data architectures such as Data Mesh, lakehouse, data vault and data warehouses. Our data … data solutions. RESPONSIBILITIES Data Pipeline Development: Design, implement, and maintain scalable data pipelines for ingesting, processing, and transforming large volumes of data from various sources using tools such as databricks, python and pyspark. Data Modeling: Design and optimize data models and schemas for efficient storage, retrieval, and analysis of structured and unstructured data. ETL Processes: Develop and automate ETL workflows … or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka, and Flink. Experience in using modern data architectures, such as lakehouse. Experience with CI/CD pipelines and version control systems like Git. Knowledge of ETL tools and More ❯