London, England, United Kingdom Hybrid / WFH Options
Rein-Ton
for our AI projects. Your work will directly influence our AI initiatives, contributing to innovative solutions for complex problems. Key Responsibilities Design, develop, and maintain scalable data pipelines using Databricks and Azure services. Perform feature set engineering, report preparations, and ML tasks. Work with data scientists, ML engineers, and collaborators to deliver high-quality data solutions. Collaborate with the Data … experience as a Data Engineer with a strong background in data pipelines. Proficiency in Python, Java, or Scala, and big data technologies (e.g., Hadoop, Spark, Kafka). Experience with Databricks, Azure AI Services, and cloud platforms (AWS, Google Cloud, Azure). Solid understanding of SQL and NoSQL databases. Strong problem-solving skills and ability to work in a fast-paced More ❯
architecture principles in a Data & Analytics context Azure architecture, infrastructure services and networking Experience with Azure Analytical components and databases like Azure Data Factory, Azure Synapse Analytics, Microsoft Fabric, Databricks, Power BI, SQL Server, Familiar with concepts and components such as VPNs, Private Endpoints, Virtual networks, Firewall, multi-region network, Aware of best-practices in data modeling, data warehousing, data More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Net Talent
engineering or a related field, with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow , Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT pipelines . Proficiency in SQL and at least More ❯
London, England, United Kingdom Hybrid / WFH Options
Howden Group Holdings
quality assurance, test automation, or data validation . Experience in testing data pipelines, ETL/ELT workflows, and big data environments . Familiarity with Azure data platforms , such as Databricks, Azure Data Factory, Synapse Analytics, or ADLS . Proficiency in SQL and scripting languages (e.g., Python, Scala) for data validation and test automation. Experience with test automation frameworks (e.g., Great More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
and derivatives. Technical Skills Proficiency in Python and SQL for data manipulation and model development. Experience in working with large-scale datasets and utilizing big data technologies (e.g., Azure, Databricks, Spark) Familiarity with data visualization tools such as Power BI, or matplotlib Strong knowledge of predictive modeling, machine learning techniques, and statistical analysis. Soft Skills Analytical mindset with a passion More ❯
London, England, United Kingdom Hybrid / WFH Options
Amber Labs
design sessions with stakeholders, ensuring solutions align with NHS requirements and best practices. Develop end-to-end data solutions leveraging Azure services, including Azure Synapse Analytics, Azure Data Factory, Databricks, and Azure SQL. Define data models, integration patterns, and governance frameworks to ensure efficient data management, interoperability, and compliance. Drive cloud migration strategies for NHS data systems, modernizing legacy environments More ❯
London, England, United Kingdom Hybrid / WFH Options
Novo Nordisk
Other skills we are searching for are: Programming Skills: Proficiency in Python, data analytics, deep learning (Scikit-learn, Pandas, PyTorch, Jupyter, pipelines), and practical knowledge of data tools like Databricks, Ray, Vector Databases, Kubernetes, and workflow scheduling tools such as Apache Airflow, Dagster, and Astronomer. GPU Computing: Familiarity with GPU computing, both on-premises and on cloud platforms, and experience More ❯
Hook, Hampshire, United Kingdom Hybrid / WFH Options
Elanco Tiergesundheit AG
Management, Regression Testing, Data Obfuscation, etc. Knowledge of Azure Data Factory/GCP Cloud Data Fusion, Microsoft Azure Machine Learning or GCP Cloud ML Engine, Azure Data Lake, Azure Databricks or GCP Cloud Dataproc. Experience scaling an "API-Ecosystem", designing, and implementing "API-First" integration patterns. Experience working with authentication and authorization protocols/patterns. Experience with AI security, model More ❯
Data Engineer – Python | Databricks | PySpark Company: Fortune 500 Financial Services firm Location: Hybrid - London Type: Permanent Salary: £90k + 20% bonus + Exceptional Benefits Exclusively via DATAHEAD Are you a detail-oriented Python Developer who thrives in complex data environments? Do you have hands-on experience with Databricks , PySpark , and cloud-native data engineering? We’re hiring a Data Engineer … first, engineering-led environment where you’ll play a key role in developing high-quality, scalable data products. What You’ll Do: Build and maintain scalable Python applications using Databricks and PySpark Design and optimise robust data pipelines and processing frameworks Write clean, modular, and testable code aligned with SOLID principles Contribute to CI/CD pipelines , automated testing frameworks … Help shape the data architecture that underpins machine learning and AI models What You’ll Bring: Proven experience developing in Python for data-intensive applications Hands-on expertise with Databricks and PySpark Strong grasp of cloud data platforms and modern engineering practices Familiarity with CI/CD, version control (e.g. Git), and automated testing Ability to work effectively in cross More ❯
City of London, London, United Kingdom Hybrid / WFH Options
DATAHEAD
Data Engineer – Python | Databricks | PySpark Company: Fortune 500 Financial Services firm Location: Hybrid - London Type: Permanent Salary: £90k + 20% bonus + Exceptional Benefits Exclusively via DATAHEAD Are you a detail-oriented Python Developer who thrives in complex data environments? Do you have hands-on experience with Databricks , PySpark , and cloud-native data engineering? We’re hiring a Data Engineer … first, engineering-led environment where you’ll play a key role in developing high-quality, scalable data products. What You’ll Do: Build and maintain scalable Python applications using Databricks and PySpark Design and optimise robust data pipelines and processing frameworks Write clean, modular, and testable code aligned with SOLID principles Contribute to CI/CD pipelines , automated testing frameworks … Help shape the data architecture that underpins machine learning and AI models What You’ll Bring: Proven experience developing in Python for data-intensive applications Hands-on expertise with Databricks and PySpark Strong grasp of cloud data platforms and modern engineering practices Familiarity with CI/CD, version control (e.g. Git), and automated testing Ability to work effectively in cross More ❯
South East, England, United Kingdom Hybrid / WFH Options
Data Science Talent
Data Engineer (Databricks & Azure) - Clean Energy Location: South East England (Hybrid - 1 day onsite per week) Salary: £60k - £70k + benefits package 18 months. That’s all the time it took for the client’s Databricks platform to evolve into a key driver of innovative green technologies. Now, they’re looking for someone to take it even further. Imagine joining … the Role? You’ll join a highly skilled data team, part of a broader department focused on modelling and digitalisation. This team develops and maintains a cutting-edge Azure Databricks Data Lakehouse platform to support all core business functions. Your primary goal will be building and maintaining robust, secure data pipelines and models that deliver trusted datasets to internal and … external stakeholders, enabling data-driven decisions across the organisation. As a Data Engineer, you will maintain, monitor, and enhance the Databricks platform that powers the client’s data services. You’ll work on building robust pipelines using Azure Data Lake and Python while collaborating closely with data scientists, simulation engineers, and the wider business. Reporting to the Head of Data More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Data Intellect Limited
Join to apply for the Databricks Solutions Architect role at Data Intellect . Company Description Every question matters, every pathway, direction, thought process, and conversation. Because every challenge makes us stronger, more knowledgeable, more determined, and more valuable to our clients. We’re not big on egos and we’re not for the faint-hearted. We stand for camaraderie, collaboration … and change. Welcome to Data Intellect . Challenge Accepted. Job Description As a Databricks Solutions Architect at Data Intellect, you will work on a variety of impactful customer technical projects, including designing and building reference architectures and productionalising customer use cases within Capital Markets. What you will be doing: Define end-to-end data architectures, ensuring best practices for data … governance, access control, and cost optimisation using Unity Catalog and Delta Lake. Provide guidance on Databricks best practices for query performance tuning, storage optimisation, and efficient compute resource allocation. Guide strategic customers in adopting Databricks Lakehouse as a unified data platform for structured and unstructured market data. Enhance and grow your knowledge among subject matter experts in our learning and More ❯
a skilled and motivated Lead Data Engineer with Insurance experience to play a key role in the creation of a brand-new data platform within the Azure ecosystem and Databricks . This is an exciting opportunity to be at the forefront of data innovation, working within a newly formed Data & Analytics team in a long standing London Market Insurer . … a Lead Data Engineer, you'll work closely with both technical and business stakeholders, leveraging your expertise to design, develop, and optimize a high-performance data platform built on Databricks . This platform will be built to scale, incorporating the latest advancements in data intelligence while supporting strategic business objectives. Key Responsibilities 🔹 Build & Develop – Design and maintain a robust Databricks … solutions. 🔹 Futureproofing – Drive the evolution of the data platform, ensuring adaptability for new data sources, analytical models, and emerging technologies. What You’ll Bring ✅ Extensive hands-on experience with Databricks , and Microsoft Azure data tools (must-have: Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensional modelling expertise for analytics use cases. ✅ Strong ETL/ELT development skills. ✅ Python More ❯
City of London, London, United Kingdom Hybrid / WFH Options
KDR Talent Solutions
a skilled and motivated Lead Data Engineer with Insurance experience to play a key role in the creation of a brand-new data platform within the Azure ecosystem and Databricks . This is an exciting opportunity to be at the forefront of data innovation, working within a newly formed Data & Analytics team in a long standing London Market Insurer . … a Lead Data Engineer, you'll work closely with both technical and business stakeholders, leveraging your expertise to design, develop, and optimize a high-performance data platform built on Databricks . This platform will be built to scale, incorporating the latest advancements in data intelligence while supporting strategic business objectives. Key Responsibilities 🔹 Build & Develop – Design and maintain a robust Databricks … solutions. 🔹 Futureproofing – Drive the evolution of the data platform, ensuring adaptability for new data sources, analytical models, and emerging technologies. What You’ll Bring ✅ Extensive hands-on experience with Databricks , and Microsoft Azure data tools (must-have: Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensional modelling expertise for analytics use cases. ✅ Strong ETL/ELT development skills. ✅ Python More ❯
Data Engineer – Investment Banking – London/Hybrid (Data Engineer, SQL Data Engineer, Java, Python, Spark, Scala, SQL, Snowflake, OO programming, Snowflake, Databricks, Data Fabric, design patterns, SOLID principles, ETL, Unit testing, NUnit, MSTest, Junit, Microservices Architecture, Continuous Integration, Azure DevOps, AWS, Jenkins, Agile, Data Engineer, SQL Data Engineer) We have several fantastic new roles including a Data Engineer position to … teams to join their new and growing IT team! They are looking for an experienced Lead Data Engineer with expert level Java/Python, as well as Snowflake/Databricks to join an exceptional core engineering team and deliver features across their Data Engineering platform. We are seeking a Data Engineer who has advanced working knowledge of Snowflake/DatabricksMore ❯
City of London, London, United Kingdom Hybrid / WFH Options
Noir
Data Engineer – Investment Banking – London/Hybrid (Data Engineer, SQL Data Engineer, Java, Python, Spark, Scala, SQL, Snowflake, OO programming, Snowflake, Databricks, Data Fabric, design patterns, SOLID principles, ETL, Unit testing, NUnit, MSTest, Junit, Microservices Architecture, Continuous Integration, Azure DevOps, AWS, Jenkins, Agile, Data Engineer, SQL Data Engineer) We have several fantastic new roles including a Data Engineer position to … teams to join their new and growing IT team! They are looking for an experienced Lead Data Engineer with expert level Java/Python, as well as Snowflake/Databricks to join an exceptional core engineering team and deliver features across their Data Engineering platform. We are seeking a Data Engineer who has advanced working knowledge of Snowflake/DatabricksMore ❯
City of London, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Investment Banking - London/Hybrid (Data Engineer, SQL Data Engineer, Java, Python, Spark, Scala, SQL, Snowflake, OO programming, Databricks, Data Fabric, design patterns, SOLID principles, ETL, Unit testing, NUnit, MSTest, Junit, Microservices Architecture, Continuous Integration, Azure DevOps, AWS, Jenkins, Agile, Data Engineer, SQL Data Engineer) We have several fantastic new roles, including a Data Engineer position to join … to join their new and growing IT division. They are looking for an experienced Lead Data Engineer with expert-level Java/Python skills, as well as Snowflake and Databricks experience, to join their core engineering team and deliver features across their Data Engineering platform. We are seeking a Data Engineer who has advanced working knowledge of Snowflake/DatabricksMore ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Investment Banking - London/Hybrid (Data Engineer, SQL Data Engineer, Java, Python, Spark, Scala, SQL, Snowflake, OO programming, Snowflake, Databricks, Data Fabric, design patterns, SOLID principles, ETL, Unit testing, NUnit, MSTest, Junit, Microservices Architecture, Continuous Integration, Azure DevOps, AWS, Jenkins, Agile, Data Engineer, SQL Data Engineer) We have several fantastic new roles including a Data Engineer position to … teams to join their new and growing IT team! They are looking for an experienced Lead Data Engineer with expert level Java/Python, as well as Snowflake/Databricks to join an exceptional core engineering team and deliver features across their Data Engineering platform. We are seeking a Data Engineer who has advanced working knowledge of Snowflake/DatabricksMore ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
MAG (Airports Group)
models, exploring customer behaviours, and supporting personalisation strategies - with opportunities to work on NLP projects too. You'll also take ownership of projects, support our data science tooling (including Databricks and AWS), and collaborate closely with experts in Data Engineering, BI, Analytics, and Data Governance to solve problems and create scalable solutions that make a tangible difference. What's in … and continuously develop your skills in a collaborative, hybrid working environment. About you Role Responsibilities: Design, build, and maintain scalable machine learning pipelines using Python and PySpark. Work within Databricks to develop, schedule, and monitor data workflows, utilising Databricks Asset Bundles. Collaborate with data analysts, engineers, and other scientists to deliver clean, reliable, and well-documented datasets. Develop and maintain … skills with a problem-solving mindset. Strong analytical and communication skills, with the ability to tailor complex insights for both technical and non-technical audiences. Hands-on experience with Databricks for deploying, monitoring, and maintaining machine learning pipelines. Experience working with AWS data services and architectures. Good understanding of code versioning and CI/CD tools and practices. Familiarity with More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
ASDA
for someone who's technically capable but commercially curious - who wants to see their work create clear, measurable value. You'll be working in a hybrid cloud environment (Azure, Databricks), applying your skills to real challenges in areas like customer behaviour, operations, and digital journeys. You'll learn from experienced colleagues, develop your craft, and help embed analytics into everyday … science solutions that drive outcomes - whether it's increasing efficiency, reducing cost, or improving customer experience. Build & Apply Models : Support the development of predictive and optimisation models using Python, Databricks, and Azure. Help ensure outputs are robust, interpretable, and actionable. Enable Data-Driven Decisions : Develop dashboards and visual narratives using Power BI that translate data into insight business users can … technical audiences. Self-starter who thrives in fast-moving environments with a strong sense of ownership. A numerate degree (e.g. Maths, Stats, Engineering, Computer Science). Desirable: Experience using Databricks or working in a cloud-based environment like Azure. Exposure to MLOps, version control, or productionising models. Experience working with Jira and Confluence in an Agile environment is advantageous. Streamlit More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
MAG (Airports Group)
and this role will have a big say in what we build next. You’ll be responsible for designing and building robust, scalable data pipelines using PySpark, SQL and Databricks — enabling our analytics, BI and data science colleagues to unlock real value across the business. This is a brilliant opportunity for someone who’s passionate about data quality, modern engineering … in this role: 2–5 years in data engineering or a related field Strong PySpark and advanced SQL skills Practical experience building and maintaining ETL/ELT pipelines in Databricks Familiarity with CI/CD pipelines and version control practices Nice to have: Experience using Databricks Asset Bundles (DAB) Working knowledge of GCP and/or Azure in multi-cloud More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Snap Analytics
a thought leader by authoring blogs, presenting at webinars, and engaging in external speaking opportunities. Technical Delivery Excellence You'll design and optimise cloud-based data architectures (e.g. Snowflake, Databricks, Redshift, Google BigQuery, Azure Synapse) to support advanced analytics. You'll build and automate scalable, secure, and high-performance data pipelines handling diverse data sources. You'll work closely with … designing data architectures on platforms like AWS, Azure, or GCP. Technical Skills Extensive experience with ETL/ELT tools (e.g. Matillion, Informatica, Talend) and cloud data platforms (e.g., Snowflake, Databricks, BigQuery). Expertise in data modelling techniques (e.g., star schema, snowflake schema) and optimising models for analytics and reporting. Familiarity with version control, CI/CD pipelines, and containerisation tools More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Snap Analytics
a thought leader by authoring blogs, presenting at webinars, and engaging in external speaking opportunities. Technical Delivery Excellence You’ll design and optimise cloud-based data architectures (e.g. Snowflake, Databricks, Redshift, Google BigQuery, Azure Synapse) to support advanced analytics. You’ll build and automate scalable, secure, and high-performance data pipelines handling diverse data sources. You’ll work closely with … designing data architectures on platforms like AWS, Azure, or GCP. Technical Skills Extensive experience with ETL/ELT tools (e.g. Matillion, Informatica, Talend) and cloud data platforms (e.g., Snowflake, Databricks, BigQuery). Expertise in data modelling techniques (e.g., star schema, snowflake schema) and optimising models for analytics and reporting. Familiarity with version control, CI/CD pipelines, and containerisation tools More ❯
Haywards Heath, Sussex, United Kingdom Hybrid / WFH Options
First Central Services
and version management of large numbers of data science models (Azure DevOps). You'll support the implementation of Machine Learning Ops on cloud (Azure & Azure ML. Experience with Databricks is advantageous.) You'll protect against model degradation and operational performance issues through the development and continual automated monitoring of model execution and model quality. You'll manage automatic model … and integration Basic understanding of networking concepts within Azure Familiarity with Docker and Kubernetes is advantageous Experience within financial/insurance services industry is advantageous Experience with AzureML and Databricks is advantageous Skills & Qualifications Strong understanding of Microsoft Azure, (Azure ML, Azure Stream Analytics, Cognitive services, Event Hubs, Synapse, and Data Factory) Fluency in common data science coding capabilities such More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
First Central Services
and version management of large numbers of data science models (Azure DevOps). You'll support the implementation of Machine Learning Ops on cloud (Azure & Azure ML. Experience with Databricks is advantageous.) You'll protect against model degradation and operational performance issues through the development and continual automated monitoring of model execution and model quality. You'll manage automatic model … and integration Basic understanding of networking concepts within Azure Familiarity with Docker and Kubernetes is advantageous Experience within financial/insurance services industry is advantageous Experience with AzureML and Databricks is advantageous Skills & Qualifications Strong understanding of Microsoft Azure, (Azure ML, Azure Stream Analytics, Cognitive services, Event Hubs, Synapse, and Data Factory) Fluency in common data science coding capabilities such More ❯