for you. The successful Senior AWS Data Engineer candidate will have the chance to make a significant impact in designing the platform and working on cutting-edge technologies like Databricks and Snowflake in the heart of a leading global Investment Banks’ front-office. This is a rare greenfield role that offers the opportunity to solve the ultimate data pipeline challenge … and gaining an overview of many different sectors. What We’re Looking For 10 + years, hands-on experience in AWS data engineering technologies, including Glue, PySpark, Athena, Iceberg, Databricks, Lake Formation, and other standard data engineering tools. Strong experience engineering in a front-office/capital markets environment. Previous experience in implementing best practices for data engineering, including data More ❯
for you. The successful Senior AWS Data Engineer candidate will have the chance to make a significant impact in designing the platform and working on cutting-edge technologies like Databricks and Snowflake in the heart of a leading global Investment Banks’ front-office. This is a rare greenfield role that offers the opportunity to solve the ultimate data pipeline challenge … and gaining an overview of many different sectors. What We’re Looking For 10 + years, hands-on experience in AWS data engineering technologies, including Glue, PySpark, Athena, Iceberg, Databricks, Lake Formation, and other standard data engineering tools. Strong experience engineering in a front-office/capital markets environment. Previous experience in implementing best practices for data engineering, including data More ❯
of DevOps/DevSecOps and/or platform engineering and infrastructure-as-code (e.g., Terraform or Ansible) Experience with the Elastic Stack (Search, Kibana), Prometheus, and Grafana Familiarity with Databricks, MLflow/Model Registry, feature stores, and data governance Experience with containerized microservices architectures and API design and their cloud deployment (containers, Kubernetes, Helm) Knowledge of security, networking, virtualization, high … and handing them over to operations Technological vision: • Working with new technologies and services alongside established industry standards Technologies: AI API AWS Ansible Architect Azure CI/CD Cloud Databricks DevSecOps DevOps Grafana Helm Support Kibana Kubernetes Machine Learning Prometheus Python Security Terraform Web microservices More: AI Architect Germany - Remote (Within Location) • Full-Time • Engineering Join Nordcloud and be part More ❯
Reigate, England, United Kingdom Hybrid/Remote Options
esure Group
data products that enable self-service analytics, advanced modelling, and AI-driven decision-making across our insurance business. What you’ll do: Design and manage scalable cloud data platforms (Databricks on AWS) across development, staging, and production environments, ensuring reliable performance and cost efficiency. Integrate and model data from diverse sources – including warehouses, APIs, marketing platforms, and operational systems – using … and Delta Live Tables Strong background in building high-performance, scalable data models that support self-service BI and regulatory reporting requirements Direct exposure to cloud-native data infrastructures (Databricks, Snowflake) especially in AWS environments is a plus Experience in building and maintaining batch and streaming data pipelines using Kafka, Airflow, or Spark Familiarity with governance frameworks, access controls (RBAC More ❯
City Of Bristol, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Data Engineer Salary: Up to £75,000 I am working with a forward-thinking organisation that is modernising its data platform to support scalable analytics and business intelligence across the Group. With a strong focus on Microsoft technologies and cloud More ❯
Reading, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Data Engineer Salary: Up to £75,000 I am working with a forward-thinking organisation that is modernising its data platform to support scalable analytics and business intelligence across the Group. With a strong focus on Microsoft technologies and cloud More ❯
Oxford, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Data Engineer Salary: Up to £75,000 I am working with a forward-thinking organisation that is modernising its data platform to support scalable analytics and business intelligence across the Group. With a strong focus on Microsoft technologies and cloud More ❯
Data Engineer Salary: Up to £75,000 I am working with a forward-thinking organisation that is modernising its data platform to support scalable analytics and business intelligence across the Group. With a strong focus on Microsoft technologies and cloud More ❯
Leicester, Leicestershire, England, United Kingdom
Robert Walters
Data EngineerLeicester£65,000 to £75,000Permanent I am currently looking for a Data Engineer to join a forward-thinking organisation, where you will play a pivotal role in building a modern, enterprise-scale data platform. Data Engineer - What will More ❯
Job Details Domain: Finance and Banking Job Description: We are seeking talented ETL Developer for one of our clients. We are seeking a skilled Python, SQL, and Databricks developer with 4 to 8 years of experience to join our dynamic team. The successful candidate will play a crucial role in developing, maintaining, and optimizing data-driven applications and solutions. This … position requires a deep understanding of Python programming, SQL databases, and the Databricks platform. Preferable banking experience. Responsibilities: Software Development: Design, develop, test, and deploy high performance and scalable Data Solutions using Python. Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications. Implement efficient and maintainable code using best practices and coding standards. SQL … database management: Utilize expertise in SQL to design, optimize, and maintain relational databases. Write complex SQL queries for data retrieval, manipulation, and analysis. Perform database performance tuning and optimization. Databricks platform: Work with Databricks platform for big data processing and analytics develop and maintain ETL processes using data bricks notebook We are an equal opportunity employer. All aspects of employment More ❯
long term project that is looking for a Data Engineering Lead to head up a team of three, driving the design and optimisation of a cutting-edge Azure/Databricks data platform. This hybrid role is perfect for an innovative and adaptable professional, ready to make an impact on turning data into real business impact. You would play your part … in improving the company's performance and scalability.*Skills and responsibilities needed:* - Databricks, SQL and Azure Data Factory.- Expert in MS Business Intelligence tools and platforms.- Proven experience in data modelling and database management.- Ability to lead and mentor a team of data engineers.- Familiarity with data warehousing and ETL processes.Looking for someone to start as soon as possible for More ❯
Glasgow, Scotland, United Kingdom Hybrid/Remote Options
Square One Resources
as Airflow and AWS Step Functions. Analyse current processes and propose end-to-end technical improvements. Engage with stakeholders to translate business needs into technical designs. Support migrations into Databricks and modern data platforms. Work closely with data scientists to build and deploy machine learning models. Required Technical Skills Strong data engineering background with hands-on AWS experience across: S3 … KMS AWS CloudFormation (mandatory) UI development experience (mandatory) Strong SQL, Python, and PySpark Experience with GitLab and unit testing Knowledge of modern data engineering patterns and best practices Desirable (Databricks Track) Apache Spark Databricks (Delta Lake, Unity Catalog, MLflow) Experience with Databricks migration or development AI/ML understanding (nice to have More ❯
Lead Data Engineer (Databricks Specialist) Contract: 3–6 months Day Rate: £600–£700 (Outside IR35) Start Date: January 2026 Location: UK (Remote/Hybrid options) We are representing a Databricks delivery partner who help organisations transform the way they use data, building scalable, modern data platforms and accelerating AI adoption through Databricks. We’re looking for a Lead Data Engineer … who is exceptional with Databricks and ready to play a key role in designing, building, and delivering cutting-edge data solutions for enterprise clients. This is a hands-on role for a seasoned engineer who can operate across architecture, delivery, and consulting. Key responsibilities include: Design and build end-to-end data solutions on Databricks, using Spark, Python, SQL, and … mentoring others, reviewing code, and driving best practices Collaborate with data scientists, architects, and business teams to deliver production-grade outcomes Essential skills needed: Deep hands-on experience with Databricks (SQL, PySpark, Delta Lake, Unity Catalog, Workflows) Strong proficiency in Python and Spark Solid understanding of CI/CD pipelines, DevOps, and Infrastructure as Code Proven track record designing and More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Primus
Lead Data Engineer (Databricks Specialist) Contract: 3–6 months Day Rate: £600–£700 (Outside IR35) Start Date: January 2026 Location: UK (Remote/Hybrid options) We are representing a Databricks delivery partner who help organisations transform the way they use data, building scalable, modern data platforms and accelerating AI adoption through Databricks. We’re looking for a Lead Data Engineer … who is exceptional with Databricks and ready to play a key role in designing, building, and delivering cutting-edge data solutions for enterprise clients. This is a hands-on role for a seasoned engineer who can operate across architecture, delivery, and consulting. Key responsibilities include: Design and build end-to-end data solutions on Databricks, using Spark, Python, SQL, and … mentoring others, reviewing code, and driving best practices Collaborate with data scientists, architects, and business teams to deliver production-grade outcomes Essential skills needed: Deep hands-on experience with Databricks (SQL, PySpark, Delta Lake, Unity Catalog, Workflows) Strong proficiency in Python and Spark Solid understanding of CI/CD pipelines, DevOps, and Infrastructure as Code Proven track record designing and More ❯
Senior Machine Learning/AI Engineer Position Overview: We are seeking a Senior Machine Learning/AI Engineer with expertise in Databricks, MLOps/LLMOps, and cloud-native architecture . The candidate must have recent experience implementing data science solutions in Databricks and be comfortable deploying web applications via containerized workflows (Docker, Kubernetes). This role involves building scalable AI …/ML systems, deploying LLMs, and operationalizing models in production. Key Responsibilities: Design, develop, and deploy ML, Deep Learning, and LLM solutions. Implement scalable ML and data pipelines in Databricks (PySpark, Delta Lake, MLflow). Build automated MLOps pipelines with model tracking, CI/CD, and registry. Deploy and operationalize LLMs , including fine-tuning, prompt optimization, and monitoring. Architect secure … Jenkins). Mentor engineers, enforce best practices, and lead design/architecture reviews. Required Skills & Experience: 5+ years in ML/AI solution development. Recent hands-on experience with Databricks, PySpark, Delta Lake, MLflow . Experience with LLMs (Hugging Face, LangChain, Azure OpenAI) . Strong MLOps, CI/CD, and model monitoring experience. Proficiency in Python, PyTorch/TensorFlow, FastAPI More ❯
Senior Python Developer - Python, SQL, Databricks, AWS/Azure OB have partnered with a Global Insurance Leader, who are expanding their Python Engineering team, with Senior Python Developer, skilled with Python, SQL, Databricks and Cloud Infrastructure. In this role, you will play a key part in designing and building new capabilities, such as high-performance API's and high-frequency … event streams, as well as making improvements to their current platform. Senior Python Developer - Python, SQL, Databricks, AWS/Azure Key skills and experience: Python FastAPI SQL Databricks AWS/Azure API's Hybrid working with 3-days a week required in a Central London office Pays up to £90k + benefits To be considered, you must be UK based … Senior Python Developer - Python, SQL, Databricks, AWS/Azure More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Oliver Bernard
Senior Python Developer - Python, SQL, Databricks, AWS/Azure OB have partnered with a Global Insurance Leader, who are expanding their Python Engineering team, with Senior Python Developer, skilled with Python, SQL, Databricks and Cloud Infrastructure. In this role, you will play a key part in designing and building new capabilities, such as high-performance API's and high-frequency … event streams, as well as making improvements to their current platform. Senior Python Developer - Python, SQL, Databricks, AWS/Azure Key skills and experience: Python FastAPI SQL Databricks AWS/Azure API's Hybrid working with 3-days a week required in a Central London office Pays up to £90k + benefits To be considered, you must be UK based … Senior Python Developer - Python, SQL, Databricks, AWS/Azure More ❯
Role Title: Sr. Databricks Engineer Location: Glasgow Duration: 31/12/2026 Days on site: 2-3 MUST BE PAYE THROUGH UMBRELLA Role Description: We are currently migrating our data pipelines from AWS to Databricks, and are seeking a Senior Databricks Engineer to lead and contribute to this transformation. This is a hands-on engineering role focused on designing … building, and optimizing scalable data solutions using the Databricks platform. Key Responsibilities: • Lead the migration of existing AWS-based data pipelines to Databricks. • Design and implement scalable data engineering solutions using Apache Spark on Databricks. • Collaborate with cross-functional teams to understand data requirements and translate them into efficient pipelines. • Optimize performance and cost-efficiency of Databricks workloads. • Develop and … maintain CI/CD workflows for Databricks using GitLab or similar tools. • Ensure data quality and reliability through robust unit testing and validation frameworks. • Implement best practices for data governance, security, and access control within Databricks. • Provide technical mentorship and guidance to junior engineers. Must-Have Skills: • Strong hands-on experience with Databricks and Apache Spark (preferably PySpark). • Proven More ❯
Senior Data Engineer, Databricks, £ 60000 - 70000 + benefits. Strong Performant SQL and Databricks required. Home Based with one day a month at the office in Nottingham. Strong commercial knowledge of Databricks and Performant SQL is required for this role. Should also have knowledge of testing, agile environments and ideally finance related projects click apply for full job details More ❯
SC Cleared) Location: London, UK Job Type: Contract Essential Skills & Experience: ·10+ years of experience in data engineering, with at least 3+ years of hands-on experience with Azure Databricks . ·Good proficiency in Python and Spark (PySpark) or Scala. ·Deep understanding of data warehousing principles, data modelling techniques, and data integration patterns. ·Extensive experience with Azure data services, including … Azure Blob Storage, and Azure SQL Database . ·Experience working with large datasets and complex data pipelines. ·Experience with data architecture design and data pipeline optimization. · Proven expertise with Databricks , including hands-on implementation experience and certifications. ·Experience with SQL and NoSQL databases. ·Experience with data quality and data governance processes. ·Experience with version control systems (e.g., Git). ·Experience More ❯
our data flows are scalable, secure, and high-performing. What You’ll Do Architect, build, and enhance integrations across ERP, CRM, POS, e-commerce, and analytics using Microsoft Azure, Databricks, and related tools Design, develop, and maintain secure ETL/ELT pipelines and real-time or near real-time dataflows Act as the subject matter expert on integration best practices … practices What You’ll Bring 10+ years of experience in enterprise integration, data architecture, or data engineering, ideally within retail, beauty, wellness, or consumer goods Proven expertise in Azure, Databricks, ETL, and API or middleware integrations Strong knowledge of SAP, CRM, Shopify POS, and e-commerce platforms Deep understanding of data governance, security, and cloud or SaaS environments Excellent communication … with the ability to influence across teams Strong documentation, solution review, and architectural advisory experience Degree in Computer Science or equivalent certifications Tools and Technologies Microsoft Azure Integration Services Databricks SAP CRM platforms Shopify POS E-commerce platforms API Gateways and management tools ETL and ELT tools Git, CI/CD, and DevOps toolchain What Success Looks Like Reliable, secure More ❯
AI transformation. This role will initially focus on managing Azure related tickets & incidents, and will gradually evolve into more DevOps automation and AI/GenAI engineering work (Azure ML, Databricks, Azure AI Foundry). I Key Responsibilities Fcous on Azure services, networking, App Services, Key Vault, AKS, resource provisioning, and pipelines. Maintain and improve CI/CD pipelines (Azure DevOps … infrastructure provisioning where possible. Collaborate with cloud, data, and AI teams to stabilise and enhance Azure operations. Support the integration and deployment of AI/GenAI workloads (Azure ML, Databricks, Azure AI Foundry, Azure OpenAI). Ensure environment reliability, monitoring, and adherence to cloud governance standards. Skills & Experience Prior experience in financial services or large enterprise environments. Solid knowledge of More ❯
Azure-based analytics tools to generate meaningful insights while ensuring robust governance and data management practices. Key Responsibilities: Design and develop scalable data models and pipelines using Azure Analytics, Databricks, and Synapse . Apply AI frameworks across the Azure ecosystem to extract insights and drive business value. Develop and optimise Power BI dashboards and reporting solutions to provide actionable insights. … experience as a Data Scientist with a background in Data Engineering . Strong database development and management skills (IB/NCL, DBaaS). Hands-on expertise with Azure Analytics, Databricks, Synapse, and Power BI . Demonstrated experience in AI/ML frameworks applied within the Azure ecosystem. Strong understanding of data automation, ingestion, and governance . Experience working with or More ❯
Azure-based analytics tools to generate meaningful insights while ensuring robust governance and data management practices. Key Responsibilities: Design and develop scalable data models and pipelines using Azure Analytics, Databricks, and Synapse . Apply AI frameworks across the Azure ecosystem to extract insights and drive business value. Develop and optimise Power BI dashboards and reporting solutions to provide actionable insights. … experience as a Data Scientist with a background in Data Engineering . Strong database development and management skills (IB/NCL, DBaaS). Hands-on expertise with Azure Analytics, Databricks, Synapse, and Power BI . Demonstrated experience in AI/ML frameworks applied within the Azure ecosystem. Strong understanding of data automation, ingestion, and governance . Experience working with or More ❯
ll be at the heart of our data transformation journey. 🎯 What We’re Looking For: Proven experience in data engineering and solution design Strong skills in Azure Data Factory, Databricks, Unity Catalogue, and SQL Server Practical knowledge of data modelling and ETL development Hands-on coding and development with a focus on quality Excellent communication and stakeholder engagement skills A … data models and ETL pipelines Develop and maintain applications and automation tools Champion best practices in data quality, lineage, and governance Act as SME in Azure Data Factory, Azure Databricks, Unity Catalogue, and SQL Server Support agile delivery, code control, and QA standards Monitor and improve existing data sets and processes Enable colleagues through testing, training, and knowledge sharing 🎁 What More ❯