to prototype development and product improvement. End-to-end implementation of data science pipelines. Experience and QualificationsKnowledge/Skills Expertise in tools such as SQL, Azure Data Factory, Azure Databricks, Azure Synapse, R, Python and Power BI. Exceptional ability to communicate strategic and technical concepts to a diverse audience and translate business needs into technical requirements Experience of liaising effectively More ❯
practices. Collaborate with cross-functional teams to translate business needs into technical solutions. Core Skills Cloud & Platforms : Azure, AWS, SAP Data Engineering : ELT, Data Modeling, Integration, Processing Tech Stack : Databricks (PySpark, Unity Catalog, DLT, Streaming), ADF, SQL, Python, Qlik DevOps : GitHub Actions, Azure DevOps, CI/CD pipelines Please click here to find out more about our Key Information Documents. More ❯
practices. Collaborate with cross-functional teams to translate business needs into technical solutions. Core Skills Cloud & Platforms : Azure, AWS, SAP Data Engineering : ELT, Data Modeling, Integration, Processing Tech Stack : Databricks (PySpark, Unity Catalog, DLT, Streaming), ADF, SQL, Python, Qlik DevOps : GitHub Actions, Azure DevOps, CI/CD pipelines Please click here to find out more about our Key Information Documents. More ❯
for front office reporting. What you'll need to succeed Strong business intelligence/data engineering experience with strong Power BI expertise. Data engineering experience building production pipelines - with Databricks Business experience working around the Front Office/Risk is a must! Understanding of regulatory reporting processes. Financial product knowledge e.g. - equities, fixed income, derivatives. Experience on Azure cloud platforms. More ❯
Employment Type: Contract
Rate: £750.0 - £800.0 per day + £700-800 Per Day (Inside IR35)
for front office reporting. What you'll need to succeed Strong business intelligence/data engineering experience with strong Power BI expertise. Data engineering experience building production pipelines - with Databricks Business experience working around the Front Office/Risk is a must! Understanding of regulatory reporting processes. Financial product knowledge e.g. - equities, fixed income, derivatives. Experience on Azure cloud platforms. More ❯
Data Modelling Strong understanding of data integration, data quality, and data governance. Extensive experience in working with big data technology tools and platforms such as Microsoft Azure Data Factory, Databricks, Unity Catalog, PySpark, Power BI, Synapse, SQL Server, Cosmos Db, Python. Understanding and application of cloud architectures and microservices in big data solutions. Understanding of commodities industry. Rate/Duration More ❯
wide data strategy aligned with business objectives, focusing on governance, scalability, and innovation Lead the evolution of a modern data platform using cloud-native technologies (e.g. Snowflake, Azure, GCP, Databricks) Manage and mentor cross-functional teams across data engineering, analytics, governance, and data science Promote a data-as-a-product culture, enabling self-serve capabilities and empowering business users Introduce More ❯
ROLE ORGANISATION BASED IN BRISTOL 3-MONTH CONTRACT IMMEDIATE START SKILLS Extensive experience in Azure Data Factory, SQL and Python. Strong understanding of ETL processes. Prior working experience with Databricks and Unity Catalogue. RESPONSIBILITIES Taking Proof of Concept into production. Fixing issues that may arise from the Proof of Concept production process'. The candidate must be a self-starter More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Hays
ROLE ORGANISATION BASED IN BRISTOL 3-MONTH CONTRACT IMMEDIATE START SKILLS Extensive experience in Azure Data Factory, SQL and Python. Strong understanding of ETL processes. Prior working experience with Databricks and Unity Catalogue. RESPONSIBILITIES Taking Proof of Concept into production. Fixing issues that may arise from the Proof of Concept production process'. The candidate must be a self-starter More ❯
Employment Type: Contract, Work From Home
Rate: £550.0 - £600.0 per day + £550 Per Day Inside IR35
with Celery for distributed task execution and background job processing, particularly in data pipeline or microservices environments Hands-on experience with Azure cloud services, especially Azure Data Factory, Azure Databricks, Azure Storage, and Azure Synapse. Proficiency in designing and deploying CI/CD pipelines using Azure DevOps (YAML pipelines, release management, artifact handling). More ❯
experience who have the following skills: Good data analytics experience Strong hands-on experience with SQL Knowledge of ETL processes Data quality audits Data progressing and data manipulation experience Databricks experience (Azure or AWS) is a plus but not essential Medallion Architecture knowledge is a plus but not essential Experience working on government/public sector clients is a massive More ❯
focused on building scalable data models, automated pipelines, and enabling self-serve analytics across the organisation. Key Responsibilities Build and maintain scalable data pipelines using Azure Data Factory and Databricks Develop transformation workflows in dbt, SQL, and Python Design dimensional models and semantic layers to support analytics use cases Implement automated data quality checks, monitoring, and alerting systems Create reusable … re Looking For 5+ years' experience in analytics engineering or closely related roles Strong proficiency in SQL, Python, and dbt (or similar transformation tools) Hands-on experience with Azure Databricks, ADF, and cloud-based data platforms Solid understanding of dimensional modelling, lakehouse/warehouse design, and scalable architecture Familiarity with Git, CI/CD, and modern software engineering workflows Confident More ❯
in Office) I am currently on the lookout for a Contract AWS Data Engineer with a scale-up who have number of greenfield projects coming up. Tech Stack: AWS Databricks Lakehouse PySpark SQL ClickHouse/MySQL/DynamoDB If you are interested in this position, please click apply with an updated copy of you CV and I will call you More ❯
Week in Office) I am currently on the lookout for a Contract AWS Data Engineer with a scale-up who have number of greenfield projects coming up.Tech Stack: AWS Databricks Lakehouse PySpark SQL ClickHouse/MySQL/DynamoDB If you are interested in this position, please click apply with an updated copy of you CV and I will call you More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Involved Solutions
London. Key Responsibilities - Azure Data Engineer: Design, build and maintain scalable and secure data pipelines on the Azure platform. Develop and deploy data ingestion processes using Azure Data Factory, Databricks (PySpark), and Azure Synapse Analytics. Optimise ETL/ELT processes to improve performance, reliability and efficiency. Integrate multiple data sources including Azure Data Lake (Gen2), SQL-based systems and APIs. … incl. GDPR and ISO standards). Required Skills & Experience - Azure Data Engineer: Proven commercial experience as a Data Engineer delivering enterprise-scale solutions in Azure Azure Data Factory Azure Databricks (PySpark) Azure Synapse Analytics Azure Data Lake Storage (Gen2) SQL & Python Understanding of CI/CD in a data environment, ideally with tools like Azure DevOps. Experience working within consultancy More ❯
Ahead of Trends: Keep abreast of industry trends and emerging technologies to build future-focused data architecture roadmaps. Desirable Skills: Experience with enterprise data modelling. Hands-on experience with Databricks, Snowflake, or similar platforms. Familiarity with Collibra. ### Why Join Us? Innovative Environment: Be part of a forward-thinking organisation that is modernising its data estate and enhancing data services. More ❯
Ahead of Trends: Keep abreast of industry trends and emerging technologies to build future-focused data architecture roadmaps. Desirable Skills: Experience with enterprise data modelling. Hands-on experience with Databricks, Snowflake, or similar platforms. Familiarity with Collibra. ### Why Join Us? Innovative Environment: Be part of a forward-thinking organisation that is modernising its data estate and enhancing data services. More ❯
Ahead of Trends: Keep abreast of industry trends and emerging technologies to build future-focused data architecture roadmaps. Desirable Skills: Experience with enterprise data modelling. Hands-on experience with Databricks, Snowflake, or similar platforms. Familiarity with Collibra. ### Why Join Us? Innovative Environment: Be part of a forward-thinking organisation that is modernising its data estate and enhancing data services. More ❯
Stratford-Upon-Avon, Warwickshire, West Midlands, United Kingdom
Zensar Technologies
What's this role about? As a Senior Data Consultant, you will be responsible for providing advisory and thought leadership on the migration to Azure/Databricks Data Cloud or implementing data solutions using Data Cloud services including integration with existing data and analytics platforms and tools. You will contribute in pre-sales and design, implement scalable data architectures, and … Studio, Collibra, Precisely, OneTrust Experience in MDM/DQM/Data Governance technologies like Collibra, Atacama, Alation, Reltio Experience with the modern data, analytics and business intelligence tools, - Promethium, Databricks, Spark, Fabric, SQL, Python, and PowerBi is a must Business Analysis and requirement gathering: End-to-end experience of gathering and documenting requirements to deliver Solutions Designs that meet business More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
IO Associates
Data Engineer- Outside Ir35 - £500 daily rate- Manchester - 6 Month Role Overview: We're seeking an experienced Contract Data Engineer with a strong background in Databricks and cloud data platforms to support a high-impact data transformation programme. This hybrid role requires 2 days per week onsite in our Manchester office (non-negotiable) , with the remainder remote. You'll join … paced, delivery-focused data team responsible for building and optimising scalable, production-grade data pipelines and infrastructure. Key Responsibilities: Design and implement robust, scalable ETL/ELT pipelines using Databricks and Apache Spark Ingest, transform, and manage large volumes of data from diverse sources Collaborate with analysts, data scientists, and business stakeholders to deliver clean, accessible datasets Ensure high performance … best practices Work with cloud-native tools and services (preferably Azure ) Required Skills & Experience: Proven experience as a Data Engineer on cloud-based projects Strong hands-on skills with Databricks , Apache Spark , and Python or Scala Proficient in SQL and working with large-scale data environments Experience with Delta Lake , Azure Data Lake , or similar technologies Familiarity with version control More ❯
Databricks Trainer – Contract Vacancy Rate: £300 - £350 p/d IR35 Status: Outside Working Location: Remote My client has a requirement for a Databricks Trainer on a contract basis. The client is an training provider and consultancy. You will be supporting on a 4 week databricks program which is due to start either the end of August or middle of … September. All the training materials are already created. Trainers must have experience working with Databricks commerical or training Databricks. If you're suitably skilled and avaiable to deliver, please apply now. As an industry leading, nationwide Marketing, Digital, Analytics, IT and Design recruitment agency, we are continually receiving new assignments to work on, so keep a close eye on our More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Unolabs
Job Title: Platform Engineer Databricks Modernisation (DPS to CDP Migration) Role: 6 Months Rolling Contract Job Type: Hybrid/Remote Project Overview: The current system runs on a legacy Private Virtual Cloud (PVC) Databricks deployment, which is being phased out by late 2025. This project involves migrating pipelines and platform infrastructure running on PVC to the Databricks Enterprise Edition (E2 … to ensure smooth migration of pipelines. Key Responsibilities: Infrastructure Migration & Automation Split and migrate codebase into sub-repos tailored for CDP deployment. Refactor Terraform modules to deploy infrastructure for Databricks clusters and supporting services on AWS EC2-based Databricks E2. Manage infrastructure-as-code to provision resources such as AWS IAM roles, S3 bucket permissions, and Jenkins agents. Ensure the … mitigations. Modify and maintain Jenkins pipelines to deploy to both environments, ensuring consistent test coverage across shared and core repos. Dockerization & Service Management Build and publish Docker images for: Databricks compute environments. Supporting microservices such as Broker, Scheduler, and Status Monitor. Push and manage Docker images in AWS ECR (Elastic Container Registry) and integrate them with GitLab CI/CD More ❯
contributed to the delivery of complex business cloud solutions. The ideal candidate will have a strong background in Machine Learning engineering and an expert in operationalising models in the Databricks MLFlow environment (chosen MLOps Platform). Responsibilities: Collaborate with Data Scientists and operationalize the model with auditing enabled, ensure the run can be reproduced if needed. Implement Databricks best practices More ❯
Nottingham, Nottinghamshire, East Midlands, United Kingdom Hybrid / WFH Options
Rullion Limited
Join our client in embarking on an ambitious data transformation journey using Databricks, guided by best practice data governance and architectural principles. To support this, we are recruiting for talented data engineers. As a major UK energy provider, our client is committed to 100% renewable energy and sustainability, focusing on delivering exceptional customer experiences. It is initially a 3-month … week being based in their Nottingham office, this is negotiable. It is a full-time role, 37 hours per week. Accountabilities: * Develop and maintain scalable, efficient data pipelines within Databricks, continuously evolving them as requirements and technologies change. * Build and manage an enterprise data model within Databricks. * Integrate new data sources into the platform using batch and streaming processes, adhering … and Skills: * Extensive experience of Python preferred, including advanced concepts like decorators, protocols, functools, context managers, and comprehensions. * Strong understanding of SQL, database design, and data architecture. * Experience with Databricks and/or Spark. * Knowledgeable in data governance, data cataloguing, data quality principles, and related tools. * Skilled in data extraction, joining, and aggregation tasks, especially with big data and real More ❯
Nottingham, Nottinghamshire, England, United Kingdom
E.ON
We're undertaking a fast paced data transformation into Databricks at E.ON Next using best practice data governance and architectural principles, and we are growing our data engineering capability within the Data Team. As part of our journey we're looking for a data architect to help bring our vision to life to design and review data models, iterate on … practice. A strong attention to detail and a curiosity about the data you will be working with. A strong understanding of Linux based tooling and concepts. Strong experience with Databricks and/or Spark. Experienced with data governance, data cataloguing, data quality principles, and associated tools. Understanding of data extraction, joining, and aggregation tasks, especially on big and real-time More ❯