Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Unolabs
Job Title: Platform Engineer Databricks Modernisation (DPS to CDP Migration) Role: 6 Months Rolling Contract Job Type: Hybrid/Remote Project Overview: The current system runs on a legacy Private Virtual Cloud (PVC) Databricks deployment, which is being phased out by late 2025. This project involves migrating pipelines and platform infrastructure running on PVC to the Databricks Enterprise Edition (E2 … to ensure smooth migration of pipelines. Key Responsibilities: Infrastructure Migration & Automation Split and migrate codebase into sub-repos tailored for CDP deployment. Refactor Terraform modules to deploy infrastructure for Databricks clusters and supporting services on AWS EC2-based Databricks E2. Manage infrastructure-as-code to provision resources such as AWS IAM roles, S3 bucket permissions, and Jenkins agents. Ensure the … mitigations. Modify and maintain Jenkins pipelines to deploy to both environments, ensuring consistent test coverage across shared and core repos. Dockerization & Service Management Build and publish Docker images for: Databricks compute environments. Supporting microservices such as Broker, Scheduler, and Status Monitor. Push and manage Docker images in AWS ECR (Elastic Container Registry) and integrate them with GitLab CI/CD More ❯
the data engineering team to build the tools and processes to efficiently ingest data into the environment as well as support the wider data team with their transition to Databricks and the Azure platform. Furthermore, you will also introduce AI models into production with automated monitoring and alerting in place. You will - Build and develop reusable pipelines Work with the … team to improve operating efficiencies Deploy production AI models Work with the team to support ETL processes What you'll need to succeed Seasoned knowledge of the Azure Databricks platform and associated functionalities Strong Python programming knowledge, ideally Pyspark A logical and analytical approach to problem-solving Awareness of the modern data stack and associated methodologies What you'll get More ❯
Employment Type: Contract
Rate: £500.0 - £650.0 per day + £500 to £650 per day
Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Hays
JOB DETAILS - £500-£550 PER DAY - OUTSIDE IR35 - REMOTE ROLE - 3-MONTH CONTRACT WITH POTENTIAL FOR EXTENSION - NPPV3 AND SC CLEARANCE REQUIRED SKILLS - Extensive experience in Azure Data Factory, Databricks and Synapse. - Knowledge of Oracle. - Understanding of security protocols, dealing with policing data and clearance requirements. RESPONSIBILITIES - Strong collaboration skills with other teams and colleagues within the organisation. - Ability to More ❯
Bracknell, Berkshire, South East, United Kingdom Hybrid / WFH Options
Halian Technology Limited
deliver end-to-end data architecture solutions to support business intelligence, reporting, and regulatory needs Lead the integration and optimisation of large-scale data platforms using Azure Synapse and Databricks Build and maintain robust data pipelines using Python (PySpark) and SQL Collaborate with data engineers, analysts, and stakeholders to ensure data quality, governance, and security Ensure all solutions adhere to … financial regulations and internal compliance standards Key Skills & Experience: Proven experience as a Data Architect within the financial services sector Hands-on expertise with Azure Synapse Analytics and Databricks Strong programming and data engineering skills in Python (PySpark) and SQL Solid understanding of financial data and regulatory compliance requirements Excellent stakeholder communication and documentation skills More ❯
experience who have the following skills: Good data analytics experience Strong hands-on experience with SQL Knowledge of ETL processes Data quality audits Data progressing and data manipulation experience Databricks experience (Azure or AWS) is a plus but not essential Medallion Architecture knowledge is a plus but not essential Experience working on government/public sector clients is a massive More ❯
actionable insights. Develop and maintain robust Power BI dashboards and reports, ensuring accuracy and usability for clinical and operational teams. Design and build data pipelines and transformation logic using Databricks and SQL Mesh. Work with stakeholders across the Trust to gather requirements, interpret business needs, and deliver tailored BI solutions. Successful candidates must have experience with SQL Server, PowerBI, DatabricksMore ❯
Nottingham, Nottinghamshire, England, United Kingdom
E.ON
We're undertaking a fast paced data transformation into Databricks at E.ON Next using best practice data governance and architectural principles, and we are growing our data engineering capability within the Data Team. As part of our journey we're looking for a data architect to help bring our vision to life to design and review data models, iterate on … practice. A strong attention to detail and a curiosity about the data you will be working with. A strong understanding of Linux based tooling and concepts. Strong experience with Databricks and/or Spark. Experienced with data governance, data cataloguing, data quality principles, and associated tools. Understanding of data extraction, joining, and aggregation tasks, especially on big and real-time More ❯
Azure Architect- Databricks, Cloud, Design, Hybrid, A Logistics business that has recently undertaken a significant transformation programme is looking to invest in its growing technology function and bring in, on a contract basis, a solid and reliable Azure Architect. Our client's working arrangements require three to four days per week in the office. This is aligned with their emphasis … complex data concepts to non-technical stakeholders. Expertise in designing and documenting data architectures (e.g., data warehouses, lakehouses, master/reference data models). Hands-on experience with Azure Databricks, including: Workspace and cluster configuration. Delta Lake table design and optimization. Integration with Unity Catalog for metadata management. Proficiency with Unity Catalog, including: Setting up data lineage and governance policies. … tools (e.g., Erwin, PowerDesigner, SQL DBML). Familiarity with data governance frameworks and tools. Understanding of cloud security and compliance (e.g., GDPR, ISO 27001) in Azure environments. Azure Architect- Databricks, Cloud, Design, Hybrid, McGregor Boyall is an equal opportunity employer and do not discriminate on any grounds. More ❯
new platform team! What you'll need to succeed Great experience as a Data Engineer in Financial Services or Insurance - Strong experience working in regulated environments. Vast expertise with Databricks - possess Databricks certification. Strong experience with Python. Expertise with modern data pipelines and running into production. Strong stakeholder management experience - you will help stakeholders use the platform. What you'll More ❯
NHS Arden and Greater East Midlands Commissioning Support Unit
development of pipelines to feed dashboards. In particular, to include proficient in coding using R studio and pyspark, able to use Github, and able to develop dashboards using tableau, databricks, and/or FDP. The post-holder should have strong requirements gathering skills to understand and design new reporting and dashboards. They should have strong management skills required to lead More ❯
NHS Arden and Greater East Midlands Commissioning Support Unit
development of pipelines to feed dashboards. In particular, to include proficient in coding using R studio and pyspark, able to use Github, and able to develop dashboards using tableau, databricks, and/or FDP. The post-holder should have strong requirements gathering skills to understand and design new reporting and dashboards. They should have strong management skills required to lead More ❯
join a skilled and collaborative analytics team of eight, working closely with senior leadership. The team has already laid the foundations for a modern data platform using Azure and Databricks and is now focused on building out scalable ETL processes, integrating AI tools, and delivering bespoke analytics solutions across the organisation. THE ROLE As a Data Engineer, you'll play … a pivotal role in designing and implementing robust data pipelines, supporting the migration from legacy Azure systems to Databricks, and working closely with stakeholders to deliver tailored data solutions. This role combines hands-on development with collaborative architecture design, and offers the opportunity to contribute to AI readiness within a fast-paced business. KEY RESPONSIBILITIES Develop and maintain ETL pipelines … loads Connect and integrate diverse data sources across cloud platforms Collaborate with analytics and design teams to create bespoke, scalable data solutions Support data migration efforts from Azure to Databricks Use Terraform to manage and deploy cloud infrastructure Build robust data workflows in Python (e.g., pandas, PySpark) Ensure the platform is scalable, efficient, and ready for future AI use cases More ❯
new platform team! What you'll need to succeed Great experience as a Data Engineer in Financial Services or Insurance - Strong experience working in regulated environments. Vast expertise with Databricks - possess Databricks certifications (desirable) Strong experience with Python. Expertise with modern data pipelines and ran things in production. Strong stakeholder management experience! What you'll get in return Flexible working More ❯
ll need to succeed Previous experience working as a Devops Engineer, preferably within the Financial Services industry Needs to have experience with the following: Azure, Teraform, GitLabs, Data Dog, Databricks What you'll get in return An exciting opportunity to join an international organisation working with a major financial services organisation. Furthermore, a competitive day rate for this role will More ❯
will involve delivering high-quality insights into customer behaviour, marketing performance and loyalty schemes to drive strategic decisions and business growth. Insights Analyst, key skills: Python and SQL knowledge Databricks or BigQuery - highly desirable Stakeholder management Previously worked within customer behaviour, marketing schemes and loyalty schemes Retail experience We are committed to fostering a diverse and inclusive recruitment process. We More ❯
I am currently working with a London-based Financial-Services client who are actively seeking 2 Senior Python Developers for a long-term project within a market-leading front office team. What you'll need to succeed Extensive Python Development More ❯
We are looking for candidates who have experience in delivering training and documentation on the following technologies: Datastorage and serialisation Databricks developer Apache Spaark Airflow DBT basics Databricks You must have experience delivering training on these technologies the work is 4 week delivered training some is hands some is self led and some is instructor led. Apply if this is More ❯
Databricks Trainer - Contract 4 week course Class of 10 Databricks focus Must have experience training with the following: Datastorage and Serialisation Databricks Developer Apache Spaark Airflow DBT basics £750 Per Day Outside IR35 Fully Remote Start date: end of August/1st September 2025 Content to be delivered over 4 weeks consisting of taking some instructor responsibility and self lead More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
next-generation data platform. Working as part of a growing data team, you will play a critical role in designing and deploying scalable data pipelines and solutions using Azure Databricks and related technologies. This is an opportunity to contribute to a cloud-first, modern data strategy within a collaborative and forward-thinking environment. Key Responsibilities: Design and develop end-to … end data pipelines (batch and streaming) using Azure Databricks, Spark, and Delta Lake. Implement the Medallion Architecture and ensure consistency across raw, enriched, and curated data layers. Build and optimise ETL/ELT processes using Azure Data Factory and PySpark. Enforce data governance through Azure Purview and Unity Catalog. Apply DevOps and CI/CD practices using Git and Azure … analysts and business stakeholders to ensure data quality and usability. Contribute to performance optimisation and cost efficiency across data solutions. Required Skills & Experience: Proven hands-on experience with Azure Databricks, Data Factory, Delta Lake, and Synapse. Strong proficiency in Python, PySpark, and advanced SQL. Understanding of Lakehouse architecture and medallion data patterns. Familiarity with data governance, lineage, and access control More ❯
for every Technology and Data initiative in the backlog working the stakeholders as appropriate Has worked on Agile and Waterfall methodologies. Experience in Data Platform is build with ADF, Databricks, Power BI, Azure D, Waterfall, Agile, RAID, Azure Devops, Data Factory, Details: Inside IR35 2-3 days in the London office 3 Months with potential extension Please send me your More ❯
reference data) Understanding of regulatory reporting processes Proven ability to work directly with demanding front office stakeholders Experience with real-time data feeds and low-latency requirements Preferred Skills Databricks experience Capital markets knowledge (equities, fixed income, derivatives) Experience with financial data vendors (Bloomberg, Reuters, MarkIt) Cloud platforms (Azure preferred) and orchestration tools Understanding of risk metrics and P&L More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Proven experience in a Data Engineer role with a successful track record Proficiency in SQL and experience with database administration Mastery of tools such as Power BI, Data Factory, Databricks, SQL Server, and Oracle Familiarity with acting as a data engineer within cloud environments like Azure. Strong analytical, problem-solving, and attention-to-detail skills Effective communication and teamwork skills More ❯
Data Architect/Senior Data Architect £600 - £650 P.D - Outside IR35 6 Month contract 1 day a week onsite in central London Overview: We're working with a Data Architect to provide expert-level consulting services over a 6-month More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Lorien
in a regulated environment e.g. FCA, PRA and wider regulatory, compliance environment for global insurance businesses (e.g. GDPR) Knowledge of Business Intelligence and Data platform technologies (e.g. Microsoft Fabric, Databricks etc.). Good knowledge of technology governance standards Good knowledge of technology best practice including SDLC Working knowledge of Enterprise and Solution Architecture frameworks and methodologies (e.g. TOGAF) Professional Qualifications More ❯
. Oversee the development of dashboards, KPIs, and operational reports that support real-time and batch analytics across front-to-back processes. Ensure integration with enterprise data platforms (e.g. Databricks) and data warehouses. Monitor system performance, identify bottlenecks, and optimise data processing workflows for improved efficiency and scalability. Team Leadership & Governance Manage and mentor a team of BI developers, delivery More ❯