Data Architect/Senior Data Architect £600 - £650 P.D - Outside IR35 6 Month contract 1 day a week onsite in central London Overview: We're working with a Data Architect to provide expert-level consulting services over a 6-month More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
next-generation data platform. Working as part of a growing data team, you will play a critical role in designing and deploying scalable data pipelines and solutions using Azure Databricks and related technologies. This is an opportunity to contribute to a cloud-first, modern data strategy within a collaborative and forward-thinking environment. Key Responsibilities: Design and develop end-to … end data pipelines (batch and streaming) using Azure Databricks, Spark, and Delta Lake. Implement the Medallion Architecture and ensure consistency across raw, enriched, and curated data layers. Build and optimise ETL/ELT processes using Azure Data Factory and PySpark. Enforce data governance through Azure Purview and Unity Catalog. Apply DevOps and CI/CD practices using Git and Azure … analysts and business stakeholders to ensure data quality and usability. Contribute to performance optimisation and cost efficiency across data solutions. Required Skills & Experience: Proven hands-on experience with Azure Databricks, Data Factory, Delta Lake, and Synapse. Strong proficiency in Python, PySpark, and advanced SQL. Understanding of Lakehouse architecture and medallion data patterns. Familiarity with data governance, lineage, and access control More ❯
reference data) Understanding of regulatory reporting processes Proven ability to work directly with demanding front office stakeholders Experience with real-time data feeds and low-latency requirements Preferred Skills Databricks experience Capital markets knowledge (equities, fixed income, derivatives) Experience with financial data vendors (Bloomberg, Reuters, MarkIt) Cloud platforms (Azure preferred) and orchestration tools Understanding of risk metrics and P&L More ❯
. Oversee the development of dashboards, KPIs, and operational reports that support real-time and batch analytics across front-to-back processes. Ensure integration with enterprise data platforms (e.g. Databricks) and data warehouses. Monitor system performance, identify bottlenecks, and optimise data processing workflows for improved efficiency and scalability. Team Leadership & Governance Manage and mentor a team of BI developers, delivery More ❯
DATA ENGINEER - DBT/AIRFLOW/DATABRICKS 4-MONTH CONTRACT £450-550 PER DAY OUTSIDE IR35 This is an exciting opportunity for a Data Engineer to join a leading media organisation working at the forefront of data innovation. You'll play a key role in designing and building the data infrastructure that supports cutting-edge machine learning and LLM initiatives. … workflows with DBT and Airflow Collaborating with AI/ML teams to support data readiness for experimentation and inference Writing clean, modular SQL and Python code for use in Databricks Contributing to architectural decisions around pipeline scalability and performance Supporting the integration of diverse data sources into the platform Ensuring data quality, observability, and cost-efficiency KEY SKILLS AND REQUIREMENTS … Strong experience with DBT, Airflow, and Databricks Advanced SQL and solid Python scripting skills Solid understanding of modern data engineering best practices Ability to work independently and communicate with technical and non-technical stakeholders Experience in fast-paced, data-driven environments DESIRABLE SKILLS Exposure to LLM workflows or vector databases Experience in the media, content, or publishing industries Familiarity with More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Proven experience in a Data Engineer role with a successful track record Proficiency in SQL and experience with database administration Mastery of tools such as Power BI, Data Factory, Databricks, SQL Server, and Oracle Familiarity with acting as a data engineer within cloud environments like Azure. Strong analytical, problem-solving, and attention-to-detail skills Effective communication and teamwork skills More ❯
Data Pipeline Development: Design and implement end-to-end data pipelines in Azure Databricks, handling ingestion from various data sources, performing complex transformations, and publishing data to Azure Data Lake or other storage services. Write efficient and standardized Spark SQL and PySpark code for data transformations, ensuring data integrity and accuracy across the pipeline. Automate pipeline orchestration using Databricks Workflows … and unstructured data from various sources (APIs, databases, file systems). Implement data transformation logic using Spark, ensuring data is cleaned, transformed, and enriched according to business requirements. Leverage Databricks features such as Delta Lake to manage and track changes to data, enabling better versioning and performance for incremental data loads. Data Publishing & Integration: Publish clean, transformed data to Azure … access control, and metadata management across data assets. Ensure data security best practices, such as encryption at rest and in transit, and role-based access control (RBAC) within Azure Databricks and Azure services. Performance Tuning & Optimization: Optimize Spark jobs for performance by tuning configurations, partitioning data, and caching intermediate results to minimize processing time and resource consumption. Continuously monitor and More ❯
Data Pipeline Development: Design and implement end-to-end data pipelines in Azure Databricks, handling ingestion from various data sources, performing complex transformations, and publishing data to Azure Data Lake or other storage services. Write efficient and standardized Spark SQL and PySpark code for data transformations, ensuring data integrity and accuracy across the pipeline. Automate pipeline orchestration using Databricks Workflows … and unstructured data from various sources (APIs, databases, file systems). Implement data transformation logic using Spark, ensuring data is cleaned, transformed, and enriched according to business requirements. Leverage Databricks features such as Delta Lake to manage and track changes to data, enabling better versioning and performance for incremental data loads. Data Publishing & Integration: Publish clean, transformed data to Azure … access control, and metadata management across data assets. Ensure data security best practices, such as encryption at rest and in transit, and role-based access control (RBAC) within Azure Databricks and Azure services. Performance Tuning & Optimization: Optimize Spark jobs for performance by tuning configurations, partitioning data, and caching intermediate results to minimize processing time and resource consumption. Continuously monitor and More ❯
join a skilled and collaborative analytics team of eight, working closely with senior leadership. The team has already laid the foundations for a modern data platform using Azure and Databricks and is now focused on building out scalable ETL processes, integrating AI tools, and delivering bespoke analytics solutions across the organisation. THE ROLE As a Data Engineer, you'll play … a pivotal role in designing and implementing robust data pipelines, supporting the migration from legacy Azure systems to Databricks, and working closely with stakeholders to deliver tailored data solutions. This role combines hands-on development with collaborative architecture design, and offers the opportunity to contribute to AI readiness within a fast-paced business. KEY RESPONSIBILITIES Develop and maintain ETL pipelines … loads Connect and integrate diverse data sources across cloud platforms Collaborate with analytics and design teams to create bespoke, scalable data solutions Support data migration efforts from Azure to Databricks Use Terraform to manage and deploy cloud infrastructure Build robust data workflows in Python (e.g., pandas, PySpark) Ensure the platform is scalable, efficient, and ready for future AI use cases More ❯
Role with 2 days per week onsite in Central London. Skillset required: * Data Pipeline Expertise: Extensive experience in designing and implementing scalable ETL/ELT data pipelines in Azure Databricks, transforming raw data into usable datasets for analysis. * Azure Databricks Proficiency: Strong knowledge of Spark (SQL, PySpark) for data transformation and processing within Databricks, along with experience building workflows and … automation using Databricks Workflows. * Azure Data Services: Hands-on experience with Azure services like Azure Data Lake, Azure Blob Storage, and Azure Synapse for data storage, processing, and publication. * Data Governance & Security: Familiarity with managing data governance and security using Databricks Unity Catalog, ensuring data is appropriately organized, secured, and accessible to authorized users. * Optimization & Performance Tuning: Proven experience in More ❯
Role with 2 days per week onsite in Central London. Skillset required: * Data Pipeline Expertise: Extensive experience in designing and implementing scalable ETL/ELT data pipelines in Azure Databricks, transforming raw data into usable datasets for analysis. * Azure Databricks Proficiency: Strong knowledge of Spark (SQL, PySpark) for data transformation and processing within Databricks, along with experience building workflows and … automation using Databricks Workflows. * Azure Data Services: Hands-on experience with Azure services like Azure Data Lake, Azure Blob Storage, and Azure Synapse for data storage, processing, and publication. * Data Governance & Security: Familiarity with managing data governance and security using Databricks Unity Catalog, ensuring data is appropriately organized, secured, and accessible to authorized users. * Optimization & Performance Tuning: Proven experience in More ❯
Bracknell, Berkshire, South East, United Kingdom Hybrid / WFH Options
Halian Technology Limited
deliver end-to-end data architecture solutions to support business intelligence, reporting, and regulatory needs Lead the integration and optimisation of large-scale data platforms using Azure Synapse and Databricks Build and maintain robust data pipelines using Python (PySpark) and SQL Collaborate with data engineers, analysts, and stakeholders to ensure data quality, governance, and security Ensure all solutions adhere to … financial regulations and internal compliance standards Key Skills & Experience: Proven experience as a Data Architect within the financial services sector Hands-on expertise with Azure Synapse Analytics and Databricks Strong programming and data engineering skills in Python (PySpark) and SQL Solid understanding of financial data and regulatory compliance requirements Excellent stakeholder communication and documentation skills More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Hays Specialist Recruitment Limited
both remotely and onsite. The contract has potential for extension or transition onto other exciting projects. Key Responsibilities:* Design, build, and maintain scalable data pipelines using Azure Data Factory, Databricks, and Azure Synapse.* Collaborate with cross-functional teams to ensure data quality and integration.* Support data ingestion, transformation, and storage in cloud environments.* Troubleshoot and optimise data workflows for performance … Maintain compliance with security protocols and clearance requirements. Essential Skills & Experience:* Must hold NPPV3 + SC clearance (this is a mandatory requirement).* Proven expertise in Azure Data Factory, Databricks, and Azure Synapse Analytics.* Strong experience in building and managing cloud-based data solutions.* Solid understanding of data modeling, ETL/ELT processes, and data warehousing.* Excellent communication skills and More ❯
Nottingham, Nottinghamshire, United Kingdom Hybrid / WFH Options
Rullion - Eon
Join our client in embarking on an ambitious data transformation journey using Databricks, guided by best practice data governance and architectural principles. To support this, we are recruiting for talented data engineers. As a major UK energy provider, our client is committed to 100% renewable energy and sustainability, focusing on delivering exceptional customer experiences. It is initially a 3-month … week being based in their Nottingham office, this is negotiable. It is a full-time role, 37 hours per week. Accountabilities: * Develop and maintain scalable, efficient data pipelines within Databricks, continuously evolving them as requirements and technologies change. * Build and manage an enterprise data model within Databricks. * Integrate new data sources into the platform using batch and streaming processes, adhering … and Skills: * Extensive experience of Python preferred, including advanced concepts like decorators, protocols, functools, context managers, and comprehensions. * Strong understanding of SQL, database design, and data architecture. * Experience with Databricks and/or Spark. * Knowledgeable in data governance, data cataloguing, data quality principles, and related tools. * Skilled in data extraction, joining, and aggregation tasks, especially with big data and real More ❯
Tech stack Python (pandas, NumPy, scikit-learn, PyTorch/TensorFlow)SQL (Redshift, Snowflake or similar)AWS SageMaker Azure ML migration, with Docker, Git, Terraform, Airflow/ADFOptional extras: Spark, Databricks, Kubernetes. What you'll bring 3-5+ years building optimisation or recommendation systems at scale. Strong grasp of mathematical optimisation (e.g., linear/integer programming, meta-heuristics) as well More ❯
and Excellent. What you'll need to succeed Strong proficiency in SQL, Excel, Power BI, and DAX is essential. Experience with cloud-based data platforms such as Snowflake or DataBricks is preferred. Expertise in T-SQL to write complex queries and stored procedures. Understanding of database optimisation, data mining, auditing, and segmentation. Skilled in data visualisation and statistical techniques such More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Unolabs
Job Title: Platform Engineer Databricks Modernisation (DPS to CDP Migration) Role: 6 Months Rolling Contract Job Type: Hybrid/Remote Project Overview: The current system runs on a legacy Private Virtual Cloud (PVC) Databricks deployment, which is being phased out by late 2025. This project involves migrating pipelines and platform infrastructure running on PVC to the Databricks Enterprise Edition (E2 … to ensure smooth migration of pipelines. Key Responsibilities: Infrastructure Migration & Automation Split and migrate codebase into sub-repos tailored for CDP deployment. Refactor Terraform modules to deploy infrastructure for Databricks clusters and supporting services on AWS EC2-based Databricks E2. Manage infrastructure-as-code to provision resources such as AWS IAM roles, S3 bucket permissions, and Jenkins agents. Ensure the … mitigations. Modify and maintain Jenkins pipelines to deploy to both environments, ensuring consistent test coverage across shared and core repos. Dockerization & Service Management Build and publish Docker images for: Databricks compute environments. Supporting microservices such as Broker, Scheduler, and Status Monitor. Push and manage Docker images in AWS ECR (Elastic Container Registry) and integrate them with GitLab CI/CD More ❯
the data engineering team to build the tools and processes to efficiently ingest data into the environment as well as support the wider data team with their transition to Databricks and the Azure platform. Furthermore, you will also introduce AI models into production with automated monitoring and alerting in place. You will - Build and develop reusable pipelines Work with the … team to improve operating efficiencies Deploy production AI models Work with the team to support ETL processes What you'll need to succeed Seasoned knowledge of the Azure Databricks platform and associated functionalities Strong Python programming knowledge, ideally Pyspark A logical and analytical approach to problem-solving Awareness of the modern data stack and associated methodologies What you'll get More ❯
Employment Type: Contract
Rate: £500.0 - £650.0 per day + £500 to £650 per day
for every Technology and Data initiative in the backlog working the stakeholders as appropriate Has worked on Agile and Waterfall methodologies. Experience in Data Platform is build with ADF, Databricks, Power BI, Azure D, Waterfall, Agile, RAID, Azure Devops, Data Factory, Details: Inside IR35 2-3 days in the London office 3 Months with potential extension Please send me your More ❯
for every Technology and Data initiative in the backlog working the stakeholders as appropriate Has worked on Agile and Waterfall methodologies. Experience in Data Platform is build with ADF, Databricks, Power BI, Azure D, Waterfall, Agile, RAID, Azure Devops, Data Factory, Details: Inside IR35 2-3 days in the London office 3 Months with potential extension Please send me your More ❯
Nottingham, Nottinghamshire, England, United Kingdom
E.ON
We're undertaking a fast paced data transformation into Databricks at E.ON Next using best practice data governance and architectural principles, and we are growing our data engineering capability within the Data Team. As part of our journey we're looking for a data architect to help bring our vision to life to design and review data models, iterate on … practice. A strong attention to detail and a curiosity about the data you will be working with. A strong understanding of Linux based tooling and concepts. Strong experience with Databricks and/or Spark. Experienced with data governance, data cataloguing, data quality principles, and associated tools. Understanding of data extraction, joining, and aggregation tasks, especially on big and real-time More ❯
database skills Experience with geospatial data and FME Solid track record in technical delivery and team leadership AWS/FME certifications preferred Bonus: Experience with ETL tools (Glue, ADF, Databricks) If you are interested in spearheading data transformation in leading company, please apply now. Alternatively, feel free to reach me directly on h.barmi @ ioassociates . co . uk. More ❯
wide data strategy aligned with business objectives, focusing on governance, scalability, and innovation Lead the evolution of a modern data platform using cloud-native technologies (e.g. Snowflake, Azure, GCP, Databricks) Manage and mentor cross-functional teams across data engineering, analytics, governance, and data science Promote a data-as-a-product culture, enabling self-serve capabilities and empowering business users Introduce More ❯
new platform team! What you'll need to succeed Great experience as a Data Engineer in Financial Services or Insurance - Strong experience working in regulated environments. Vast expertise with Databricks - possess Databricks certifications (desirable) Strong experience with Python. Expertise with modern data pipelines and ran things in production. Strong stakeholder management experience! What you'll get in return Flexible working More ❯
new platform team! What you'll need to succeed Great experience as a Data Engineer in Financial Services or Insurance - Strong experience working in regulated environments. Vast expertise with Databricks - possess Databricks certification. Strong experience with Python. Expertise with modern data pipelines and running into production. Strong stakeholder management experience - you will help stakeholders use the platform. What you'll More ❯
Employment Type: Contract
Rate: £600.0 - £650.0 per day + £600-650 Per Day (Inside IR35)