with data visualisation platforms like Tableau or Power BI Exposure to machine learning or data science application development Knowledge of modern data platform technologies such as Quantexa, Palantir, or Databricks Proficiency with open-source data tools and frameworks Experience with test automation within CI/CD pipelines More ❯
to architectural decisions, and support the migration of reporting services to Azure. Key Responsibilities: Design, build, and maintain ETL pipelines using Azure Data Factory , Azure Data Lake , Synapse , and Databricks . Design and build a greenfield Azure data platform to support business-critical data needs. Collaborate with stakeholders across the organization to gather and define data requirements. Assist in the More ❯
of these Senior Data Engineer roles you must be able to demonstrate the following experience: Experience in prominent languages such as Python, Scala, Spark, SQL. Good experience in using Databricks Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Experience with the design, build and maintenance of data pipelines and infrastructure Understanding of More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Anson Mccade
large-scale platforms and diverse stakeholder groups Strong data modelling skills across OLTP and OLAP systems, with experience building datamarts and warehouses Proficiency with cloud-native tools (e.g. Snowflake, Databricks, Azure Data Services, etc.) High-level SQL skills and experience with tools like SSMS, SSIS, or SSRS Strong understanding of data governance, RBAC, and information security practices Able to evaluate More ❯
Asda's data strategy. What You'll Bring Proven experience in data analysis, with the ability to interpret complex information and deliver meaningful insights. Advanced SQL knowledge Experience with Databricks is a plus, though not essential. Familiarity with relational databases and data structures. Proficiency in Excel and other Microsoft Office tools; experience with BI platforms such as Power BI is More ❯
Data Engineer (Databricks) - Leeds (Data Engineer, Python, PySpark, SQL, Big Data, Databricks, R, Machine Learning, AI, Agile, Scrum, TDD, BDD, CI/CD, SOLID principles, Github, Azure DevOps, Jenkins, Terraform, AWS CDK, AWS CloudFormation, Azure, Data Engineer) Our client is a global innovator and world leader with one of the most recognisable names within technology. They are looking for Data … Engineers with significant Databricks experience to join an exceptional Agile engineering team. We are seeking a Data Engineer with strong Python, PySpark and SQL experience, possess a clear understanding of databricks, as well as a passion for Data Science (R, Machine Learning and AI). Database experience with SQL and No-SQL - Aurora, MS SQL Server, MySQL is expected, as … Salary: £40k - £50k + Pension + Benefits To apply for this position please send your CV to Nathan Warner at Noir Consulting. (Data Engineer, Python, PySpark, SQL, Big Data, Databricks, R, Machine Learning, AI, Agile, Scrum, TDD, BDD, CI/CD, SOLID principles, Github, Azure DevOps, Jenkins, Terraform, AWS CDK, AWS CloudFormation, Azure, Data Engineer) NOIRUKTECHREC NOIRUKREC More ❯
Lead Data Engineer (Databricks) - Leeds (Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer, Python, PySpark, SQL, Big Data, Databricks, R, Machine Learning, AI, Agile, Scrum, TDD, BDD, CI/CD, SOLID principles, Github, Azure DevOps, Jenkins, Terraform, AWS CDK, AWS CloudFormation, Azure, Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer) Our client … is a global innovator and world leader with one of the most recognisable names within technology. They are looking for a Lead Data Engineer with significant Databricks experience as well as leadership responsibility to run an exceptional Agile engineering team and provide technical leadership through coaching and mentorship. We are seeking a Lead Data Engineer capable of leading client delivery … data solutions and developing complex enterprise data ETL and ML pipelines and projections. The successful candidate will have strong Python, PySpark and SQL experience, possess a clear understanding of databricks, as well as a passion for Data Science (R, Machine Learning and AI). Database experience with SQL and No-SQL - Aurora, MS SQL Server, MySQL is expected, as well More ❯
ll also play a key part in mentoring junior engineers and shaping the long-term DevOps roadmap. Architect and build scalable Azure infrastructure and deployment pipelines Lead integration of Databricks, Azure services, and CI/CD workflows Automate infrastructure using Terraform (IaC) Mentor junior DevOps engineer Establish and govern DevOps best practices, automation, and security standards Guide adoption of Microsoft More ❯
architecture in a complex organisation Expertise in Kimball and Data Vault 2.0 methodologies Strong grasp of data modelling, metadata, and governance knowledge Hands-on experience with modern data platforms (Databricks, Delta Lake, Unity Catalog, Azure) Ability to define and drive architecture principles, patterns, and best practices Excellent communication and stakeholder management skills Retail industry experience is a bonus but not More ❯
decision making for Cox Automotive. You'll collaborate with a talented team, using open-source tools such as R, Python, and Spark, data visualisation tools like Power BI, and Databricks data platform. Key Responsibilities: Develop and implement analytics strategies that provide actionable insights for our business and clients. Apply the scientific method to create robust, reproducible solutions Collaborate with stakeholders … seamlessly with team members and external clients. Proficiency in R or Python. Solid understanding of SQL; experience working with Spark (Java, Python, or Scala variants) and cloud platforms like Databricks is a plus. Strong statistical knowledge, including hypothesis testing, confidence intervals, and A/B testing. Ability to understand and communicate the commercial impact of data activities. Why Join Us More ❯
Data Engineer (Databricks) - Leeds Our client is a global innovator and world leader with a highly recognizable name within technology. They are looking for Data Engineers with significant Databricks experience to join an exceptional Agile engineering team. We are seeking a Data Engineer with strong Python, PySpark, and SQL experience, a clear understanding of Databricks, and a passion for Data More ❯
Engineering, Data Science, Analytics, and DevOps teams to align operational strategies with technical and business requirements. Optimize operational performance and cost management for services including Azure Data Factory, Azure Databricks, Delta Lake, and Azure Data Lake Storage. Serve as the domain expert in DataOps by providing strategic guidance, mentoring colleagues, and driving continuous process improvements. What you will need Demonstrable … with agile teams and driving automation of data workflows within the Microsoft Azure ecosystem. Hands-on expertise with Azure Data Platform with components such as Azure Data Factory, Azure Databricks, Azure Data Lake Storage, Delta Lake, Azure SQL, Purview and APIM. Proficiency in developing CI/CD data pipelines and strong programming skills in Python, SQL, Bash, and PySpark for More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Unolabs
Job Title: Platform Engineer Databricks Modernisation (DPS to CDP Migration) Role: 6 Months Rolling Contract Job Type: Hybrid/Remote Project Overview: The current system runs on a legacy Private Virtual Cloud (PVC) Databricks deployment, which is being phased out by late 2025. This project involves migrating pipelines and platform infrastructure running on PVC to the Databricks Enterprise Edition (E2 … to ensure smooth migration of pipelines. Key Responsibilities: Infrastructure Migration & Automation Split and migrate codebase into sub-repos tailored for CDP deployment. Refactor Terraform modules to deploy infrastructure for Databricks clusters and supporting services on AWS EC2-based Databricks E2. Manage infrastructure-as-code to provision resources such as AWS IAM roles, S3 bucket permissions, and Jenkins agents. Ensure the … mitigations. Modify and maintain Jenkins pipelines to deploy to both environments, ensuring consistent test coverage across shared and core repos. Dockerization & Service Management Build and publish Docker images for: Databricks compute environments. Supporting microservices such as Broker, Scheduler, and Status Monitor. Push and manage Docker images in AWS ECR (Elastic Container Registry) and integrate them with GitLab CI/CD More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
native data solutions? We're recruiting on behalf of a dynamic organisation in Leeds that's investing in its data platform - and they're looking for someone with strong Databricks expertise to join their team. About the role: Designing and developing robust data pipelines using Azure Data Factory, Databricks, and Synapse Analytics. Working with Delta Lake and Azure Data Lake … Azure-based architecture Ensuring best practices in data governance, security, and performance tuning Requirements: Proven experience with Azure Data Services (ADF, Synapse, Data Lake) Strong hands-on experience with Databricks (including PySpark or SQL) Solid SQL skills and understanding of data modelling and ETL/ELT processes Familiarity with Delta Lake and lakehouse architecture A proactive, collaborative approach to problem … innovation Benefits: Competitive salary up to £65,000 Flexible hybrid working (2 days in the Leeds office) Generous holiday allowance and pension scheme Ongoing training and certification support (Azure, Databricks, etc.) A supportive, inclusive team culture with real opportunities for growth Please Note: This is a permanent role for UK residents only. This role does not offer Sponsorship. You must More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
PEXA Group Limited
customer-facing reports. You will optimise the transformation pipeline from start to finish, guaranteeing that datasets are robust, tested, secure, and business-ready. Our data platform is built using Databricks, with data pipelines written in PySpark and orchestrated using Airflow. You will be expected to challenge and improve current transformations, ensuring they meet our performance, scalability, and data governance needs. … end-to-end data quality, from raw ingested data to business-ready datasets Optimise PySpark-based data transformation logic for performance and reliability Build scalable and maintainable pipelines in Databricks and Airflow Implement and uphold GDPR-compliant processes around PII data Collaborate with stakeholders to define what "business-ready" means, and confidently sign off datasets as fit for consumption Put … Help shape our approach to reliable data delivery for internal and external customers Skills & Experience Required Extensive hands-on experience with PySpark, including performance optimisation Deep working knowledge of Databricks (development, architecture, and operations) Proven experience working with Airflow for orchestration Proven track record in managing and securing PII data, with GDPR compliance in mind Experience in data governance processes More ❯
technical background but has recently worked in a managerial role focused on mentoring, coaching, reviewing code, and standard setting. The role will focus on the development of the clients Databricks platform (AWS is preferred but open to Azure.GCP experience also), utilising Python and SQL, contribute to CI/CD pipelines, strategy development, cost optimisation and data governance frameworks. Job duties … engineers, helping to mentor and coach the team. Manage the adoption of automated CI/CD pipelines. Implement a new delivery roadmap. Contribute to the development of a new Databricks system in AWS (AWS experience is preferred but they are open to managers with Azure experience). Cost optimisation. Establish data governance frameworks for secure handling of delivery information. Requirements … Manager: 6+ years experience in a hands on data engineer role, with over a years recent experience in a managerial role, coaching similar sized teams. Deep knowledge of the Databricks platform. Hands on Python development experience. SQL optimisation. Experience with large scale data pipeline optimisation. Experience with Streaming and Batch Spark workloads. Strong people management skills. Role: Data Engineering Manager More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Corecom Consulting
and managing modern data pipelines and cloud-based data solutions. What you'll be doing: Designing and maintaining scalable data pipelines Building and optimising data warehousing solutions Working with Databricks and modern cloud platforms (Azure or AWS) Collaborating with cross-functional teams to deliver high-impact data products Leading best practices in data engineering and pipeline architecture What we're … looking for: Proven experience in data engineering at a senior level Strong hands-on knowledge of Databricks Experience with Azure or AWS cloud platforms Expertise in data warehousing and ETL processes Ability to work both independently and as part of a collaborative team Why join us? £60,000 salary Flexible hybrid working (2 days a week in our Leeds office More ❯
you'll work closely with DevOps, Product, and business partners to unlock the value of data through modern, secure and governed pipelines. Lead hands-on development work using Azure, Databricks, and Terraform (IaC) Support the migration from legacy platforms to a more modern cloud stack Act as a mentor and technical guide to the wider engineering team Implement data governance … close to the ML lifecycle YOUR SKILLS AND EXPERIENCE: A successful Principal Data Engineer will have the following skills and experience: Strong hands-on experience with Azure Proficient in Databricks Skilled in Terraform and Infrastructure as Code practices Exposure to containers/Kubernetes Awareness of ML Ops concepts THE BENEFITS: You will receive a salary, dependent on experience. Salary is More ❯
leeds, west yorkshire, yorkshire and the humber, united kingdom Hybrid / WFH Options
VIQU IT Recruitment
to shape the future of logistics. You must have management/leadship experience to be considered for this role. What You'll Do: Lead the technical direction of our Databricks Common Data Platform Drive hands-on development with high-quality Python and SQL code Mentor and coach a distributed team of data engineers Champion CI/CD and automated testing … Work closely with the Head of Data Engineering to align with strategic goals Must-Have Experience: 8+ years in data engineering, including leadership experience Expert-level knowledge of the Databricks platform Advanced Python development for logistics and delivery optimisation Mastery of SQL for large-scale, nationwide operations Strong understanding of data modelling for parcel tracking systems Experience with both streaming More ❯
reporting that underpins our financial and strategic decisions. In this role, you'll be responsible for supporting the delivery and continuous improvement of business critical reports using Power BI, Databricks, and SAP Business Objects. You'll play a key part in ensuring our reporting of business performance is accurate and insightful to support decision making. This is a great time … grow with us. Interested? Take a look below to understand what you'll be doing as a BI Analyst: Deliver a suite of Finance BI reports, using Power BI, Databricks, and Business Objects to support performance tracking and business decision-making. Ensure reporting accuracy and consistency across the BI suite by identifying errors and proposing improvements. Support the transition from … legacy systems such as Alteryx and Business Objects to Databricks and Power BI as part of our data modernisation strategy. Respond to internal stakeholder queries via the Finance BI mailbox, escalating or resolving as appropriate. Assist on new and ongoing business-wide data projects as directed by the BI Manager or Senior Analyst. Provide cover for BI team members during More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Harnham
modelling to solve problems around store planning, network optimisation, and targeted customer engagement Identify whitespace opportunities based on demographics, mobility, and purchasing behaviour Advanced Geospatial Analytics Apply tools like Databricks, ArcGIS, SQL, and Python to build models, test hypotheses, and scale geospatial insight Integrate datasets like HST (High Street Trading), customer segmentation, and footfall into spatial analysis workflows Conduct segmentation … delivery optimisation, or customer segmentation Confident communicator with the ability to translate data into strategic impact Desirable: Experience in retail, logistics, transportation, or infrastructure industries Familiarity with working in Databricks or similar cloud environments Understanding of demographic and economic datasets relevant to location modelling Experience in line management or mentoring Benefits: £65,000 base salary £5,700 car allowance 13.5 … tooling How to Apply Interested? Send your CV to Mohammed Buhariwala at Harnham via the Apply link on this page. Keywords Geospatial Analytics, Location Planning, Spatial Insight, GIS, ArcGIS, Databricks, Python, SQL, Retail Strategy, Network Optimisation, Data Science, Hybrid Working, Leeds More ❯
ensuring we accurately generate and monitor high value invoices, identify revenue risks, and provide data-driven insights that influence decisions at the highest level. You'll work across Oracle, Databricks, Excel and other platforms to produce critical reports and spot commercial opportunities that protect and grow our revenue streams. If you thrive in a fast paced, data-rich environment - we … in a dynamic, fast-paced environment. Knowledge of the parcel delivery industry (desirable but not essential). Advanced Excel skills are essential. Experience with SQL and databases is desirable (Databricks, Oracle). Familiarity with SAP Business Objects or VBA is an advantage. At Evri, we know we only grow if our people do too. That's why we're committed More ❯