Data Engineer to play a key role in the creation of a brand-new data platform within the Azure ecosystem including Azure Data Factory (ADF), Synapse and PySpark/Databricks and Snowflake. You will be a data ingestion and ETL Pipeline guru, tackling complex problems at source in order to retrieve the data and ensure to can flow upstream to More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
a range of data science tools, especially Python, R, and SQL Experience of cloud computing - Cloud Computing – Azure, AWS or GCP Substantial experience working in cloud-based tools like Databricks for Machine Learning, Azure Machine Learning and Azure AI Foundry as well as experience helping others to use them. If this role sounds like the challenge you are seeking, get More ❯
digital transformation. The ideal candidate has a technical background with recent managerial experience focused on mentoring, coaching, code review, and standard setting. The role involves developing the client’s Databricks platform, preferably on AWS (Azure or GCP experience also acceptable), using Python and SQL, and contributing to CI/CD pipelines, strategy development, cost optimization, and data governance frameworks. Responsibilities … Manage and mentor a team of eight engineers. Oversee the adoption of automated CI/CD pipelines. Implement a new delivery roadmap. Develop a new Databricks system on AWS (experience on Azure is also suitable). Optimize costs and establish data governance frameworks for secure data handling. Requirements: At least 6 years of hands-on experience as a data engineer … with over a year in a managerial role coaching similar-sized teams. Deep expertise in the Databricks platform. Experience with large-scale data pipeline optimization. Knowledge of Streaming and Batch Spark workloads. Location: Leeds (On-site one day per week). How to Apply: To discuss this opportunity further, please APPLY NOW for a no-obligation chat with your VIQU More ❯
digital transformation. The ideal candidate has a technical background with recent managerial experience focused on mentoring, coaching, code review, and standard setting. The role involves developing the client's Databricks platform (AWS preferred, open to Azure or GCP), utilizing Python and SQL, contributing to CI/CD pipelines, strategy development, cost optimization, and data governance frameworks. Responsibilities Manage and mentor … a team of eight data engineers. Oversee the adoption of automated CI/CD pipelines. Implement a new delivery roadmap. Contribute to the development of a Databricks system in AWS (or Azure/GCP). Optimize costs associated with data infrastructure. Establish data governance frameworks for secure data handling. Requirements 6+ years of hands-on experience as a data engineer … with at least 1 year in a managerial role coaching similar-sized teams. Deep knowledge of the Databricks platform. Experience with large-scale data pipeline optimization. Experience with Streaming and Batch Spark workloads. To discuss this opportunity further, please APPLY NOW for a no-obligation chat with your VIQU Consultant. You can also contact Jack McManus at [email protected]. More ❯
technical background but has recently worked in a managerial role focused on mentoring, coaching, reviewing code, and standard setting. The role will focus on the development of the clients Databricks platform (AWS is preferred but open to Azure/GCP experience also), utilising Python and SQL, contribute to CI/CD pipelines, strategy development, cost optimisation and data governance frameworks. … Lead a team of eight engineers, helping to mentor and coach the team. Manage the adoption of automated CI/CD pipelines. Contribute to the development of a new Databricks system in AWS (AWS experience is preferred but they are open to managers with Azure experience). Cost optimisation. Establish data governance frameworks for secure handling of delivery information. Requirements … Engineer: 6+ years experience in a hands on data engineer role, with over a years recent experience in a managerial role, coaching similar sized teams. Deep knowledge of the Databricks platform. Hands on Python development experience. SQL optimisation. Experience with large scale data pipeline optimisation. Experience with Streaming and Batch Spark workloads. Strong people management skills. Role: Lead Data Engineer More ❯
transformation. The ideal candidate will have a technical background with recent managerial experience involving mentoring, coaching, code review, and standard setting. The role focuses on developing the client's Databricks platform (AWS experience preferred, but open to Azure or GCP), utilizing Python and SQL, contributing to CI/CD pipelines, strategy development, cost optimisation, and data governance frameworks. Key Responsibilities … of eight data engineers. Oversee the adoption and integration of automated CI/CD pipelines. Implement a new delivery roadmap for data projects. Lead the development of a new Databricks system on AWS (or Azure/GCP as applicable). Drive cost optimisation initiatives. Establish and enforce data governance frameworks to ensure secure data handling. Requirements Minimum of 6+ years … experience as a hands-on data engineer. At least 1+ years' recent experience in a managerial role leading similar-sized teams. Deep expertise in the Databricks platform. Proficiency in Python development. Strong SQL optimisation skills. Experience with large-scale data pipeline optimisation. Knowledge of Streaming and Batch Spark workloads. Excellent people management skills. Application Process To discuss this opportunity further More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Harnham
modelling to solve problems around store planning, network optimisation, and targeted customer engagement Identify whitespace opportunities based on demographics, mobility, and purchasing behaviour Advanced Geospatial Analytics Apply tools like Databricks, ArcGIS, SQL, and Python to build models, test hypotheses, and scale geospatial insight Integrate datasets like HST (High Street Trading), customer segmentation, and footfall into spatial analysis workflows Conduct segmentation … planning, delivery optimisation, or customer segmentation Confident communicator with the ability to translate data into strategic impact Experience in retail, logistics, transportation, or infrastructure industries Familiarity with working in Databricks or similar cloud environments Understanding of demographic and economic datasets relevant to location modelling Experience in line management or mentoring 13.5% bonus Hybrid working from Leeds (3 days per week … tooling How to Apply Interested? Send your CV to Mohammed Buhariwala at Harnham via the Apply link on this page. Keywords Geospatial Analytics, Location Planning, Spatial Insight, GIS, ArcGIS, Databricks, Python, SQL, Retail Strategy, Network Optimisation, Data Science, Hybrid Working, Leeds Seniority level Seniority level Associate Employment type Employment type Full-time Job function Job function Analyst Industries Retail Referrals More ❯
to link data across multiple systems Proficiency in data analysis tools and software (e.g., SQL, Python) Experience with data transformation and visualization tools (e.g., DOMO) Familiarity with Snowflake and Databricks for data warehousing and analytics Strong problem-solving skills and attention to detail Knowledge of data governance and data quality best practices. Information About Your Location This role offers a More ❯
Social network you want to login/join with: Principal Data and AI Engineer, leeds, west yorkshire col-narrow-left Client: Location: leeds, west yorkshire, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
BJSS
successful applicant with great freedom to develop and evolve our data platform accelerators. We welcome candidates from diverse backgrounds and skills, with the following requirements: Strong experience working with Databricks Experience in an AWS or Azure environment Comfortable with Infrastructure as Code Ability to automate repetitive tasks and workloads Enthusiastic about working with and evangelizing data solutions Enjoys keeping up More ❯
The primary focus is on building and maintaining the infrastructure to support the full data science lifecycle from data ingestion to model deployment, monitoring, and upgrades within Azure and Databricks environments. The engineer will work closely with data scientists in a collaborative, cross-functional setting, helping transition models from research into production. Key Responsibilities: Own and develop deployment frameworks for … cross-functional teams to ensure smooth productionization of models. Write clean, production-ready Python code. Apply software engineering best practices, CI/CD, TDD. Required Skills: Proficiency in Python, Databricks, and Azure. Experience with deployment tools (e.g., AKS, managed endpoints). Strong software engineering background (CI/CD, VCS, TDD). Ability to integrate ML into business workflows. Background in More ❯
logistics company and a looking for a Data Engineering Product owner to join their Data Engineering Team! You'll need: - 8 years of experience in Data Engineering - Expertise in Databricks , Spark and Python - Strong SQL skills - Hands on experience dwith AWS - Familiarity with test-driven development , CI/CD , and delivery principles . What you'll be doing: - Domain Expertise More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Context Recruitment
Candidate: Deep knowledge and significant experience in delivering Big Data architecture and design solutions, specifically Azure Extensive experience with Azure data services (e.g., Azure SQL, Azure Data Lake, Azure Databricks). Strong understanding of data architecture principles and best practices. Proficient in data integration and ETL processes. Excellent problem-solving skills and ability to work independently. Strong communication and collaboration More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
PCR RECRUITMENT LIMITED
A/B tests and other statistical modelling techniques. Proficiency in Python and/or R. Excellent proficiency in SQL to and working with data warehousing technologies (e.g. Snowflake, Databricks). Ability to use data visualisation tools such as Looker/Tableau. Ability to convert technical outputs into succinct and compelling presentations with clear narrative and actionable recommendations. Strong clarity More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Halton Housing
suites, and third-party data partnerships. Align roadmaps with Cox Automotive’s pan-European growth goals and client needs. Data Platform: Oversee the architecture of our data lakehouse (Snowflake, Databricks) and real-time pipelines (Kafka, AWS Kinesis). Salesforce Integrations: Ensure seamless workflows across Financial Services Cloud, Marketing Cloud, and CPQ. Decision Engine: Prioritise AI/ML use cases (e.g. More ❯
to expand your cloud/platform engineering capabilities Experience working with Big Data Experience of data storage technologies: Delta Lake, Iceberg, Hudi Proven knowledge and understanding of Apache Spark, Databricks or Hadoop Ability to take business requirements and translate these into tech specifications Competence in evaluating and selecting development tools and technologies Sound like the role you have been looking More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Oakland Everything Data
of expertise: Data Governance/Data Management Analytics & Insight AI Azure expertise – Although we are a tech agnostic consultancy, this role is initially working with Azure-based clients. Snowflake Databricks We operate a hybrid working arrangement from our Head Office in Leeds, designed to balance your commitments and preferences with our client’s needs. Whilst we support flexible working, we More ❯
leeds, west yorkshire, yorkshire and the humber, united kingdom Hybrid / WFH Options
VIQU Limited
to combine technical leadership with hands-on development, working with cutting-edge technologies to shape the future of logistics. What You'll Do: Lead the technical direction of our Databricks Common Data Platform Drive hands-on development with high-quality Python and SQL code Mentor and coach a distributed team of data engineers Champion CI/CD and automated testing … Work closely with the Head of Data Engineering to align with strategic goals Must-Have Experience: 8+ years in data engineering, including leadership experience Expert-level knowledge of the Databricks platform Advanced Python development for logistics and delivery optimisation Mastery of SQL for large-scale, nationwide operations Strong understanding of data modelling for parcel tracking systems Experience with both streaming More ❯
closely with both business and technical teams – translating complex requirements clearly to both. Experience working with AWS (preferred) or Azure cloud environments. Hands-on experience or strong understanding of Databricks (or other Spark-based platforms). Familiarity with modern data modelling approaches and pipeline design. Confident leading conversations across technical teams and stakeholders – strong communicator and collaborator. Knowledge of data More ❯
leeds, west yorkshire, yorkshire and the humber, united kingdom
Peregrine
Join us as a Lead Automation Engineer for our client. At Peregrine, we’re always seeking Specialist Talent that have the ideal mix of skills, experience, and attitude, to place with our vast array of clients. From Business Analysts in More ❯
The primary focus is on building and maintaining the infrastructure to support the full data science lifecycle from data ingestion to model deployment, monitoring, and upgrades within Azure and Databricks environments. The engineer will work closely with data scientists in a collaborative, cross-functional setting, helping transition models from research into production. Key Responsibilities: Own and develop deployment frameworks for … cross-functional teams to ensure smooth productionisation of models. Write clean, production-ready Python code. Apply software engineering best practices, CI/CD, TDD. Required Skills: Proficiency in Python, Databricks, and Azure. Experience with deployment tools (e.g., AKS, managed endpoints). Strong software engineering background (CI/CD, VCS, TDD). Ability to integrate ML into business workflows. Desirable: Background More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Arsenal FC
appears irrelevant and not related to the job description; consider removing or replacing with relevant content] This role involves shaping data flow across our ecosystem, primarily using Azure and Databricks, integrated with platforms like Tableau, Cortex, and MS Dynamics. You will leverage GraphQL for real-time data access, ensuring data is well-structured, secure, and accessible for actionable insights. This … driven by courage, community support, and integrity. Your day-to-day Manage data architecture strategy and design - Oversee the implementation of scalable data solutions within our Azure environment, including Databricks and GraphQL for real-time data access. Collaborate across teams - Act as the main liaison between business, product owners, and technical teams to ensure cohesive, value-driven solutions. Lead and More ❯
architecture in a complex organisation Expertise in Kimball and Data Vault 2.0 methodologies Strong grasp of data modelling, metadata, and governance knowledge Hands-on experience with modern data platforms (Databricks, Delta Lake, Unity Catalog, Azure) Ability to define and drive architecture principles, patterns, and best practices Excellent communication and stakeholder management skills Retail industry experience is a bonus but not More ❯
stakeholders and business users. Key Skills: Proven background in platform selection, configuration, and onboarding (ideally across AWS, cloud, or on-prem solutions). Hands-on familiarity with tools like Databricks, Python, and SaaS analytics environments. Experience working within a CTO or principal engineering team to translate complex technical concepts into language understood by functional users. Strong stakeholder management - able to More ❯
source of truth) Location: Remote Duration: 6months Start Date: ASAP Rate: £400 per day (inside IR35) We’re looking for an experienced Project Manager to lead a high-impact Databricks migration project, consolidating data from multiple platforms into a single source of truth. This is a critical programme for our business, and we need someone who can hit the ground … running. What you’ll be doing: Leading the end-to-end migration of a number of data platforms to our new Databricks platform, working with multiple data teams and stakeholders. Managing timelines, risks, and dependencies across different data sources and platforms. Ensuring clear communication and alignment across tech, data, and business teams. Delivering a robust, scalable single data environment that … underpins strategic decision-making. We’re looking for: Proven experience delivering data migration or modernisation projects, ideally involving Databricks or similar cloud-based platforms. Strong project management skills – you’re comfortable owning delivery and keeping complex programmes on track. Ability to engage technical and non-technical stakeholders with confidence. A hands-on, delivery-focused approach – this role is about getting More ❯