leeds, west yorkshire, yorkshire and the humber, united kingdom Hybrid / WFH Options
KPMG UK
Ability to use wide variety of open-source technologies Knowledge and experience using at least one Data Platform Technology such as Quantexa, Palantir and DataBricks Knowledge of test automation frameworks and ability to automate testing within the pipeline To discuss this or wider Technology roles with our recruitment team, all More ❯
senior capacity Demonstrable experience designing and managing metadata-driven frameworks using Azure Data Factory and Azure Databricks. Competence in and hands-on experience managing Databricks environments and developing lakehouse architectures with a focus on automation, performance tuning, cost optimisation, and system reliability. Proven proficiency in programming languages such as Python More ❯
we'll help you get there. What's the Opportunity? We're recruiting for entry-level roles in: Junior Data Engineering - work with Azure, Databricks, Python, and Power BI Junior DevOps Engineering - get hands-on with CI/CD, cloud platforms, automation tools, and modern engineering practices How It Works More ❯
tools like GitHub Actions to automate deployment workflows. Working knowledge of many Azure Data Services and associated applications, including (but not limited to) Azure Databricks, Azure Data Factory, Azure Monitor, Azure Data Lake Storage, Azure Identity Management. Demonstrable experience with monitoring, alerting, and performance tuning in cloud environments. Ability to More ❯
leeds, west yorkshire, yorkshire and the humber, united kingdom Hybrid / WFH Options
Datatech Analytics
tools, especially Python, R, and SQL Experience of cloud computing - Cloud Computing – Azure, AWS or GCP Substantial experience working in cloud-based tools like Databricks for Machine Learning, Azure Machine Learning and Azure AI Foundry as well as experience helping others to use them. If this role sounds like the More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Context Recruitment
experience in delivering Big Data architecture and design solutions, specifically Azure Extensive experience with Azure data services (e.g., Azure SQL, Azure Data Lake, Azure Databricks). Strong understanding of data architecture principles and best practices. Proficient in data integration and ETL processes. Excellent problem-solving skills and ability to work More ❯
Leeds, West Yorkshire, Yorkshire and the Humber Hybrid / WFH Options
Viqu
roadmaps - Drive innovation in our data engineering space Must-Have Experience: - 8+ years in data engineering with team leadership experience - Expert-level knowledge of Databricks platform for logistics applications - Advanced Python development skills - Deep understanding of SQL optimization for large-scale operations - Proven experience with AWS technologies (EC2, Lambda, Fargate More ❯
transactional data (handling terabytes of data) is optimized for business performance. Data Architecture Expertise: Lead the architectural design of data warehouses, data lakes, and Databricks, ensuring seamless integration for AI-driven forecasting and operational analytics. GenAI Implementation: Leverage Generative AI to streamline data gathering, analysis, and presentation for business operations. More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
VIQU Limited
gather requirements, challenge assumptions, and turn needs into solutions. Background in solution design and a solid understanding of data platform strategy. Proven experience with Databricks (Azure or AWS), SQL, and Python Understanding of cloud technologies (either AWS or Azure) Experience in data modelling, pipeline optimisation, and delivering scalable solutions. Comfortable More ❯
gather requirements, challenge assumptions, and turn needs into solutions. Background in solution design and a solid understanding of data platform strategy. Proven experience with Databricks (Azure or AWS), SQL, and Python Understanding of cloud technologies (either AWS or Azure) Experience in data modelling, pipeline optimisation, and delivering scalable solutions. Comfortable More ❯
gather requirements, challenge assumptions, and turn needs into solutions.â • Background in solution design and a solid understanding of data platform strategy.â • Proven experience with Databricks (Azure or AWS), SQL, and Pythonâ • Understanding of cloud technologies (either AWS or Azure)â • Experience in data modelling, pipeline optimisation, and delivering scalable solutions.â More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
Asda Stores Ltd
deliver end-to-end data science and analytics projects aligned with business priorities. Model Development : Build, test, and deploy predictive and optimisation models using Databricks, Azure, and Python , incorporating best practices in MLOps and governance. Insight Generation : Translate complex datasets into accessible and actionable insights using Power BI and other … Need Essential Skills & Experience: Proven experience in data science, advanced analytics, or data engineering, with a track record of delivering measurable outcomes. Proficiency in Databricks, Azure, Python, SQL, and data visualisation using open source libraries. Experience with modern MLOps practices for deploying and maintaining models in production. Excellent communication skills … a forward-thinking data team using modern tools in a cloud-native environment Flexible hybrid working with a supportive, inclusive culture Tools & tech: Azure, Databricks, Power BI, Python - continuously evolving Attractive benefits package: Competitive salary 7% Stakeholder Pension Plan 15% Asda Colleague Discount Free parking at Asda House, Leeds Clear More ❯
other engineers Their Technology Stack: Microsoft Azure Docker, Kubernetes and Terraform TypeScript, Node.JS C# - .NET Core Golang Python and R with Jupyter and Azure DataBricks Postgres (and Timescale), Redis, document and column-based storage engines RabbitMQ and Kafka-style commit logs Dapr React, Redux, React-Router, Styled-Components, Express, TRPC More ❯
Employment Type: Permanent
Salary: £75000 - £100000/annum Up to £100k basic + excellent benef
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
Asda Stores Ltd
capable but commercially curious - who wants to see their work create clear, measurable value. You'll be working in a hybrid cloud environment (Azure, Databricks), applying your skills to real challenges in areas like customer behaviour, operations, and digital journeys. You'll learn from experienced colleagues, develop your craft, and … whether it's increasing efficiency, reducing cost, or improving customer experience. Build & Apply Models : Support the development of predictive and optimisation models using Python, Databricks, and Azure. Help ensure outputs are robust, interpretable, and actionable. Enable Data-Driven Decisions : Develop dashboards and visual narratives using Power BI that translate data … thrives in fast-moving environments with a strong sense of ownership. A numerate degree (e.g. Maths, Stats, Engineering, Computer Science). Desirable: Experience using Databricks or working in a cloud-based environment like Azure. Exposure to MLOps, version control, or productionising models. Experience working with Jira and Confluence in an More ❯
DevOps teams to align operational strategies with technical and business requirements. Optimize operational performance and cost management for services including Azure Data Factory, Azure Databricks, Delta Lake, and Azure Data Lake Storage. Serve as the domain expert in DataOps by providing strategic guidance, mentoring colleagues, and driving continuous process improvements. … and driving automation of data workflows within the Microsoft Azure ecosystem. Hands-on expertise with Azure Data Platform components including Azure Data Factory, Azure Databricks, Azure Data Lake Storage, Delta Lake, Azure SQL, Purview and APIM. Proficiency in developing CI/CD data pipelines and strong programming skills in Python … Demonstrated expertise in cost optimisation and performance tuning within an Azure-based data infrastructure. Relevant certifications such as Azure Data Engineer Associate, Azure DevOps, Databricks Data Engineer Professional, or equivalent credentials. Ability to build and manage the DataOps processes in a newly formed team which is planned to continue to More ❯
Role requirements: The role will include: Manage and maintain Azure Data Services infrastructure, which will work alongside many other cloud-based components, such as Databricks, Purview and Cognitive/Machine Learning Services. Define SLAs and operational processes for data-related infrastructure. Support live initiatives and ensure system reliability and availability. … tools like GitHub Actions to automate deployment workflows Robust knowledge of many Azure Data Services and associated applications, including (but not limited to) Azure Databricks, Azure Data Factory, Azure Monitor, Azure Data Lake Storage, Azure Identity Management. Demonstrable experience with monitoring, alerting, and performance tuning in complex cloud environments. Ability More ❯
leeds, west yorkshire, yorkshire and the humber, united kingdom
Wipro
ideas and connect the dots to build a better and a bold new future. Job Description: Responsibilities will include designing, implementing, and maintaining the Databricks platform, and providing operational support. Operational support responsibilities include platform set-up and configuration, workspace administration, resource monitoring, providing technical support to data engineering, Data … and resolving issues. The position will also involve the management of security and changes. The position will work closely with the Team Lead, other Databricks Administrators, System Administrators, and Data Engineers/Scientists/Architects/Modelers/Analysts. Responsibilities: Responsible for the administration, configuration, and optimization of the Databricks … and data engineering activities within the organization. Collaborate with the data engineering team to ingest, transform, and orchestrate data. Manage privileges over the entire Databricks account, as well as at the workspace level, Unity Catalog level and SQL warehouse level. Create workspaces, configure cloud resources, view usage data, and manage More ❯
including creating demos and proofs of concept. Ensure performance, security, and scalability of data solutions. Key Requirements of the Data Solutions Lead: Expert in Databricks (experience with Databricks on AWS or Azure). Strong Python and SQL skills for logistics and delivery optimisation. Background in solution design and business engagement. More ❯
worked in a managerial role focused on mentoring, coaching, reviewing code, and standard setting. The role will focus on the development of the clients Databricks platform (AWS is preferred but open to Azure.GCP experience also), utilising Python and SQL, contribute to CI/CD pipelines, strategy development, cost optimisation and … coach the team. Manage the adoption of automated CI/CD pipelines. Implement a new delivery roadmap. Contribute to the development of a new Databricks system in AWS (AWS experience is preferred but they are open to managers with Azure experience). Cost optimisation. Establish data governance frameworks for secure … a hands on data engineer role, with over a years recent experience in a managerial role, coaching similar sized teams. Deep knowledge of the Databricks platform. Hands on Python development experience. SQL optimisation. Experience with large scale data pipeline optimisation. Experience with Streaming and Batch Spark workloads. Strong people management More ❯
building and maintaining the infrastructure to support the full data science lifecycle from data ingestion to model deployment, monitoring, and upgrades within Azure and Databricks environments. The engineer will work closely with data scientists in a collaborative, cross-functional setting, helping transition models from research into production. Key Responsibilities: Own … smooth productionization of models. Write clean, production-ready Python code. Apply software engineering best practices, CI/CD, TDD. Required Skills: Proficiency in Python, Databricks, and Azure. Experience with deployment tools (e.g., AKS, managed endpoints). Strong software engineering background (CI/CD, VCS, TDD). Ability to integrate ML More ❯
building and maintaining the infrastructure to support the full data science lifecycle from data ingestion to model deployment, monitoring, and upgrades within Azure and Databricks environments. The engineer will work closely with data scientists in a collaborative, cross-functional setting, helping transition models from research into production. Key Responsibilities: Own … smooth productionization of models. Write clean, production-ready Python code. Apply software engineering best practices, CI/CD, TDD. Required Skills: Proficiency in Python, Databricks, and Azure. Experience with deployment tools (e.g., AKS, managed endpoints). Strong software engineering background (CI/CD, VCS, TDD). Ability to integrate ML More ❯
months Start Date: ASAP Rate: £400 per day (inside IR35) We’re looking for an experienced Project Manager to lead a high-impact Databricks migration project, consolidating data from multiple platforms into a single source of truth. This is a critical programme for our business, and we need someone who … hit the ground running. What you’ll be doing: Leading the end-to-end migration of a number of data platforms to our new Databricks platform, working with multiple data teams and stakeholders. Managing timelines, risks, and dependencies across different data sources and platforms. Ensuring clear communication and alignment across … robust, scalable single data environment that underpins strategic decision-making. We’re looking for: Proven experience delivering data migration or modernisation projects, ideally involving Databricks or similar cloud-based platforms. Strong project management skills – you’re comfortable owning delivery and keeping complex programmes on track. Ability to engage technical and More ❯