Solihull, West Midlands, West Midlands (County), United Kingdom
Pontoon
public sector setting. Strong programming skills: Python, R, SQL, and ideally knowledge of Scala, Java, or C++. Deep familiarity with the Microsoft technology stack - including Azure Data Factory, Synapse, Databricks, Power BI, Azure ML. Sound understanding of machine learning methods (KNN, Naïve Bayes, SVM, Decision Forests). Solid statistical and mathematical foundations (regression, distributions, linear algebra, multivariable calculus). Excellent More ❯
public sector setting. Strong programming skills: Python, R, SQL, and ideally knowledge of Scala, Java, or C++. Deep familiarity with the Microsoft technology stack - including Azure Data Factory, Synapse, Databricks, Power BI, Azure ML. Sound understanding of machine learning methods (KNN, Na ve Bayes, SVM, Decision Forests). Solid statistical and mathematical foundations (regression, distributions, linear algebra, multivariable calculus). More ❯
Rochdale, Greater Manchester, North West, United Kingdom Hybrid / WFH Options
Footasylum Ltd
share best practice. Within the teams we recognise individual skillsets and encourage knowledge sharing sessions and self-development. About You Experience with finance/financial systems and concepts Azure Databricks Azure Data Factory Excellent SQL skills Good Python/Spark/pyspark skills Experience of Kimball Methodology and star schemas (dimensional model). Experience of working with enterprise data warehouse More ❯
and Finance. Develop target-state architectures and data strategies in line with business needs. Create and manage conceptual, logical, and physical data models. Design and implement Lakehouse Architectures using Databricks and other modern data platforms. Ensure robust data governance, integration, and quality across systems. Collaborate with IT and business stakeholders to deliver scalable, cloud-based solutions. What We're Looking More ❯
and hands-on experience with Spark. Experience building, maintaining, and debugging DBT pipelines. Strong proficiency in developing, monitoring, and debugging ETL jobs. Deep understanding of SQL and experience with Databricks, Snowflake, BigQuery, Azure, Hadoop, or CDP environments. Hands-on technical support experience, including escalation management and adherence to SLAs. Familiarity with CI/CD technologies and version control systems like More ❯
Do Develop and maintain BI solutions from end-to-end, collaborating with scientists and analysts. Build automated data models and pipelines for large, complex datasets, with a focus on Databricks in Azure. Drive data standardization and best practices to support a strong data culture. What You'll Bring Proven expertise in Power BI, DAX, Power Query (M), Power Apps, and More ❯
Cambridge, Cambridgeshire, East Anglia, United Kingdom
InfinityQuest Ltd,
Job Description Agent Design & Delivery : Architect and implement production-ready AI agents that integrate with enterprise platforms (SAP, Salesforce, ServiceNow, CI/CD, Databricks), with human-in-the-loop controls, observability, and auditability. Hybrid Skillset : Strong grounding in traditional machine learning and data workflows (forecasting, scoring models, risk detection, NLP etc) alongside Gen AI/LLM-based architectures. Able to More ❯
growth & improvement. Experience of using digital analytics tools such as Google Analytics (or equivalent). A minimum of intermediate SQL ability with experience of using tools such Google BigQuery, Databricks (or equivalent). Use of visualisation tools (Looker Studio/Power BI/Tableau). Strong stakeholder engagement and ability to work collaboratively across multiple teams. Proactive nature, experience being More ❯
DATABRICKS ENGINEER 6-MONTH CONTRACT £550-£600 PER DAY This role offers a great opportunity for an Azure Databricks Engineer to join a renewable energy firm based in London. You'll play a hands-on role in developing and optimising modern data lakehouse solutions on Azure, while supporting critical analytics and data delivery systems. The environment encourages technical ownership, collaboration … You'll be responsible for data pipeline development, resource deployment, and ongoing optimisation of cloud-native systems. Your responsibilities will include: Designing and implementing scalable data lakehouse architectures using Databricks on Azure. Building efficient ETL/ELT pipelines for structured and unstructured data. Working with stakeholders to ensure high-quality, accessible data delivery. Optimising SQL workloads and data flows for … Automating infrastructure deployment using Terraform and maintaining CI/CD practices. Supporting secure and performant data access via cloud-based networking. KEY SKILLS AND REQUIREMENTS Strong experience with Azure Databricks in production environments. Background with Azure Data Factory, Azure Functions, and Synapse Analytics. Proficient in Python and advanced SQL, including query tuning and optimisation. Hands-on experience with big data More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
AMS CWS
the Project Manager include: Experience in large-scale cloud migration programmes. Proven experience delivering data migration or reporting projects within the financial services domain. Knowledge of Power BI, Azure Databricks, and Microsoft Fabric. Excellent stakeholder and third party management experience. Familiarity with data governance practices and tools. Why Tesco Insurance and Money Services? Seeing your impact all around you: there More ❯
Stevenage, Hertfordshire, South East, United Kingdom
Queen Square Recruitment Limited
building and managing Docker containers . Strong Linux Infrastructure knowledge. Solid background in Azure Infrastructure engineering . Nice to Have: Hands-on experience with Domino , Azure DevOps , Python , GitHub , Databricks . Familiarity with Agile Scrum methodologies . Why Join? This is an exciting opportunity to lead technical innovation within HPC, working on impactful projects at scale. Youll have the chance More ❯
Principal Data Engineer - Azure Databricks (Unity Catalog) - Contract Location: Bristol - Hybrid - 2 days a week in the office Contract Length: 12 Months Day Rate: Negotiable Job Ref: J12998 A data for good organisation that is in the early stages of building a modern Analytics and Data Engineering function, are looking to bring in a Principal Data Engineer to support and … responsible for designing and implementing scalable, reusable data pipelines to support a range of analytics and AI-driven projects. The organisation is currently transitioning to Azure as well as Databricks and Principal Data Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on … organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: ·Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. ·Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. ·Experience working across both technical and non More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Datatech
Principal Data Engineer - Azure Databricks (Unity Catalog) - Contract Location: Bristol - Hybrid - 2 days a week in the office Contract Length: 12 Months Day Rate: Negotiable Job Ref: J12998 A data for good organisation that is in the early stages of building a modern Analytics and Data Engineering function, are looking to bring in a Principal Data Engineer to support and … responsible for designing and implementing scalable, reusable data pipelines to support a range of analytics and AI-driven projects. The organisation is currently transitioning to Azure as well as Databricks and Principal Data Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on … organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. Experience working across both technical and non More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Datatech Analytics
Principal Data Engineer - Azure Databricks (Unity Catalog) - Contract Location: Bristol - Hybrid - 2 days a week in the office Contract Length: 12 Months Day Rate: Negotiable Job Ref: J12998 A data for good organisation that is in the early stages of building a modern Analytics and Data Engineering function, are looking to bring in a Principal Data Engineer to support and … responsible for designing and implementing scalable, reusable data pipelines to support a range of analytics and AI-driven projects. The organisation is currently transitioning to Azure as well as Databricks and Principal Data Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on … organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledgeof modern ELT workflows, with a focus on the Extract and Load stages. Experience working across both technical and non-technical More ❯
for creating, maintaining, and enhancing data models while supporting data architects with innovative solutions To be successful, you will have: Data Modelling: Kimball, Data Vault, Canonical SQL Unity Catalog Databricks/Spark Event Hubs Python (applied to data validation/modelling) The role is operating Outside IR35 paying £600-£700 per day and requires you to be onsite in Edinburgh More ❯
Analytics teams, you'll help evolve our data warehouse and implement best-in-class engineering practices across Azure. Key Responsibilities Build and enhance ETL/ELT pipelines in Azure Databricks Develop facts and dimensions for financial reporting Collaborate with cross-functional teams to deliver robust data solutions Optimize data workflows for performance and cost-efficiency Implement governance and security using … Unity Catalog Drive automation and CI/CD practices across the data platform Explore new technologies to improve data ingestion and self-service Essential Skills Azure Databricks : Expert in Spark (SQL, PySpark), Databricks Workflows Data Pipeline Design : Proven experience in scalable ETL/ELT development Azure Services : Data Lake, Blob Storage, Synapse Data Governance : Unity Catalog, access control, metadata management More ❯
silver layers). What you'll need to succeed Proficiency in the design and implementation of modern data architectures - Microsoft Fabric/Data Factory and modern data warehouse technologies - Databricks a must! Experience in data management disciplines, including data integration, modeling, optimisation, data quality and Master Data Management. Experience with database technologies such as RDBMS (SQL Server, Oracle) or NoSQL More ❯
Northampton, Northamptonshire, United Kingdom Hybrid / WFH Options
Experis
are seeking a highly skilled and communicative Technical Data Engineer to join our team. The ideal candidate will have hands-on experience with modern data platforms and tools including Databricks, DBT, and Snowflake. You will play a key role in designing, developing, and optimizing data pipelines and analytics solutions that drive business insights and decision-making. Key Responsibilities: Design, build … and maintain scalable data pipelines using Databricks and DBT. Develop and optimize data models and transformations in Snowflake. Collaborate with cross-functional teams to understand data requirements and deliver robust solutions. Ensure data quality, integrity, and governance across platforms. Troubleshoot and resolve data-related issues in a timely manner. Document processes, workflows, and technical specifications clearly and effectively. Required Skills … Experience: Proven hands-on experience with: Databricks (Spark, Delta Lake, notebooks) DBT (data modeling, transformations, testing) Snowflake (SQL, performance tuning, data warehousing) Strong understanding of data engineering principles and best practices. Excellent communication skills with the ability to explain technical concepts to non-technical stakeholders. Experience working in agile environments and collaborating with data analysts, scientists, and business teams. All More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
to a modern Azure-based architecture, with a focus on performance, scalability, and reliability. Responsibilities Design and implement robust data migration pipelines using Azure Data Factory, Synapse Analytics, and Databricks Develop scalable ETL processes using PySpark and Python Collaborate with stakeholders to understand legacy data structures and ensure accurate mapping and transformation Ensure data quality, governance, and performance throughout the … migration lifecycle Document technical processes and support knowledge transfer to internal teams Required Skills Strong hands-on experience with Azure Data Factory, Synapse, Databricks, PySpark, Python, and SQL Proven track record in delivering data migration projects within Azure environments Ability to work independently and communicate effectively with technical and non-technical stakeholders Previous experience in consultancy or client-facing roles More ❯
business value. RESPONSIBILITIES Develop, test, and deploy custom web applications and dashboards, connecting to multiple enterprise data sources (e.g., Microsoft Fabric Data Lake, Power.B.I., Qlik Sense, SQL Server, Airtable, Databricks) Build advanced data visualisations using D3.JS and other JavaScript libraries to create highly interactive, responsive interfaces Integrate applications with APIs to pull, push, and process data in real-time Work … managing applications in Azure and/or AWS environments Data Platform Familiarity: Knowledge of connecting to data platforms such as Microsoft Fabric, Power.B.I., Qlik Sense, SQL Server, Airtable, and Databricks Performance Mindset: Commitment to building scalable, secure, and optimised web applications QUALIFICATIONS Proven experience developing and deploying web applications, ideally in a consulting or enterprise environment Proficiency in HTML, CSS … integration (REST, GraphQL) Familiarity with database technologies and query languages (SQL, NoSQL) Experience working with data sources such as Microsoft Fabric’s Data Lake, Power.B.I., Qlik Sense, Airtable, and Databricks Good grasp of DevOps practices, including CI/CD pipelines and code management workflows Cloud deployment experience with Azure and/or AWS Some background in UI/UX design More ❯
We're undertaking a fast paced data transformation into Databricks at E.ON Next using best practice data governance and architectural principles, and we are growing our data engineering capability within the Data Team. As part of our journey we're looking for a data architect to help bring our vision to life to design and review data models, iterate on … practice. A strong attention to detail and a curiosity about the data you will be working with. A strong understanding of Linux based tooling and concepts. Strong experience with Databricks and/or Spark. Experienced with data governance, data cataloguing, data quality principles, and associated tools. Understanding of data extraction, joining, and aggregation tasks, especially on big and real-time More ❯
Senior Data Engineer ESM Contract: 6 months Rate: £500 £550 per day (inside IR35) Location: Glasgow (3 days onsite per week) Start date: ASAP Our client, a large global consultancy, is seeking a Senior Data Engineer to join a Enterprise More ❯
Welwyn Garden City, Hertfordshire, South East, United Kingdom
La Fosse
Build automated discovery workflows to enable self-service analytics. Champion adoption of governed data assets across business and technical teams. Technical Stewardship Configure cross-platform metadata synchronisation between Alation, Databricks (Unity Catalog) , and downstream systems. Implement schema registries, data contracts, and API-first data governance. Ensure compliance with GDPR, CCPA, SOX, and industry regulations. What We're Looking For Technical … master data management. Strong track record of curating and synchronising metadata across distributed platforms. Platform & Tool Expertise Alation power user : Expert in Stewardship Workbench, OCF connectors, and governance workflows. Databricks ecosystem : Unity Catalog administration, Delta Lake governance, lakehouse patterns, Genie AI/BI spaces. Hands-on with data quality tools such as Great Expectations, Ataccama, Monte Carlo (or similar). More ❯
stakeholder leadership , ideal for someone who can drive clarity and structure across divisional boundaries. Key Responsibilities Lead the implementation and integration of Microsoft Purview across Azure Data Factory and Databricks Unity Catalog Define and promote data products within business domains Educate data owners and business users on cataloging and governance best practices Navigate domains with unclear data ownership , escalating where … business alignment Collaborate across divisional teams with strong stakeholder management Experience Required 10+ years in data governance, metadata management, and cataloging Strong technical understanding of Microsoft Purview , Azure, and Databricks Familiarity with tools like Collibra, Elation , or similar is a plus Background in data engineering, architecture , or technical project management Fluent in English ; Finnish or Swedish is a major advantage More ❯