Manchester, Lancashire, United Kingdom Hybrid / WFH Options
MAG (Airports Group)
models, exploring customer behaviours, and supporting personalisation strategies - with opportunities to work on NLP projects too. You'll also take ownership of projects, support our data science tooling (including Databricks and AWS), and collaborate closely with experts in Data Engineering, BI, Analytics, and Data Governance to solve problems and create scalable solutions that make a tangible difference. What's in … and continuously develop your skills in a collaborative, hybrid working environment. About you Role Responsibilities: Design, build, and maintain scalable machine learning pipelines using Python and PySpark. Work within Databricks to develop, schedule, and monitor data workflows, utilising Databricks Asset Bundles. Collaborate with data analysts, engineers, and other scientists to deliver clean, reliable, and well-documented datasets. Develop and maintain … skills with a problem-solving mindset. Strong analytical and communication skills, with the ability to tailor complex insights for both technical and non-technical audiences. Hands-on experience with Databricks for deploying, monitoring, and maintaining machine learning pipelines. Experience working with AWS data services and architectures. Good understanding of code versioning and CI/CD tools and practices. Familiarity with More ❯
role Support, improve and deliver high quality BI reporting and data transformation solutions as part of the Central BI Team using tools such as Azure Data Factory, Microsoft Fabric, Databricks and Power Bi. Actively participate within the central Data & BI Team to help develop other team members and support the delivery of the team objectives. Perform operation support activities to … ensure the Central BI solution meets the business SLA's, including occasional OOH activities. Key responsibilities Data & Business Intelligence Team Enhance existing Data Lakehouse with new Databricks pipelines to ingest new data sources. Build out data transformations to support new Facts and Dimensions within existing Data Lakehouse. Conduct data analysis and then design and deliver the subsequent data extraction and … undertake any other reasonable duties compatible with your experience and competencies. This description may be varied from time to time to reflect changing business requirements. Preferred Skills and Experience Databricks Azure Data Factory Data Lakehouse Medallion architecture Microsoft Azure T-SQL Development (MS SQL Server 2005 onwards) Python, PySpark Experience of the following systems would also be advantageous: Azure DevOps More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
ASDA
for someone who's technically capable but commercially curious - who wants to see their work create clear, measurable value. You'll be working in a hybrid cloud environment (Azure, Databricks), applying your skills to real challenges in areas like customer behaviour, operations, and digital journeys. You'll learn from experienced colleagues, develop your craft, and help embed analytics into everyday … science solutions that drive outcomes - whether it's increasing efficiency, reducing cost, or improving customer experience. Build & Apply Models : Support the development of predictive and optimisation models using Python, Databricks, and Azure. Help ensure outputs are robust, interpretable, and actionable. Enable Data-Driven Decisions : Develop dashboards and visual narratives using Power BI that translate data into insight business users can … technical audiences. Self-starter who thrives in fast-moving environments with a strong sense of ownership. A numerate degree (e.g. Maths, Stats, Engineering, Computer Science). Desirable: Experience using Databricks or working in a cloud-based environment like Azure. Exposure to MLOps, version control, or productionising models. Experience working with Jira and Confluence in an Agile environment is advantageous. Streamlit More ❯
understand differences, ensuring accuracy, and improving data validation processes. Handling data quality and migration projects to enhance system performance and integrity. Supporting the deployment of machine learning models using Databricks and PySpark. Managing and optimising cross-functional ETL processes across 80 databases daily. Working within a secure private cloud environment that includes Azure, SQL 2016, and SQL Server. About You … optimisation, indexing strategies, and troubleshooting complex data environments. A background in managing and working within complex data infrastructures, ensuring reliability and efficiency. Proficiency in cloud-based data tools, including Databricks, Data Factory, and Fabric, to streamline data engineering processes. Advanced skills in SQL, Python, and MongoDB, enabling efficient querying, scripting, and automation. This role is ideal for someone passionate about More ❯
Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure DataBricks, HD Insights, ML Service etc. Good knowledge of Python and Spark are required. Experience in ETL & ELT Good understanding of one scripting language Good understanding of how to enable analytics … a diverse and challenging set of customers to success. Good understanding of the CPG (Consumer Packaged Goods) domain is preferred. Skills: Data Ops, ML Ops, Deep expertise in Azure Databricks, ETL frameworks. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure DataBricks, HD Insights, ML Service etc. Fractal provides equal More ❯
Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure DataBricks, HD Insights, ML Service etc. Good knowledge of Python and Spark are required. Experience in ETL & ELT Good understanding of one scripting language Good understanding of how to enable analytics … a diverse and challenging set of customers to success. Good understanding of the CPG (Consumer Packaged Goods) domain is preferred. Skills: Data Ops, ML Ops, Deep expertise in Azure Databricks , ETL frameworks. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure DataBricks, HD Insights, ML Service etc. Fractal provides equal More ❯
Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure DataBricks, HD Insights, ML Service etc. Good knowledge of Python and Spark are required. Experience in ETL & ELT Good understanding of one scripting language Good understanding of how to enable analytics … a diverse and challenging set of customers to success. Good understanding of the CPG (Consumer Packaged Goods) domain is preferred. Skills: Data Ops, ML Ops, Deep expertise in Azure Databricks , ETL frameworks. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure DataBricks, HD Insights, ML Service etc. Fractal provides equal More ❯
to enact step-change operational efficiency and maximize business value by confidently utilizing trustworthy data. What are we looking for? Great experience as a Data Engineer Experience with Spark, Databricks, or similar data processing tools. Proficiency in working with the cloud environment and various software’s including SQL Server, Hadoop, and NoSQL databases. Proficiency in Python (or similar), SQL and … Spark. Proven ability to develop data pipelines (ETL/ELT). Strong inclination to learn and adapt to new technologies and languages. Strong understanding and experience in working with Databricks Delta Lake. Proficiency in Microsoft Azure cloud technologies Strong inclination to learn and adapt to new technologies and languages. What will be your key responsibilities? Collaborate in hands-on development … relevant technologies to create and maintain data assets and reports for business insights. Assist in engineering and managing data models and pipelines within a cloud environment, utilizing technologies like Databricks, Spark, Delta Lake, and SQL. Contribute to the maintenance and enhancement of our progressive tech stack, which includes Python, PySpark, Logic Apps, Azure Functions, ADLS, Django, and ReactJs. Support the More ❯
to enact step-change operational efficiency and maximize business value by confidently utilizing trustworthy data. What are we looking for? Great experience as a Data Engineer Experience with Spark, Databricks, or similar data processing tools. Proficiency in working with the cloud environment and various software’s including SQL Server, Hadoop, and NoSQL databases. Proficiency in Python (or similar), SQL and … Spark. Proven ability to develop data pipelines (ETL/ELT). Strong inclination to learn and adapt to new technologies and languages. Strong understanding and experience in working with Databricks Delta Lake. Proficiency in Microsoft Azure cloud technologies Strong inclination to learn and adapt to new technologies and languages. What will be your key responsibilities? Collaborate in hands-on development … relevant technologies to create and maintain data assets and reports for business insights. Assist in engineering and managing data models and pipelines within a cloud environment, utilizing technologies like Databricks, Spark, Delta Lake, and SQL. Contribute to the maintenance and enhancement of our progressive tech stack, which includes Python, PySpark, Logic Apps, Azure Functions, ADLS, Django, and ReactJs. Support the More ❯
Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure DataBricks, HD Insights, ML Service etc. Good knowledge of Python and Spark are required. Experience in ETL & ELT Good understanding of one scripting Good understanding of how to enable analytics using … a diverse and challenging set of customers to success. Good understanding of the CPG (Consumer Packaged Goods) domain is . Skills: Data Ops, ML Ops, Deep expertise in Azure Databricks , ETL frameworks. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure DataBricks, HD Insights, ML Service etc. Fractal provides equal More ❯
data solutions using Microsoft Fabric and Azure services. Architect modern, scalable, cloud-native data platforms that make the most of tools like Azure Synapse Analytics, Azure Data Factory, Azure Databricks, Power BI and others. Oversee implementation and delivery, ensuring performance, integration, and resilience are built in from the start. Develop robust data models and pipelines, supporting both real-time and … processes. Skills: 7–10 years of hands-on experience in data architecture and engineering, focused on Microsoft Azure. In-depth technical expertise in Microsoft Fabric, Synapse Analytics, Data Factory, Databricks and Power BI. Proven ability to design and implement data solutions in complex environments. Strong understanding of data modelling, ETL/ELT frameworks, and query optimisation (especially SQL). Experience More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
MAG
and this role will have a big say in what we build next. You'll be responsible for designing and building robust, scalable data pipelines using PySpark, SQL and Databricks - enabling our analytics, BI and data science colleagues to unlock real value across the business. This is a brilliant opportunity for someone who's passionate about data quality, modern engineering … role: Essential experience: 2-5 years in data engineering or a related field Strong PySpark and advanced SQL skills Practical experience building and maintaining ETL/ELT pipelines in Databricks Familiarity with CI/CD pipelines and version control practices Nice to have: Experience using Databricks Asset Bundles (DAB) Working knowledge of GCP and/or Azure in multi-cloud More ❯
hybrid role based in the London area. As Data Analyst , you will be responsible for delivering data-driven insights across a wide range of datasets. Using tools such as Databricks, Tableau, Looker Studio, Amplitude, and DBT, you will extract, transform, and analyse data to support key business functions. You will collaborate closely with stakeholders to understand their data needs, generate … with forecasting, target setting, and promotional analysis. Data Governance & Optimisation Conduct QA checks for data accuracy. Improve data architecture and integrate new data sources. Apply advanced analytics using SQL, Databricks, and other tools. What are we looking for? 2+ years’ experience in a similar role. Strong analytical mindset with a passion for uncovering insights. Skilled in SQL and data visualisation More ❯
hybrid role based in the London area. As Data Analyst , you will be responsible for delivering data-driven insights across a wide range of datasets. Using tools such as Databricks, Tableau, Looker Studio, Amplitude, and DBT, you will extract, transform, and analyse data to support key business functions. You will collaborate closely with stakeholders to understand their data needs, generate … with forecasting, target setting, and promotional analysis. Data Governance & Optimisation Conduct QA checks for data accuracy. Improve data architecture and integrate new data sources. Apply advanced analytics using SQL, Databricks, and other tools. What are we looking for? 2+ years’ experience in a similar role. Strong analytical mindset with a passion for uncovering insights. Skilled in SQL and data visualisation More ❯
this role, you’ll be instrumental in supporting the Head of Data in building and deploying fit for purpose data quality management capability underpinned by modern data stack (Azure Databricks, ADF and Power BI), ensuring that data is reliable and trustworthy, then extract insights from it to improve operations and optimise resources. Drive requirements for data quality measurement Build reports … dashboards for data quality and other business problems as per business priorities using Power BI and Databricks Dashboard Create and maintain the data quality tracker to document rule planning and implementation. Deliver continuous improvements of data quality solution based upon feedback. Investigation and analysis of data issues related to quality, lineage, controls, and authoritative source identification. Execution of data cleansing … tasks to prepare data for analysis. Documentation of data quality findings and recommendations for improvement. Work with Data Architecture & Engineering to design and build data quality solution utilising Azure Databricks stack. Take ownership of design and work with data architecture and engineering to build of data pipelines to automate data movement and processing. Manage and mitigate risks through assessment, in More ❯
data requirements. Attention to detail and a strong commitment to data integrity. Excellent problem-solving skills and ability to work in a fast-paced environment. Preferred Qualifications Experience with Databricks and Databricks Unity Catalog. Familiarity with dbt and Airflow. Experience with data quality frameworks. Understanding of ML requirements and experience working with ML teams. Experience in robotics or a related More ❯
Accountabilities: Define the enterprise data architecture vision, target state, and guiding principles, aligned with business priorities and regulatory frameworks. Lead architecture for enterprise data platforms such as Azure Synapse, Databricks, Power BI, and Informatica. Establish enterprise-wide standards for master data, metadata, lineage, and data stewardship. Collaborate with business and domain architects to identify and support key data domains. Provide … Experience Essential: Significant experience in enterprise architecture with a strong focus on data, information, or analytics. Proven hands-on expertise with data platforms such as Azure Data Lake, Synapse, Databricks, Power BI, etc. Deep knowledge of data governance, MDM, metadata management, and data quality frameworks. Understanding of data protection and privacy regulations (e.g., GDPR, CCPA). Track record of developing More ❯
Accountabilities: Define the enterprise data architecture vision, target state, and guiding principles, aligned with business priorities and regulatory frameworks. Lead architecture for enterprise data platforms such as Azure Synapse, Databricks, Power BI, and Informatica. Establish enterprise-wide standards for master data, metadata, lineage, and data stewardship. Collaborate with business and domain architects to identify and support key data domains. Provide … Experience Essential: Significant experience in enterprise architecture with a strong focus on data, information, or analytics. Proven hands-on expertise with data platforms such as Azure Data Lake, Synapse, Databricks, Power BI, etc. Deep knowledge of data governance, MDM, metadata management, and data quality frameworks. Understanding of data protection and privacy regulations (e.g., GDPR, CCPA). Track record of developing More ❯
skills with both clients and technical staff • Desired Skills • Proficiency in Python and Scala • Experience using Spark and Hive • Experience with Qlik or other data visualization administration • Experience completing Databricks development and/or administrative tasks • Familiarity with some of these tools: DB2, Oracle, SAP, Postgres, Elastic Search, Glacier, Cassandra, DynamoDB, Hadoop, Splunk, SAP HANA, Databricks • Experience working with federal More ❯
oriented languages (e.g., Python, PySpark) and frameworks. Expertise in relational and dimensional modeling, including big data technologies. Proficiency in Microsoft Azure components like Azure Data Factory, Data Lake, SQL, DataBricks, HD Insights, ML Service. Good knowledge of Python and Spark. Experience in ETL & ELT processes. Understanding of cloud analytics and ML Ops. Azure Infrastructure and Azure DevOps experience is a … plus. Ability to work with global teams and contribute to architecture discussions. Skills: Data Ops, ML Ops Deep expertise in Azure Databricks, ETL frameworks Fractal is an equal opportunity employer committed to diversity and inclusion. We prohibit discrimination and harassment of any kind. If you enjoy growth and working with enthusiastic professionals, join us! Not the right fit now? Express More ❯
and data structure design across Royal London's Enterprise Data Platform (EDP) which is being built on out Azure Databricks. The EDP is a Data Lakehouse built using the Databricks Medalion architecture and consists of a relational core 'silver' layer with a variety of dimensional, de-normalised and relational structured data products being exposed for consumption through the 'Gold' layer. … tools (Idera ER/Studio or similar). Expert level SQL skills required. Data engineering/data pipeline experience, with hands on experience on integration tools such as Azure Databricks Notebooks, Azure Data Factory or PySpark. Python extremely beneficial. About Royal London We're the UK's largest mutual life, pensions, and investment company, offering protection, long-term savings and More ❯
of data science platforms and tools, as well as experience designing and implementing machine learning pipelines. WHAT YOU WILL BE DOING Design, implement, and maintain machine learning pipelines in Databricks and AWS that enable real-time data-driven decision-making Work closely with the Director of Data Science and cross-functional teams to identify data requirements, define data and ML … similar technical field/experience Hands-on data science expertise with code-based model development e.g. R, Python Strong knowledge of deploying end-to-end machine learning models in Databricks utilizing Pyspark, MLflow and workflows Strong knowledge of data platforms and tools, including Hadoop, Spark, SQL, and NoSQL databases Communicate algorithmic solutions in a clear, understandable way. Leverage data visualization More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Snap Analytics
a thought leader by authoring blogs, presenting at webinars, and engaging in external speaking opportunities. Technical Delivery Excellence You'll design and optimise cloud-based data architectures (e.g. Snowflake, Databricks, Redshift, Google BigQuery, Azure Synapse) to support advanced analytics. You'll build and automate scalable, secure, and high-performance data pipelines handling diverse data sources. You'll work closely with … designing data architectures on platforms like AWS, Azure, or GCP. Technical Skills Extensive experience with ETL/ELT tools (e.g. Matillion, Informatica, Talend) and cloud data platforms (e.g., Snowflake, Databricks, BigQuery). Expertise in data modelling techniques (e.g., star schema, snowflake schema) and optimising models for analytics and reporting. Familiarity with version control, CI/CD pipelines, and containerisation tools More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Snap Analytics
a thought leader by authoring blogs, presenting at webinars, and engaging in external speaking opportunities. Technical Delivery Excellence You’ll design and optimise cloud-based data architectures (e.g. Snowflake, Databricks, Redshift, Google BigQuery, Azure Synapse) to support advanced analytics. You’ll build and automate scalable, secure, and high-performance data pipelines handling diverse data sources. You’ll work closely with … designing data architectures on platforms like AWS, Azure, or GCP. Technical Skills Extensive experience with ETL/ELT tools (e.g. Matillion, Informatica, Talend) and cloud data platforms (e.g., Snowflake, Databricks, BigQuery). Expertise in data modelling techniques (e.g., star schema, snowflake schema) and optimising models for analytics and reporting. Familiarity with version control, CI/CD pipelines, and containerisation tools More ❯
Haywards Heath, Sussex, United Kingdom Hybrid / WFH Options
First Central Services
and version management of large numbers of data science models (Azure DevOps). You'll support the implementation of Machine Learning Ops on cloud (Azure & Azure ML. Experience with Databricks is advantageous.) You'll protect against model degradation and operational performance issues through the development and continual automated monitoring of model execution and model quality. You'll manage automatic model … and integration Basic understanding of networking concepts within Azure Familiarity with Docker and Kubernetes is advantageous Experience within financial/insurance services industry is advantageous Experience with AzureML and Databricks is advantageous Skills & Qualifications Strong understanding of Microsoft Azure, (Azure ML, Azure Stream Analytics, Cognitive services, Event Hubs, Synapse, and Data Factory) Fluency in common data science coding capabilities such More ❯