the support team to maintain, enhance, and ensure the reliability of our BI systems hosted on Microsoft Azure. Optimize and manage Azure data services, including Azure Data Factory, Azure Databricks, and Azure SQL Database, as well as Power BI for data analysis and visualization. Monitor and troubleshoot data pipelines to ensure seamless and efficient operations. Stay updated with advancements in … fast-paced, dynamic environment. Being open minded, motivated, and self-organized. Nice to have : Hands on experience with Cloud platform. Microsoft Azure is preferable. Particularly Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Synapse Analytics, and Azure Data Lake Storage. Familiarity with programming languages such as Python, Scala, Java, C#. Bachelor's or Master's degree in computer science More ❯
the support team to maintain, enhance, and ensure the reliability of our BI systems hosted on Microsoft Azure. Optimize and manage Azure data services, including Azure Data Factory, Azure Databricks, and Azure SQL Database, as well as Power BI for data analysis and visualization. Monitor and troubleshoot data pipelines to ensure seamless and efficient operations. Stay updated with advancements in … fast-paced, dynamic environment. Being open minded, motivated, and self-organized. Nice to have : Hands on experience with Cloud platform. Microsoft Azure is preferable. Particularly Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Synapse Analytics, and Azure Data Lake Storage. Familiarity with programming languages such as Python, Scala, Java, C#. Bachelor's or Master's degree in computer science More ❯
data products within a high-performance cloud platform • Collaborate with cross-functional squads to solve real-world data challenges • Design, build, and optimise scalable data pipelines using Python, Spark & Databricks • Work on orchestration, monitoring, and performance optimisation • Create frameworks and processes for high-quality, scalable data workflows • Help shape and promote software engineering best practices • Support ML/AI workflows … Tech stack & must-have experience: • Strong Python skills & deep knowledge of the Python data ecosystem • Solid SQL skills and experience with data modelling best practices • Hands-on experience with Databricks or Snowflake, ideally on AWS (open to Azure) • Strong knowledge of Spark or PySpark • Experience with CI/CD, Git, Jenkins (or similar tools) • Proven ability to think about scalability More ❯
Burton-on-Trent, Staffordshire, England, United Kingdom Hybrid / WFH Options
Crimson
processing and cost-efficient solutions. A strong background in Azure Data Pipeline development is key for this position. Key Skills & Responsibilities: Build and manage pipelines using Azure Data Factory, Databricks, CI/CD, and Terraform. Optimisation of ETL processes for performance and cost-efficiency. Design scalable data models aligned with business needs. Azure data solutions for efficient data storage and … and reliability. Maintain technical documentation and lead knowledge-sharing initiatives. Deploy advanced analytics and machine learning solutions using Azure. Stay current with Azure technologies and identify areas for enhancement. Databricks (Unity Catalog, DLT), Data Factory, Synapse, Data Lake, Stream Analytics, Event Hubs. Strong knowledge of Python, Scala, C#, .NET. Experience with advanced SQL, T-SQL, relational databases. Azure DevOps, Terraform More ❯
infrastructure, improve data quality, and enable data-driven decision-making across the organization. Core Duties and Responsibilities Design, build, and maintain large-scale data pipelines using Microsoft Fabric and Databricks Develop and implement data architectures that meet business requirements and ensure data quality, security, and compliance Collaborate with wider Product & Engineering teams to integrate data pipelines with machine learning models … and cloud computing Skills Capabilities and Attributes Essential: Good experience in data engineering, with a focus on cloud-based data pipelines and architectures Strong expertise in Microsoft Fabric and Databricks, including data pipeline development, data warehousing, and data lake management Proficiency in Python, SQL, Scala, or Java Experience with data processing frameworks such as Apache Spark, Apache Beam, or Azure … with Azure Synapse Analytics, Azure Data Lake Storage, or other Azure data services Experience with agile development methodologies and version control systems such as Git Certification in Microsoft Azure, Databricks, or other relevant technologies What We Offer Save For Your Future - Equiniti Pension Plan; Equiniti matches your pension contributions up to 10% All Employee Long Term Incentive Plan (LTIP) - Gives More ❯
infrastructure, improve data quality, and enable data-driven decision-making across the organization. Core Duties and Responsibilities Design, build, and maintain large-scale data pipelines using Microsoft Fabric and Databricks Develop and implement data architectures that meet business requirements and ensure data quality, security, and compliance Collaborate with wider Product & Engineering teams to integrate data pipelines with machine learning models … and cloud computing Skills Capabilities and Attributes Essential: Good experience in data engineering, with a focus on cloud-based data pipelines and architectures Strong expertise in Microsoft Fabric and Databricks, including data pipeline development, data warehousing, and data lake management Proficiency in Python, SQL, Scala, or Java Experience with data processing frameworks such as Apache Spark, Apache Beam, or Azure … with Azure Synapse Analytics, Azure Data Lake Storage, or other Azure data services Experience with agile development methodologies and version control systems such as Git Certification in Microsoft Azure, Databricks, or other relevant technologies What We Offer Save For Your Future - Equiniti Pension Plan; Equiniti matches your pension contributions up to 10% All Employee Long Term Incentive Plan (LTIP) - Gives More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
days in office, 3 from home) Pension contribution Great opportunities for career progression And many more Role & Responsibilities Design and deliver solutions using MS Fabric, ADF, ADL, Synapse, Databricks, SQL, and Python. Work closely with a variety of clients to gather requirements and deliver solutions. Be willing to engage and assist in pre-sales activities, bids, proposals etc. Use key … in building out, developing, and training the data engineering function. What do I need to apply Strong MS data engineering expertise (Fabric, ADF, ADL, Synapse, SQL) Expert use of Databricks Strong Python experience Consultancy experience Leadership experience My client are looking to book in first stage interviews for next week and slots are already filling up fast. I have limited More ❯
infrastructure, improve data quality, and enable data-driven decision-making across the organization. Core Duties and Responsibilities Design, build, and maintain large-scale data pipelines using Microsoft Fabric and Databricks Develop and implement data architectures that meet business requirements and ensure data quality, security, and compliance Collaborate with wider Product & Engineering teams to integrate data pipelines with machine learning models … engineering and cloud computing Skills Capabilities and Attributes Good experience in data engineering, with a focus on cloud-based data pipelines and architectures Strong expertise in Microsoft Fabric and Databricks, including data pipeline development, data warehousing, and data lake management Proficiency in Python, SQL, Scala, or Java Experience with data processing frameworks such as Apache Spark, Apache Beam, or Azure … with Azure Synapse Analytics, Azure Data Lake Storage, or other Azure data services Experience with agile development methodologies and version control systems such as Git Certification in Microsoft Azure, Databricks, or other relevant technologies What We Offer Save For Your Future - Equiniti Pension Plan; Equiniti matches your pension contributions up to 10% All Employee Long Term Incentive Plan (LTIP) – Gives More ❯
to hear from you. Key Responsibilities Define and evolve the enterprise-wide data and analytics architecture, strategy, and roadmap. Lead the development of a modern data platform using Azure, Databricks, Power BI, and SAP data sources. Build robust data pipelines and integration models leveraging Azure Synapse, Data Factory, and automation tools. Ensure data governance, security, compliance, and quality across the … requirements into scalable, value-focused solutions. What You'll Bring Proven success designing and delivering modern, cloud-based data platforms (Azure/AWS/GCP). Deep knowledge of Databricks, Power BI, and enterprise integration tools. A developed understanding of TOGAF or equivalent enterprise architecture frameworks. Hands-on experience in data warehousing, data lakes, data modelling, and ETL processes. Excellent More ❯
the adoption and maintenance of cloud-first data solutions. Innovation and Future-Proofing Stay up to date with modern tools and methodologies, including but not limited to Microsoft Fabric , Databricks , and Lakehouse architectures. Ensure solutions are scalable, maintainable, and aligned with evolving best practices. Contribute to shaping the future state of the organisation’s data estate. Qualifications & Experience: Bachelor’s … dashboard development and DAX formulas. Experience with Power Automate for data-driven workflows. Understanding of ETL concepts and processes. Exposure to modern data platforms such as Azure Data Lake , Databricks , or Microsoft Fabric is a bonus. Analytical Skills: Ability to understand complex data structures and derive actionable insights. Strong problem-solving ability with attention to detail. Curious and data-driven More ❯
data ingestion into our Azure Data Lake, designing and developing sophisticated data models for analytics, to deploying Azure Logic Apps and Function Apps for process automation. With tools like Databricks, Data Factory, Azure SQL Server, and Python you will help formulate a data driven platform and develop data pipelines that convert raw data into actionable insights in line with our … supporting a collaborative and productive work environment. Develop and implement data ingestion solutions on the Azure platform, ensuring scalability, reliability, and performance. Leveraging technologies such as Azure Data Factory, Databricks, Delta Lake, and Python. Design and maintain robust ETL/ELT processes to integrate data from various sources into Azure data services. Help to implement DataOps practices to streamline and … drive continuous improvement within the team. About you You will fit right into this role if you: Can demonstrate advanced knowledge of cloud services, preferably the broader Fabric or Databricks suite of products; or Azure Data Lake, Azure Data Factory, Azure Logic Apps, Azure Function Apps, alongside a skillset incorporating data modelling techniques and business data requirement comprehension, documentation and More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Asda
You will shape and deliver advanced analytics, and visual insights that influence decisions at every level of the organisation. You’ll be working in a hybrid cloud environment (Azure, Databricks), applying your skills to real challenges in areas like customer behaviour, operations, and digital journeys. You will also mentor and manage a team of talented data professionals, helping to grow … a fast-paced, dynamic, and sometimes ambiguous work environment. Proven experience in data science, advanced analytics, or data engineering, with a track record of delivering measurable outcomes. Proficiency in Databricks, Azure, SQL , and data visualisation using open source libraries. Excellent communication skills — able to simplify technical concepts for varied audiences. Strong problem-solving skills and a focus on value creation … forward-thinking data team using modern tools in a cloud-native environment We offer a flexible hybrid working with a supportive, inclusive culture We use leading Tools & tech: Azure, Databricks, Power BI, Python — and are continuously evolving We offer an attractive benefits package: Competitive salary, car allowance, and performance related bonus Competitive Stakeholder Pension Plan 15% Asda Colleague Discount Free More ❯
Guildford, England, United Kingdom Hybrid / WFH Options
Person Centred Software Ltd
Do: Design and implement benchmarking methodologies using data from hundreds of care homes across clinical and operational metrics Build scalable data science workflows using Azure Data Factory, Synapse Analytics, Databricks, and other Microsoft Azure services Lead end-to-end development of predictive models (e.g., fall prediction, infection prevention) using Azure Machine Learning Collaborate with developers and product managers to embed … ML) and SQL (for data manipulation) Deep expertise in supervised and unsupervised ML techniques, benchmarking design, and statistical analysis Strong experience with Microsoft Azure stack: AML, Data Factory, Synapse, Databricks, Blob Storage/Data Lake Familiarity with Power BI for report automation and data insight delivery Knowledge of model explainability tools (e.g., SHAP, LIME) and responsible AI practices Git proficiency More ❯
Wilmslow, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
pipelines and infrastructure, who can implement processes on our modern tech stack with robust, pragmatic solutions. Responsibilities develop, and maintain ETL/ELT data pipelines using AWS services Data, Databricks and dbt Manage and optimize data storage solutions such as Amazon S3, Redshift, RDS, and DynamoDB. Implement and manage infrastructure-as-code (IaC) using tools like Terraform or AWS CloudFormation. … tools like Terraform or CloudFormation. Experience with workflow orchestration tools (e.g., Airflow, Dagster). Good understanding of Cloud providers – AWS, Microsoft Azure, Google Cloud Familiarity with DBT, Delta Lake, Databricks Experience working in Agile environments with tools like Jira and Git. About Us We are Citation. We are far from your average service provider. Our colleagues bring their brilliant selves More ❯
London, England, United Kingdom Hybrid / WFH Options
LEGO Gruppe
business needs & improve overall data availability for the business. Partner with E2E LEGO.com Operations digital product teams to ensure high quality data is collected and published to LEGO Nexus (databricks) to a standard fit for purpose for downstream delivery of data products. Enable LEGO Retail specific data understanding and champion data literacy via guidelines, training, drop-in sessions, documentation, and … a curious, solution-driven mindset. Fluent in English and open to light travel (up to 10 days/year). Nice to have: Experience with CI/CD pipelines, Databricks, and Databricks Asset Bundles is beneficial. You’re comfortable working in cross-cultural environments and collaborating with global teams across time zones. A good understanding of consumer retail or direct More ❯
Warrington, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
pipelines and infrastructure, who can implement processes on our modern tech stack with robust, pragmatic solutions. Responsibilities develop, and maintain ETL/ELT data pipelines using AWS services Data, Databricks and dbt Manage and optimize data storage solutions such as Amazon S3, Redshift, RDS, and DynamoDB. Implement and manage infrastructure-as-code (IaC) using tools like Terraform or AWS CloudFormation. … tools like Terraform or CloudFormation. Experience with workflow orchestration tools (e.g., Airflow, Dagster). Good understanding of Cloud providers – AWS, Microsoft Azure, Google Cloud Familiarity with DBT, Delta Lake, Databricks Experience working in Agile environments with tools like Jira and Git. About Us We are Citation. We are far from your average service provider. Our colleagues bring their brilliant selves More ❯
London, England, United Kingdom Hybrid / WFH Options
Person Centred Software Ltd
Do: Design and implement benchmarking methodologies using data from hundreds of care homes across clinical and operational metrics Build scalable data science workflows using Azure Data Factory, Synapse Analytics, Databricks, and other Microsoft Azure services Lead end-to-end development of predictive models (e.g., fall prediction, infection prevention) using Azure Machine Learning Collaborate with developers and product managers to embed … ML) and SQL (for data manipulation) Deep expertise in supervised and unsupervised ML techniques, benchmarking design, and statistical analysis Strong experience with Microsoft Azure stack: AML, Data Factory, Synapse, Databricks, Blob Storage/Data Lake Familiarity with Power BI for report automation and data insight delivery Knowledge of model explainability tools (e.g., SHAP, LIME) and responsible AI practices Git proficiency More ❯
vision and architectural direction across entire programmes and client estates. Key Responsibilities Senior Data Architects: Lead the design and delivery of cloud-native data solutions using modern platforms (e.g. Databricks, Snowflake, Kafka, Confluent) Architect data lakes, lakehouses, streaming pipelines, and event-driven architectures Oversee engineering teams and collaborate with analysts and QA functions Translate complex requirements into scalable, robust data … Core Experience: Proven track record in data architecture , either from a delivery or enterprise strategy perspective Deep experience with cloud platforms (Azure, AWS, GCP) and modern data ecosystems (Spark, Databricks, Kafka, Snowflake) Strong understanding of Data Mesh , Data Fabric , and data product-led approaches Data modelling expertise (relational, dimensional) and familiarity with tools like Erwin , Sparx , Archi Experience with ETL More ❯
Accountabilities: Define the enterprise data architecture vision, target state, and guiding principles, aligned with business priorities and regulatory frameworks. Lead architecture for enterprise data platforms such as Azure Synapse, Databricks, Power BI, and Informatica. Establish enterprise-wide standards for master data, metadata, lineage, and data stewardship. Collaborate with business and domain architects to identify and support key data domains. Provide … Skills & Experience Significant experience in enterprise architecture with a strong focus on data, information, or analytics. Proven hands-on expertise with data platforms such as Azure Data Lake, Synapse, Databricks, Power BI, etc. Deep knowledge of data governance, MDM, metadata management, and data quality frameworks. Understanding of data protection and privacy regulations (e.g., GDPR, CCPA). Track record of developing More ❯
suitable for junior candidates ). Language Skills: fluent in german and english Azure Expertise: Deep hands-on knowledge of Azure data and AI services - including Azure Machine Learning, Azure Databricks, Azure Data Lake/Synapse, Azure Cognitive Services (Text, Vision, Speech), Azure OpenAI and Azure AI Foundry. Ability to architect solutions that leverage these services cohesively. Architectural Skills: Strong skills … expertise in Azure and AI. Examples include Microsoft Certified: Azure Solutions Architect Expert and Azure AI Engineer Associate (AI-102). Certification in machine learning frameworks or platforms (e.g. Databricks Certified Generative AI Engineer Associate ) is also valued. Devoteam supports continuous learning and certification attainment. Education: A Bachelor's or Master's degree in Computer Science, Data Science, or related More ❯
closely with business and technical stakeholders to deliver robust, scalable, and secure data platforms. Key Responsibilities Design and implement modern data architectures using Azure Synapse , Data Lake , Data Factory , Databricks , and Microsoft Fabric . Lead the development of integrated data solutions supporting BI, advanced analytics, and real-time data processing. Define data governance, security, and compliance standards across the data More ❯
closely with business and technical stakeholders to deliver robust, scalable, and secure data platforms. Key Responsibilities Design and implement modern data architectures using Azure Synapse , Data Lake , Data Factory , Databricks , and Microsoft Fabric . Lead the development of integrated data solutions supporting BI, advanced analytics, and real-time data processing. Define data governance, security, and compliance standards across the data More ❯
Employment Type: Permanent
Salary: £90000 - £110000/annum Plus bonus and package
and brownfield projects. Comfortable working independently in a remote setup. Strong communication and stakeholder engagement skills. Nice to Have: Experience with other Azure services such as Azure Data Lake, Databricks, or Power BI. Familiarity with DevOps/DataOps processes and tools. Microsoft Certified: Fabric Analytics Engineer Associate Start Date: ASAP or July start acceptable Apply now to join a forward More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Primus
and brownfield projects. Comfortable working independently in a remote setup. Strong communication and stakeholder engagement skills. Nice to Have: Experience with other Azure services such as Azure Data Lake, Databricks, or Power BI. Familiarity with DevOps/DataOps processes and tools. Microsoft Certified: Fabric Analytics Engineer Associate Start Date: ASAP or July start acceptable Apply now to join a forward More ❯
London, England, United Kingdom Hybrid / WFH Options
Foreign, Commonwealth and Development Office
the FCDO as part of our collaborative and technical consultancy service. Provide Engineering Data pipelines in Azure primarily using Azure Data Factory and also, to develop team capability, using Databricks and other tools within the Azure Cloud environment. Deliver data products to a range of business domains stakeholders and to teams across the department. Implement data design through good Data … the FCDO as part of our collaborative and technical consultancy service. Provide Engineering Data pipelines in Azure primarily using Azure Data Factory and also, to develop team capability, using Databricks and other tools within the Azure Cloud environment. Deliver data products to a range of business domains stakeholders and to teams across the department. Implement data design through good Data More ❯