Kirkby on Bain, England, United Kingdom Hybrid / WFH Options
ANGLIAN WATER-2
delivery pipelines if the solution is to adopt modern DevOps processes. What does it take to be an Enterprise Data Engineer? Previous strong experience in data engineering ideally using Databricks, Azure Data Factory, Spark, Python, SQL, PowerBI Strong data engineering experience atleast 3-5 years Dimensional data modelling Experience in delivering end to end BI solution from requirements, design to More ❯
reference data) Understanding of regulatory reporting processes Proven ability to work directly with demanding front office stakeholders Experience with real-time data feeds and low-latency requirements Preferred Skills Databricks experience Capital markets knowledge (equities, fixed income, derivatives) Experience with financial data vendors (Bloomberg, Reuters, MarkIt) Cloud platforms (Azure preferred) and orchestration tools Understanding of risk metrics and P&L More ❯
a Big Data technologies (Spark, Impala, Hive, Redshift, Kafka, etc.) Experience in data quality testing; adept at writing test cases and scripts, presenting and resolving data issues Experience with Databricks, Snowflake, Iceberg are required Preferred qualifications, capabilities, and skills Experience in application and data design disciplines with an emphasis on real-time processing and delivery e.g. Kafka is preferable Understanding More ❯
team based in Leeds, working mostly remotely with just one day on-site per week. You ll lead the design and delivery of scalable, cloud-based data solutions using Databricks, Python, and SQL, while mentoring a team and driving engineering best practices. About You You might currently be a Senior Data Engineer ready to grow your leadership skills. You re … passionate about building robust, efficient data pipelines and shaping cloud data architecture in an agile environment. Key Responsibilities Lead development of data pipelines and solutions using Databricks, Python, and SQL Design and maintain data models supporting analytics and business intelligence Build and optimise ELT/ETL processes on AWS or Azure Collaborate closely with analysts, architects, and stakeholders to deliver … as code Mentor and support your team, taking ownership of technical delivery and decisions Drive continuous improvements in platform performance, cost, and reliability Key Requirements Hands-on experience with Databricks or similar data engineering platforms Strong Python and SQL skills in data engineering contexts Expertise in data modelling and building analytics-ready datasets Experience with AWS or Azure cloud data More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
VIQU Limited
team based in Leeds, working mostly remotely with just one day on-site per week. You'll lead the design and delivery of scalable, cloud-based data solutions using Databricks, Python, and SQL, while mentoring a team and driving engineering best practices. About You You might currently be a Senior Data Engineer ready to grow your leadership skills. You're … passionate about building robust, efficient data pipelines and shaping cloud data architecture in an agile environment. Key Responsibilities Lead development of data pipelines and solutions using Databricks, Python, and SQL Design and maintain data models supporting analytics and business intelligence Build and optimise ELT/ETL processes on AWS or Azure Collaborate closely with analysts, architects, and stakeholders to deliver … as code Mentor and support your team, taking ownership of technical delivery and decisions Drive continuous improvements in platform performance, cost, and reliability Key Requirements Hands-on experience with Databricks or similar data engineering platforms Strong Python and SQL skills in data engineering contexts Expertise in data modelling and building analytics-ready datasets Experience with AWS or Azure cloud data More ❯
team based in Leeds, working mostly remotely with just one day on-site per week. You'll lead the design and delivery of scalable, cloud-based data solutions using Databricks, Python, and SQL, while mentoring a team and driving engineering best practices. About You You might currently be a Senior Data Engineer ready to grow your leadership skills. You're … passionate about building robust, efficient data pipelines and shaping cloud data architecture in an agile environment. Key Responsibilities Lead development of data pipelines and solutions using Databricks, Python, and SQL Design and maintain data models supporting analytics and business intelligence Build and optimise ELT/ETL processes on AWS or Azure Collaborate closely with analysts, architects, and stakeholders to deliver … as code Mentor and support your team, taking ownership of technical delivery and decisions Drive continuous improvements in platform performance, cost, and reliability Key Requirements Hands-on experience with Databricks or similar data engineering platforms Strong Python and SQL skills in data engineering contexts Expertise in data modelling and building analytics-ready datasets Experience with AWS or Azure cloud data More ❯
Morley, England, United Kingdom Hybrid / WFH Options
VIQU Limited
team based in Leeds, working mostly remotely with just one day on-site per week. You'll lead the design and delivery of scalable, cloud-based data solutions using Databricks, Python, and SQL, while mentoring a team and driving engineering best practices. About You You might currently be a Senior Data Engineer ready to grow your leadership skills. You're … passionate about building robust, efficient data pipelines and shaping cloud data architecture in an agile environment. Key Responsibilities Lead development of data pipelines and solutions using Databricks, Python, and SQL Design and maintain data models supporting analytics and business intelligence Build and optimise ELT/ETL processes on AWS or Azure Collaborate closely with analysts, architects, and stakeholders to deliver … as code Mentor and support your team, taking ownership of technical delivery and decisions Drive continuous improvements in platform performance, cost, and reliability Key Requirements Hands-on experience with Databricks or similar data engineering platforms Strong Python and SQL skills in data engineering contexts Expertise in data modelling and building analytics-ready datasets Experience with AWS or Azure cloud data More ❯
implementation of customers modern data platforms. The ideal candidate will have extensive experience migrating traditional data warehouse technologies, including Teradata, Oracle, BW, Hadoop to modern cloud data platforms like Databricks, Snowflake, Redshift, Bigquery, or Microsoft fabric. You will be responsible for leading data platform migrations and the design and development of scalable data solutions that support our organization's strategic … and big data platforms. Establish best practices and standards for data modeling, integration, and management. Platform Design and Implementation: Architect, design, and implement data warehouse solutions using platforms like Databricks, Redshift, BigQuery, Synapse, and Snowflake. Develop scalable big data solutions using cloud data technologies and services. Ensure the data architecture supports data quality, security, and governance requirements. Technology Leadership: Evaluate … data architecture, data warehousing, and big data solutions. 5+ years of experience working in Cloud AWS, GCP or Azure. 5+ years of experience working in modern cloud data platforms (Databricks, Redshift, BigQuery, Synapse, SAP Datasphere, and Snowflake.) 5+ years of experience designing Cloud Infrastructure on AWS, GCP or Azure. Extensive experience with data warehouse platforms such as Teradata Oracle, SAP More ❯
varied end points to move data at speed and at scale. The right candidate will have a wealth of knowledge in the data world with a strong focus on Databricks, and will be keen to expand upon their existing knowledge, learning new technologies along the way as well as supporting both future and legacy technologies and processes. You will be … data problems and challenges every day. Key Responsibilities: Design, Build, and Optimise Real-Time Data Pipelines: Develop and maintain robust and scalable stream and micro-batch data pipelines using Databricks, Spark (PySpark/SQL), and Delta Live Tables. Implement Change Data Capture (CDC): Implement efficient CDC mechanisms to capture and process data changes from various source systems in near real … and schema evolution, to ensure data quality and reliability. Champion Data Governance with Unity Catalog: Implement and manage data governance policies, data lineage, and fine-grained access control using Databricks Unity Catalog. Enable Secure Data Sharing with Delta Sharing: Design and implement secure and governed data sharing solutions to distribute data to both internal and external consumers without data replication. More ❯
varied end points to move data at speed and at scale. The right candidate will have a wealth of knowledge in the data world with a strong focus on Databricks, and will be keen to expand upon their existing knowledge, learning new technologies along the way as well as supporting both future and legacy technologies and processes. You will be … data problems and challenges every day. Key Responsibilities: Design, Build, and Optimise Real-Time Data Pipelines: Develop and maintain robust and scalable stream and micro-batch data pipelines using Databricks, Spark (PySpark/SQL), and Delta Live Tables. Implement Change Data Capture (CDC): Implement efficient CDC mechanisms to capture and process data changes from various source systems in near real … and schema evolution, to ensure data quality and reliability. Champion Data Governance with Unity Catalog: Implement and manage data governance policies, data lineage, and fine-grained access control using Databricks Unity Catalog. Enable Secure Data Sharing with Delta Sharing: Design and implement secure and governed data sharing solutions to distribute data to both internal and external consumers without data replication. More ❯
data products within a high-performance cloud platform • Collaborate with cross-functional squads to solve real-world data challenges • Design, build, and optimise scalable data pipelines using Python, Spark & Databricks • Work on orchestration, monitoring, and performance optimisation • Create frameworks and processes for high-quality, scalable data workflows • Help shape and promote software engineering best practices • Support ML/AI workflows … Tech stack & must-have experience: • Strong Python skills & deep knowledge of the Python data ecosystem • Solid SQL skills and experience with data modelling best practices • Hands-on experience with Databricks or Snowflake, ideally on AWS (open to Azure) • Strong knowledge of Spark or PySpark • Experience with CI/CD, Git, Jenkins (or similar tools) • Proven ability to think about scalability More ❯
Burton-on-Trent, Staffordshire, England, United Kingdom Hybrid / WFH Options
Crimson
processing and cost-efficient solutions. A strong background in Azure Data Pipeline development is key for this position. Key Skills & Responsibilities: Build and manage pipelines using Azure Data Factory, Databricks, CI/CD, and Terraform. Optimisation of ETL processes for performance and cost-efficiency. Design scalable data models aligned with business needs. Azure data solutions for efficient data storage and … and reliability. Maintain technical documentation and lead knowledge-sharing initiatives. Deploy advanced analytics and machine learning solutions using Azure. Stay current with Azure technologies and identify areas for enhancement. Databricks (Unity Catalog, DLT), Data Factory, Synapse, Data Lake, Stream Analytics, Event Hubs. Strong knowledge of Python, Scala, C#, .NET. Experience with advanced SQL, T-SQL, relational databases. Azure DevOps, Terraform More ❯
Stoke-on-Trent, Staffordshire, UK Hybrid / WFH Options
Crimson
processing and cost-efficient solutions. A strong background in Azure Data Pipeline development is key for this position. Key Skills & Responsibilities: Build and manage pipelines using Azure Data Factory, Databricks, CI/CD, and Terraform. Optimisation of ETL processes for performance and cost-efficiency. Design scalable data models aligned with business needs. Azure data solutions for efficient data storage and … and reliability. Maintain technical documentation and lead knowledge-sharing initiatives. Deploy advanced analytics and machine learning solutions using Azure. Stay current with Azure technologies and identify areas for enhancement. Databricks (Unity Catalog, DLT), Data Factory, Synapse, Data Lake, Stream Analytics, Event Hubs. Strong knowledge of Python, Scala, C#, .NET. Experience with advanced SQL, T-SQL, relational databases. Azure DevOps, Terraform More ❯
infrastructure, improve data quality, and enable data-driven decision-making across the organization. Core Duties and Responsibilities Design, build, and maintain large-scale data pipelines using Microsoft Fabric and Databricks Develop and implement data architectures that meet business requirements and ensure data quality, security, and compliance Collaborate with wider Product & Engineering teams to integrate data pipelines with machine learning models … and cloud computing Skills Capabilities and Attributes Essential: Good experience in data engineering, with a focus on cloud-based data pipelines and architectures Strong expertise in Microsoft Fabric and Databricks, including data pipeline development, data warehousing, and data lake management Proficiency in Python, SQL, Scala, or Java Experience with data processing frameworks such as Apache Spark, Apache Beam, or Azure … with Azure Synapse Analytics, Azure Data Lake Storage, or other Azure data services Experience with agile development methodologies and version control systems such as Git Certification in Microsoft Azure, Databricks, or other relevant technologies What We Offer Save For Your Future - Equiniti Pension Plan; Equiniti matches your pension contributions up to 10% All Employee Long Term Incentive Plan (LTIP) - Gives More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
days in office, 3 from home) Pension contribution Great opportunities for career progression And many more Role & Responsibilities Design and deliver solutions using MS Fabric, ADF, ADL, Synapse, Databricks, SQL, and Python. Work closely with a variety of clients to gather requirements and deliver solutions. Be willing to engage and assist in pre-sales activities, bids, proposals etc. Use key … in building out, developing, and training the data engineering function. What do I need to apply Strong MS data engineering expertise (Fabric, ADF, ADL, Synapse, SQL) Expert use of Databricks Strong Python experience Consultancy experience Leadership experience My client are looking to book in first stage interviews for next week and slots are already filling up fast. I have limited More ❯
to hear from you. Key Responsibilities Define and evolve the enterprise-wide data and analytics architecture, strategy, and roadmap. Lead the development of a modern data platform using Azure, Databricks, Power BI, and SAP data sources. Build robust data pipelines and integration models leveraging Azure Synapse, Data Factory, and automation tools. Ensure data governance, security, compliance, and quality across the … requirements into scalable, value-focused solutions. What You'll Bring Proven success designing and delivering modern, cloud-based data platforms (Azure/AWS/GCP). Deep knowledge of Databricks, Power BI, and enterprise integration tools. A developed understanding of TOGAF or equivalent enterprise architecture frameworks. Hands-on experience in data warehousing, data lakes, data modelling, and ETL processes. Excellent More ❯
the adoption and maintenance of cloud-first data solutions. Innovation and Future-Proofing Stay up to date with modern tools and methodologies, including but not limited to Microsoft Fabric , Databricks , and Lakehouse architectures. Ensure solutions are scalable, maintainable, and aligned with evolving best practices. Contribute to shaping the future state of the organisation’s data estate. Qualifications & Experience: Bachelor’s … dashboard development and DAX formulas. Experience with Power Automate for data-driven workflows. Understanding of ETL concepts and processes. Exposure to modern data platforms such as Azure Data Lake , Databricks , or Microsoft Fabric is a bonus. Analytical Skills: Ability to understand complex data structures and derive actionable insights. Strong problem-solving ability with attention to detail. Curious and data-driven More ❯
pipelines and infrastructure, who can implement processes on our modern tech stack with robust, pragmatic solutions. Responsibilities develop, and maintain ETL/ELT data pipelines using AWS services Data, Databricks and dbt Manage and optimize data storage solutions such as Amazon S3, Redshift, RDS, and DynamoDB. Implement and manage infrastructure-as-code (IaC) using tools like Terraform or AWS CloudFormation. … tools like Terraform or CloudFormation. Experience with workflow orchestration tools (e.g., Airflow, Dagster). Good understanding of Cloud providers – AWS, Microsoft Azure, Google Cloud Familiarity with DBT, Delta Lake, Databricks Experience working in Agile environments with tools like Jira and Git. About Us We are Citation. We are far from your average service provider. Our colleagues bring their brilliant selves More ❯
closely with business and technical stakeholders to deliver robust, scalable, and secure data platforms. Key Responsibilities Design and implement modern data architectures using Azure Synapse , Data Lake , Data Factory , Databricks , and Microsoft Fabric . Lead the development of integrated data solutions supporting BI, advanced analytics, and real-time data processing. Define data governance, security, and compliance standards across the data More ❯
closely with business and technical stakeholders to deliver robust, scalable, and secure data platforms. Key Responsibilities Design and implement modern data architectures using Azure Synapse , Data Lake , Data Factory , Databricks , and Microsoft Fabric . Lead the development of integrated data solutions supporting BI, advanced analytics, and real-time data processing. Define data governance, security, and compliance standards across the data More ❯
Employment Type: Permanent
Salary: £90000 - £110000/annum Plus bonus and package
and brownfield projects. Comfortable working independently in a remote setup. Strong communication and stakeholder engagement skills. Nice to Have: Experience with other Azure services such as Azure Data Lake, Databricks, or Power BI. Familiarity with DevOps/DataOps processes and tools. Microsoft Certified: Fabric Analytics Engineer Associate Start Date: ASAP or July start acceptable Apply now to join a forward More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Primus
and brownfield projects. Comfortable working independently in a remote setup. Strong communication and stakeholder engagement skills. Nice to Have: Experience with other Azure services such as Azure Data Lake, Databricks, or Power BI. Familiarity with DevOps/DataOps processes and tools. Microsoft Certified: Fabric Analytics Engineer Associate Start Date: ASAP or July start acceptable Apply now to join a forward More ❯
drive technical innovation across client projects Document processes and contribute to internal knowledge repositories and best practice libraries Key Skills & Experience Strong hands-on experience with Azure tooling, including: Databricks, Data Factory, Data Lake, and Synapse (or similar data warehouse tools) Azure Analysis Services or comparable BI tooling Solid programming capability in: SQL, Python, Spark, and ideally DAX Familiarity with More ❯
drive technical innovation across client projects Document processes and contribute to internal knowledge repositories and best practice libraries Key Skills & Experience Strong hands-on experience with Azure tooling, including: Databricks, Data Factory, Data Lake, and Synapse (or similar data warehouse tools) Azure Analysis Services or comparable BI tooling Solid programming capability in: SQL, Python, Spark, and ideally DAX Familiarity with More ❯
drive technical innovation across client projects Document processes and contribute to internal knowledge repositories and best practice libraries Key Skills & Experience Strong hands-on experience with Azure tooling, including: Databricks, Data Factory, Data Lake, and Synapse (or similar data warehouse tools) Azure Analysis Services or comparable BI tooling Solid programming capability in: SQL, Python, Spark, and ideally DAX Familiarity with More ❯