London, South East, England, United Kingdom Hybrid/Remote Options
Crimson
a strong understanding of data engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets Key skills and responsibilities: Design, build, and maintain scalable ETL pipelines for ingesting, transforming, and loading data from APIs, databases, and financial data sources into Azure Databricks. Optimize pipelines for performance, reliability, and cost, incorporating data quality checks. Develop complex More ❯
for: We're seeking someone with significant experience in data reporting and analytics, excellent SQL and Power BI skills, and a strong understanding of relational databases, data modelling, andETL processes. You'll need a Level 3 qualification in a relevant subject (or equivalent experience), along with a keen eye for accuracy, strong problem-solving abilities, and confidence communicating technical More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
the forefront of the data engineering field. You'll Work With Azure Data Services: Data Factory, Data Lake, SQL Databricks: Spark, Delta Lake Power BI: Advanced dashboards and analytics ETL & Data Modelling: T-SQL, metadata-driven pipelines Design and implement scalable Azure-based data solutions Build and optimise data pipelines for integration and transformation Develop Power BI dashboards for global More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
the forefront of the data engineering field. You'll Work With Azure Data Services: Data Factory, Data Lake, SQL Databricks: Spark, Delta Lake Power BI: Advanced dashboards and analytics ETL & Data Modelling: T-SQL, metadata-driven pipelines Design and implement scalable Azure-based data solutions Build and optimise data pipelines for integration and transformation Develop Power BI dashboards for global More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
can be at the forefront of your field. What You'll Work With Azure Data Services: Data Factory, Data Lake, SQL Databricks: Spark, Delta Lake Power BI: Advanced dashboards ETL & Data Modelling: T-SQL, metadata-driven pipelines What you'll do Design and implement scalable Azure-based data solutions Build and optimise data pipelines for integration and transformation Develop Power More ❯
Ability to work independently and thrive in a team environment. Eagerness to learn and develop technical skills; a degree or relevant experience is required. Knowledge of data warehousing andETL processes. Familiarity with Power Apps, Azure Data Factory, and Power Automate is advantageous. Someone with a can do attitude who pays attention to detail and can spot irregularities. Office Angels More ❯
metadata management, and data governance. Experience with modern data platforms such as Azure Data Lake, Databricks, Power BI, and SAP BTP. Solid grasp of enterprise integration patterns (APIs, streaming, ETL/ELT, event-driven architectures). Ability to translate complex data concepts into clear, value-focused business outcomes. Excellent stakeholder management and communication skills across technical and non-technical audiences. More ❯
metadata management, and data governance. Experience with modern data platforms such as Azure Data Lake, Databricks, Power BI, and SAP BTP. Solid grasp of enterprise integration patterns (APIs, streaming, ETL/ELT, event-driven architectures). Ability to translate complex data concepts into clear, value-focused business outcomes. Excellent stakeholder management and communication skills across technical and non-technical audiences. More ❯
Crewe, Cheshire, North West, United Kingdom Hybrid/Remote Options
Langley James Limited
leading data visualisation platform. Harness a modern data platform to seamlessly integrate and manage data from diverse sources. Maintain and enhance sophisticated financial models and reporting systems. Implement robust Extract, Transform, andLoad (ETL) processes to ensure data flows smoothly and accurately. Champion data accuracy, consistency, and integrity across all BI deliverables. Contribute to the establishment and enforcement of data More ❯
Crewe, Cheshire, North West, United Kingdom Hybrid/Remote Options
Langley James Limited
leading data visualisation platform. Harness a modern data platform to seamlessly integrate and manage data from diverse sources. Maintain and enhance sophisticated financial models and reporting systems. Implement robust Extract, Transform, andLoad (ETL) processes to ensure data flows smoothly and accurately. Champion data accuracy, consistency, and integrity across all BI deliverables. Contribute to the establishment and enforcement of data More ❯
an initial 7-month contract with an immediate start and strong potential for extension on a greenfield project. Key Responsibilities Design, build, and optimise data pipelines using ELT/ETL best practices. Migrate andtransform data into a Databricks Lakehouse architecture. Ensure data quality, reliability, and scalability for analytics and reporting. Collaborate with stakeholders to deliver robust solutions aligned with More ❯
an initial 7-month contract with an immediate start and strong potential for extension on a greenfield project. Key Responsibilities Design, build, and optimise data pipelines using ELT/ETL best practices. Migrate andtransform data into a Databricks Lakehouse architecture. Ensure data quality, reliability, and scalability for analytics and reporting. Collaborate with stakeholders to deliver robust solutions aligned with More ❯
an initial 7-month contract with an immediate start and strong potential for extension on a greenfield project. Key Responsibilities Design, build, and optimise data pipelines using ELT/ETL best practices. Migrate andtransform data into a Databricks Lakehouse architecture. Ensure data quality, reliability, and scalability for analytics and reporting. Collaborate with stakeholders to deliver robust solutions aligned with More ❯
an initial 7-month contract with an immediate start and strong potential for extension on a greenfield project. Key Responsibilities Design, build, and optimise data pipelines using ELT/ETL best practices. Migrate andtransform data into a Databricks Lakehouse architecture. Ensure data quality, reliability, and scalability for analytics and reporting. Collaborate with stakeholders to deliver robust solutions aligned with More ❯
an initial 7-month contract with an immediate start and strong potential for extension on a greenfield project. Key Responsibilities Design, build, and optimise data pipelines using ELT/ETL best practices. Migrate andtransform data into a Databricks Lakehouse architecture. Ensure data quality, reliability, and scalability for analytics and reporting. Collaborate with stakeholders to deliver robust solutions aligned with More ❯
your chance to work on exciting projects, design robust data architectures, and help organisations unlock the full potential of their data. Key Responsibilities: Build and optimise data pipelines andETL workflows using Microsoft Fabric and Azure Synapse or Databricks. Implement scalable solutions for data ingestion, storage, and transformation. Develop clean, reusable Python code for data engineering tasks. Research and integrate More ❯
Crawley, Sussex, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
your chance to work on exciting projects, design robust data architectures, and help organisations unlock the full potential of their data. Key Responsibilities: Build and optimise data pipelines andETL workflows using Microsoft Fabric and Azure Synapse or Databricks. Implement scalable solutions for data ingestion, storage, and transformation. Develop clean, reusable Python code for data engineering tasks. Research and integrate More ❯
Kent, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
your chance to work on exciting projects, design robust data architectures, and help organisations unlock the full potential of their data. Key Responsibilities: Build and optimise data pipelines andETL workflows using Microsoft Fabric and Azure Synapse or Databricks. Implement scalable solutions for data ingestion, storage, and transformation. Develop clean, reusable Python code for data engineering tasks. Research and integrate More ❯
Luton, Bedfordshire, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
your chance to work on exciting projects, design robust data architectures, and help organisations unlock the full potential of their data. Key Responsibilities: Build and optimise data pipelines andETL workflows using Microsoft Fabric and Azure Synapse or Databricks. Implement scalable solutions for data ingestion, storage, and transformation. Develop clean, reusable Python code for data engineering tasks. Research and integrate More ❯
Slough, Berkshire, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
your chance to work on exciting projects, design robust data architectures, and help organisations unlock the full potential of their data. Key Responsibilities: Build and optimise data pipelines andETL workflows using Microsoft Fabric and Azure Synapse or Databricks. Implement scalable solutions for data ingestion, storage, and transformation. Develop clean, reusable Python code for data engineering tasks. Research and integrate More ❯
Crawley, West Sussex, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
your chance to work on exciting projects, design robust data architectures, and help organisations unlock the full potential of their data. Key Responsibilities: Build and optimise data pipelines andETL workflows using Microsoft Fabric and Azure Synapse or Databricks. Implement scalable solutions for data ingestion, storage, and transformation. Develop clean, reusable Python code for data engineering tasks. Research and integrate More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
your chance to work on exciting projects, design robust data architectures, and help organisations unlock the full potential of their data. Key Responsibilities: Build and optimise data pipelines andETL workflows using Microsoft Fabric and Azure Synapse or Databricks. Implement scalable solutions for data ingestion, storage, and transformation. Develop clean, reusable Python code for data engineering tasks. Research and integrate More ❯
caching. Run performance tests and ensure workloads are cost-efficient and reliable. Apply data quality, lineage, and access controls using Unity Catalog and governance best practices. Develop reusable PySpark ETLand data transformation code. Manage Delta Lake tables with ACID transactions, schema evolution, and time-travel capabilities. Integrate Databricks with Azure ADLS, Key Vault, Azure Functions, and related services. Work More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
eTech Partners
ecosystem. Writing advanced SQL and optimising complex queries. Building and orchestrating data pipelines across modern cloud platforms. Applying expert knowledge of data integration tools and methods. Designing ELT/ETL workflows using medallion architecture and lakehouse principles. Developing and maintaining data warehouses and data marts. Skills Needed 2+ years’ experience in data engineering, with at least 1 year on Microsoft More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
eTech Partners
ecosystem. Writing advanced SQL and optimising complex queries. Building and orchestrating data pipelines across modern cloud platforms. Applying expert knowledge of data integration tools and methods. Designing ELT/ETL workflows using medallion architecture and lakehouse principles. Developing and maintaining data warehouses and data marts. Skills Needed 2+ years’ experience in data engineering, with at least 1 year on Microsoft More ❯