Senior Data Engineer – SC Cleared We are seeking a hands-on Senior Data Engineer with deep expertise in building and managing streaming and batch datapipelines . The ideal candidate will have strong experience working with large-scale data systems operating on cloud-based platforms such as AWS, Databricks or Snowflake . … This role also involves close collaboration with hyperscalers and data platform vendors to evaluate and document Proofs of Concept (PoCs) for modern data platforms, while effectively engaging with senior stakeholders across the organisation. Key Responsibilities: Design, develop, and maintain streaming and batch datapipelines using modern data engineering tools and frameworks. Work … with large volumes of structured and unstructured data , ensuring high performance and scalability. Collaborate with cloud providers and data platform vendors (e.g., AWS, Microsoft Azure, Databricks, IBM, Snowflake) to conduct PoCs for data platform solutions. Evaluate PoC outcomes and provide comprehensive documentation including architecture, performance benchmarks, and recommendations. Engage with key stakeholders including Heads More ❯
/Engineer Location: 3 Days p/w Central London Rate: £450 Outside IR35 Duration: 6 Months + Extensions This role is responsible for migrating and developing modern data and analytics solutions using Microsoft Fabric and Azure technologies. The position supports enterprise-wide BI initiatives, delivering data models, pipelines, dashboards and reporting assets that enable data-driven decision-making. Key Responsibilities Lead and implement data migration into Microsoft Fabric for analytics and reporting use cases Design and build data models from multiple sources to generate actionable insights Maintain and optimise data warehouse platforms; identify and resolve issues Develop datapipelines using Fabric Pipelines, Azure Data … Lakehouse and Warehouse environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, DataMore ❯
/Engineer Location: 3 Days p/w Central London Rate: £450 Outside IR35 Duration: 6 Months + Extensions This role is responsible for migrating and developing modern data and analytics solutions using Microsoft Fabric and Azure technologies. The position supports enterprise-wide BI initiatives, delivering data models, pipelines, dashboards and reporting assets that enable data-driven decision-making. Key Responsibilities Lead and implement data migration into Microsoft Fabric for analytics and reporting use cases Design and build data models from multiple sources to generate actionable insights Maintain and optimise data warehouse platforms; identify and resolve issues Develop datapipelines using Fabric Pipelines, Azure Data … Lakehouse and Warehouse environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, DataMore ❯
/Engineer Location: 3 Days p/w Central London Rate: £450 Outside IR35 Duration: 6 Months + Extensions This role is responsible for migrating and developing modern data and analytics solutions using Microsoft Fabric and Azure technologies. The position supports enterprise-wide BI initiatives, delivering data models, pipelines, dashboards and reporting assets that enable data-driven decision-making. Key Responsibilities Lead and implement data migration into Microsoft Fabric for analytics and reporting use cases Design and build data models from multiple sources to generate actionable insights Maintain and optimise data warehouse platforms; identify and resolve issues Develop datapipelines using Fabric Pipelines, Azure Data … Lakehouse and Warehouse environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, DataMore ❯
london (city of london), south east england, united kingdom
Sanderson
/Engineer Location: 3 Days p/w Central London Rate: £450 Outside IR35 Duration: 6 Months + Extensions This role is responsible for migrating and developing modern data and analytics solutions using Microsoft Fabric and Azure technologies. The position supports enterprise-wide BI initiatives, delivering data models, pipelines, dashboards and reporting assets that enable data-driven decision-making. Key Responsibilities Lead and implement data migration into Microsoft Fabric for analytics and reporting use cases Design and build data models from multiple sources to generate actionable insights Maintain and optimise data warehouse platforms; identify and resolve issues Develop datapipelines using Fabric Pipelines, Azure Data … Lakehouse and Warehouse environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, DataMore ❯
/Engineer Location: 3 Days p/w Central London Rate: £450 Outside IR35 Duration: 6 Months + Extensions This role is responsible for migrating and developing modern data and analytics solutions using Microsoft Fabric and Azure technologies. The position supports enterprise-wide BI initiatives, delivering data models, pipelines, dashboards and reporting assets that enable data-driven decision-making. Key Responsibilities Lead and implement data migration into Microsoft Fabric for analytics and reporting use cases Design and build data models from multiple sources to generate actionable insights Maintain and optimise data warehouse platforms; identify and resolve issues Develop datapipelines using Fabric Pipelines, Azure Data … Lakehouse and Warehouse environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, DataMore ❯
Contract Role: Snowflake Data Architect/Consultant Location: London or Manchester (Hybrid – 2/3 days onsite pw) Duration: 6 months initial (poss extensions) Engagement: Outside IR35 Overview We’re seeking a highly capable Snowflake Data Architect/Consultant to help design, optimise, and embed modern cloud data architectures. The role will blend strategic … input, hands-on delivery, and technical leadership across data modelling, integration, and platform enablement. Key Responsibilities Platform Architecture Design and implement scalable data environments on Snowflake , ensuring performance, security, and reusability. Establish integration patterns between Snowflake and wider ecosystems including ETL/ELT tools , BI platforms, and cloud services (AWS, Azure, GCP). Delivery & Optimisation Lead … and analytics teams to embed Snowflake best practices. Create documentation and reusable assets that support long-term client self-sufficiency. Strategic Advisory Contribute to the definition of enterprise data roadmaps, covering topics such as data mesh , data sharing , and AI/ML enablement . Collaborate with platform partners and stakeholders to align delivery with More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Robertson Sumner
Contract Role: Snowflake Data Architect/Consultant Location: London or Manchester (Hybrid – 2/3 days onsite pw) Duration: 6 months initial (poss extensions) Engagement: Outside IR35 Overview We’re seeking a highly capable Snowflake Data Architect/Consultant to help design, optimise, and embed modern cloud data architectures. The role will blend strategic … input, hands-on delivery, and technical leadership across data modelling, integration, and platform enablement. Key Responsibilities Platform Architecture Design and implement scalable data environments on Snowflake , ensuring performance, security, and reusability. Establish integration patterns between Snowflake and wider ecosystems including ETL/ELT tools , BI platforms, and cloud services (AWS, Azure, GCP). Delivery & Optimisation Lead … and analytics teams to embed Snowflake best practices. Create documentation and reusable assets that support long-term client self-sufficiency. Strategic Advisory Contribute to the definition of enterprise data roadmaps, covering topics such as data mesh , data sharing , and AI/ML enablement . Collaborate with platform partners and stakeholders to align delivery with More ❯
london, south east england, united kingdom Hybrid / WFH Options
Robertson Sumner
Contract Role: Snowflake Data Architect/Consultant Location: London or Manchester (Hybrid – 2/3 days onsite pw) Duration: 6 months initial (poss extensions) Engagement: Outside IR35 Overview We’re seeking a highly capable Snowflake Data Architect/Consultant to help design, optimise, and embed modern cloud data architectures. The role will blend strategic … input, hands-on delivery, and technical leadership across data modelling, integration, and platform enablement. Key Responsibilities Platform Architecture Design and implement scalable data environments on Snowflake , ensuring performance, security, and reusability. Establish integration patterns between Snowflake and wider ecosystems including ETL/ELT tools , BI platforms, and cloud services (AWS, Azure, GCP). Delivery & Optimisation Lead … and analytics teams to embed Snowflake best practices. Create documentation and reusable assets that support long-term client self-sufficiency. Strategic Advisory Contribute to the definition of enterprise data roadmaps, covering topics such as data mesh , data sharing , and AI/ML enablement . Collaborate with platform partners and stakeholders to align delivery with More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Robertson Sumner
Contract Role: Snowflake Data Architect/Consultant Location: London or Manchester (Hybrid – 2/3 days onsite pw) Duration: 6 months initial (poss extensions) Engagement: Outside IR35 Overview We’re seeking a highly capable Snowflake Data Architect/Consultant to help design, optimise, and embed modern cloud data architectures. The role will blend strategic … input, hands-on delivery, and technical leadership across data modelling, integration, and platform enablement. Key Responsibilities Platform Architecture Design and implement scalable data environments on Snowflake , ensuring performance, security, and reusability. Establish integration patterns between Snowflake and wider ecosystems including ETL/ELT tools , BI platforms, and cloud services (AWS, Azure, GCP). Delivery & Optimisation Lead … and analytics teams to embed Snowflake best practices. Create documentation and reusable assets that support long-term client self-sufficiency. Strategic Advisory Contribute to the definition of enterprise data roadmaps, covering topics such as data mesh , data sharing , and AI/ML enablement . Collaborate with platform partners and stakeholders to align delivery with More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Ashdown Group
Data Delivery and Integration Manager - Full time permanent role - Retail industry - Remote/home-based - Salary up to £75,000 plus bonus, private healthcare and more! A large and growing retail business are looking to expand their IT function with the addition of a Data Integration Manager. This role will focus on end to end data engineering and integration solutions for a company wide modernisation programme. Duties will include: - Managing the processes for data engineering and data integration in order to align to business needs - Leading a team across engineering and integration - Work to ensure the data infrastructure of the business effectively supports analytics, AL, machine learning ML … and reporting - Designing of datapipelines, ETL, ELT processes and architecture - Work with internal stakeholders and multiple 3rd parties to seamlessly integrate systems, platforms and applications To be considered suitable for this Data Engineering Manager role you will need to have experience across the following: - Team leadership and guidance - Data engineering and aligning of More ❯
The Role This is a fantastic opportunity to join one of the largest integrated energy and commodity trading companies in the world. We are looking for a Senior Data Engineer with strong technical expertise in Databricks , data engineering , and cloud-native analytics platforms . You will contribute to the development and expansion of our global analytics … platform —supporting Front Office Trading across commodities—by building scalable, secure, and efficient data solutions. You will work alongside data scientists, ML engineers, and business stakeholders to understand requirements, design and build robust datapipelines, and deliver end-to-end analytics and ML/AI capabilities. Key Responsibilities Design, build, and maintain scalable data … Apache Spark, PySpark, and Databricks—including experience with Delta Lake, Unity Catalog, MLflow, and Databricks Workflows. Deep proficiency in Python and SQL, with proven experience building modular, testable, reusable pipeline components. Strong experience with AWS cloud services including S3, Lambda, Glue, API Gateway, IAM, EC2, EKS, and integration of AWS-native components with Databricks. Advanced skills in Infrastructure as More ❯
The Role This is a fantastic opportunity to join one of the largest integrated energy and commodity trading companies in the world. We are looking for a Senior Data Engineer with strong technical expertise in Databricks , data engineering , and cloud-native analytics platforms . You will contribute to the development and expansion of our global analytics … platform —supporting Front Office Trading across commodities—by building scalable, secure, and efficient data solutions. You will work alongside data scientists, ML engineers, and business stakeholders to understand requirements, design and build robust datapipelines, and deliver end-to-end analytics and ML/AI capabilities. Key Responsibilities Design, build, and maintain scalable data … Apache Spark, PySpark, and Databricks—including experience with Delta Lake, Unity Catalog, MLflow, and Databricks Workflows. Deep proficiency in Python and SQL, with proven experience building modular, testable, reusable pipeline components. Strong experience with AWS cloud services including S3, Lambda, Glue, API Gateway, IAM, EC2, EKS, and integration of AWS-native components with Databricks. Advanced skills in Infrastructure as More ❯
The Role This is a fantastic opportunity to join one of the largest integrated energy and commodity trading companies in the world. We are looking for a Senior Data Engineer with strong technical expertise in Databricks , data engineering , and cloud-native analytics platforms . You will contribute to the development and expansion of our global analytics … platform —supporting Front Office Trading across commodities—by building scalable, secure, and efficient data solutions. You will work alongside data scientists, ML engineers, and business stakeholders to understand requirements, design and build robust datapipelines, and deliver end-to-end analytics and ML/AI capabilities. Key Responsibilities Design, build, and maintain scalable data … Apache Spark, PySpark, and Databricks—including experience with Delta Lake, Unity Catalog, MLflow, and Databricks Workflows. Deep proficiency in Python and SQL, with proven experience building modular, testable, reusable pipeline components. Strong experience with AWS cloud services including S3, Lambda, Glue, API Gateway, IAM, EC2, EKS, and integration of AWS-native components with Databricks. Advanced skills in Infrastructure as More ❯
london (city of london), south east england, united kingdom
Mercuria
The Role This is a fantastic opportunity to join one of the largest integrated energy and commodity trading companies in the world. We are looking for a Senior Data Engineer with strong technical expertise in Databricks , data engineering , and cloud-native analytics platforms . You will contribute to the development and expansion of our global analytics … platform —supporting Front Office Trading across commodities—by building scalable, secure, and efficient data solutions. You will work alongside data scientists, ML engineers, and business stakeholders to understand requirements, design and build robust datapipelines, and deliver end-to-end analytics and ML/AI capabilities. Key Responsibilities Design, build, and maintain scalable data … Apache Spark, PySpark, and Databricks—including experience with Delta Lake, Unity Catalog, MLflow, and Databricks Workflows. Deep proficiency in Python and SQL, with proven experience building modular, testable, reusable pipeline components. Strong experience with AWS cloud services including S3, Lambda, Glue, API Gateway, IAM, EC2, EKS, and integration of AWS-native components with Databricks. Advanced skills in Infrastructure as More ❯
Data Engineer/PostgreSQL/SQL/DataPipelines/Apache Superset/PowerBI/Tableau/Terraform Data Engineer (Outside IR35 Contract role) Determination: Outside IR35 Day Rate: Up to £575 per day Location: Hybrid Zone 1 Duration: 3 months (initial) Job Title: Data Engineer About the role: We're on … the lookout for an experience Data Engineer for a SaaS Scale-up in the tech for good space. You'll be involved in the full end-to-end process, building datapipelines and dashboards. Data Engineer/PostgreSQL/SQL/DataPipelines/Apache Superset/PowerBI/Tableau/Terraform … Requirements: 5+years' experience with PostgresSQL, SQL & Terrform Demonstrable experience with building datapipelines from scratch 3+ years' Dashboarding/Building Dashboards, (Apache Superset/PowerBI/Tableau) Experience with building report templates. If you are an experienced data engineer with experience building datapipelines please apply, or send your CV directly to callum@acquiredtalent.co.uk More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Acquired Talent Ltd
Data Engineer/PostgreSQL/SQL/DataPipelines/Apache Superset/PowerBI/Tableau/Terraform Data Engineer (Outside IR35 Contract role) Determination: Outside IR35 Day Rate: Up to £575 per day Location: Hybrid Zone 1 Duration: 3 months (initial) Job Title: Data Engineer About the role: We're on … the lookout for an experience Data Engineer for a SaaS Scale-up in the tech for good space. You'll be involved in the full end-to-end process, building datapipelines and dashboards. Data Engineer/PostgreSQL/SQL/DataPipelines/Apache Superset/PowerBI/Tableau/Terraform … Requirements: 5+years' experience with PostgresSQL, SQL & Terrform Demonstrable experience with building datapipelines from scratch 3+ years' Dashboarding/Building Dashboards, (Apache Superset/PowerBI/Tableau) Experience with building report templates. If you are an experienced data engineer with experience building datapipelines please apply, or send your CV directly to callum@acquiredtalent.co.uk More ❯
london, south east england, united kingdom Hybrid / WFH Options
Acquired Talent Ltd
Data Engineer/PostgreSQL/SQL/DataPipelines/Apache Superset/PowerBI/Tableau/Terraform Data Engineer (Outside IR35 Contract role) Determination: Outside IR35 Day Rate: Up to £575 per day Location: Hybrid Zone 1 Duration: 3 months (initial) Job Title: Data Engineer About the role: We're on … the lookout for an experience Data Engineer for a SaaS Scale-up in the tech for good space. You'll be involved in the full end-to-end process, building datapipelines and dashboards. Data Engineer/PostgreSQL/SQL/DataPipelines/Apache Superset/PowerBI/Tableau/Terraform … Requirements: 5+years' experience with PostgresSQL, SQL & Terrform Demonstrable experience with building datapipelines from scratch 3+ years' Dashboarding/Building Dashboards, (Apache Superset/PowerBI/Tableau) Experience with building report templates. If you are an experienced data engineer with experience building datapipelines please apply, or send your CV directly to callum@acquiredtalent.co.uk More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Acquired Talent Ltd
Data Engineer/PostgreSQL/SQL/DataPipelines/Apache Superset/PowerBI/Tableau/Terraform Data Engineer (Outside IR35 Contract role) Determination: Outside IR35 Day Rate: Up to £575 per day Location: Hybrid Zone 1 Duration: 3 months (initial) Job Title: Data Engineer About the role: We're on … the lookout for an experience Data Engineer for a SaaS Scale-up in the tech for good space. You'll be involved in the full end-to-end process, building datapipelines and dashboards. Data Engineer/PostgreSQL/SQL/DataPipelines/Apache Superset/PowerBI/Tableau/Terraform … Requirements: 5+years' experience with PostgresSQL, SQL & Terrform Demonstrable experience with building datapipelines from scratch 3+ years' Dashboarding/Building Dashboards, (Apache Superset/PowerBI/Tableau) Experience with building report templates. If you are an experienced data engineer with experience building datapipelines please apply, or send your CV directly to callum@acquiredtalent.co.uk More ❯
N5, Highbury East, Greater London, Highbury, United Kingdom
Retelligence
Lead Data Engineer Salary/Rate : £100,000 - £120,000 per annum + Bonus Location : North London Company : Retelligence About Retelligence Retelligence is partnering with a high-growth, forward-thinking organization that specializes in digital innovation and marketing across international markets. The company is on an exciting journey, rapidly scaling its capabilities and leveraging advanced technology to deliver … cutting-edge solutions. Join a dynamic team within a business that values innovation, supports professional development, and offers exceptional career progression. The Role Retelligence is seeking a Lead Data Engineer to take a hands-on role in designing and delivering robust, real-time datapipelines and infrastructure in a Google Cloud Platform (GCP) environment. The company … Build and optimize data models for querying and analytics use cases. Develop fault-tolerant, highly available data ingestion and processing pipelines. Continuously monitor and improve pipeline performance for low-latency and high-throughput operations. Ensure data quality, integrity, and security across all systems. Implement effective monitoring, logging, and alerting mechanisms. Collaborate with product More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Robert Half Technology are assisting a cutting edge AI organisation to recruit a Data Engineer on a contract basis - remote working - UK based A Junior Data Engineer who's excited to help build the data foundations that power cutting-edge AI solutions. You'll join a high-impact team working at the intersection of … data, analytics, and machine learning - designing pipelines and infrastructure that make innovation possible at scale. Role Design, build, and maintain scalable datapipelines that fuel AI and analytics initiatives. Partner closely with data scientists, analysts, and engineers to deliver clean, structured, and reliable data. Develop robust data transformations in Python and SQL … Engineer will ideally have 2-5 years of experience in data engineering or a similar role. Strong Python skills for data processing and pipeline development. Proven experience writing complex and efficient SQL. Hands-on Snowflake experience (data modelling, performance tuning, pipelines). Familiarity with orchestration tools (e.g., Airflow, dbt) is a More ❯
Use modern cloud-native tools and CI/CD platforms to maintain secure, observable, and high-performing environments. Relevant Skills: Essential Skills: Proficiency in Python or Scala for data … processing Strong SQL skills for querying and managing relational databases Experience with AWS services including S3, Glue, Redshift, Lambda, and Athena Knowledge of ETL processes and datapipeline development Understanding of data modelling and data warehousing concepts Familiarity with version control systems, particularly Git Desirable Skills: Experience with infrastructure as code tools such … as Terraform or CloudFormation Exposure to Apache Spark for distributed data processing Familiarity with workflow orchestration tools such as Airflow or AWS Step Functions Understanding of containerisation using Docker Experience with CI/CD pipelines and automated deployment processes Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community More ❯
Use modern cloud-native tools and CI/CD platforms to maintain secure, observable, and high-performing environments. Relevant Skills: Essential Skills: Proficiency in Python or Scala for data … processing Strong SQL skills for querying and managing relational databases Experience with AWS services including S3, Glue, Redshift, Lambda, and Athena Knowledge of ETL processes and datapipeline development Understanding of data modelling and data warehousing concepts Familiarity with version control systems, particularly Git Desirable Skills: Experience with infrastructure as code tools such … as Terraform or CloudFormation Exposure to Apache Spark for distributed data processing Familiarity with workflow orchestration tools such as Airflow or AWS Step Functions Understanding of containerisation using Docker Experience with CI/CD pipelines and automated deployment processes Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Futuria
About the Role As a Data Engineer at Futuria, you’ll play a central role in designing, building, and maintaining the scalable data infrastructure that powers our AI models and intelligent applications. You’ll collaborate with AI engineers, data scientists, and product teams to ensure the seamless flow of high-quality, reliable data that drives performance and insight across the platform. Key Responsibilities Design and implement scalable, secure, and reliable datapipelines to support procedural workflow orchestration and AI agent workflows Develop and manage robust data ingestion processes (batch and streaming) from diverse sources Collaborate with AI engineers and product teams to define data requirements … and integrate pipelines seamlessly with models and applications. Build and maintain ETL/ELT processes that ensure data integrity, consistency, and accuracy across systems. Optimize data infrastructure for performance, cost efficiency, and scalability in cloud environments. Develop and manage graph-based data systems (e.g. Kuzu, Neo4j, Apache AGE) to model and query complex relationships More ❯
About the Role As a Data Engineer at Futuria, you’ll play a central role in designing, building, and maintaining the scalable data infrastructure that powers our AI models and intelligent applications. You’ll collaborate with AI engineers, data scientists, and product teams to ensure the seamless flow of high-quality, reliable data that drives performance and insight across the platform. Key Responsibilities Design and implement scalable, secure, and reliable datapipelines to support procedural workflow orchestration and AI agent workflows Develop and manage robust data ingestion processes (batch and streaming) from diverse sources Collaborate with AI engineers and product teams to define data requirements … and integrate pipelines seamlessly with models and applications. Build and maintain ETL/ELT processes that ensure data integrity, consistency, and accuracy across systems. Optimize data infrastructure for performance, cost efficiency, and scalability in cloud environments. Develop and manage graph-based data systems (e.g. Kuzu, Neo4j, Apache AGE) to model and query complex relationships More ❯