London, South East, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
from the warehouse. Maintaining and improving the existing Python-based dynamic pricing model and web scraping tools. What we're looking for: Experience across a modern data stack for ETL/ELT processes and data warehousing. Strong SQL and Python skills, with an understanding of Kimball-style data modelling . Experience with DBT, Dagster, or Airflow for transformation and orchestration. More ❯
and shape the direction of the platform as it evolves, pushing the boundaries of what's possible with data and AI. What You'll Do Design & build high-performance ETL/ELT pipelines in modern cloud environments (including Azure, AWS, GCP, Snowflake or Databricks). Lead CI/CD automation, environment versioning, and production deployments for data products. Integrate AI More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Oscar Associates (UK) Limited
This role has a direct opportunity to grow into a Head of Data and AI position. Key Responsibilities Data Engineering & Architecture Lead the development and maintenance of data pipelines, ETL processes, and warehouse architecture (GCP, Azure). Ensure high-quality, scalable, and secure data infrastructure that supports campaign reporting and advanced analytics. Design and support the delivery of AI andMore ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Akkodis
WFHDuration: 3 months rolling contractType of contract : Freelance, Inside IR35Level: mid-Senior Duties and Tasks: Develop and optimize data pipelines using Databricks and Spark.Design and implement data models andETL processes in Snowflake.Collaborate with data scientists, analysts, and business stakeholders to understand data requirements.Ensure data quality, integrity, and security across platforms.Monitor and troubleshoot data workflows and performance issues. Requirements: Proven More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Focused Futures Consultancy LTD
and lakehouses using Databricks , with additional exposure to Snowflake, Azure Synapse, or Fabric being a bonus. Data Modelling – Build and optimise enterprise-grade data models across varied data layers. ETL/ELT Engineering – Use tooling such as Databricks, SSIS, ADF, Informatica, IBM DataStage to drive efficient data ingestion and transformation. Data Governance – Implement governance and MDM using tools like Unity More ❯
and lakehouses using Databricks , with additional exposure to Snowflake, Azure Synapse, or Fabric being a bonus. Data Modelling – Build and optimise enterprise-grade data models across varied data layers. ETL/ELT Engineering – Use tooling such as Databricks, SSIS, ADF, Informatica, IBM DataStage to drive efficient data ingestion and transformation. Data Governance – Implement governance and MDM using tools like Unity More ❯
scientists, analysts, and software engineers to ensure the company's data strategy underpins their innovative financial products. Key Responsibilities: Lead the design, development, and optimisation of data pipelines andETL processes. Architect scalable data solutions to support analytics, machine learning, and real-time financial applications. Drive best practices for data engineering, ensuring high levels of data quality, governance, and security. More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
CV TECHNICAL LTD
scientists, analysts, and software engineers to ensure the company's data strategy underpins their innovative financial products. Key Responsibilities: Lead the design, development, and optimisation of data pipelines andETL processes. Architect scalable data solutions to support analytics, machine learning, and real-time financial applications. Drive best practices for data engineering, ensuring high levels of data quality, governance, and security. More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Tata Consultancy Services
Apache Spark Proven experience in Snowflake data engineering, including: Snowflake SQL, Snowpipe, Streams & Tasks, and performance optimization Integration with AWS services and orchestration tools Expertise in data integration patterns, ETL/ELT, and data pipeline orchestration Experience with data quality frameworks, metadata management, and data lineage Hands-on experience with machine learning pipelines and generative AI engineering Familiarity with DevOps More ❯
Apache Spark Proven experience in Snowflake data engineering, including: Snowflake SQL, Snowpipe, Streams & Tasks, and performance optimization Integration with AWS services and orchestration tools Expertise in data integration patterns, ETL/ELT, and data pipeline orchestration Experience with data quality frameworks, metadata management, and data lineage Hands-on experience with machine learning pipelines and generative AI engineering Familiarity with DevOps More ❯
internal best practices What You’ll Bring: Strong experience in data engineering with Microsoft Fabric Solid understanding of DataOps, CI/CD, and automation Hands-on experience with Jira, ETL/ELT, and data modelling Familiarity with Power BI, DAX, or Azure DevOps Excellent communication and stakeholder engagement skills Consulting or client-facing experience is a plus 🌱 Career Progression: Clear More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Opus Recruitment Solutions
internal best practices What You’ll Bring: Strong experience in data engineering with Microsoft Fabric Solid understanding of DataOps, CI/CD, and automation Hands-on experience with Jira, ETL/ELT, and data modelling Familiarity with Power BI, DAX, or Azure DevOps Excellent communication and stakeholder engagement skills Consulting or client-facing experience is a plus 🌱 Career Progression: Clear More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Opus Recruitment Solutions
Data Factory, Lakehouse, Power BI) Strong proficiency in SQL, DAX, and Power Query (M) Experience with Azure Data Services (Synapse, Data Lake, Azure SQL) Solid understanding of data modelling, ETL processes, and BI architecture Familiarity with CI/CD pipelines, DevOps, and version control (Git) Excellent communication and stakeholder management skills Ability to work independently and lead technical delivery Desirable More ❯
Data Factory, Lakehouse, Power BI) Strong proficiency in SQL, DAX, and Power Query (M) Experience with Azure Data Services (Synapse, Data Lake, Azure SQL) Solid understanding of data modelling, ETL processes, and BI architecture Familiarity with CI/CD pipelines, DevOps, and version control (Git) Excellent communication and stakeholder management skills Ability to work independently and lead technical delivery Desirable More ❯
design, build, and optimise scalable data pipelines and lakehouse architectures on Azure, enabling advanced analytics and data-driven decision making across the business. Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Azure Databricks, PySpark, and Delta Lake. Build and optimise data lakehouse architectures on Azure Data Lake Storage ( ADLS ) . Develop high-performance data solutions using Azure More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
recruitment22
data modellers, and reporting teams to ensure the curated data supports deeper insights into corporate performance Optimise data pipelines for scalability, reliability, and maintainability using best practices (e.g., modular ETL design, version control, CI/CD) Strong understanding of Microsoft Fabric architecture and components Expertise in Microsoft Fabric Data Engineering Fabric Dataflows/Azure Data Factory Experience with Azure Synapse More ❯
Extensive experience in designing cloud data platforms using Azure, AWS, or exceptional on-premise design expertise. At least 5 years in data engineering or business intelligence roles. Proficiency in ETLand data pipeline design, with a technology-agnostic approach. A solid understanding of data warehouse and data lake principles. Expert SQL skills and demonstrable data modelling capabilities. About the Company More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Asset Resourcing Limited
Extensive experience in designing cloud data platforms using Azure, AWS, or exceptional on-premise design expertise. At least 5 years in data engineering or business intelligence roles. Proficiency in ETLand data pipeline design, with a technology-agnostic approach. A solid understanding of data warehouse and data lake principles. Expert SQL skills and demonstrable data modelling capabilities. About the Company More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Asset Resourcing
Extensive experience in designing cloud data platforms using Azure, AWS, or exceptional on-premise design expertise. - At least 5 years in data engineering or business intelligence roles. - Proficiency in ETLand data pipeline design, with a technology-agnostic approach. - A solid understanding of data warehouse and data lake principles. - Expert SQL skills and demonstrable data modelling capabilities. About the Company More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Billigence
to data engineering teams, driving innovation and best practices in data cloud implementations Design, develop, and implement scalable data solutions using modern cloud data platforms Architect and deliver robust ETL/ELT pipelines and data integration solutions for enterprise clients Drive technical excellence across projects, establishing coding standards, best practices, and quality assurance processes Collaborate with cross-functional teams including … data platforms: Snowflake, Databricks, AWS Redshift, Microsoft Fabric, or similar Understanding of data modelling principles, dimensional modelling, and database design Proficiency in SQL and query optimization Comprehensive knowledge of ETL/ELT processes and data pipeline architecture Excellent communication skills with the ability to collaborate across cross-functional teams Experience managing client relationships at various levels Strong problem-solving abilities More ❯
to data engineering teams, driving innovation and best practices in data cloud implementations Design, develop, and implement scalable data solutions using modern cloud data platforms Architect and deliver robust ETL/ELT pipelines and data integration solutions for enterprise clients Drive technical excellence across projects, establishing coding standards, best practices, and quality assurance processes Collaborate with cross-functional teams including … data platforms: Snowflake, Databricks, AWS Redshift, Microsoft Fabric, or similar Understanding of data modelling principles, dimensional modelling, and database design Proficiency in SQL and query optimization Comprehensive knowledge of ETL/ELT processes and data pipeline architecture Excellent communication skills with the ability to collaborate across cross-functional teams Experience managing client relationships at various levels Strong problem-solving abilities More ❯
transforming raw data into actionable intelligence, working closely with data scientists, quants, and business stakeholders to shape cutting-edge betting products. Key Responsibilities Build and optimise data pipelines andETL workflows in AWS using Python and SQL. Partner with analysts and quants to deliver reliable datasets for predictive modelling and pricing. Design and maintain data models supporting trading, risk, andMore ❯
City of London, London, United Kingdom Hybrid/Remote Options
Harrington Starr
transforming raw data into actionable intelligence, working closely with data scientists, quants, and business stakeholders to shape cutting-edge betting products. Key Responsibilities Build and optimise data pipelines andETL workflows in AWS using Python and SQL. Partner with analysts and quants to deliver reliable datasets for predictive modelling and pricing. Design and maintain data models supporting trading, risk, andMore ❯
City of London, London, United Kingdom Hybrid/Remote Options
Recann
critical business functions. What you’ll be doing Building and maintaining scalable data pipelines using Azure Data Factory, Azure Data Fabric, and Azure Synapse Analytics. Developing robust ELT/ETL processes to integrate data from multiple business systems. Ensuring data consistency, security, and compliance (including GDPR). Supporting analytics/reporting teams with clean, structured datasets. Collaborating with IT, Finance More ❯
critical business functions. What you’ll be doing Building and maintaining scalable data pipelines using Azure Data Factory, Azure Data Fabric, and Azure Synapse Analytics. Developing robust ELT/ETL processes to integrate data from multiple business systems. Ensuring data consistency, security, and compliance (including GDPR). Supporting analytics/reporting teams with clean, structured datasets. Collaborating with IT, Finance More ❯