scientists, analysts, and software engineers to ensure the company's data strategy underpins their innovative financial products. Key Responsibilities: Lead the design, development, and optimisation of data pipelines andETL processes. Architect scalable data solutions to support analytics, machine learning, and real-time financial applications. Drive best practices for data engineering, ensuring high levels of data quality, governance, and security. More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
CV TECHNICAL LTD
scientists, analysts, and software engineers to ensure the company's data strategy underpins their innovative financial products. Key Responsibilities: Lead the design, development, and optimisation of data pipelines andETL processes. Architect scalable data solutions to support analytics, machine learning, and real-time financial applications. Drive best practices for data engineering, ensuring high levels of data quality, governance, and security. More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Tata Consultancy Services
Apache Spark Proven experience in Snowflake data engineering, including: Snowflake SQL, Snowpipe, Streams & Tasks, and performance optimization Integration with AWS services and orchestration tools Expertise in data integration patterns, ETL/ELT, and data pipeline orchestration Experience with data quality frameworks, metadata management, and data lineage Hands-on experience with machine learning pipelines and generative AI engineering Familiarity with DevOps More ❯
Apache Spark Proven experience in Snowflake data engineering, including: Snowflake SQL, Snowpipe, Streams & Tasks, and performance optimization Integration with AWS services and orchestration tools Expertise in data integration patterns, ETL/ELT, and data pipeline orchestration Experience with data quality frameworks, metadata management, and data lineage Hands-on experience with machine learning pipelines and generative AI engineering Familiarity with DevOps More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Opus Recruitment Solutions
internal best practices What You’ll Bring: Strong experience in data engineering with Microsoft Fabric Solid understanding of DataOps, CI/CD, and automation Hands-on experience with Jira, ETL/ELT, and data modelling Familiarity with Power BI, DAX, or Azure DevOps Excellent communication and stakeholder engagement skills Consulting or client-facing experience is a plus 🌱 Career Progression: Clear More ❯
internal best practices What You’ll Bring: Strong experience in data engineering with Microsoft Fabric Solid understanding of DataOps, CI/CD, and automation Hands-on experience with Jira, ETL/ELT, and data modelling Familiarity with Power BI, DAX, or Azure DevOps Excellent communication and stakeholder engagement skills Consulting or client-facing experience is a plus 🌱 Career Progression: Clear More ❯
/recovery, and maintenance planning. Ability to independently gather and analyse requirements and translate them into technical database solutions. Solid understanding of relational database concepts and normalisation. Experience with ETL processes, data integration, and reporting tools (e.g., SSIS, SSRS, Power BI). Excellent problem-solving skills and attention to detail. Strong communication and interpersonal skills; ability to work effectively with More ❯
Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT orchestration for structured and unstructured data Proficiency in: PySpark, T-SQL, Notebooks and advanced data manipulation Performance monitoring and orchestration of Fabric solutions Power BI semantic models andMore ❯
City of London, London, United Kingdom Hybrid/Remote Options
Opus Recruitment Solutions
Data Factory, Lakehouse, Power BI) Strong proficiency in SQL, DAX, and Power Query (M) Experience with Azure Data Services (Synapse, Data Lake, Azure SQL) Solid understanding of data modelling, ETL processes, and BI architecture Familiarity with CI/CD pipelines, DevOps, and version control (Git) Excellent communication and stakeholder management skills Ability to work independently and lead technical delivery Desirable More ❯
Data Factory, Lakehouse, Power BI) Strong proficiency in SQL, DAX, and Power Query (M) Experience with Azure Data Services (Synapse, Data Lake, Azure SQL) Solid understanding of data modelling, ETL processes, and BI architecture Familiarity with CI/CD pipelines, DevOps, and version control (Git) Excellent communication and stakeholder management skills Ability to work independently and lead technical delivery Desirable More ❯
design, build, and optimise scalable data pipelines and lakehouse architectures on Azure, enabling advanced analytics and data-driven decision making across the business. Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Azure Databricks, PySpark, and Delta Lake. Build and optimise data lakehouse architectures on Azure Data Lake Storage ( ADLS ) . Develop high-performance data solutions using Azure More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
recruitment22
data modellers, and reporting teams to ensure the curated data supports deeper insights into corporate performance Optimise data pipelines for scalability, reliability, and maintainability using best practices (e.g., modular ETL design, version control, CI/CD) Strong understanding of Microsoft Fabric architecture and components Expertise in Microsoft Fabric Data Engineering Fabric Dataflows/Azure Data Factory Experience with Azure Synapse More ❯
Extensive experience in designing cloud data platforms using Azure, AWS, or exceptional on-premise design expertise. At least 5 years in data engineering or business intelligence roles. Proficiency in ETLand data pipeline design, with a technology-agnostic approach. A solid understanding of data warehouse and data lake principles. Expert SQL skills and demonstrable data modelling capabilities. About the Company More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Asset Resourcing Limited
Extensive experience in designing cloud data platforms using Azure, AWS, or exceptional on-premise design expertise. At least 5 years in data engineering or business intelligence roles. Proficiency in ETLand data pipeline design, with a technology-agnostic approach. A solid understanding of data warehouse and data lake principles. Expert SQL skills and demonstrable data modelling capabilities. About the Company More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Asset Resourcing
Extensive experience in designing cloud data platforms using Azure, AWS, or exceptional on-premise design expertise. - At least 5 years in data engineering or business intelligence roles. - Proficiency in ETLand data pipeline design, with a technology-agnostic approach. - A solid understanding of data warehouse and data lake principles. - Expert SQL skills and demonstrable data modelling capabilities. About the Company More ❯
background in hands-on development of platforms/dashboards inc SQL, ADF, PowerBI etc Management and leadership of multi-disciplinary teams within data – Data Science, Data Analysis, Engineering/ETL, Data Visualisation Experience of developing and, critically, delivering AI/NLP/ML and predictive analytics capability within commercial environments. Experience of ensuring that all data-related activities comply with More ❯
technology domains including CMS, CRM, Martech platforms, data pipelines, analytics, and cloud services. Exposure to CDPs (e.g., Bloomreach, Segment, BlueConic, Tealium) and data integration pipelines. Understanding of data modelling, ETL processes, and basic analytics or BI tools like Power BI or Tableau. Experience in the sports, entertainment, or fan engagement domain is a strong plus. What can we offer you More ❯
to data engineering teams, driving innovation and best practices in data cloud implementations Design, develop, and implement scalable data solutions using modern cloud data platforms Architect and deliver robust ETL/ELT pipelines and data integration solutions for enterprise clients Drive technical excellence across projects, establishing coding standards, best practices, and quality assurance processes Collaborate with cross-functional teams including … data platforms: Snowflake, Databricks, AWS Redshift, Microsoft Fabric, or similar Understanding of data modelling principles, dimensional modelling, and database design Proficiency in SQL and query optimization Comprehensive knowledge of ETL/ELT processes and data pipeline architecture Excellent communication skills with the ability to collaborate across cross-functional teams Experience managing client relationships at various levels Strong problem-solving abilities More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Billigence
to data engineering teams, driving innovation and best practices in data cloud implementations Design, develop, and implement scalable data solutions using modern cloud data platforms Architect and deliver robust ETL/ELT pipelines and data integration solutions for enterprise clients Drive technical excellence across projects, establishing coding standards, best practices, and quality assurance processes Collaborate with cross-functional teams including … data platforms: Snowflake, Databricks, AWS Redshift, Microsoft Fabric, or similar Understanding of data modelling principles, dimensional modelling, and database design Proficiency in SQL and query optimization Comprehensive knowledge of ETL/ELT processes and data pipeline architecture Excellent communication skills with the ability to collaborate across cross-functional teams Experience managing client relationships at various levels Strong problem-solving abilities More ❯
and business intelligence (BI) systems document source-to-target mappings re-engineer manual data flows to enable scaling and repeatable use support the build of data streaming systems write ETL (extract, transform, load) scripts and code to ensure the ETL process performs optimally develop business intelligence reports that can be reused build accessible data for analysis Skills needed for this More ❯
and business intelligence (BI) systems document source-to-target mappings re-engineer manual data flows to enable scaling and repeatable use support the build of data streaming systems write ETL (extract, transform, load) scripts and code to ensure the ETL process performs optimally develop business intelligence reports that can be reused build accessible data for analysis Skills needed for this More ❯
technical solutions. Maintain clear documentation and contribute to internal best practices. Requirements Strong hands-on experience with PySpark (RDDs, DataFrames, Spark SQL). Proven ability to build and optimise ETL pipelines and dataflows. Familiar with Microsoft Fabric or similar lakehouse/data platform environments. Experience with Git, CI/CD pipelines, and automated deployment. Knowledge of market data, transactional systems More ❯
in Python and SQL Experience with modern data tools (dbt, Airflow, Prefect, Dagster, etc.) Knowledge of cloud platforms like AWS , GCP , or Azure An understanding of data modelling andETL best practices Curiosity, creativity, and a mindset that thrives in fast-moving environments Why You’ll Love It Work on meaningful data challenges that directly impact their products Small, high More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Harrington Starr
transforming raw data into actionable intelligence, working closely with data scientists, quants, and business stakeholders to shape cutting-edge betting products. Key Responsibilities Build and optimise data pipelines andETL workflows in AWS using Python and SQL. Partner with analysts and quants to deliver reliable datasets for predictive modelling and pricing. Design and maintain data models supporting trading, risk, andMore ❯
transforming raw data into actionable intelligence, working closely with data scientists, quants, and business stakeholders to shape cutting-edge betting products. Key Responsibilities Build and optimise data pipelines andETL workflows in AWS using Python and SQL. Partner with analysts and quants to deliver reliable datasets for predictive modelling and pricing. Design and maintain data models supporting trading, risk, andMore ❯