City of London, London, United Kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
architectures that drive measurable transformation aligned to banking sector priorities. Client Advisory & Engagement – Act as a trusted advisor, leading workshops and influencing senior stakeholders on strategy, roadmaps, and innovation. ETL/ELT & Data Engineering – Provide oversight on robust data pipeline and integration designs using SSIS, ADF, Informatica, or IBM DataStage. Data Governance & Quality – Define governance, metadata, and quality frameworks using More ❯
best practices. Participate in Agile delivery using Azure DevOps for backlog management, sprint planning, and CI/CD. Technical Skills Azure Data Factory: Expert in building, automating, and optimising ETL pipelines. Azure Synapse Analytics: Strong experience with dedicated SQL pools, data warehousing concepts, and performance tuning. Power BI: Advanced experience managing enterprise models, datasets, and governance processes. SQL: Expert-level More ❯
solutions within Microsoft Fabric (including Data Factory, Synapse, and OneLake). Advanced proficiency in Power BI, including DAX, Power Query (M), and data modelling. Deep understanding of data warehousing, ETL, and data lakehouse concepts. Strong working knowledge of Databricks, including Delta Lake and notebooks. Strong interpersonal skills with the ability to influence and communicate complex data topics clearly. Excellent analytical More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
environments and enjoys working on varied, impactful projects. Key Responsibilities Design, build, and maintain scalable data pipelines using Azure Data Factory, Databricks, and SQL-based solutions. Develop and optimise ETL/ELT workflows to support analytics, reporting, and machine learning use cases. Work closely with clients to understand data requirements and translate them into robust technical solutions. Implement best practices More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Datatech Analytics
and shape the direction of the platform as it evolves, pushing the boundaries of what’s possible with data and AI. What You’ll Do Design & build high-performance ETL/ELT pipelines in modern cloud environments (including Azure, AWS, GCP, Snowflake or Databricks). Lead CI/CD automation, environment versioning, and production deployments for data products. Integrate AI More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Oscar Associates (UK) Limited
This role has a direct opportunity to grow into a Head of Data and AI position. Key Responsibilities Data Engineering & Architecture Lead the development and maintenance of data pipelines, ETL processes, and warehouse architecture (GCP, Azure). Ensure high-quality, scalable, and secure data infrastructure that supports campaign reporting and advanced analytics. Design and support the delivery of AI andMore ❯
work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement More ❯
compliance requirements. Team Management : Recruit, mentor, and develop a high-performing data team, promoting a culture of continuous learning and professional growth. Management of a hybrid team, comprising internal ETL specialists and third-party resources to ensure the Colt DCS Data Platform Data Governance : Establish and maintain data governance frameworks, policies, and standards to ensure data quality, security, and compliance More ❯
work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement More ❯
suited to someone who's confident communicating with data, product, and engineering teams, not a "heads-down coder" type. Top 4 Core Skills Python - workflow automation, data processing, andETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (Data Build Tool) - modular data modelling, testing More ❯
Java, Scala, R, or .NET/C# . Proficiency in SQL and database design . Solid understanding of data models, data mining, analytics, and segmentation techniques . Experience with ETL pipelines and data transformation tools. Desirable/Nice to Have: Hands-on experience using Azure OpenAI , Google Colab , or other GenAI platforms. Familiarity with Microsoft Azure SQL Server , Data Warehousing More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tata Consultancy Services
Apache Spark Proven experience in Snowflake data engineering, including: Snowflake SQL, Snowpipe, Streams & Tasks, and performance optimization Integration with AWS services and orchestration tools Expertise in data integration patterns, ETL/ELT, and data pipeline orchestration Experience with data quality frameworks, metadata management, and data lineage Hands-on experience with machine learning pipelines and generative AI engineering Familiarity with DevOps More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
internal best practices What You’ll Bring: Strong experience in data engineering with Microsoft Fabric Solid understanding of DataOps, CI/CD, and automation Hands-on experience with Jira, ETL/ELT, and data modelling Familiarity with Power BI, DAX, or Azure DevOps Excellent communication and stakeholder engagement skills Consulting or client-facing experience is a plus 🌱 Career Progression: Clear More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
Data Factory, Lakehouse, Power BI) Strong proficiency in SQL, DAX, and Power Query (M) Experience with Azure Data Services (Synapse, Data Lake, Azure SQL) Solid understanding of data modelling, ETL processes, and BI architecture Familiarity with CI/CD pipelines, DevOps, and version control (Git) Excellent communication and stakeholder management skills Ability to work independently and lead technical delivery Desirable More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Asset Resourcing
Extensive experience in designing cloud data platforms using Azure, AWS, or exceptional on-premise design expertise. - At least 5 years in data engineering or business intelligence roles. - Proficiency in ETLand data pipeline design, with a technology-agnostic approach. - A solid understanding of data warehouse and data lake principles. - Expert SQL skills and demonstrable data modelling capabilities. About the Company More ❯
Salesforce Administrator) are a plus. Preferred Skills Strong knowledge of integration patterns and authentication protocols. Knowledge of DevOps tools. Familiarity with the finance industry is a plus. Experience with ETL tools and data visualization platforms (e.g., Tableau, Power BI). Knowledge of programming languages (e.g., Python, Apex) for data manipulation and automation. Familiarity with cloud computing concepts and technologies. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
for scalable, reliable, and governed data solutions. Strong leadership and mentorship capabilities, guiding teams through complex deliveries, fostering collaboration, and ensuring adoption of best practices. Skilled in orchestrating complex ETL workflows, integrating hybrid cloud environments, and delivering high-quality data for advanced analytics and reporting. Experience with Power BI, and building dynamic dashboards to uncover actionable insights. Excellent communication andMore ❯
City of London, London, United Kingdom Hybrid / WFH Options
Billigence
to data engineering teams, driving innovation and best practices in data cloud implementations Design, develop, and implement scalable data solutions using modern cloud data platforms Architect and deliver robust ETL/ELT pipelines and data integration solutions for enterprise clients Drive technical excellence across projects, establishing coding standards, best practices, and quality assurance processes Collaborate with cross-functional teams including … data platforms: Snowflake, Databricks, AWS Redshift, Microsoft Fabric, or similar Understanding of data modelling principles, dimensional modelling, and database design Proficiency in SQL and query optimization Comprehensive knowledge of ETL/ELT processes and data pipeline architecture Excellent communication skills with the ability to collaborate across cross-functional teams Experience managing client relationships at various levels Strong problem-solving abilities More ❯
and business intelligence (BI) systems document source-to-target mappings re-engineer manual data flows to enable scaling and repeatable use support the build of data streaming systems write ETL (extract, transform, load) scripts and code to ensure the ETL process performs optimally develop business intelligence reports that can be reused build accessible data for analysis Skills needed for this More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Harrington Starr
transforming raw data into actionable intelligence, working closely with data scientists, quants, and business stakeholders to shape cutting-edge betting products. Key Responsibilities Build and optimise data pipelines andETL workflows in AWS using Python and SQL. Partner with analysts and quants to deliver reliable datasets for predictive modelling and pricing. Design and maintain data models supporting trading, risk, andMore ❯
City of London, London, United Kingdom Hybrid / WFH Options
Recann
critical business functions. What you’ll be doing Building and maintaining scalable data pipelines using Azure Data Factory, Azure Data Fabric, and Azure Synapse Analytics. Developing robust ELT/ETL processes to integrate data from multiple business systems. Ensuring data consistency, security, and compliance (including GDPR). Supporting analytics/reporting teams with clean, structured datasets. Collaborating with IT, Finance More ❯
portfolio managers, quants, and analysts to design and deliver scalable, cloud-based data solutions that power trading and investment strategies. Key Responsibilities Design, build, and optimise data pipelines andETL workflows using AWS, Python, and SQL. Develop and maintain data models, ensuring accuracy and reliability of trading and market data. Deliver Power BI dashboards and reports to provide real-time More ❯
years of experience in data engineering or a similar role Strong SQL skills and proficiency in at least one programming language (ideally Python) Understanding of data warehousing concepts andETL/ELT patterns Experience with version control (Git), testing, and code review practices Familiarity with cloud-based data environments (e.g. AWS, GCP, or Azure) Exposure to modern data tools such More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Omnis Partners
or GCP). Snowflake Expertise: Strong practical experience with Snowflake components (Snowpipe, Snowpark, Tasks, Dynamic Tables, UDFs). SnowPro certification preferred. Technical Proficiency: Advanced Python and SQL skills for ETL/ELT workflows and data transformations. Cloud & Automation: Familiarity with CI/CD pipelines and automated testing for data workflows. Governance & Security: Understanding of data governance frameworks and security best More ❯
JavaScript is a plus but not required. Experience implementing development best practices including writing automated testing and CI/CD deployment. Responsibilities : Build and maintain reliable data pipelines andETL processes for data ingestion and transformation. Support the development and maintenance of data models and data warehouses used for reporting and analytics. Collaborate with senior engineers, analysts, and product teams More ❯