City of London, London, United Kingdom Hybrid / WFH Options
I3 Resourcing Limited
Skillset Delivery experience Building solutions in snowflake Insurance experience - advantageous but not necessary MUST HAVE SNOWFLAKE, AWS, SNOWPRO CORE Key Responsibilities: Lead the design and implementation of Snowflake [and Redshift] based data warehousing solutions within an AWS environment Mentoring of team members through code reviews and pair programming Build and support new AWS native cloud data warehouse solutions Develop … experience as a data engineer with a strong focus on Snowflake and AWS services in large-scale enterprise environments Extensive experience in AWS services, e.g. EC2, S3, RDS, DynamoDB, Redshift, Lambda, API Gateway Strong SQL skills for complex data queries and transformations Python programming for data processing and analysis is a plus Strong acumen for application health through performance More ❯
focus on building scalable data solutions. Experience with data pipeline orchestration tools such as Dagster or similar. Familiarity with cloud platforms (e.g. AWS) and their data services (e.g., S3, Redshift, Snowflake). Understanding of data warehousing concepts and experience with modern warehousing solutions. Experience with GitHub Actions (or similar) and implementing CI/CD pipelines for data workflows and More ❯
and transformation workflows Model and maintain curated data layers to support reporting, analytics, and decision-making Ensure high availability, scalability, and performance of data warehouse systems (cloud-based, e.g., Redshift) Develop & Manage Data Products: Collaborate with business and domain experts to define and deliver high-value, reusable data products Implement best practices around versioning, SLAs, data contracts, and quality … For 3+ years of experience as a Data Engineer, with a strong focus on data warehousing and data modeling Hands-on experience with cloud-native data tech (preferably AWS: Redshift, Glue, S3, Lambda, IAM, Terraform, GitHub, CI/CD) Proficiency in SQL and Python for data processing and automation Experience working with data modeling tools and practices (e.g., dimensional More ❯
Sub; Fluent Python, SQL skills with real life project experience; Experience on orchestration tools such as Airflow and DBT; Experience with one of major analytical DWHs is plus: BigQuery, Redshift, Snowflake, Databricks, Synapse; Work experience with following technologies are noteworthy to mention and might be seen as bonus: AWS (and Data related proprietary technologies), Azure (and Data related proprietary More ❯
NumPy, Pandas, SQL Alchemy) and expert-level SQL across multiple database platforms Hands-on experience with modern data stack tools including dbt, Airflow, and cloud data warehouses (Snowflake, BigQuery, Redshift) Strong understanding of data modelling, schema design, and building maintainable ELT/ETL pipelines Experience with cloud platforms (AWS, Azure, GCP) and infrastructure-as-code practices Familiarity with data More ❯
reliable, scalable, and well-tested solutions to automate data ingestion, transformation, and orchestration across systems. Own data operations infrastructure: Manage and optimise key data infrastructure components within AWS, including AmazonRedshift, Apache Airflow for workflow orchestration and other analytical tools. You will be responsible for ensuring the performance, reliability, and scalability of these systems to meet the growing More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Client Server
pipelines with technologies such as Kafka You have experience of Big Data Analytics platform integration with AWS You have a strong understanding of RDBMS, Data Warehousing, Data Modelling, SQL (Redshift, PostgreSQL) and NoSQL databases You have a good appreciation of software engineering best practices, DevOps, CI/CD, IaC You have excellent communication skills What's in it for More ❯
both at the Board/Executive level and at the business unit level. Key Responsibilities Design, develop, and maintain scalable ETL pipelines using technologies like dbt, Airbyte, Cube, DuckDB, Redshift, and Superset Work closely with stakeholders across the company to gather data requirements and setup dashboards Promote a data driven culture at Notabene and train, upskill power-users across More ❯
of data architecture principles and how these can be practically applied. Experience with Python or other scripting languages Good working knowledge of the AWS data management stack (RDS, S3, Redshift, Athena, Glue, QuickSight) or Google data management stack (Cloud Storage, Airflow, Big Query, DataPlex, Looker) About Our Process We can be flexible with the structure of our interview process More ❯
AWS expertise and a consulting mindset as an experienced Principal Data Engineer. Key Responsibilities • Lead and deliver enterprise-grade data engineering solutions using core AWS services (S3, Glue, Lambda, Redshift, Matillion, etc.). • Define architecture, mentor large technical teams, and engage directly with senior client stakeholders. • Own technical responses for RFI/RFP processes, partnering with senior leadership and … engagements. • Strong stakeholder engagement at CxO or Director level. • Deep experience in cloud data lake architectures, ETL/ELT patterns, and metadata/data quality management. • Expertise in Matillion, Redshift, Glue, Lambda, DynamoDB, and data pipeline automation. • Familiarity with data visualisation platforms such as Quicksight, Tableau, or Looker. • Knowledge of CI/CD processes and infrastructure-as-code. • Eligible More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Anson McCade
AWS expertise and a consulting mindset as an experienced Principal Data Engineer. Key Responsibilities • Lead and deliver enterprise-grade data engineering solutions using core AWS services (S3, Glue, Lambda, Redshift, Matillion, etc.). • Define architecture, mentor large technical teams, and engage directly with senior client stakeholders. • Own technical responses for RFI/RFP processes, partnering with senior leadership and … engagements. • Strong stakeholder engagement at CxO or Director level. • Deep experience in cloud data lake architectures, ETL/ELT patterns, and metadata/data quality management. • Expertise in Matillion, Redshift, Glue, Lambda, DynamoDB, and data pipeline automation. • Familiarity with data visualisation platforms such as Quicksight, Tableau, or Looker. • Knowledge of CI/CD processes and infrastructure-as-code. • Eligible More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
decisions and mentor junior engineers Collaborate across engineering, data science, and product teams to deliver business impact Skills & Experience: Expert in SQL , dbt , and cloud data warehouses (e.g., BigQuery, Redshift) Strong experience with Airflow , Python , and multi-cloud environments (AWS/GCP) Proven background in designing and scaling analytics solutions in agile environments Proven experience as an Analytics Engineer More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Anson McCade
It suits someone who thrives in greenfield environments, enjoys client engagement, and values clean, scalable, well-documented engineering. Key Responsibilities: Design and build robust data pipelines using AWS (S3, Redshift, Glue, Lambda, Step Functions, DynamoDB). Deliver ETL/ELT solutions with Matillion and related tooling. Work closely with client teams to define requirements and hand over production-ready … solutions. Own infrastructure and deployment via CI/CD and IaC best practices. Contribute to technical strategy and mentor junior engineers. Requirements: Strong hands-on AWS experience – S3, Redshift, Glue essential. Proven experience building ETL/ELT pipelines in cloud environments. Proficient in working with structured/unstructured data (JSON, XML, CSV, Parquet). Skilled in working with relational More ❯
It suits someone who thrives in greenfield environments, enjoys client engagement, and values clean, scalable, well-documented engineering. Key Responsibilities: Design and build robust data pipelines using AWS (S3, Redshift, Glue, Lambda, Step Functions, DynamoDB). Deliver ETL/ELT solutions with Matillion and related tooling. Work closely with client teams to define requirements and hand over production-ready … solutions. Own infrastructure and deployment via CI/CD and IaC best practices. Contribute to technical strategy and mentor junior engineers. Requirements: Strong hands-on AWS experience – S3, Redshift, Glue essential. Proven experience building ETL/ELT pipelines in cloud environments. Proficient in working with structured/unstructured data (JSON, XML, CSV, Parquet). Skilled in working with relational More ❯
drills for stream and batch environments. Architecture & Automation Collaborate with data engineering and product teams to architect scalable, fault-tolerant pipelines using AWS services (e.g., Step Functions , EMR , Lambda , Redshift ) integrated with Apache Flink and Kafka . Troubleshoot & Maintain Python -based applications. Harden CI/CD for data jobs: implement automated testing of data schemas, versioned Flink jobs, and More ❯
in the team and contribute to deep technical discussions Nice to Have Experience with operating machine learning models (e.g., MLFlow) Experience with Data Lakes, Lakehouses, and Warehouses (e.g., DeltaLake, Redshift) DevOps skills, including terraform and general CI/CD experience Previously worked in agile environments Experience with expert systems Perks & Benefits Comprehensive benefits package Fitness reimbursement Veeva Work-Anywhere More ❯
its data related services. Strong SQL and PySpark skills, with a focus on writing efficient, readable, modular code. Experience of development on modern cloud data platforms (e.g. Databricks, Snowflake, RedShift). Familiarity of Data Lakehouse principles, standards and best practices. Understanding of event-driven architecture and data streaming technologies. Familiarity with Agile delivery methodologies, Azure DevOps and CI/ More ❯
and Python skills for building and optimising data pipelines Experience working with cloud platforms (e.g., AWS, GCP, or Azure) Familiarity with modern data stack tools (e.g., dbt, Airflow, Snowflake, Redshift, or BigQuery) Understanding of data modelling and warehousing principles Experience working with large datasets and distributed systems What's in it for you? Up to £70k Hybrid working More ❯
Job Title: Data Engineering Manager Job Type: Full Time, Permanent Location: London, Hybrid Role Purpose At Travelex we are developing modern data technology and data products. Data is central to the way we define and sell our foreign currency exchange More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
or Power BI Demonstrable experience designing and analysing A/B tests Strong storytelling and stakeholder-management skills Full UK work authorization Desirable Familiarity with cloud data platforms (Snowflake, Redshift, BigQuery) and ETL tools (dbt, Airflow) Experience with loyalty-programme analytics or CRM platforms Knowledge of machine-learning frameworks (scikit-learn, TensorFlow) for customer scoring Technical Toolbox Data & modeling … SQL, Python/R, pandas, scikit-learn Dashboarding: Tableau or Power BI ETL & warehousing: dbt, Airflow, Snowflake/Redshift/BigQuery Experimentation: A/B testing platforms (Optimizely, VWO More ❯
London, England, United Kingdom Hybrid / WFH Options
Harnham
or Power BI Demonstrable experience designing and analysing A/B tests Strong storytelling and stakeholder-management skills Full UK work authorization Desirable Familiarity with cloud data platforms (Snowflake, Redshift, BigQuery) and ETL tools (dbt, Airflow) Experience with loyalty-programme analytics or CRM platforms Knowledge of machine-learning frameworks (scikit-learn, TensorFlow) for customer scoring Technical Toolbox Data & modeling … SQL, Python/R, pandas, scikit-learn Dashboarding: Tableau or Power BI ETL & warehousing: dbt, Airflow, Snowflake/Redshift/BigQuery Experimentation: A/B testing platforms (Optimizely, VWO) Desired Skills and Experience 8+ years in retail/FMCG customer insights and analytics Built customer segmentation, CLV, and propensity models in Python/R Designed and analysed A/ More ❯
to empower the team and partner teams. Help develop, upskill and empower team members through trainings and efficient knowledge management. Base qualifications: Experience in analysing and interpreting data with Redshift, Oracle, NoSQL etc. Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modelling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Career Moves Group
to empower the team and partner teams. Help develop, upskill and empower team members through trainings and efficient knowledge management. Base qualifications: Experience in analysing and interpreting data with Redshift, Oracle, NoSQL etc. Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modelling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Fynity
Azure SQL Tableau/Power BI A knowledge of data modelling and of general IT architecture and systems integration is also required. Other technologies such as Azure Data Factory, RedShift, Informatica, Qlik or similar are also useful and you will be tech curious, keen and open-minded to learning new skills. Experience in Oracle OBIEE and/or Oracle More ❯
Azure SQL Tableau/Power BI A knowledge of data modelling and of general IT architecture and systems integration is also required. Other technologies such as Azure Data Factory, RedShift, Informatica, Qlik or similar are also useful and you will be tech curious, keen and open-minded to learning new skills. Experience in Oracle OBIEE and/or Oracle More ❯