focus on building scalable data solutions. Experience with data pipeline orchestration tools such as Dagster or similar. Familiarity with cloud platforms (e.g. AWS) and their data services (e.g., S3, Redshift, Snowflake). Understanding of data warehousing concepts and experience with modern warehousing solutions. Experience with GitHub Actions (or similar) and implementing CI/CD pipelines for data workflows and More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Snap Analytics
thought leader by authoring blogs, presenting at webinars, and engaging in external speaking opportunities. Technical Delivery Excellence You'll design and optimise cloud-based data architectures (e.g. Snowflake, Databricks, Redshift, Google BigQuery, Azure Synapse) to support advanced analytics. You'll build and automate scalable, secure, and high-performance data pipelines handling diverse data sources. You'll work closely with More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Fruition Group
engineers and support their growth. Implement best practices for data security and compliance. Collaborate with stakeholders and external partners. Skills & Experience: Strong experience with AWS data technologies (e.g., S3, Redshift, Lambda). Proficient in Python, Apache Spark, and SQL. Experience in data warehouse design and data migration projects. Cloud data platform development and deployment. Expertise across data warehouse and More ❯
may either leverage third party tools such as Fivetran, Airbyte, Stitch or build custom pipelines. We use the main data warehouses for dbt modelling and have extensive experience with Redshift, BigQuery and Snowflake. Recently we've been rolling out a serverless implementation of dbt and progressing work on internal product to build modular data platforms. When initially working with More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Dept Holding B.V
cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to translate complex technical concepts for business stakeholders Strategic thinking with More ❯
cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to translate complex technical concepts for business stakeholders Strategic thinking with More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
WüNDER TALENT
input into technical decisions, peer reviews and solution design. Requirements Proven experience as a Data Engineer in cloud-first environments. Strong commercial knowledge of AWS services (e.g. S3, Glue, Redshift). Advanced PySpark and Databricks experience (Delta Lake, Unity Catalog, Databricks Jobs etc). Proficient in SQL (T-SQL/SparkSQL) and Python for data transformation and scripting. Hands More ❯
Cambridge, Cambridgeshire, England, United Kingdom Hybrid / WFH Options
Oscar Technology
for clients Present findings and recommendations to senior client stakeholders Tech Stack You'll Work With BI Tools : Power BI, Tableau, Looker, Google Data Studio Data Warehousing : Snowflake, BigQuery, Redshift, SQL Server Languages : SQL, DAX, Python (Pandas, NumPy, Scikit-learn - desirable) CRM Systems : Salesforce, HubSpot, Dynamics 365, Zoho ETL Tools : dbt, Fivetran, Talend, Alteryx AI/ML Frameworks (Desirable More ❯
Oxford, Oxfordshire, England, United Kingdom Hybrid / WFH Options
Oscar Technology
for clients Present findings and recommendations to senior client stakeholders Tech Stack You'll Work With BI Tools : Power BI, Tableau, Looker, Google Data Studio Data Warehousing : Snowflake, BigQuery, Redshift, SQL Server Languages : SQL, DAX, Python (Pandas, NumPy, Scikit-learn - desirable) CRM Systems : Salesforce, HubSpot, Dynamics 365, Zoho ETL Tools : dbt, Fivetran, Talend, Alteryx AI/ML Frameworks (Desirable More ❯
Bristol, Avon, England, United Kingdom Hybrid / WFH Options
Oscar Technology
for clients Present findings and recommendations to senior client stakeholders Tech Stack You'll Work With BI Tools : Power BI, Tableau, Looker, Google Data Studio Data Warehousing : Snowflake, BigQuery, Redshift, SQL Server Languages : SQL, DAX, Python (Pandas, NumPy, Scikit-learn - desirable) CRM Systems : Salesforce, HubSpot, Dynamics 365, Zoho ETL Tools : dbt, Fivetran, Talend, Alteryx AI/ML Frameworks (Desirable More ❯
Milton Keynes, Buckinghamshire, England, United Kingdom Hybrid / WFH Options
Oscar Technology
for clients Present findings and recommendations to senior client stakeholders Tech Stack You'll Work With BI Tools : Power BI, Tableau, Looker, Google Data Studio Data Warehousing : Snowflake, BigQuery, Redshift, SQL Server Languages : SQL, DAX, Python (Pandas, NumPy, Scikit-learn - desirable) CRM Systems : Salesforce, HubSpot, Dynamics 365, Zoho ETL Tools : dbt, Fivetran, Talend, Alteryx AI/ML Frameworks (Desirable More ❯
NumPy, Pandas, SQL Alchemy) and expert-level SQL across multiple database platforms Hands-on experience with modern data stack tools including dbt, Airflow, and cloud data warehouses (Snowflake, BigQuery, Redshift) Strong understanding of data modelling, schema design, and building maintainable ELT/ETL pipelines Experience with cloud platforms (AWS, Azure, GCP) and infrastructure-as-code practices Familiarity with data More ❯
reliable, scalable, and well-tested solutions to automate data ingestion, transformation, and orchestration across systems. Own data operations infrastructure: Manage and optimise key data infrastructure components within AWS, including AmazonRedshift, Apache Airflow for workflow orchestration and other analytical tools. You will be responsible for ensuring the performance, reliability, and scalability of these systems to meet the growing More ❯
Birmingham, West Midlands, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
pipelines with technologies such as Kafka You have experience of Big Data Analytics platform integration with AWS You have a strong understanding of RDBMS, Data Warehousing, Data Modelling, SQL (Redshift, PostgreSQL) and NoSQL databases You have a good appreciation of software engineering best practices, DevOps, CI/CD, IaC You have excellent communication skills What's in it for More ❯
Kafka, Spark Streaming, Kinesis) Familiarity with schema design and semi-structured data formats Exposure to containerisation, graph databases, or machine learning concepts Proficiency with cloud-native data tools (BigQuery, Redshift, Snowflake) Enthusiasm for learning and experimenting with new technologies Why Join Capco Deliver high-impact technology solutions for Tier 1 financial institutions Work in a collaborative, flat, and entrepreneurial More ❯
both at the Board/Executive level and at the business unit level. Key Responsibilities Design, develop, and maintain scalable ETL pipelines using technologies like dbt, Airbyte, Cube, DuckDB, Redshift, and Superset Work closely with stakeholders across the company to gather data requirements and setup dashboards Promote a data driven culture at Notabene and train, upskill power-users across More ❯
of data architecture principles and how these can be practically applied. Experience with Python or other scripting languages Good working knowledge of the AWS data management stack (RDS, S3, Redshift, Athena, Glue, QuickSight) or Google data management stack (Cloud Storage, Airflow, Big Query, DataPlex, Looker) About Our Process We can be flexible with the structure of our interview process More ❯
Wilmslow, England, United Kingdom Hybrid / WFH Options
The Citation Group
stack with robust, pragmatic solutions. Responsibilities develop, and maintain ETL/ELT data pipelines using AWS services Data, Databricks and dbt Manage and optimize data storage solutions such as Amazon S3, Redshift, RDS, and DynamoDB. Implement and manage infrastructure-as-code (IaC) using tools like Terraform or AWS CloudFormation. Monitor and optimize the performance, cost, and scalability of More ❯
Cardiff, South Glamorgan, Wales, United Kingdom Hybrid / WFH Options
Octad Recruitment Consultants (Octad Ltd )
IaaS ? PaaS), including Infra as Code. Strong SQL skills and proficiency in Python or PySpark . Built or maintained data lakes/warehouses using Synapse , Fabric , Databricks , Snowflake , or Redshift . Experience hardening cloud environments (NSGs, identity, Defender). Demonstrated automation of backups, CI/CD deployments, or DR workflows. Nice-to-Haves: Experience with Azure OpenAI , vector databases More ❯
Cardiff, South Glamorgan, Wales, United Kingdom Hybrid / WFH Options
Octad Recruitment Ltd
IaaS PaaS), including Infra as Code. Strong SQL skills and proficiency in Python or PySpark . Built or maintained data lakes/warehouses using Synapse , Fabric , Databricks , Snowflake , or Redshift . Experience hardening cloud environments (NSGs, identity, Defender). Demonstrated automation of backups, CI/CD deployments, or DR workflows. Nice-to-Haves: Experience with Azure OpenAI , vector databases More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Parking Network BV
re an experienced data engineering professional with a proven ability to lead and inspire teams. You bring deep technical skills in Python, SQL, and AWS services such as EC2, Redshift, Lambda and Kinesis, alongside strong stakeholder management and commercial awareness. You'll also bring: Proven experience designing and implementing data pipelines, ETL processes, and warehousing in cloud environments. The More ❯
/Data Engineering/BI Engineering experience Understanding of data warehousing, data modelling concepts and structuring new data tables Knowledge of cloud-based MPP data warehousing (e.g. Snowflake, BigQuery, Redshift) Nice to have Experience developing in a BI tool (Looker or similar) Good practical understanding of version control SQL ETL/ELT knowledge, experience with DAGs to manage script More ❯
e.g.scikit-learn, pandas, NumPy, SciPy, etc) Experience with ML frameworks such as TensorFlow, PyTorch, XGBoost, LightGBM, or similar Strong SQL skills and experience with data warehousing solutions (Snowflake, BigQuery, Redshift) Experience with cloud platforms (AWS, Azure, GCP) and their ML and AI services (SageMaker, Azure ML, Vertex AI) Knowledge of MLOps tools including Docker, MLflow, Kubeflow, or similar platforms More ❯
City of London, London, United Kingdom Hybrid / WFH Options
I3 Resourcing Limited
Skillset Delivery experience Building solutions in snowflake implementing data warehousing solutions using Snowflake and AWS Hands-on experience with AWS services such as Glue (Spark), Lambda, Step Functions, ECS, Redshift, and SageMaker. Enthusiasm for cross-functional work and adaptability beyond traditional data engineering. Examples like building APIs, integrating with microservices, or contributing to backend systems - not just data pipelines … AWS CDK, CloudFormation, Terraform. MUST HAVE SNOWFLAKE, AWS Key Responsibilities: Design and implement scalable, secure, and cost-efficient data solutions on AWS, leveraging services such as Glue, Lambda, S3, Redshift, and Step Functions. Lead the development of robust data pipelines and analytics platforms, ensuring high availability, performance, and maintainability. Demonstrate proficiency in software engineering principles, contributing to the development … strong hands-on programming skills and software engineering fundamentals, with experience building scalable solutions in cloud environments (AWS preferred) Extensive experience in AWS services, e.g. EC2, S3, RDS, DynamoDB, Redshift, Lambda, API Gateway Solid foundation in software engineering principles, including version control (Git), testing, CI/CD, modular design, and clean code practices. Experience developing reusable components and APIs More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
Talent Hero Ltd
data storage and retrieval for performance Work with batch and real-time processing frameworks Implement and manage ETL processes Use tools like Python, SQL, Spark, Airflow, Kafka, dbt, Snowflake, Redshift, BigQuery Requirements Bachelors degree in Computer Science, Engineering, or a related field 1+ year in a Data Engineer or similar role Proficiency in SQL and Python (or another scripting More ❯