Cambridge, England, United Kingdom Hybrid / WFH Options
Axiom Software Solutions Limited
a week) Duration: Long Term B2B Contract Job Description: The ideal candidate will have a minimum of 5+ years of experience working with Snowflake, DBT, Python, and AWS to deliver … ETL/ELT pipelines using various resources. Proficiency in Snowflake data warehouse architecture. Design, build, and optimize ETL/ELT pipelines using DBT (DataBuildTool) and Snowflake. Experience with DBT (DataBuildTool) for data transformation and modeling. Implement data transformation workflows using DBT (core/cloud). Strong … Python to create automation scripts and optimize data processing tasks. Proficiency in SQL performance tuning and query optimization techniques using Snowflake. Troubleshoot and optimize DBT models and Snowflake performance. Knowledge of CI/CD and version control (Git) tools. Experience with orchestration tools such as Airflow. Strong analytical and problem More ❯
London, England, United Kingdom Hybrid / WFH Options
TieTalent
control, CI/CD, and Infrastructure-as-Code (Terraform or similar). Background in fintech or payments.Knowledge of streaming frameworks (Kafka, Kinesis).Experience with dbt and data quality frameworks (e.g., Great Expectations). What's on Offer Competitive base salary + bonusFlexible hybrid working model (Liverpool Street office)Private healthcare More ❯
CD , and Infrastructure-as-Code (Terraform or similar). Desirable: Background in fintech or payments. Knowledge of streaming frameworks (Kafka, Kinesis). Experience with dbt and data quality frameworks (e.g., Great Expectations ). What's on Offer Competitive base salary + bonus Flexible hybrid working model (Liverpool Street office) Private More ❯
engineering, or similar roles. Hands-on expertise with Python (Numpy/Pandas) and SQL. Proven experience designing and building robust ETL/ELT pipelines (dbt, Airflow). Strong knowledge of data pipelining, schema design, and cloud platforms (e.g., Snowflake, AWS). Excellent communication skills and the ability to translate technical More ❯
City of London, London, United Kingdom Hybrid / WFH Options
83data
CD , and Infrastructure-as-Code (Terraform or similar). Desirable: Background in fintech or payments. Knowledge of streaming frameworks (Kafka, Kinesis). Experience with dbt and data quality frameworks (e.g., Great Expectations ). What’s on Offer Competitive base salary + bonus Flexible hybrid working model (Liverpool Street office) Private More ❯
CD , and Infrastructure-as-Code (Terraform or similar). Desirable: Background in fintech or payments. Knowledge of streaming frameworks (Kafka, Kinesis). Experience with dbt and data quality frameworks (e.g., Great Expectations ). What’s on Offer Competitive base salary + bonus Flexible hybrid working model (Liverpool Street office) Private More ❯
CD , and Infrastructure-as-Code (Terraform or similar). Desirable: Background in fintech or payments. Knowledge of streaming frameworks (Kafka, Kinesis). Experience with dbt and data quality frameworks (e.g., Great Expectations ). What's on Offer Competitive base salary + bonus Flexible hybrid working model (Liverpool Street office) Private More ❯
South East London, England, United Kingdom Hybrid / WFH Options
83data
CD , and Infrastructure-as-Code (Terraform or similar). Desirable: Background in fintech or payments. Knowledge of streaming frameworks (Kafka, Kinesis). Experience with dbt and data quality frameworks (e.g., Great Expectations ). What’s on Offer Competitive base salary + bonus Flexible hybrid working model (Liverpool Street office) Private More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hartree Partners
PyTorch, or similar). Experience validating models with historical data and communicating results to non-specialists. Exposure to real-time data engineering (Kafka, Airflow, dbt) Track record turning research code into production services (CI/CD, containers etc) Strong SQL and data-management skills; experience querying large analytical databases (Snowflake More ❯
PyTorch, or similar). Experience validating models with historical data and communicating results to non-specialists. Exposure to real-time data engineering (Kafka, Airflow, dbt) Track record turning research code into production services (CI/CD, containers etc) Strong SQL and data-management skills; experience querying large analytical databases (Snowflake More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Hartree Partners
PyTorch, or similar). Experience validating models with historical data and communicating results to non-specialists. Exposure to real-time data engineering (Kafka, Airflow, dbt) Track record turning research code into production services (CI/CD, containers etc) Strong SQL and data-management skills; experience querying large analytical databases (Snowflake More ❯
London, England, United Kingdom Hybrid / WFH Options
Starling Bank
either AWS or GCP Experience with Terraform to define and manage cloud infrastructure through code Desirables: Experience in SQL-based transformation workflows, particularly using DBT in BigQuery Experience with containerisation technologies (Docker, Kubernetes) Familiarity with streaming data ingestion technologies (Kafka, Debezium) Exposure to Data management and Linux admisntration Interview process More ❯
of experience in a Data Engineering role. Strong programming skills in Python , Scala , or Java . Solid experience with ETL tools (e.g., Apache Airflow, dbt, Talend). Proficiency with SQL and relational/non-relational databases (e.g., PostgreSQL, BigQuery, Snowflake, MongoDB). Experience working with cloud environments and data services More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
risk modelling, or algorithmic trading is a plus. Familiarity with cloud platforms (AWS, GCP, or Azure) and modern data stack tools (e.g., Apache Airflow, dbt, Snowflake). Excellent communication and stakeholder management skills. Must be available to work onsite in London 3 days per week. What's on Offer Competitive More ❯
risk modelling, or algorithmic trading is a plus. Familiarity with cloud platforms (AWS, GCP, or Azure) and modern data stack tools (e.g., Apache Airflow, dbt, Snowflake). Excellent communication and stakeholder management skills. Must be available to work onsite in London 3 days per week. What's on Offer Competitive More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
risk modelling, or algorithmic trading is a plus. Familiarity with cloud platforms (AWS, GCP, or Azure) and modern data stack tools (e.g., Apache Airflow, dbt, Snowflake). Excellent communication and stakeholder management skills. Must be available to work onsite in London 3 days per week. What's on Offer Competitive More ❯
City of London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
risk modelling, or algorithmic trading is a plus. Familiarity with cloud platforms (AWS, GCP, or Azure) and modern data stack tools (e.g., Apache Airflow, dbt, Snowflake). Excellent communication and stakeholder management skills. Must be available to work onsite in London 3 days per week. What's on Offer Competitive More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Duel
with Snowflake. You understand event-driven architectures and real-time data processing You have experience implementing and maintaining scalable data pipelines using tools like dbt, Apache Airflow, or similar. You have no issue working with either structured or semi-structured data. You are comfortable working with data engineering and scripting More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Enso Recruitment
pipelines and infrastructure to ensure efficient data flow across the business. Key Responsibilities Develop, support and optimise robust data solutions using tools like Snowflake, dbt, Fivetran, and Azure Cloud services Collaborate with cross-functional teams to translate business needs into actionable data architecture Design and manage data pipelines and integration More ❯
London, England, United Kingdom Hybrid / WFH Options
Lloyds Banking Group
Agile environment. Deep technical expertise in software and data engineering, programming languages (python, java etc.). Understanding of orchestration (Composer, DAGs), data processing (DataFlow, dbt), and database capabilities (e.g. BigQuery, CloudSQL, BigTable). Knowledge of container technologies (Docker, Kubernetes), IaaC (Terraform) and experience with cloud platforms such as GCP. Detailed More ❯
London, England, United Kingdom Hybrid / WFH Options
Deel
a Data Engineer on the Data Platform team, you will: Design, implement, and manage scalable and efficient data pipelines using tools like Snowflake, Airflow, dbt, and Fivetran. Experience with similar technologies is also valued as we continuously evolve our stack. Collaborate with cross-functional teams to understand data requirements and More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Block MB
data best practices across teams Champion data quality, governance, and documentation Key Requirements: Strong experience with Python, SQL, and modern ETL tools (e.g., Airflow, dbt) Solid grasp of cloud platforms (AWS/GCP/Azure) and data warehouses (e.g., BigQuery, Snowflake) Familiarity with streaming technologies (Kafka, Kinesis, etc.) Passion for More ❯
data best practices across teams Champion data quality, governance, and documentation Key Requirements: Strong experience with Python, SQL, and modern ETL tools (e.g., Airflow, dbt) Solid grasp of cloud platforms (AWS/GCP/Azure) and data warehouses (e.g., BigQuery, Snowflake) Familiarity with streaming technologies (Kafka, Kinesis, etc.) Passion for More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
JR United Kingdom
efficient code and comfortable undertaking system optimisation and performance tuning tasks Comfortable working with relational databases such as Oracle, PostgreSQL, MySQL Has exposure to DBT and data quality test frameworks Has awareness of Infrastructure as Code such as Terraform and Ansible BENEFITS Company Laptop supplied. Bonus Scheme. Cycle to work More ❯
cloud-based database services (Snowflake). Knowledge of data warehousing, orchestration and pipeline technologies (Apache Airflow/Kafka, Azure DataFactory etc.). Experience with DBT for modelling Server administration and networking fundamentals More ❯