help them further grow their already exciting business. Within this role, you will be responsible for Maintaining, supporting and expanding existing data pipelines using DBT, Snowflake and S3. You will also be tasked with implementing standardised data ingress/egress pipelines coupled with onboarding new, disparate data sets, sourced from More ❯
variety of databases Working knowledge of one or more of the cloud platforms (AWS, Azure, GCP) Experience building ETL/ELT pipelines specifically using DBT for structured and semi-structured datasets Any orchestration toolings such as Airflow, Dagster, Azure Data Factory, Fivetran etc It will be nice to have: Software More ❯
Snowflake, or BigQuery. Strong command of SQL and programming languages like Python, Scala, or Java. Familiarity with ETL/ELT tools (e.g., Airflow, Fivetran, dbt) and cloud data stacks (AWS/GCP/Azure). A deep understanding of data modelling, access controls, and infrastructure performance tuning. A background in More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Duel
with Snowflake. You understand event-driven architectures and real-time data processing. You have experience implementing and maintaining scalable data pipelines using tools like dbt, Apache Airflow, or similar. You have no issue working with either structured or semi-structured data. You are comfortable working with data engineering and scripting More ❯
pipelines and infrastructure to ensure efficient data flow across the business. Key Responsibilities Develop, support and optimise robust data solutions using tools like Snowflake, dbt, Fivetran, and Azure Cloud services Collaborate with cross-functional teams to translate business needs into actionable data architecture Design and manage data pipelines and integration More ❯
belfast, antrim, United Kingdom Hybrid / WFH Options
Enso Recruitment
pipelines and infrastructure to ensure efficient data flow across the business. Key Responsibilities Develop, support and optimise robust data solutions using tools like Snowflake, dbt, Fivetran, and Azure Cloud services Collaborate with cross-functional teams to translate business needs into actionable data architecture Design and manage data pipelines and integration More ❯
languages commonly used for data work (e.g., Python, Java, Scala) Deep understanding of ETL/ELT tools and workflow orchestration platforms (e.g., Airflow, Fivetran, dbt) Proficiency with SQL and solid grounding in data modeling concepts Familiarity with cloud services and architectures (AWS, GCP, or Azure) Proven experience managing or mentoring More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Block MB
data best practices across teams Champion data quality, governance, and documentation Key Requirements: Strong experience with Python, SQL, and modern ETL tools (e.g., Airflow, dbt) Solid grasp of cloud platforms (AWS/GCP/Azure) and data warehouses (e.g., BigQuery, Snowflake) Familiarity with streaming technologies (Kafka, Kinesis, etc.) Passion for More ❯
efficient code and comfortable undertaking system optimisation and performance tuning tasks Comfortable working with relational databases such as Oracle, PostgreSQL, MySQL Has exposure to DBT and data quality test frameworks Has awareness of Infrastructure as Code such as Terraform and Ansible BENEFITS Competitive Salary. Company Laptop supplied. Bonus Scheme. More ❯
efficient code and comfortable undertaking system optimisation and performance tuning tasks Comfortable working with relational databases such as Oracle, PostgreSQL, MySQL Has exposure to DBT and data quality test frameworks Has awareness of Infrastructure as Code such as Terraform and Ansible BENEFITS Competitive Salary. Company Laptop supplied. Bonus Scheme. More ❯
Cambridge, Cambridgeshire, UK Hybrid / WFH Options
Intellect Group
Nice to Have Experience working within a consultancy or client-facing environment Familiarity with tools and frameworks such as: Databricks PySpark Pandas Airflow or dbt Experience deploying solutions using cloud-native services (e.g., BigQuery, AWS Glue, S3, Lambda) What’s On Offer Fully remote working with the flexibility to work More ❯
Nottinghamshire, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (Python, Data Modelling, ETL/ELT, Apache Airflow, DBT, AWS) Enterprise-scale tech firm Up to £70,000 plus benefits - FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalabale data structures and data pipelines within a truly More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Winston Fox
the Data space. This role will also allow the successful individual to cross-train into modern Data Engineering tools and Technologies such as Airflow, dbt, Snowflake, as well further develop their skills Python, SQL and Market Data platforms. The firm work on a hybrid working schedule (three days per week More ❯
Salisbury, England, United Kingdom Hybrid / WFH Options
Ascentia Partners
practices. Nice-to-Have Skills Exposure to AWS Redshift, Glue, or Snowflake. Familiarity with BigQuery and Google Analytics APIs. Proficiency in Python, PySpark, or dbt for data transformations. Background in insurance, especially in pricing analytics or actuarial data. Click Apply More ❯
a leader within the business over time. The current tech stack is varied, and it’s currently made up of TypeScript, Python, PostgreSQL, Redis, DBT on AWS. You’ll be encouraged to take ownership of products and you’ll be given this autonomy from the co-founders. The products handle More ❯
a related field, with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow , Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT More ❯
edinburgh, central scotland, United Kingdom Hybrid / WFH Options
Net Talent
a related field, with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow , Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT More ❯
with Python for data engineering tasks. Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker). Exposure to specific orchestration tools (e.g., Airflow, dbt). Experience working in Agile/Scrum development methodologies. Experience with Big Data Technologies & Frameworks Join Us! This role can be based in either of More ❯
timelines and quality standards are met. Required Skills & Experience: 5+ years' experience as a Data Analyst Strong skills in Python, SQL and tools like dbt, Snowflake, AWS S3 and SQL Server. Solid understanding of financial instruments such as Equities, Futures, Forwards, CDS, IRS and ETFs with deep knowledge in at More ❯
Role Job Description: AWS Services: Glue, Lambda, IAM, Service Catalogue, Cloud Formation, Lake Formation, SNS, SQS, Event Bridge Language & Scripting: Python and Spark ETL: DBT Good to Have: Airflow, Snowflake, Big Data (Hadoop), and Teradata Responsibilities: Serve as the primary point of contact for all AWS related data initiatives and More ❯
effectively across teams. Bonus Points For: Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker). Experience with specific orchestration tools (e.g., Airflow, dbt). Experience working in Agile/Scrum development methodologies. Experience with Big Data Technologies & Frameworks Join Us! This role can be based in either of More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Hargreaves Lansdown Asset Management Limited
low latency data pipeline with the following skills. Data Engineering Skills Modelling Orchestration using Apache Airflow Cloud native streaming pipelines using Flink, Beam etc. DBT Snowflake Infrastructure Skills Terraform Devops Skills Experienced in developing CI/CD pipelines Integration Skills REST and Graph APIs (Desirable) Serverless API development ( e.g. Lambda More ❯
orchestration and modelling. Lead the team in building and maintaining robust data pipelines, data models, and infrastructure using tools such as Airflow, AWS Redshift, DBT and Looker.Ensuring the team follows agile methodologies to improve delivery cadence and responsiveness. Contribute to hands-on coding, particularly in areas requiring architectural input, prototyping More ❯