Data/Analytics Engineer. You’ll transform raw data into meaningful, high-quality datasets that power applications across the company. You’ll build scalable dbt models on top of Databricks and PostgreSQL, partnering with data and business stakeholders to define metrics, track performance, and ensure data quality. This role would … years professional experience as a Data Engineer or Analytics Engineer Strong proficiency in SQL, with proven experience writing complex, performant queries Experience working with DBT in production Experience working with Databricks and/or PostgreSQL Solid understanding of data testing, observability, and data quality assurance Familiarity with Git and modern More ❯
Data/Analytics Engineer. You’ll transform raw data into meaningful, high-quality datasets that power applications across the company. You’ll build scalable dbt models on top of Databricks and PostgreSQL, partnering with data and business stakeholders to define metrics, track performance, and ensure data quality. This role would … years professional experience as a Data Engineer or Analytics Engineer Strong proficiency in SQL, with proven experience writing complex, performant queries Experience working with DBT in production Experience working with Databricks and/or PostgreSQL Solid understanding of data testing, observability, and data quality assurance Familiarity with Git and modern More ❯
Data/Analytics Engineer. You’ll transform raw data into meaningful, high-quality datasets that power applications across the company. You’ll build scalable dbt models on top of Databricks and PostgreSQL, partnering with data and business stakeholders to define metrics, track performance, and ensure data quality. This role would … years professional experience as a Data Engineer or Analytics Engineer Strong proficiency in SQL, with proven experience writing complex, performant queries Experience working with DBT in production Experience working with Databricks and/or PostgreSQL Solid understanding of data testing, observability, and data quality assurance Familiarity with Git and modern More ❯
move the needle, you'll love it here. What You'll Do Design models that hold up under pressure: Own and develop analytics-ready dbt models that transform raw data into clean, documented, and trusted sources of truth. Get the right data flowing : Use Fivetran and custom pipelines to ingest … environments where you've had to balance speed, quality, and scale. Proven ability to write clean, efficient SQL and Python, and to build robust dbt models that support scalable data workflows in production. Comfortable working across modern data stacks, including ELT tools, cloud warehouses, and BI platforms - with the ability … well. Current Stack We work with a modern data stack, but we're open to evolving as we grow. Currently, that includes: Fivetran BigQuery dbt Lightdash Hex Heap Benefits 28 days holiday per annum + Bank holidays, with the option to roll up to 5 days per annum. Employee Share More ❯
ecosystem: We primarily work in python, go and SQL. Our code (tracked via github) deploys to GCP (Bigquery, Airflow/Composer, GKE, Cloud Run), dbt Cloud and Azure via terraform. We recognise this is a broad list; if youre not deeply familiar with some of these tools, wed like you … otherwise), particularly for applications in support of customer personalisation Surfacing analytical products into data visualisation platforms (we use PowerBI) via semantic models (ideally within dbt) Improving engineering excellence (enforcement via CI/CD pipelines, KEDBs, SOPs, best practices) Why apply? We believe that diversity fuels innovation. At Pets at Home More ❯
ecosystem: We primarily work in python, go and SQL. Our code (tracked via github) deploys to GCP (Bigquery, Airflow/Composer, GKE, Cloud Run), dbt Cloud and Azure via terraform. We recognise this is a broad list; if you're not deeply familiar with some of these tools, we'd … otherwise), particularly for applications in support of customer personalisation Surfacing analytical products into data visualisation platforms (we use PowerBI) via semantic models (ideally within dbt) Improving engineering excellence (enforcement via CI/CD pipelines, KEDBs, SOPs, best practices) Why apply? We believe that diversity fuels innovation. At Pets at Home More ❯
SQL, Python and Airflow; Experience in Kubernetes, Docker, Django, Spark and related monitoring tools for DevOps a big plus (e.g. Grafana, Prometheus); Experience with dbt for pipeline modeling also beneficial; Skilled at shaping needs into a solid set of requirements and designing scalable solutions to meet them; Able to quickly … for a more effective platform; Open to traveling to Octopus offices across Europe and the US. Our Data Stack: SQL-based pipelines built with dbt on Databricks Analysis via Python Jupyter notebooks Pyspark in Databricks workflows for heavy lifting Streamlit and Python for dashboarding Airflow DAGs with Python for ETL More ❯
that clients are invoiced correctly. We are looking for someone who ideally has experience of working in a Data Team and is proficient with DBT, Bigquery and Looker. Key Responsibilities: Building new processes to map Data to billable events and KPIs, working with Data engineers to ensure this. Data Mapping … managing the definition and Business logic for KPIs for the company, across Product team, Operations team, Commercial and Customer Success teams, using DBT and Look-ML. Will be in charge of dashboarding and will be the owner of Looker, the company's BI tool. Billing infrastructure & System, ensuring all clients More ❯
at the Board/Executive level and at the business unit level. Key Responsibilities Design, develop, and maintain scalable ETL pipelines using technologies like dbt, Airbyte, Cube, DuckDB, Redshift, and Superset Work closely with stakeholders across the company to gather data requirements and setup dashboards Promote a data driven culture … scalability Must Haves 8+ years working in data engineering Proven experience in building and maintaining ETL pipelines and data infrastructure Strong experience working with dbt core/cloud Business savvy and capable of interfacing with finance, revenue and ops leaders to build our business intelligence Expertise in data governance, best More ❯
Role : Data Engineer Location : London, UK (Hybrid) Salary : £40k–£55k (DOE) Are you passionate about building robust data infrastructure and enabling powerful analytics? Join a fast-paced, collaborative environment where you’ll help shape the future of data-driven decision More ❯
Role : Data Engineer Location : London, UK (Hybrid) Salary : £40k–£55k (DOE) Are you passionate about building robust data infrastructure and enabling powerful analytics? Join a fast-paced, collaborative environment where you’ll help shape the future of data-driven decision More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Intellect Group
Role : Data Engineer Location : London, UK (Hybrid) Salary : £40k–£55k (DOE) Are you passionate about building robust data infrastructure and enabling powerful analytics? Join a fast-paced, collaborative environment where you’ll help shape the future of data-driven decision More ❯
About Airwallex Airwallex is the only unified payments and financial platform for global businesses. Powered by our unique combination of proprietary infrastructure and software, we empower over 150,000 businesses worldwide - including Brex, Rippling, Navan, Qantas, SHEIN and many more More ❯
We're looking for a Lead Data Engineer to join SN Data - Data Competence Centre within Springer Nature Operations. Springer Nature is a leading publisher of scientific books, journals and magazines with over 3000 journal titles and one of the More ❯
next level. As Senior Analytics Engineer, you will have sole ownership of analytics engineering at Omaze. You will use industry standard tools and platforms (dbt, Snowflake, ThoughtSpot) to amplify the effectiveness and impact of our (growing) analytics team. You will provide clean, tested, well-documented models, and work with our … data engineer to give the team access to new data sources. 🔧 What You’ll Be Doing Fully own our dbt project, building and maintaining data models in our Snowflake data warehouse, blending and modelling data from multiple sources Work with analysts and engineers to collaboratively design and builddata new … experience at a high growth company, ideally with hands-on experience of product/BI analytics A mastery of postgres SQL Extensive knowledge of dbt, and experience using its non-standard functionality that can elevate the performance and efficacy of dbt projects Excellent communication skills, and experience working with cross More ❯
next level. As Senior Analytics Engineer, you will have sole ownership of analytics engineering at Omaze. You will use industry standard tools and platforms (dbt, Snowflake, ThoughtSpot) to amplify the effectiveness and impact of our (growing) analytics team. You will provide clean, tested, well-documented models, and work with our … data engineer to give the team access to new data sources. 🔧 What You’ll Be Doing Fully own our dbt project, building and maintaining data models in our Snowflake data warehouse, blending and modelling data from multiple sources Work with analysts and engineers to collaboratively design and builddata new … experience at a high growth company, ideally with hands-on experience of product/BI analytics A mastery of postgres SQL Extensive knowledge of dbt, and experience using its non-standard functionality that can elevate the performance and efficacy of dbt projects Excellent communication skills, and experience working with cross More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Omaze UK
next level. As Senior Analytics Engineer, you will have sole ownership of analytics engineering at Omaze. You will use industry standard tools and platforms (dbt, Snowflake, ThoughtSpot) to amplify the effectiveness and impact of our (growing) analytics team. You will provide clean, tested, well-documented models, and work with our … data engineer to give the team access to new data sources. 🔧 What You’ll Be Doing Fully own our dbt project, building and maintaining data models in our Snowflake data warehouse, blending and modelling data from multiple sources Work with analysts and engineers to collaboratively design and builddata new … experience at a high growth company, ideally with hands-on experience of product/BI analytics A mastery of postgres SQL Extensive knowledge of dbt, and experience using its non-standard functionality that can elevate the performance and efficacy of dbt projects Excellent communication skills, and experience working with cross More ❯
pipelines for summarising customer conversations and surfacing insights in CRM Building and deploying production-ready Python code Developing ETL workflows and model pipelines using DBT and Airflow Supporting internal reps with smart tools directly embedded in their workflows Required Experience: Strong hands-on Python (production code, not just prototyping) Experience … with DBT , Airflow , and general ETL/data engineering Exposure to RAG systems , NLP , and GenAI tooling Ability to build ML solutions that scale across large user bases (100k+) Demonstrated commercial impact (e.g., reducing churn, increasing MRR) Experience in fast-paced SaaS/GTM environments Contract Details: Length: 12 months … Rate: £500-£800/day (depending on experience) Location: Fully remote Start Date: ASAP If interested, please send your CV Desired Skills and Experience DBT Apache Airflow CI/CD RAG (Retrieval-Augmented Generation) ETL Pipelines SQL More ❯
London, England, United Kingdom Hybrid / WFH Options
Harnham
pipelines for summarising customer conversations and surfacing insights in CRM Building and deploying production-ready Python code Developing ETL workflows and model pipelines using DBT and Airflow Supporting internal reps with smart tools directly embedded in their workflows Required Experience: Strong hands-on Python (production code, not just prototyping) Experience … with DBT , Airflow , and general ETL/data engineering Exposure to RAG systems , NLP , and GenAI tooling Ability to build ML solutions that scale across large user bases (100k+) Demonstrated commercial impact (e.g., reducing churn, increasing MRR) Experience in fast-paced SaaS/GTM environments Contract Details: Length: 12 months … Rate: £500-£800/day (depending on experience) Location: Fully remote Start Date: ASAP If interested, please send your CV Desired Skills and Experience DBT Apache Airflow CI/CD RAG (Retrieval-Augmented Generation) ETL Pipelines SQL More ❯
The role leverages a modern tech stack including SQL, Python, Airflow, Kubernetes, and various other cutting-edge technologies. You'll work with tools like dbt on Databricks, PySpark, Streamlit, and Django, ensuring robust data infrastructure that powers business-critical operations. What makes this role particularly exciting is the combination of … members Requirements For Data Engineer Strong aptitude with SQL, Python and Airflow Experience in Kubernetes, Docker, Django, Spark and related monitoring tools Experience with dbt for pipeline modelling Ability to shape needs into requirements and design scalable solutions Quick understanding of new domain areas and data visualization Team player with More ❯
move the needle, you'll love it here. What You'll Do Design models that hold up under pressure: Own and develop analytics-ready dbt models that transform raw data into clean, documented, and trusted sources of truth. Get the right data flowing : Use Fivetran and custom pipelines to ingest … environments where you've had to balance speed, quality, and scale. Proven ability to write clean, efficient SQL and Python, and to build robust dbt models that support scalable data workflows in production. Comfortable working across modern data stacks, including ELT tools, cloud warehouses, and BI platforms - with the ability … well. Current Stack We work with a modern data stack, but we're open to evolving as we grow. Currently, that includes: Fivetran BigQuery dbt Lightdash Hex Heap Benefits 28 days holiday per annum + Bank holidays, with the option to roll up to 5 days per annum. Employee Share More ❯
to define success metrics, measure impact, and learn from outcomes High level understanding of a modern data team's stack - we use Fivetran, Snowflake, dbt, Mixpanel and Omni An interest in infrastructure and observability-especially in complex, event-driven systems (we run on AWS) Familiarity with MongoDB or experience working …/CD we automate deployment onto Serverless architecture. Our platform data is stored in MongoDB and we have extensive analytics tooling using Fivetran, Snowflake, DBT, Mixpanel and Omni to enable data driven decisions across the business. We have robust monitoring, logging and reporting using AWS Cloudwatch and Sentry, and collaborate More ❯
will you be doing? • Designing cloud-based data platform infrastructure with AWS, Azure or GCP experience • Working with modern data platform tools such as DBT, Airflow, Snowflake • Work with CI/CD Azure Pipelines to automate and deploy • Monitor & look after the data – make sure it’s stable, secure, reliable … and performant Tech Stack • Experience with Terraform • Cloud Database knowledge (Snowflake, Redshift etc) • Docker or any Linux tooling •Experience with modern data tooling platforms, DBT, Airflow etc Benefits Package More ❯
will you be doing? • Designing cloud-based data platform infrastructure with AWS, Azure or GCP experience • Working with modern data platform tools such as DBT, Airflow, Snowflake • Work with CI/CD Azure Pipelines to automate and deploy • Monitor & look after the data – make sure it’s stable, secure, reliable … and performant Tech Stack • Experience with Terraform • Cloud Database knowledge (Snowflake, Redshift etc) • Docker or any Linux tooling •Experience with modern data tooling platforms, DBT, Airflow etc Benefits Package More ❯
Platform Delivery Manager - Technology Delivery - London (Key skills: Platform Delivery Manager, Program Management, Technical Delivery, Cloud Infrastructure, Data Platforms, DBT, Power BI, DevOps, Front End, Agile, Stakeholder Management, Governance, Vendor Management, Platform Delivery) Our client is a global digital solutions firm known for driving innovation across enterprise platforms in financial … to-end technical delivery, with a strong understanding of front end frameworks (React, Angular), cloud infrastructure (AWS, Azure, GCP), and data engineering platforms including DBT and Power BI. Experience managing hybrid teams and vendor engagements is critical, as is a deep understanding of Agile delivery, governance, and stakeholder communication at More ❯