meaningful outcomes. Technical expertise: Advanced Python skills for data engineering tasks. Experience building and maintaining web scraping pipelines. Strong SQL skills, with expertise in performance tuning. Strong proficiency with dbt for data transformations. Hands-on experience with Apache Airflow or Prefect. Proficiency with GitHub, GitHub Actions, and CI/CD pipelines. Nice to have: Experience with GCP (BigQuery, Dataflow, Composer More ❯
to know it today but to learn from the team as it makes up a large part of the stack). Good knowledge of Data Engineering tooling such as dbt or Spark. CDC tools like Debezium are a bonus. Builddata systems with a software and infrastructure engineer mindset, including tested, scalable, resilient, fault tolerant, observable and "as code" practices. More ❯
and scientists working across Africa, Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data architectures such as Data Mesh, lakehouse, data vault and data warehouses. Our data engineers More ❯
Python: schema design, transformations, query optimisation, automation, testing. Track record of buildingETL/ELT pipelines into modern warehouses (BigQuery, Snowflake, Redshift). Familiar with tools like Dagster, Airflow, Prefect, dbt, Dataform, SQLMesh. Cloud experience (we're on GCP) + containerisation (Docker, Kubernetes). Strong sense of ownership over data standards, security, and roadmap. A collaborator at heart - working with analysts More ❯
modelling techniques (star schema, data vault, dimensional modelling). Proficiency in Excel-based data workflows for various Agile Retail projects. Hands-on experience with data pipeline orchestration tools (Airflow, dbt, Prefect, or similar). Benefits: Unlimited holiday Annual Wellbeing Allowance Flexible work culture Monthly socials and events Complimentary snack bar Employer pension contribution If you're a data enthusiast ready More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Office Angels
modelling techniques (star schema, data vault, dimensional modelling). Proficiency in Excel-based data workflows for various Agile Retail projects. Hands-on experience with data pipeline orchestration tools (Airflow, dbt, Prefect, or similar). Benefits: Unlimited holiday Annual Wellbeing Allowance Flexible work culture Monthly socials and events Complimentary snack bar Employer pension contribution If you're a data enthusiast ready More ❯
engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and More ❯
Stack Cloud: Azure, sometimes GCP & AWS Data Platform: Databricks, Snowflake, BigQuery Data Engineering tools: Pyspark, Polars, DuckDB, Malloy, SQL Infrastructure-as-code: Terraform, Pulumi Data Management and Orchestration: Airflow, dbt Databases and Data Warehouses: SQL Server, PostgreSQL, MongoDB, Qdrant, Pinecone GenAI: OpenAI APIs, HuggingFace, LangChain, Talk-to-data Monitoring: Datadog About You We are looking for someone who can wear More ❯
use LLM frameworks : LangChain, LlamaIndex (or similar) Cloud & Dev : Azure/AWS/GCP, Docker, REST APIs, GitHub Actions/CI Data & MLOps : BigQuery/Snowflake, MLflow/DVC, dbt/Airflow (nice to have) Front ends (for internal tools) : Streamlit/Gradio/basic React Must-have experience 7+ years in Data Science/ML, including hands-on delivery More ❯
team members to deliver outcome Highly motivated, with strong problem-solving skills and business acumen Desirable Skills Experience with data visualization tools (e.g., Looker, Tableau, Power BI) Familiarity with dbt, Airflow, or other modern data tools Exposure to marketing or e-commerce industries Additional Information Publicis Groupe operates a hybrid working pattern with full time employees being office-based three More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
in data engineering, warehousing, or platform leadership. Proven track record of delivering large-scale data warehouse migrations (BigQuery experience strongly preferred). Hands-on expertise with SQL, Python, Airflow, DBT/Dataform, Terraform , and modern data architecture. Strong leadership and stakeholder management skills. Experience driving complex data projects in agile, cross-functional teams. Nice-to-haves: Background in insurance or More ❯
Tech Stack Cloud Data Warehouse - Snowflake AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github Actions/Jenkins Business Intelligence - Looker Experience and Attributes we'd like to see Platform Engineering Expertise Extensive experience in platform engineering; designing, building More ❯
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
and in event-driven and eventual consistency systems using Kafka, .Net, Java, REST APIs, AWS, Terraform and DevOps Nice to have:experience in data pipping and modelling using SQL, DBT, ETL, Data Warehousing, Redshift and Python, and ecommerce and mobile applications background Additional Information Were a community here that cares as much about your life outside work as how you More ❯
experience in data leadership – ideally within fintech, SaaS, or regulated tech environments. Technical depth across data engineering, analytics, or data science. Hands-on familiarity with modern data stacks – SQL, dbt, Airflow, Snowflake, Looker/Power BI. Understanding of the AI/ML lifecycle – including tooling (Python, MLflow) and best-practice MLOps. Comfortable working across finance, risk, and commercial functions. Experience More ❯
experience in data leadership – ideally within fintech, SaaS, or regulated tech environments. Technical depth across data engineering, analytics, or data science. Hands-on familiarity with modern data stacks – SQL, dbt, Airflow, Snowflake, Looker/Power BI. Understanding of the AI/ML lifecycle – including tooling (Python, MLflow) and best-practice MLOps. Comfortable working across finance, risk, and commercial functions. Experience More ❯
hands-on technically/40% hands-off leadership and strategy Proven experience designing scalable data architectures and pipelines Strong Python, SQL, and experience with tools such as Airflow, dbt, and Spark Cloud expertise (AWS preferred), with Docker/Terraform A track record of delivering in fast-paced, scale-up environments Nice to have: Experience with streaming pipelines, MLOps, or modern More ❯
Employment Type: Full-Time
Salary: £110,000 - £120,000 per annum, Inc benefits
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
roles, ideally within consumer tech, logistics, or e-commerce. Strong proficiency in Python, SQL, and machine learning frameworks. Experience with cloud platforms (Azure, AWS, GCP) and tools like Databricks, DBT, Airflow, or Terraform. Familiarity with AI/ML applications and modern analytics tooling. Excellent communication skills and ability to work independently in a fast-paced environment. Why Join? Be part More ❯
solutions. You'll work closely with senior engineers and stakeholders, supporting data pipeline development and helping improve data accessibility. Key Responsibilities Assist in building and maintaining data pipelines using dbt and Snowflake, ensuring they are scalable and efficient. Collaborate with analytics engineers and stakeholders across Product, Marketing, and Analytics teams to support data needs. Contribute to BI tool adoption and More ❯
direct experience-we want to hire people to grow into the role and beyond. About the team: Python is our bread and butter. The wider data platform team uses dbt, Snowflake, and Looker to model, transform, and expose data for analytics and reporting across the business. We use Docker and Kubernetes to manage our production services. We use Github Actions More ❯
enterprise b2c who you will have heard of/used, who are seeking 2x Data Engineers to join them asap on an initial 6 months contract. Python, SQL, AWS, dbt, Airflow (spark/scala a bonus) 6 months Outside IR35 (ltd company) £500-600 per day Asap start More ❯
enterprise b2c who you will have heard of/used, who are seeking 2x Data Engineers to join them asap on an initial 6 months contract. Python, SQL, AWS, dbt, Airflow (spark/scala a bonus) 6 months Outside IR35 (ltd company) £500-600 per day Asap start More ❯
West London, London, United Kingdom Hybrid / WFH Options
Young's Employment Services Ltd
Tech Lead, Data Engineering Manager etc. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in a cloud-native environment (e.g. More ❯
platforms (e.g. Databricks, Azure, AWS or GCP native stacks)Experience with platform observability and CI/CD for data platformsHands-on experience with modern data engineering tools such as dbt, Fivetran, Matillion or AirflowHistory of supporting pre-sales activities in a product or consultancy-based businessWhat Kubrick offers:We are a fast moving and fast growth business which is doing More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
engineering teams to set up a data warehouse using modern data modelling techniques, focusing on Kimball methodology and a Medallion architecture (bronze, silver, gold layers). Develop and maintain DBT projects and configure incremental loads with built-in unit testing. Support data pipeline orchestration with Airflow and work with AWS cloud tools. Help deliver a production-ready Data Mart with More ❯