of the AWS services like Redshift, Lambda,S3,Step Functions, Batch, Cloud formation, Lake Formation, Code Build, CI/CD, GitHub, IAM, SQS, SNS, Aurora DB • Good experience with DBT, Apache Iceberg, Docker, Microsoft BI stack (nice to have) • Experience in data warehouse design (Kimball and lake house, medallion and data vault) is a definite preference as is knowledge of More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
practices around CI/CD, infrastructure-as-code, and modern data tooling Introduce and advocate for scalable, efficient data processes and platform enhancements Tech Environment: Python, SQL, Spark, Airflow, dbt, Snowflake, Postgres AWS (S3), Docker, Terraform Exposure to Apache Iceberg, streaming tools (Kafka, Kinesis), and ML pipelines is a bonus What We're Looking For: 5+ years in Data Engineering More ❯
time and analytical data pipelines, metadata, and cataloguing (e.g., Atlan) Strong communication, stakeholder management, and documentation skills Preferred (but not essential): AWS or Snowflake certifications Knowledge of Apache Airflow, DBT, GitHub Actions Experience with Iceberg tables and data product thinking Why Apply? Work on high-impact, high-scale client projects Join a technically elite team with a strong culture of More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Boston Hale
Key Responsibilities: Design and maintain scalable data pipelines across diverse sources including CMS, analytics, ad tech, and social platforms. Lead engineering efforts to automate workflows using tools like Airflow, dbt, and Spark. Build robust data models to support dashboards, A/B testing, and revenue analytics. Collaborate with cross-functional teams to deliver actionable insights and support strategic initiatives. Drive More ❯
Greater Leeds Area, United Kingdom Hybrid / WFH Options
Corecom Consulting
Iceberg. Experience with data warehouse design, ETL/ELT, and CI/CD pipelines (GitHub, CodeBuild). Knowledge of infrastructure as code, performance tuning, and data migration. Exposure to DBT, Docker, or Microsoft BI stack is a plus. What’s on offer: Opportunity to work on large-scale, high-impact data projects. A collaborative, supportive environment with scope to grow More ❯
bradford, yorkshire and the humber, united kingdom Hybrid / WFH Options
Corecom Consulting
Iceberg. Experience with data warehouse design, ETL/ELT, and CI/CD pipelines (GitHub, CodeBuild). Knowledge of infrastructure as code, performance tuning, and data migration. Exposure to DBT, Docker, or Microsoft BI stack is a plus. What’s on offer: Opportunity to work on large-scale, high-impact data projects. A collaborative, supportive environment with scope to grow More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Boston Hale
for a Data Engineer to join their London-based team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources. Automate and optimise workflows using tools like Airflow, dbt, and Spark. Support data modelling for analytics, dashboards, and A/B testing. Collaborate with cross-functional teams to deliver data-driven insights. Work with cloud platforms (GCP preferred) and More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
day a week in a central London office High-growth scale-up with a strong mission and serious funding Modern tech stack: Python, SQL, Snowflake, Apache Iceberg, AWS, Airflow, dbt, Spark Work cross-functionally with engineering, product, analytics, and data science leaders What You'll Be Doing Lead, mentor, and grow a high-impact team of data engineers Define and More ❯
Leeds, West Yorkshire, England, United Kingdom Hybrid / WFH Options
Robert Walters
e.g., S3, Glue, Lambda, Redshift, Athena, EMR). Experience designing and building data lakes and modern data platforms. Proficiency with Python, SQL, and data pipeline orchestration tools (e.g., Airflow, dbt). Strong understanding of data modelling, ETL/ELT processes, and distributed systems. Knowledge of data security, governance, and compliance best practices. Excellent leadership, communication, and stakeholder engagement skills. What More ❯
and data quality across the business Requirements: Strong experience in SQL, Python, Spark, and cloud infrastructure (AWS, Databricks preferred) Hands-on experience building production data pipelines, coding transformations (e.g., dbt), and implementing data quality controls Familiarity with version control (Git), CI/CD concepts, and BI tools (e.g., Tableau) Experience handling large, inconsistent, or non-standard datasets with attention to More ❯
Blackburn, Lancashire, North West, United Kingdom Hybrid / WFH Options
Graham & Brown
migration experience Strong SQL and database management skills Experience with Microsoft Dynamics AX and NetSuite ERP is highly desirable Proficiency in ETL/ELT tools such as SSIS, Talend, dbt, or equivalent Programming skills in Python, R, or similar for data ingestion and automation Hands-on experience integrating data from digital platforms (Big Query, Google Analytics, Meta Ads, social media More ❯
of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project delivery for More ❯
of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project delivery for More ❯
london (city of london), south east england, united kingdom
LexisNexis Risk Solutions
of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project delivery for More ❯
of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project delivery for More ❯
of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project delivery for More ❯
london (city of london), south east england, united kingdom
LexisNexis Risk Solutions
of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project delivery for More ❯
data solutions Proficiency in Python, SQL, and data modelling tools (e.g. Erwin, Lucidchart) Strong knowledge of AWS services (Lambda, SNS, S3, EKS, API Gateway) Familiarity with Snowflake, Spark, Airflow, DBT, and data governance frameworks Preferred: Certifications in cloud/data technologies Experience with API/interface modelling and CI/CD (e.g. GitHub Actions) Knowledge of Atlan and iceberg tables More ❯
using Git. Proven background in data modelling and architecture design. Nice to have: Experience with Azure services (e.g., App registration, Storage Accounts). Familiarity with Terraform (especially Snowflake provider), DBT, Power Automate, and CI/CD pipelines (Azure DevOps or others). Experience working on greenfield builds. Asset management experience. More ❯
hands-on technically/40% hands-off leadership and strategy Proven experience designing scalable data architectures and pipelines Strong Python, SQL, and experience with tools such as Airflow, dbt, and Spark Cloud expertise (AWS preferred), with Docker/Terraform A track record of delivering in fast-paced, scale-up environments Nice to have: Experience with streaming pipelines, MLOps, or modern More ❯
Employment Type: Full-Time
Salary: £110,000 - £120,000 per annum, Inc benefits
London, South East, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
Strong technical background (5+ years) in building scalable data platforms Excellent communication and stakeholder management skills Hands-on experience with modern data tools and technologies — Python, SQL, Snowflake, Airflow, dbt, Spark, AWS, Terraform A collaborative mindset and a passion for mentoring and developing others Comfortable balancing technical decisions with business needs Nice to have: experience with Data Mesh, real-time More ❯
Longbridge, City and Borough of Birmingham, West Midlands (County), United Kingdom Hybrid / WFH Options
Kerv Digital
Azure ecosystem. Required Experience: Creating mapping specifications between legacy and CRM, ERP, Finance applications Integration to D365, Dataverse solutions or other SaaS applications Data Quality Frameworks: Soda.io, Great Expectations, DBT Creation of fault-tolerant data ingestion pipelines in SSIS/KWS/Data Factory/data flows using Linked Services, Integration Datasets Extracting data from a variety of sources including More ❯
. Strong communication and stakeholder management skills. Preferred: Cloud certifications (AWS, etc.). Experience with API data modeling, CI/CD (GitHub Actions). Familiarity with tools like Airflow, DBT, and Atlan. DONT WAIT APPLY NOW !!! Reference: DRI/AMC/DA Postcode: SW1 4DF #dari More ❯
less structured environments and propose scalable solutions Experience mentoring or managing juniors (or ready to step into that responsibility) Background in commodities, trading, finance, or shipping highly desirable Bonus: DBT experience, data governance knowledge, and exposure to analytics/data science workflows This is a hands-on role in a collaborative, face-to-face culture - perfect for someone who thrives More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
with architects, analysts, and business stakeholders to unlock insights and enable innovation. What You'll Be Doing: Design and build robust, automated data pipelines using Azure Data Factory, Synapse, dbt, and Databricks. Integrate data from enterprise systems (e.g. Dynamics, iTrent, Unit4) into a unified data platform. Cleanse, transform, and model data to support BI tools (e.g., Power BI) and AI More ❯