making. Work with stakeholders to understand data needs and provide insights through presentations and reports. Deliver data-driven recommendations to support business objectives. Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency … making. Work with stakeholders to understand data needs and provide insights through presentations and reports. Deliver data-driven recommendations to support business objectives. Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency … Information Systems, Finance, or a related field. Proven experience as a Data Analyst/Analytics Engineer role, preferably in the payments industry with issuer processors. Proven experience in SQL,DBT and Snowflake. Proficiency in building and managing data transformations with dbt, with experience in optimizing complex transformations and documentation. Hands-on experience with Snowflake as a primary data warehouse, including More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
You'll play a key role in scaling analytics infrastructure, optimizing pipelines, and mentoring fellow engineers. Key Responsibilities: Build and optimize data models across bronze to gold layers using dbt and Kimball methodology Own and manage the semantic layer for BI tools like Looker and Power BI Implement rigorous data quality and testing frameworks Drive CI/CD practices with … like GitHub Actions and Terraform Lead technical decisions and mentor junior engineers Collaborate across engineering, data science, and product teams to deliver business impact Skills & Experience: Expert in SQL , dbt , and cloud data warehouses (e.g., BigQuery, Redshift) Strong experience with Airflow , Python , and multi-cloud environments (AWS/GCP) Proven background in designing and scaling analytics solutions in agile environments More ❯
of data from any source — whether databases, applications, or files — into lakehouses like Snowflake, Databricks, and Redshift. With pipelines that just work and features like advanced data transformation using dbt Core and end-to-end pipeline observability, we’re focused on making robust data pipelines accessible to everyone. London (in person) At Etleap, we’re redefining how data teams build … of data from any source — whether databases, applications, or files — into lakehouses like Snowflake, Databricks, and Redshift. With pipelines that just work and features like advanced data transformation using dbt Core and end-to-end pipeline observability, we’re focused on making robust data pipelines accessible to everyone. We are looking to add senior engineers to our core engineering team More ❯
West London, London, United Kingdom Hybrid / WFH Options
McGregor Boyall Associates Limited
Technical Requirements: Advanced proficiency in Python and modern software engineering practices. Experience architecting solutions using major cloud platforms (Azure, AWS, GCP). Familiarity with technologies such as Databricks, Airflow, dbt, Snowflake, GitHub CI/CD, and infrastructure-as-code. Strong background across at least several of the following areas: Cloud Engineering, Data Platform Architecture, DevOps, MLOps/LLMOps. Ideal Profile More ❯
you will be doing Data Modelling & Engineering: Gather requirements from stakeholders and design scalable data solutions Build and maintain robust data models and exposures in the data warehouse using dbt, Snowflake, and Looker Document architectural decisions, modelling challenges, and outcomes in a clear and structured way Wrangle and integrate data from multiple third-party sources (e.g., Salesforce, Amplitude, Segment, Google … or in a similar role You will have proven experience in designing and maintaining data models for warehousing and business intelligence You will have advanced SQL skills; experience with dbt and/or Looker is strongly preferred You will be proficient with modern data platforms (e.g., Snowflake, dbt, AWS, GCP, Looker, Tableau, Airflow) You will have experience with version control More ❯
you will be doing Data Modelling & Engineering: Gather requirements from stakeholders and design scalable data solutions Build and maintain robust data models and exposures in the data warehouse using dbt, Snowflake, and Looker Document architectural decisions, modelling challenges, and outcomes in a clear and structured way Wrangle and integrate data from multiple third-party sources (e.g., Salesforce, Amplitude, Segment, Google … or in a similar role You will have proven experience in designing and maintaining data models for warehousing and business intelligence You will have advanced SQL skills; experience with dbt and/or Looker is strongly preferred You will be proficient with modern data platforms (e.g., Snowflake, dbt, AWS, GCP, Looker, Tableau, Airflow) You will have experience with version control More ❯
experience in data collection, preprocessing, and integration from various sources, ensuring accuracy, consistency, and handling missing values or outliers. Proficient in designing and implementing ELT pipelines using tools like dbt, with strong knowledge of data warehousing, data lake concepts, and data pipeline optimization. Skilled in SQL for data manipulation, analysis, query optimisation, and database design. Artificial Intelligence and Machine Learning More ❯
ready datasets. Promote CI/CD, DevOps, and data reliability engineering (DRE) best practices across Databricks environments. Integrate with cloud-native services and orchestrate workflows using tools such as dbt, Airflow, and Databricks Workflows. Drive performance tuning, cost optimisation, and monitoring across data workloads. Mentor engineering teams and support architectural decisions as a recognised Databricks expert. Essential Skills & Experience: Demonstrable … security, and compliance, including Unity Catalog. Excellent communication, leadership, and problem-solving skills. Desirable: Databricks certifications (e.g., Data Engineer Associate/Professional or Solutions Architect). Familiarity with MLflow, dbt, and BI tools such as Power BI or Tableau. Exposure to MLOps practices and deploying ML models within Databricks. Experience working within Agile and DevOps-driven delivery environments. More ❯
pragmatic approach to how we apply new technologies. Our Tech Stack TypeScript (Full-stack) React + Next.js, Tailwind, Prisma, tRPC PostgreSQL, MongoDB, Redis Serverless, AWS, Google Cloud, Github Actions DBT, BigQuery Terraform Python Requirements You're opinionated and want to help us change the legal system for the better You have a track record of delivering exceptional work and can More ❯
apply new technologies. Our Tech Stack Must have: TypeScript (Full-stack), React (+ Next.js, Tailwind, Prisma, tRPC) Nice to have: PostgreSQL, MongoDB, Redis Serverless, AWS, Google Cloud, Github Actions DBT, BigQuery Terraform Python Requirements You're opinionated and want to help us change the legal system for the better You have a track record of delivering exceptional work and can More ❯
DATA ENGINEER - DBT/AIRFLOW/DATABRICKS 4-MONTH CONTRACT £450-550 PER DAY OUTSIDE IR35 This is an exciting opportunity for a Data Engineer to join a leading media organisation working at the forefront of data innovation. You'll play a key role in designing and building the data infrastructure that supports cutting-edge machine learning and LLM initiatives. … contractor to accelerate delivery of critical pipelines and platform improvements. THE ROLE You'll join a skilled data team to lead the build and optimisation of scalable pipelines using DBT, Airflow, and Databricks. Working alongside data scientists and ML engineers, you'll support everything from raw ingestion to curated layers powering LLMs and advanced analytics.Your responsibilities will include: Building and … maintaining production-grade ETL/ELT workflows with DBT and Airflow Collaborating with AI/ML teams to support data readiness for experimentation and inference Writing clean, modular SQL and Python code for use in Databricks Contributing to architectural decisions around pipeline scalability and performance Supporting the integration of diverse data sources into the platform Ensuring data quality, observability, and More ❯
Walsall, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Adecco
unlock their full potential. Liaise with stakeholders to shape clear, actionable data requirements - even when things are complex or incomplete. What You'll Bring Strong hands-on expertise in DBT, SQL, Snowflake , and orchestration tools like Airflow. Experience with Azure Data Factory and visualisation tools such as Power BI. Deep knowledge of agile data development methodologies and best practices. A More ❯
Requirements: 5+ years of experience working with production systems or data warehouses. Proficiency in Python, Snowflake/SQL, and experience with orchestration tools like Argo or Airflow. Knowledge of DBT is preferred. Experience with Retool is a plus. Get to know DevsData : We are a technology consulting company and a recruitment agency, delivering software solutions to clients from Europe and More ❯
in a modern cloud environment. You will play a key role in building and optimizing our Medallion architecture (Bronze, Silver, Gold layers), working with modern tools such as Databricks , dbt , Azure Data Factory , and Python/SQL to support critical business analytics and AI/ML initiatives. Key Responsibilities ETL Development : Design and build robust and reusable ETL/ELT … pipelines through the Medallion architecture in Databricks . Data Transformation : Create and manage data models and transformations using dbt , ensuring clear lineage, version control, and modularity. Pipeline Orchestration : Develop and manage workflow orchestration using Azure Data Factory , including setting up triggers, pipelines, and integration runtimes. System Maintenance : Monitor, maintain, and optimize existing data pipelines, including cron job scheduling and batch … privacy regulations. Required Skills and Experience 5+ years of experience in data engineering or similar roles. Strong experience with Databricks , including notebooks, cluster configuration, and Delta Lake. Proficiency in dbt for transformation logic and version-controlled data modeling. Deep knowledge of Azure Data Factory , including pipeline orchestration and integration with other Azure services. Experience with data integration (e.g. : APIs, JSON More ❯
are relied upon to ensure our systems are trusted, reliable and available . The technology underpinning these capabilities includes industry leading data and analytics products such as Snowflake, Tableau, DBT, Talend, Collibra, Kafka/Confluent , Astronomer/Airflow , and Kubernetes . This forms part of a longer-term strategic direction to implement Data Mesh, and with it establish shared platform … are achievable , and driv ing a culture of iterative improvemen t. Modern data stack - hands-on deploy ment and govern ance of enterprise technologies at scale (e.g. Snowflake, Tableau, DBT, Fivetran , Airflow, AWS , GitHub, Terraform, etc ) for self-service workloads . Thought leadership and influencing - deep interest in data platforms landscape to build well-articulated proposals that are supported by More ❯
to inform strategic decisions both at the Board/Executive level and at the business unit level. Key Responsibilities Design, develop, and maintain scalable ETL pipelines using technologies like dbt, Airbyte, Cube, DuckDB, Redshift, and Superset Work closely with stakeholders across the company to gather data requirements and setup dashboards Promote a data driven culture at Notabene and train, upskill … working in data engineering with large sets of data. Ex: millions of students, transactions. Proven experience in building and maintaining ETL pipelines and data infrastructure Strong experience working with dbt core/cloud Business savvy and capable of interfacing with finance, revenue and ops leaders to build our business intelligence Expertise in data governance, best practices, and data security, especially More ❯
in data engineering roles, preferably for a customer facing data product Expertise in designing and implementing large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling in your development More ❯
ensuring quality, cataloging, and discoverability. Enforce data compliance standards, including PII masking, data obfuscation, and RBAC. Oversee the design, development, deployment, and maintenance of data products, including dashboards, reports, dbt models, and machine learning models. Ensure the quality, accuracy, and reliability of data analysis and modelling. Collaborate with the Software Engineering team to build and maintain a robust data infrastructure. More ❯
Birmingham, Staffordshire, United Kingdom Hybrid / WFH Options
Kerv Digital for Digital Transformation
Azure ecosystem. Required Experience: • Creating mapping specifications between legacy and CRM, ERP, Finance applications • Integration to D365, Dataverse solutions or other SaaS applications • Data Quality Frameworks: Soda.io, Great Expectations, DBT • Creation of fault-tolerant data ingestion pipelines in SSIS/KWS/Data Factory/data flows using Linked Services, Integration Datasets • Extracting data from a variety of sources including More ❯
Longbridge, City and Borough of Birmingham, West Midlands (County), United Kingdom Hybrid / WFH Options
Kerv Digital
Azure ecosystem. Required Experience: Creating mapping specifications between legacy and CRM, ERP, Finance applications Integration to D365, Dataverse solutions or other SaaS applications Data Quality Frameworks: Soda.io, Great Expectations, DBT Creation of fault-tolerant data ingestion pipelines in SSIS/KWS/Data Factory/data flows using Linked Services, Integration Datasets Extracting data from a variety of sources including More ❯
into high-quality shared libraries. Review Python code changes and assist with design decisions across dev teams when appropriate. Transition legacy data pipelines into new target architecture using Snowflake, DBT, and data governance tools. Process and handle large volumes of data efficiently. Optimize performance for expensive processes. Communicate complex technical concepts effectively to technical and non-technical stakeholders. What We More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
strategy within ambitious software businesses. Specifically, you can expect to be involved in the following: Designing and developing full-stack data pipelines and platforms using modern tools such as dbt, Airflow, and cloud infrastructure Cleansing, enriching and modelling data to generate commercial insights and power C-level dashboards Delivering scalable solutions that support internal use cases and extend directly to … finance, sales) and building tools that serve business needs Background in startups or scale-ups with high adaptability and a hands-on approach Experience with modern data tools (e.g. dbt, Airflow, CI/CD) and at least one cloud platform (AWS, GCP, Azure) Strong communication skills and a track record of credibility in high-pressure or client-facing settings BENEFITS More ❯
Employment Type: Full-Time
Salary: £100,000 - £110,000 per annum, Inc benefits
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
S3) Deep experience with Snowflake as a data warehouse. Proficiency in Python or Scala for data processing Excellent communication and stakeholder management skills Preferably some experience with Terraform and dbt (although these are not essential) Benefits: Competitive salary Performance bonus scheme 100% remote working (UK only) 26 days holiday + Bank holidays + Birthday off Opportunity for growth and development More ❯
S3) Deep experience with Snowflake as a data warehouse. Proficiency in Python or Scala for data processing Excellent communication and stakeholder management skills Preferably some experience with Terraform and dbt (although these are not essential) Benefits: Competitive salary Performance bonus scheme 100% remote working (UK only) 26 days holiday + Bank holidays + Birthday off Opportunity for growth and development More ❯
Liverpool, Merseyside, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
S3) Deep experience with Snowflake as a data warehouse. Proficiency in Python or Scala for data processing Excellent communication and stakeholder management skills Preferably some experience with Terraform and dbt (although these are not essential) Benefits: Competitive salary Performance bonus scheme 100% remote working (UK only) 26 days holiday + Bank holidays + Birthday off Opportunity for growth and development More ❯