ready datasets. Promote CI/CD, DevOps, and data reliability engineering (DRE) best practices across Databricks environments. Integrate with cloud-native services and orchestrate workflows using tools such as dbt, Airflow, and Databricks Workflows. Drive performance tuning, cost optimisation, and monitoring across data workloads. Mentor engineering teams and support architectural decisions as a recognised Databricks expert. Essential Skills & Experience: Demonstrable … security, and compliance, including Unity Catalog. Excellent communication, leadership, and problem-solving skills. Desirable: Databricks certifications (e.g., Data Engineer Associate/Professional or Solutions Architect). Familiarity with MLflow, dbt, and BI tools such as Power BI or Tableau. Exposure to MLOps practices and deploying ML models within Databricks. Experience working within Agile and DevOps-driven delivery environments. More ❯
with petabyte scale data sets and developing integration layer solutions in Databricks, Snowflake or similar large platforms. Experience with cloud-based data warehousing, transformation tools like Delta Lake Tables, DBT, Fivetran or Snowflake. Proficiency in machine learning and familiarity with open-source machine learning ecosystems. Proven experience as an Integration Engineer or similar role, with a strong understanding of integration More ❯
pragmatic approach to how we apply new technologies. Our Tech Stack TypeScript (Full-stack) React + Next.js, Tailwind, Prisma, tRPC PostgreSQL, MongoDB, Redis Serverless, AWS, Google Cloud, Github Actions DBT, BigQuery Terraform Python Requirements You're opinionated and want to help us change the legal system for the better You have a track record of delivering exceptional work and can More ❯
apply new technologies. Our Tech Stack Must have: TypeScript (Full-stack), React (+ Next.js, Tailwind, Prisma, tRPC) Nice to have: PostgreSQL, MongoDB, Redis Serverless, AWS, Google Cloud, Github Actions DBT, BigQuery Terraform Python Requirements You're opinionated and want to help us change the legal system for the better You have a track record of delivering exceptional work and can More ❯
West Bend, Wisconsin, United States Hybrid / WFH Options
Delta Defense
security programs within cloud first architectures integrated with other large third-party SaaS/ecommerce platforms (Snowflake, Redshift, Databricks). Experience working with tech stack/tools: Python, Snowflake, dbt, Fivetran, Kafka, Tableau, Git, Informatica, Kestra, Excel, and/or related technologies. Experience working with tech stack/tools: JS, PHP, PostgreSQL, Kubernetes, Kafka/Airflow, NeonDB, Cloudflare, Gitlab, Doppler More ❯
compliance requirements. You have Experience with cloud data warehouses such as Redshift. Skills in modelling and querying data in SQL and Python. Experience with ETL/ELT tooling including DBT and Airflow. Experience with CI/CD and infrastructure-as-code, within AWS cloud. Also desirable - familiarity with AWS' data tools such as EMR, MWAA, MSK You are Someone who More ❯
DATA ENGINEER - DBT/AIRFLOW/DATABRICKS 4-MONTH CONTRACT £450-550 PER DAY OUTSIDE IR35 This is an exciting opportunity for a Data Engineer to join a leading media organisation working at the forefront of data innovation. You'll play a key role in designing and building the data infrastructure that supports cutting-edge machine learning and LLM initiatives. … contractor to accelerate delivery of critical pipelines and platform improvements. THE ROLE You'll join a skilled data team to lead the build and optimisation of scalable pipelines using DBT, Airflow, and Databricks. Working alongside data scientists and ML engineers, you'll support everything from raw ingestion to curated layers powering LLMs and advanced analytics.Your responsibilities will include: Building and … maintaining production-grade ETL/ELT workflows with DBT and Airflow Collaborating with AI/ML teams to support data readiness for experimentation and inference Writing clean, modular SQL and Python code for use in Databricks Contributing to architectural decisions around pipeline scalability and performance Supporting the integration of diverse data sources into the platform Ensuring data quality, observability, and More ❯
variety of datasources with for example apache camel, apache pulsar or kafka, dlt, python, airbyte Analytics Engineering: model datawarehouses both batch and real-time with for example clickhouse and dbt or sqlmesh Business intelligence: build visuals that answer critical business questions in for example metabase, databricks or powerbi. ML/AI: use modern tools to predict or use llm's … available data, for example creating insights for a wholesale client with data warehousing using an azure, aws, gcp or on-premise architecture including apache kafka/pulsar, sqlmesh/dbt, clickhouse/databend and metabase/superset. Build state-of-the-art systems that solve client-specific challenges, for example building agentic LLM's with RAG (retrieval-augmented generation) using More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
Ideal Candidate Strong proficiency in Python and SQL for data wrangling and automation* Experience with ML workflows, model deployment, and feature engineering* Familiarity with modern data platforms (e.g., BigQuery, dbt, Looker)* Understanding of data modelling and cloud-based infrastructure (AWS or GCP)* Comfortable working in a fast-paced, product-focused environment* Proactive, curious, and capable of balancing technical depth with … business understanding* Excellent communication skills and a collaborative mindset Tech Stack/Tools Python SQL Dbt Spark or Databricks GCP (Open to AWS/Azure) CI/CD tooling Benefits * Company profit share scheme* Bupa private healthcare with 24/7 GP access* Up to 8% employer pension contribution* 25 days holiday + bank holidays* Access to GymPass Gold (national More ❯
Walsall, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Adecco
unlock their full potential. Liaise with stakeholders to shape clear, actionable data requirements - even when things are complex or incomplete. What You'll Bring Strong hands-on expertise in DBT, SQL, Snowflake , and orchestration tools like Airflow. Experience with Azure Data Factory and visualisation tools such as Power BI. Deep knowledge of agile data development methodologies and best practices. A More ❯
Requirements: 5+ years of experience working with production systems or data warehouses. Proficiency in Python, Snowflake/SQL, and experience with orchestration tools like Argo or Airflow. Knowledge of DBT is preferred. Experience with Retool is a plus. Get to know DevsData : We are a technology consulting company and a recruitment agency, delivering software solutions to clients from Europe and More ❯
in a modern cloud environment. You will play a key role in building and optimizing our Medallion architecture (Bronze, Silver, Gold layers), working with modern tools such as Databricks , dbt , Azure Data Factory , and Python/SQL to support critical business analytics and AI/ML initiatives. Key Responsibilities ETL Development : Design and build robust and reusable ETL/ELT … pipelines through the Medallion architecture in Databricks . Data Transformation : Create and manage data models and transformations using dbt , ensuring clear lineage, version control, and modularity. Pipeline Orchestration : Develop and manage workflow orchestration using Azure Data Factory , including setting up triggers, pipelines, and integration runtimes. System Maintenance : Monitor, maintain, and optimize existing data pipelines, including cron job scheduling and batch … privacy regulations. Required Skills and Experience 5+ years of experience in data engineering or similar roles. Strong experience with Databricks , including notebooks, cluster configuration, and Delta Lake. Proficiency in dbt for transformation logic and version-controlled data modeling. Deep knowledge of Azure Data Factory , including pipeline orchestration and integration with other Azure services. Experience with data integration (e.g. : APIs, JSON More ❯
are relied upon to ensure our systems are trusted, reliable and available . The technology underpinning these capabilities includes industry leading data and analytics products such as Snowflake, Tableau, DBT, Talend, Collibra, Kafka/Confluent , Astronomer/Airflow , and Kubernetes . This forms part of a longer-term strategic direction to implement Data Mesh, and with it establish shared platform … are achievable , and driv ing a culture of iterative improvemen t. Modern data stack - hands-on deploy ment and govern ance of enterprise technologies at scale (e.g. Snowflake, Tableau, DBT, Fivetran , Airflow, AWS , GitHub, Terraform, etc ) for self-service workloads . Thought leadership and influencing - deep interest in data platforms landscape to build well-articulated proposals that are supported by More ❯
to inform strategic decisions both at the Board/Executive level and at the business unit level. Key Responsibilities Design, develop, and maintain scalable ETL pipelines using technologies like dbt, Airbyte, Cube, DuckDB, Redshift, and Superset Work closely with stakeholders across the company to gather data requirements and setup dashboards Promote a data driven culture at Notabene and train, upskill … working in data engineering with large sets of data. Ex: millions of students, transactions. Proven experience in building and maintaining ETL pipelines and data infrastructure Strong experience working with dbt core/cloud Business savvy and capable of interfacing with finance, revenue and ops leaders to build our business intelligence Expertise in data governance, best practices, and data security, especially More ❯
in data engineering roles, preferably for a customer facing data product Expertise in designing and implementing large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling in your development More ❯
Birmingham, Staffordshire, United Kingdom Hybrid / WFH Options
Kerv Digital for Digital Transformation
Azure ecosystem. Required Experience: • Creating mapping specifications between legacy and CRM, ERP, Finance applications • Integration to D365, Dataverse solutions or other SaaS applications • Data Quality Frameworks: Soda.io, Great Expectations, DBT • Creation of fault-tolerant data ingestion pipelines in SSIS/KWS/Data Factory/data flows using Linked Services, Integration Datasets • Extracting data from a variety of sources including More ❯
Rednal, Birmingham, United Kingdom Hybrid / WFH Options
Kerv
Azure ecosystem. Required Experience: • Creating mapping specifications between legacy and CRM, ERP, Finance applications • Integration to D365, Dataverse solutions or other SaaS applications • Data Quality Frameworks: Soda.io, Great Expectations, DBT • Creation of fault-tolerant data ingestion pipelines in SSIS/KWS/Data Factory/data flows using Linked Services, Integration Datasets • Extracting data from a variety of sources including More ❯
communicator in engaging technical staff and business partners Bachelor's degree or higher in STEM or quantitative discipline preferred Preferred Skills: Full spectrum experience with modern data platforms (Snowflake, dbt, Sigma, Alation). Proficient with ETL platforms such as Fivetran and Prefect Hands on experience with data lineage and observability platforms Knowledge of Agile methodologies and project management tools (JIRA More ❯
Christchurch, Dorset, United Kingdom Hybrid / WFH Options
Wearebasis
will be valuable to ensure solutions are fit for purpose. Tooling Familiarity: Experience with relevant QA and data-related tools, which may include (but not limited to) Airflow, Stitch, DBT, Great Expectations, Datafold, and AWS. CI/CD Integration: Familiarity with CI/CD pipelines and the ability to integrate testing processes within automated deployment workflows. Communication skills: Clear, candid More ❯
Proven ability to design and develop scalable data pipelines Strong collaboration skills with business stakeholders and cross-functional teams Experience with infrastructure as code (ideally Terraform) Desirable: Experience using dbt and implementing dbt best practices Overview: Job Title: Senior Data Engineer – Data Modelling & Engineering Location: London area Duration: 6 months Why Apply? This is a fantastic opportunity to contribute to More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
strategy within ambitious software businesses. Specifically, you can expect to be involved in the following: Designing and developing full-stack data pipelines and platforms using modern tools such as dbt, Airflow, and cloud infrastructure Cleansing, enriching and modelling data to generate commercial insights and power C-level dashboards Delivering scalable solutions that support internal use cases and extend directly to … finance, sales) and building tools that serve business needs Background in startups or scale-ups with high adaptability and a hands-on approach Experience with modern data tools (e.g. dbt, Airflow, CI/CD) and at least one cloud platform (AWS, GCP, Azure) Strong communication skills and a track record of credibility in high-pressure or client-facing settings BENEFITS More ❯
Employment Type: Full-Time
Salary: £100,000 - £110,000 per annum, Inc benefits
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
S3) Deep experience with Snowflake as a data warehouse. Proficiency in Python or Scala for data processing Excellent communication and stakeholder management skills Preferably some experience with Terraform and dbt (although these are not essential) Benefits: Competitive salary Performance bonus scheme 100% remote working (UK only) 26 days holiday + Bank holidays + Birthday off Opportunity for growth and development More ❯
S3) Deep experience with Snowflake as a data warehouse. Proficiency in Python or Scala for data processing Excellent communication and stakeholder management skills Preferably some experience with Terraform and dbt (although these are not essential) Benefits: Competitive salary Performance bonus scheme 100% remote working (UK only) 26 days holiday + Bank holidays + Birthday off Opportunity for growth and development More ❯
Liverpool, Merseyside, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
S3) Deep experience with Snowflake as a data warehouse. Proficiency in Python or Scala for data processing Excellent communication and stakeholder management skills Preferably some experience with Terraform and dbt (although these are not essential) Benefits: Competitive salary Performance bonus scheme 100% remote working (UK only) 26 days holiday + Bank holidays + Birthday off Opportunity for growth and development More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
S3) Deep experience with Snowflake as a data warehouse. Proficiency in Python or Scala for data processing Excellent communication and stakeholder management skills Preferably some experience with Terraform and dbt (although these are not essential) Benefits: Competitive salary Performance bonus scheme 100% remote working (UK only) 26 days holiday + Bank holidays + Birthday off Opportunity for growth and development More ❯