experience in data leadership – ideally within fintech, SaaS, or regulated tech environments. Technical depth across data engineering, analytics, or data science. Hands-on familiarity with modern data stacks – SQL, dbt, Airflow, Snowflake, Looker/Power BI. Understanding of the AI/ML lifecycle – including tooling (Python, MLflow) and best-practice MLOps. Comfortable working across finance, risk, and commercial functions. Experience More ❯
hands-on technically/40% hands-off leadership and strategy Proven experience designing scalable data architectures and pipelines Strong Python, SQL, and experience with tools such as Airflow, dbt, and Spark Cloud expertise (AWS preferred), with Docker/Terraform A track record of delivering in fast-paced, scale-up environments Nice to have: Experience with streaming pipelines, MLOps, or modern More ❯
Employment Type: Full-Time
Salary: £110,000 - £120,000 per annum, Inc benefits
London, South East, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
Strong technical background (5+ years) in building scalable data platforms Excellent communication and stakeholder management skills Hands-on experience with modern data tools and technologies — Python, SQL, Snowflake, Airflow, dbt, Spark, AWS, Terraform A collaborative mindset and a passion for mentoring and developing others Comfortable balancing technical decisions with business needs Nice to have: experience with Data Mesh, real-time More ❯
solutions. You'll work closely with senior engineers and stakeholders, supporting data pipeline development and helping improve data accessibility. Key Responsibilities Assist in building and maintaining data pipelines using dbt and Snowflake, ensuring they are scalable and efficient. Collaborate with analytics engineers and stakeholders across Product, Marketing, and Analytics teams to support data needs. Contribute to BI tool adoption and More ❯
Longbridge, City and Borough of Birmingham, West Midlands (County), United Kingdom Hybrid / WFH Options
Kerv Digital
Azure ecosystem. Required Experience: Creating mapping specifications between legacy and CRM, ERP, Finance applications Integration to D365, Dataverse solutions or other SaaS applications Data Quality Frameworks: Soda.io, Great Expectations, DBT Creation of fault-tolerant data ingestion pipelines in SSIS/KWS/Data Factory/data flows using Linked Services, Integration Datasets Extracting data from a variety of sources including More ❯
direct experience-we want to hire people to grow into the role and beyond. About the team: Python is our bread and butter. The wider data platform team uses dbt, Snowflake, and Looker to model, transform, and expose data for analytics and reporting across the business. We use Docker and Kubernetes to manage our production services. We use Github Actions More ❯
enterprise b2c who you will have heard of/used, who are seeking 2x Data Engineers to join them asap on an initial 6 months contract. Python, SQL, AWS, dbt, Airflow (spark/scala a bonus) 6 months Outside IR35 (ltd company) £500-600 per day Asap start More ❯
enterprise b2c who you will have heard of/used, who are seeking 2x Data Engineers to join them asap on an initial 6 months contract. Python, SQL, AWS, dbt, Airflow (spark/scala a bonus) 6 months Outside IR35 (ltd company) £500-600 per day Asap start More ❯
less structured environments and propose scalable solutions Experience mentoring or managing juniors (or ready to step into that responsibility) Background in commodities, trading, finance, or shipping highly desirable Bonus: DBT experience, data governance knowledge, and exposure to analytics/data science workflows This is a hands-on role in a collaborative, face-to-face culture - perfect for someone who thrives More ❯
West London, London, United Kingdom Hybrid / WFH Options
Young's Employment Services Ltd
Tech Lead, Data Engineering Manager etc. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in a cloud-native environment (e.g. More ❯
platforms (e.g. Databricks, Azure, AWS or GCP native stacks)Experience with platform observability and CI/CD for data platformsHands-on experience with modern data engineering tools such as dbt, Fivetran, Matillion or AirflowHistory of supporting pre-sales activities in a product or consultancy-based businessWhat Kubrick offers:We are a fast moving and fast growth business which is doing More ❯
science, Engineering, or a related field. 5+ years of experience in data engineering, with a strong focus on Snowflake and AWS. Proficiency in SQL, Python, and ETL tools ( Streamsets, DBT etc.) Hands on experience with Oracle RDBMS Data Migration experience to Snowflake Experience with AWS services such as S3, Lambda, Redshift, and Glue. Strong understanding of data warehousing concepts and More ❯
with architects, analysts, and business stakeholders to unlock insights and enable innovation. What You'll Be Doing: Design and build robust, automated data pipelines using Azure Data Factory, Synapse, dbt, and Databricks. Integrate data from enterprise systems (e.g. Dynamics, iTrent, Unit4) into a unified data platform. Cleanse, transform, and model data to support BI tools (e.g., Power BI) and AI More ❯
with architects, analysts, and business stakeholders to unlock insights and enable innovation. What You'll Be Doing: Design and build robust, automated data pipelines using Azure Data Factory, Synapse, dbt, and Databricks. Integrate data from enterprise systems (e.g. Dynamics, iTrent, Unit4) into a unified data platform. Cleanse, transform, and model data to support BI tools (e.g., Power BI) and AI More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
with architects, analysts, and business stakeholders to unlock insights and enable innovation. What You'll Be Doing: Design and build robust, automated data pipelines using Azure Data Factory, Synapse, dbt, and Databricks. Integrate data from enterprise systems (e.g. Dynamics, iTrent, Unit4) into a unified data platform. Cleanse, transform, and model data to support BI tools (e.g., Power BI) and AI More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
with architects, analysts, and business stakeholders to unlock insights and enable innovation. What You'll Be Doing: Design and build robust, automated data pipelines using Azure Data Factory, Synapse, dbt, and Databricks. Integrate data from enterprise systems (e.g. Dynamics, iTrent, Unit4) into a unified data platform. Cleanse, transform, and model data to support BI tools (e.g., Power BI) and AI More ❯
with architects, analysts, and business stakeholders to unlock insights and enable innovation. What You'll Be Doing: Design and build robust, automated data pipelines using Azure Data Factory, Synapse, dbt, and Databricks. Integrate data from enterprise systems (e.g. Dynamics, iTrent, Unit4) into a unified data platform. Cleanse, transform, and model data to support BI tools (e.g., Power BI) and AI More ❯
South West London, London, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
with architects, analysts, and business stakeholders to unlock insights and enable innovation. What You'll Be Doing: Design and build robust, automated data pipelines using Azure Data Factory, Synapse, dbt, and Databricks. Integrate data from enterprise systems (e.g. Dynamics, iTrent, Unit4) into a unified data platform. Cleanse, transform, and model data to support BI tools (e.g., Power BI) and AI More ❯
support. Required Skills and Qualifications: 4+ years of experience in troubleshooting and supporting B2B enterprise applications. Advanced knowledge and hands-on experience with Spark. Experience building, maintaining, and debugging DBT pipelines. Strong proficiency in developing, monitoring, and debugging ETL jobs. Deep understanding of SQL and experience with Databricks, Snowflake, BigQuery, Azure, Hadoop, or CDP environments. Hands-on technical support experience More ❯
Strong experience with Snowflake data warehousing Proficiency in Azure Data Services (e.g., Data Factory, Synapse, Blob Storage) Solid understanding of ETL/ELT pipelines Experience with SQL, Python, or dbt is a plus Ability to work independently and communicate effectively in a remote setting Benefits: Salary up to £70,000 Fully remote working Opportunity to work on cutting-edge dataMore ❯
join our Data & Analytics team. This role is instrumental in delivering clean, modern, and efficient data solutions across cloud-native platforms. Key Responsibilities Develop solutions across Snowflake, Azure, and DBT platforms. Lead migration and optimisation of applications using Azure cloud-native services. Write clean, testable, and maintainable code following industry standards. Implement CI/CD pipelines and test automation using More ❯
data platforms, data integration, or enterprise SaaS. Data platform knowledge: Strong familiarity with data warehouses/lakehouses (Snowflake, Databricks, BigQuery), orchestration tools (Airflow, Prefect), streaming (Kafka, Flink), and transformation (dbt). Technical proficiency: Solid understanding of REST/GraphQL APIs, SDK development, authentication/authorization standards (OAuth, SSO), and best practices in developer experience. Customer empathy: Strong customer empathy for More ❯
senior levels. Entrepreneurial mindset with a bias for action and delivery. Desirable but not essential: Sector experience in advertising, media, or technology. Familiarity with tools such as Power BI, dbt, Snowflake, Python/R, and cloud platforms (Azure preferred). Experience with financial modelling, forecasting, and operational analytics. VCCP DE&I Statement We believe that DE&I is about creating More ❯
senior levels. Entrepreneurial mindset with a bias for action and delivery. Desirable but not essential: Sector experience in advertising, media, or technology. Familiarity with tools such as Power BI, dbt, Snowflake, Python/R, and cloud platforms (Azure preferred). Experience with financial modelling, forecasting, and operational analytics. VCCP DE&I Statement We believe that DE&I is about creating More ❯
senior levels. Entrepreneurial mindset with a bias for action and delivery. Desirable but not essential: Sector experience in advertising, media, or technology. Familiarity with tools such as Power BI, dbt, Snowflake, Python/R, and cloud platforms (Azure preferred). Experience with financial modelling, forecasting, and operational analytics. VCCP DE&I Statement We believe that DE&I is about creating More ❯