s Degree (Engineering/Computer Science preferred but not required); or equivalent experience required Deep proficiency in Python, SQL, Cloud Platforms AWS, GCP, Azure). Data Warehousing (Snowflake), Orchestration (Airflow, Rundeck), Streaming (Kafka) Continuous engagement with Data Science and Analytics colleagues to understand requirements for our data-assets and empower them with best possible data, to create high value More ❯
Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Motability Operations Limited
Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Employment Type: Permanent, Part Time, Work From Home
in trust metrics or customer experience analysis Knowledge of dashboard design and data visualization best practices Experience with cloud-based data infrastructure (AWS) Familiarity with modern data stack tools (Airflow, dbt, etc.) Why This Role Matters Judge.me is at an inflection point. As the market leader in Shopify reviews, we've chosen to build our future with Shopify because More ❯
such as Tableau, with a focus on optimizing underlying data structures for dashboard performance. Ingestion and orchestration tools: Skilled in using pipeline orchestration and data ingestion tools such as Airflow and Stitch, along with Python scripting for integrating diverse data sources. Large-scale data processing: Proficient with distributed query engines like AWS Athena or SparkSQL for working with datasets More ❯
systems (e.g. Git) Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the More ❯
systems (e.g. Git) Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Starling Bank Limited
systems (e.g. Git) Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the More ❯
In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and ApacheAirflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and … significant impact, we encourage you to apply! Job Responsibilities ETL/ELT Pipeline Development: Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow. Implement batch and real-time data processing solutions using Apache Spark. Ensure data quality, governance, and security throughout the data lifecycle. Cloud Data Engineering: Manage and optimize … effectiveness. Implement and maintain CI/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Develop and optimize large-scale data processing pipelines using Apache Spark and PySpark. Implement data partitioning, caching, and performance tuning techniques to enhance Spark-based workloads. Work with diverse data formats (structured and unstructured) to support advanced analytics and More ❯
In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and ApacheAirflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and … desire to make a significant impact, we encourage you to apply! Job Responsibilities Data Engineering & Data Pipeline Development Design, develop, and optimize scalable DATA workflows using Python, PySpark, and Airflow Implement real-time and batch data processing using Spark Enforce best practices for data quality, governance, and security throughout the data lifecycle Ensure data availability, reliability and performance through … data processing workloads Implement CI/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Build and optimize large-scale data processing pipelines using Apache Spark and PySpark Implement data partitioning, caching, and performance tuning for Spark-based workloads. Work with diverse data formats (structured and unstructured) to support advanced analytics and machine learning More ❯
normalisation is required for the role. Equally, strong ML experience, proficiency in Python and SQL knowledge is essential, ideally with experience using data processing frameworks such as Kafka, NoSQL, Airflow, TensorFlow, or Spark. Finally, experience with cloud platforms like AWS or Azure, including data services such as ApacheAirflow, Athena, or SageMaker, is essential. This is a More ❯
normalisation is required for the role. Equally, strong ML experience, proficiency in Python and SQL knowledge is essential, ideally with experience using data processing frameworks such as Kafka, NoSQL, Airflow, TensorFlow, or Spark. Finally, experience with cloud platforms like AWS or Azure, including data services such as ApacheAirflow, Athena, or SageMaker, is essential. This is a More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Owen Thomas | Pending B Corp™
with cloud platforms (GCP preferred). Experience with CI/CD pipelines and version control. Proficiency in data visualisation tools (e.g. Tableau, PowerBI). Exposure to tools like DBT, ApacheAirflow, Docker. Experience working with large-scale datasets (terabyte-level or higher). Excellent problem-solving capabilities. Strong communication and collaboration skills. Proficiency in Python and SQL (or More ❯
with cloud platforms (GCP preferred). Experience with CI/CD pipelines and version control. Proficiency in data visualisation tools (e.g. Tableau, PowerBI). Exposure to tools like DBT, ApacheAirflow, Docker. Experience working with large-scale datasets (terabyte-level or higher). Excellent problem-solving capabilities. Strong communication and collaboration skills. Proficiency in Python and SQL (or More ❯
to manage competing technical requirements across complex systems. Strong communication and stakeholder engagement skills, enabling you to translate technical solutions into business value. Desirable experience with tools like Snowflake, ApacheAirflow, and AWS certification-or demonstrable equivalent knowledge-will help you thrive from day one. You'll benefit from Our compensation package includes a competitive salary, company bonus More ❯
data engineering or a related field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., ApacheAirflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There's no place quite like BFS and we're proud of that. And it More ❯
DATA ENGINEER - DBT/AIRFLOW/DATABRICKS 4-MONTH CONTRACT £450-550 PER DAY OUTSIDE IR35 This is an exciting opportunity for a Data Engineer to join a leading media organisation working at the forefront of data innovation. You'll play a key role in designing and building the data infrastructure that supports cutting-edge machine learning and LLM … to accelerate delivery of critical pipelines and platform improvements. THE ROLE You'll join a skilled data team to lead the build and optimisation of scalable pipelines using DBT, Airflow, and Databricks. Working alongside data scientists and ML engineers, you'll support everything from raw ingestion to curated layers powering LLMs and advanced analytics.Your responsibilities will include: Building and … maintaining production-grade ETL/ELT workflows with DBT and Airflow Collaborating with AI/ML teams to support data readiness for experimentation and inference Writing clean, modular SQL and Python code for use in Databricks Contributing to architectural decisions around pipeline scalability and performance Supporting the integration of diverse data sources into the platform Ensuring data quality, observability More ❯
Brighton, Sussex, United Kingdom Hybrid / WFH Options
Burns Sheehan
Lead Data Engineer £75,000-£85,000 ️ AWS, Python, SQL, Airflow Brighton, hybrid working Analyse customer behaviour using AI & ML We are partnered with a private equity backed company who provide an AI-powered, guided selling platform that helps businesses improve online sales and customer experience. They are looking for a Lead Data Engineer to lead a small team … experience in a Senior Data Engineering role. Comfortable owning and delivering technical projects end-to-end. Strong in Python, SQL, and cloud platforms (AWS or comparable). Experience with Airflow, Snowflake, Docker (or similar). Familiarity with coaching and mentoring more junior engineers, leading 1-1s and check ins. Wider tech stack : AWS, Python, Airflow, Fivetran, Snowflake … Enhanced parental leave and pay If you are interested in finding out more, please apply or contact me directly! Lead Data Engineer £75,000-£85,000 ️ AWS, Python, SQL, Airflow Brighton, hybrid working Analyse customer behaviour using AI & ML Burns Sheehan Ltd will consider applications based only on skills and ability and will not discriminate on any grounds. More ❯
pipelines to power next-gen data products in the commodities industry. Ensure data quality using the latest analytics and monitoring tools. Design and build robust pipelines with tools like Airflow and DBT. Create scalable infrastructure on Azure using technologies like Terraform. Write clean, high-quality, reusable code aligned with best practices. Drive innovation by bringing your own ideas-your … in a fast-paced startup or agile environment. Strong background in schema design and dimensional data modeling. Able to communicate data architecture clearly with internal stakeholders. Experience with Azure, Airflow, DBT, Kubernetes, GitHub. Bonus points for: open-source contributions, an active GitHub profile, and curiosity for the latest in tech. A natural problem-solver who loves making things work. More ❯
pipelines to power next-gen data products in the commodities industry. Ensure data quality using the latest analytics and monitoring tools. Design and build robust pipelines with tools like Airflow and DBT. Create scalable infrastructure on Azure using technologies like Terraform. Write clean, high-quality, reusable code aligned with best practices. Drive innovation by bringing your own ideas-your … in a fast-paced startup or agile environment. Strong background in schema design and dimensional data modeling. Able to communicate data architecture clearly with internal stakeholders. Experience with Azure, Airflow, DBT, Kubernetes, GitHub. Bonus points for: open-source contributions, an active GitHub profile, and curiosity for the latest in tech. A natural problem-solver who loves making things work. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Harnham
within ambitious software businesses. Specifically, you can expect to be involved in the following: Designing and developing full-stack data pipelines and platforms using modern tools such as dbt, Airflow, and cloud infrastructure Cleansing, enriching and modelling data to generate commercial insights and power C-level dashboards Delivering scalable solutions that support internal use cases and extend directly to … sales) and building tools that serve business needs Background in startups or scale-ups with high adaptability and a hands-on approach Experience with modern data tools (e.g. dbt, Airflow, CI/CD) and at least one cloud platform (AWS, GCP, Azure) Strong communication skills and a track record of credibility in high-pressure or client-facing settings BENEFITS More ❯
within ambitious software businesses. Specifically, you can expect to be involved in the following: Designing and developing full-stack data pipelines and platforms using modern tools such as dbt, Airflow, and cloud infrastructure Cleansing, enriching and modelling data to generate commercial insights and power C-level dashboards Delivering scalable solutions that support internal use cases and extend directly to … sales) and building tools that serve business needs Background in startups or scale-ups with high adaptability and a hands-on approach Experience with modern data tools (e.g. dbt, Airflow, CI/CD) and at least one cloud platform (AWS, GCP, Azure) Strong communication skills and a track record of credibility in high-pressure or client-facing settings BENEFITS More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
within ambitious software businesses. Specifically, you can expect to be involved in the following: Designing and developing full-stack data pipelines and platforms using modern tools such as dbt, Airflow, and cloud infrastructure Cleansing, enriching and modelling data to generate commercial insights and power C-level dashboards Delivering scalable solutions that support internal use cases and extend directly to … sales) and building tools that serve business needs Background in startups or scale-ups with high adaptability and a hands-on approach Experience with modern data tools (e.g. dbt, Airflow, CI/CD) and at least one cloud platform (AWS, GCP, Azure) Strong communication skills and a track record of credibility in high-pressure or client-facing settings BENEFITS More ❯
Employment Type: Full-Time
Salary: £100,000 - £110,000 per annum, Inc benefits