Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom
Opus Recruitment Solutions Ltd
SC cleared Software developers (Python & AWS) to join a contract till April 2026.Inside IR35SC clearedWeekly travel to Newcastle Around £400 per dayContract till April 2026Skills:- Python- AWS Services- Terraform- Apache Spark- Airflow- Docker More ❯
Birmingham, West Midlands, England, United Kingdom
Harnham - Data & Analytics Recruitment
contributing to how data is used to drive commercial decisions - particularly around pricing, revenue, and customer insight. Key responsibilities include: Manage and maintain the company's data warehouse (Python, Airflow, DBT, Kimball) Ensure data pipelines are robust, accurate, and performant Maintain and develop cloud infrastructure using Infrastructure as Code (Terraform) Identify opportunities to improve data processes, architecture, and efficiency … and data during the ongoing merger YOUR SKILLS AND EXPERIENCE: A successful Data Engineer will bring: Strong SQL and Python skills Experience managing or building data warehouses Familiarity with Airflow and modern data engineering workflows Interest in cloud infrastructure and IaC principles Proactive mindset - not just maintaining systems, but improving them THE BENEFITS: You will receive a salary of More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom
Norton Rose Fulbright LLP
Azure/Microsoft Fabric/Data Factory) and modern data warehouse technologies (Databricks, Snowflake) Experience with database technologies such as RDBMS (SQL Server, Oracle) or NoSQL (MongoDB) Knowledge in Apache technologies such as Spark, Kafka and Airflow to build scalable and efficient data pipelines Ability to design, build, and deploy data solutions that explore, capture, transform, and utilize More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Oliver Bernard
Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey OB have partnered with a leading scale-up business in the AdTech space, where Data is at the forefront of everything they do, and they are currently hiring for a Lead Data Engineer to join their team and work on the development of their Data platform. This will … on experience working in Data heavy environments, with Real-Time Data Pipelines, Distributed Streaming Pipelines and strong knowledge of Cloud Environments. Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey Key Skills and Experience: Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey Prior experience working as a Lead Engineer or Tech Lead Pays £100k-£130k + … in Central London To be considered, you must be UK based and unfortunately visa sponsorship is unavailable 2-stage interview process! Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey More ❯
Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey OB have partnered with a leading scale-up business in the AdTech space, where Data is at the forefront of everything they do, and they are currently hiring for a Lead Data Engineer to join their team and work on the development of their Data platform. This will … on experience working in Data heavy environments, with Real-Time Data Pipelines, Distributed Streaming Pipelines and strong knowledge of Cloud Environments. Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey Key Skills and Experience: Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey Prior experience working as a Lead Engineer or Tech Lead Pays £100k-£130k + … in Central London To be considered, you must be UK based and unfortunately visa sponsorship is unavailable 2-stage interview process! Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey More ❯
North London, London, United Kingdom Hybrid / WFH Options
Oliver Bernard
Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey OB have partnered with a leading scale-up business in the AdTech space, where Data is at the forefront of everything they do, and they are currently hiring for a Lead Data Engineer to join their team and work on the development of their Data platform. This will … on experience working in Data heavy environments, with Real-Time Data Pipelines, Distributed Streaming Pipelines and strong knowledge of Cloud Environments. Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey Key Skills and Experience: Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey Prior experience working as a Lead Engineer or Tech Lead Pays £100k-£130k + … in Central London To be considered, you must be UK based and unfortunately visa sponsorship is unavailable 2-stage interview process! Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey More ❯
core of VAULT is big data at scale. Our systems handle massive ingestion pipelines, long-term storage, and high-performance querying. We leverage distributed technologies (Kafka, Spark, Flink, Cassandra, Airflow, etc.) to deliver resilient, low-latency access to trillions of records, while continuously optimizing for scalability, efficiency, and reliability. We'll trust you to: Build high-performance, distributed data … pipelines and services Lead technical direction and ensure alignment with business needs Collaborate across teams using modern open-source tech (Kafka, FastAPI, Airflow, Spark, etc.) Mentor junior engineers and contribute to an inclusive team culture You will need to have: Bachelor's in Computer Science, Engineering, or related field (or equivalent experience) Proven professional experience in an object-oriented … Deep background in distributed, high-volume, high-availability systems Fluency in AI development tools We would love to see: Experience with big data ecosystems (Kafka, Spark, Flink, Cassandra, Redis, Airflow) Familiarity with cloud platforms (AWS, Azure, GCP) and S3-compatible storage SaaS/PaaS development experience Container technologies (Docker, Kubernetes) Bloomberg is an equal opportunity employer and we value More ❯
and prioritize opportunities to rapidly deliver new capabilities Build and maintain robust MLOps pipelines to support scalable, reproducible, and automated model development, deployment, and monitoring Leverage tools such as Airflow for workflow orchestration and MLflow for experiment tracking, model registry, and lifecycle management, ensuring strong CI/CD practices and model governance Essential Skills Bachelor’s degree in science … skills Practical experience in the development of machine learning models and/or deep learning to solve complex science and engineering problems Experience with MLOps tools and practices, including Airflow, MLflow, and containerization (e.g., Docker) A passion for gaining insight into real-world datasets and clearly communicating through data visualization techniques Interest in material discovery, computer vision, handling big More ❯
and prioritize opportunities to rapidly deliver new capabilities Build and maintain robust MLOps pipelines to support scalable, reproducible, and automated model development, deployment, and monitoring Leverage tools such as Airflow for workflow orchestration and MLflow for experiment tracking, model registry, and lifecycle management, ensuring strong CI/CD practices and model governance Essential Skills Bachelor’s degree in science … skills Practical experience in the development of machine learning models and/or deep learning to solve complex science and engineering problems Experience with MLOps tools and practices, including Airflow, MLflow, and containerization (e.g., Docker) A passion for gaining insight into real-world datasets and clearly communicating through data visualization techniques Interest in material discovery, computer vision, handling big More ❯
observability, and automation. Collaborate with Data Science and Analytics teams to ensure clean, reliable data supports modelling, reporting, and operational use-cases. Implement and maintain orchestration frameworks (e.g., Dagster, Airflow, Prefect). Develop and maintain complex data processing systems (e.g., event streaming, usage rating, CDR flows). Advocate for good engineering practice: version control, documentation, reproducibility, modularity, and scalability. … and Data Engineering. What You Bring Advanced experience in Python and SQL Strong experience building ETL/ELT pipelines and data transformations Hands-on experience with orchestration frameworks (Dagster, Airflow, Prefect, dbt) Ability to analyse and structure complex datasets Strong communication skills with both technical and non-technical colleagues Desirable Experience with Spark, Dask, or Polars Experience with containerisation More ❯
Job title: Data Analyst Client: Elite FinTech Salary: £65,000-£100,000 + Bonus Location: London Skills: SQL, Python, PySpark, Airflow, Linux The role: My client are looking for a Data Analyst to join their team. Responsibilities: Playing a key role in all Data related activities for a wide range of datasets, used by Quants and Traders Working closely … working as a Data Analyst, ideally within FinTech or Financial Services Exposure to Derivatives or other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for More ❯
Job title: Data Analyst Client: Elite FinTech Salary: £65,000-£100,000 + Bonus Location: London Skills: SQL, Python, PySpark, Airflow, Linux The role: My client are looking for a Data Analyst to join their team. Responsibilities: Playing a key role in all Data related activities for a wide range of datasets, used by Quants and Traders Working closely … working as a Data Analyst, ideally within FinTech or Financial Services Exposure to Derivatives or other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for More ❯
maintaining data warehouses that seamlessly stitches together data from production databases and clickstream event data Hands-on experience with Hive query development and optimization, and, building workflows (preferably using Airflow) Hands-on experience with building data pipelines in a programming language like Python Hands-on experience with building and maintaining Tableau dashboards and/or Jupyter reports Working understanding … ETL BIG DATA ANALYTICS DATA WAREHOUSE DESIGN/ARCHITECTURE Tableau dashboards and/or Jupyter reports ENGINEER DATA VISUALIZATION AWS Tableau, D3, or Plotly node.js or flask apps Preferred: AirflowMore ❯
cloud foundation, including networking, IAM, security, and “everything as code” with Terraform/Terramate. Implement and maintain self-healing architectures, focusing on Kubernetes. Support critical engineering platforms: multi-tenant Airflow, BigQuery, and PostgreSQL clusters. Enhance developer experience via remote dev environments, self-hosted CI/CD pipelines, and observability tools. What They're Looking For: Hands-on experience with … or equivalent professional experience. Tech Stack: Cloud: GCP Orchestration: Kubernetes IaC: Terraform/Terramate Dev Environments: Remote development tools CI/CD: Self-hosted pipelines, Argo Workflows Data Platforms: Airflow, BigQuery, PostgreSQL Observability: Grafana, Prometheus Deployments: Helm For more information, contact Maria Ciprini at Harrington Starr, or click "Apply" to start your application. More ❯
cloud foundation, including networking, IAM, security, and “everything as code” with Terraform/Terramate. Implement and maintain self-healing architectures, focusing on Kubernetes. Support critical engineering platforms: multi-tenant Airflow, BigQuery, and PostgreSQL clusters. Enhance developer experience via remote dev environments, self-hosted CI/CD pipelines, and observability tools. What They're Looking For: Hands-on experience with … or equivalent professional experience. Tech Stack: Cloud: GCP Orchestration: Kubernetes IaC: Terraform/Terramate Dev Environments: Remote development tools CI/CD: Self-hosted pipelines, Argo Workflows Data Platforms: Airflow, BigQuery, PostgreSQL Observability: Grafana, Prometheus Deployments: Helm For more information, contact Maria Ciprini at Harrington Starr, or click "Apply" to start your application. More ❯
Sunderland, Tyne and Wear, England, United Kingdom
Reed
data solutions. Provide hands-on technical guidance on data design, modelling, and integration, ensuring alignment with architectural standards. Drive the adoption of tools such as Alation, Monte Carlo, and Airflow to improve data lineage, quality, and reliability. Ensure data security, privacy, and compliance are integral to all architecture and integration designs. Act as a bridge between business and technology … Glue, Azure Blob Storage, Google BigQuery). Expertise in data modelling (Dimensional, Data Vault, Enterprise). Experience designing and implementing modern data architectures. Proficiency with integration/orchestration tools (Airflow, dbt, Glue). Strong communication and stakeholder management skills. Experience with metadata, cataloguing, and data quality tools, and knowledge of data governance and GDPR. Benefits: Opportunity to work in More ❯
code and testing principles. Develop tools and frameworks for data governance, privacy, and quality monitoring, ensuring full compliance with data protection standards. Create resilient data workflows and automation within Airflow, Databricks, and other modern big data ecosystems. Implement and manage data observability and cataloguing tools (e.g., Monte Carlo, Atlan, DataHub) to enhance visibility and reliability. Partner with ML engineers … deploy, and scale production-grade data platforms and backend systems. Familiarity with data governance frameworks, privacy compliance, and automated data quality checks. Hands-on experience with big data tools (Airflow, Databricks) and data observability platforms. Collaborative mindset and experience working with cross-functional teams including ML and analytics specialists. Curiosity and enthusiasm for continuous learning - you stay up to More ❯
and operational reporting needs. Define best practices for Snowflake usage, data ingestion, transformation, and performance tuning. Develop and optimize ETL/ELT pipelines using tools such as Snowpipe, dbt, Airflow, Informatica, or Talend . Collaborate with data engineers, BI developers, and data scientists to ensure consistent and efficient data access. Implement data governance, security, and compliance standards (RBAC, masking … end Snowflake data architecture, including schema design, data modeling (3NF, star/snowflake, data vault), and ELT frameworks. Define data ingestion, transformation, and consumption patterns leveraging tools like DBT, Airflow, Matillion, Fivetran, or Informatica. Establish architecture blueprints for multi-region, multi-tenant, and secure Snowflake deployments. Lead migration from legacy data warehouses (Teradata, Oracle, SQL Server, Redshift, BigQuery, etc. More ❯
and operational reporting needs. Define best practices for Snowflake usage, data ingestion, transformation, and performance tuning. Develop and optimize ETL/ELT pipelines using tools such as Snowpipe, dbt, Airflow, Informatica, or Talend . Collaborate with data engineers, BI developers, and data scientists to ensure consistent and efficient data access. Implement data governance, security, and compliance standards (RBAC, masking … end Snowflake data architecture, including schema design, data modeling (3NF, star/snowflake, data vault), and ELT frameworks. Define data ingestion, transformation, and consumption patterns leveraging tools like DBT, Airflow, Matillion, Fivetran, or Informatica. Establish architecture blueprints for multi-region, multi-tenant, and secure Snowflake deployments. Lead migration from legacy data warehouses (Teradata, Oracle, SQL Server, Redshift, BigQuery, etc. More ❯
buckinghamshire, south east england, united kingdom Hybrid / WFH Options
Rightmove
rollback/recovery processes. Using MLOps tools (e.g., Vertex Pipelines, Kubeflow, Weights & Biases) for experiment tracking, model registry, and automated deployment. Leveraging Docker/Kubernetes and workflow orchestration tools (Airflow, Prefect, Dagster). Collaborating with product, design, and engineering teams to deliver ML features that directly impact customer experience. Translating model performance into business metrics (e.g., accuracy vs cost … mature organizations or teams operating at significant scale (e.g., web-scale, distributed systems, cloud-native environments). Brings expertise in MLOps: CI/CD pipelines, Docker, Kubernetes, workflow orchestration (Airflow, Prefect), and automation. Has experience across and understands the full ML lifecycle. Can design for long-term scalability, reliability, and resilience. Has strong programming skills with Python – essential. Has More ❯
Data & Insight Engineer Salary: £55,000-£60,000 + Share Options + Biannual Bonus Location: Central London (3 days in office) About the Company/Role Fast-growing food-tech scale-up founded by leaders from high-growth consumer brands. More ❯
Data & Insight Engineer Salary: £55,000-£60,000 + Share Options + Biannual Bonus Location: Central London (3 days in office) About the Company/Role Fast-growing food-tech scale-up founded by leaders from high-growth consumer brands. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Harnham
deployment . 🔧 What You’ll Do Architect and develop backend systems (Python, FastAPI, Flask, NodeJS) Design and implement RAG pipelines (LangChain, Qdrant, embeddings) Build ETL & CI/CD workflows (Airflow, dbt, MLFlow) Integrate AI/LLM components into backend services Ensure reliability, scalability, and maintainability across systems Collaborate with a small, elite team of engineers and AI researchers 💼 About … You 4+ years of backend experience (Python, distributed systems, async APIs) Proven experience integrating AI or LLM systems (LangChain, vector DBs, etc.) Hands-on with some of Airflow, dbt, MLFlow, ETL, and CI/CD workflows Experience in early-stage startup environments — self-starter mindset Excited by ambiguity, ownership, and building from zero to one Ideally: background in workflow More ❯
deployment . 🔧 What You’ll Do Architect and develop backend systems (Python, FastAPI, Flask, NodeJS) Design and implement RAG pipelines (LangChain, Qdrant, embeddings) Build ETL & CI/CD workflows (Airflow, dbt, MLFlow) Integrate AI/LLM components into backend services Ensure reliability, scalability, and maintainability across systems Collaborate with a small, elite team of engineers and AI researchers 💼 About … You 4+ years of backend experience (Python, distributed systems, async APIs) Proven experience integrating AI or LLM systems (LangChain, vector DBs, etc.) Hands-on with some of Airflow, dbt, MLFlow, ETL, and CI/CD workflows Experience in early-stage startup environments — self-starter mindset Excited by ambiguity, ownership, and building from zero to one Ideally: background in workflow More ❯
Milton Keynes, England, United Kingdom Hybrid / WFH Options
beatmysalary
to PostgreSQL. 1. Develop & Schedule SQL Views via DAGs Design and implement SQL views aligned with business needs, prioritizing clarity, reusability, and efficiency. Build and manage workflow orchestrations (e.g., Airflow DAGs) to automate those views, ensuring reliable execution on daily, weekly, or customized schedules. 2. Execute Cross‐Platform ETL with AWS Glue Develop, deploy, and maintain AWS Glue jobs … Set up secure connectivity, schedule jobs via cron or trigger mechanisms, and ensure data pipelines are reliable and idempotent. 3. Monitor, Troubleshoot & Resolve Incidents Continuously oversee ETL workflows in Airflow and AWS Glue, proactively responding to alerts and errors. Conduct root cause analysis for pipeline failures—whether due to schema mismatches or performance bottlenecks—and apply robust fixes. Document More ❯