Akkodis is a global leader in engineering, technology, and R&D, harnessing the power of connected data to drive digital transformation and innovation for a smarter, more sustainable future. As part of the Adecco Group, Akkodis combines the expertise of More ❯
watford, hertfordshire, east anglia, united kingdom
Akkodis
Akkodis is a global leader in engineering, technology, and R&D, harnessing the power of connected data to drive digital transformation and innovation for a smarter, more sustainable future. As part of the Adecco Group, Akkodis combines the expertise of More ❯
Akkodis is a global leader in engineering, technology, and R&D, harnessing the power of connected data to drive digital transformation and innovation for a smarter, more sustainable future. As part of the Adecco Group, Akkodis combines the expertise of More ❯
cross-functional teams to build scalable data pipelines and contribute to digital transformation initiatives across government departments. Key Responsibilities Design, develop and maintain robust data pipelines using PostgreSQL and Airflow or Apache Spark Collaborate with frontend/backend developers using Node.js or React Implement best practices in data modelling, ETL processes and performance optimisation Contribute to containerised deployments … essential Operate within Agile teams and support DevOps practices What We're Looking For Proven experience as a Data Engineer in complex environments Strong proficiency in PostgreSQL and either Airflow or Spark Solid understanding of Node.js or React for integration and tooling Familiarity with containerisation technologies (Docker/Kubernetes) is a plus Excellent communication and stakeholder engagement skills Experience More ❯
cross-functional teams to build scalable data pipelines and contribute to digital transformation initiatives across government departments. Key Responsibilities Design, develop and maintain robust data pipelines using PostgreSQL and Airflow or Apache Spark Collaborate with frontend/backend developers using Node.js or React Implement best practices in data modelling, ETL processes and performance optimisation Contribute to containerised deployments … essential Operate within Agile teams and support DevOps practices What We're Looking For Proven experience as a Data Engineer in complex environments Strong proficiency in PostgreSQL and either Airflow or Spark Solid understanding of Node.js or React for integration and tooling Familiarity with containerisation technologies (Docker/Kubernetes) is a plus Excellent communication and stakeholder engagement skills Experience More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom
Opus Recruitment Solutions Ltd
SC cleared Software developers (Python & AWS) to join a contract till April 2026.Inside IR35SC clearedWeekly travel to Newcastle Around £400 per dayContract till April 2026Skills:- Python- AWS Services- Terraform- Apache Spark- Airflow- Docker More ❯
Role: Snowflake Engineer/Developer Location: Manchester, UK Work Mode: Permanent - Hybrid - 4 Days/Week (Mandatory) JOB DESCRITION: Mandatory Possess good knowledge in Cloud computing Snowflake DBT airflow Very good working knowledge in data models viz Dimensional Data Model ER Data Model and Data Vault Very good working knowledge in writing SQL queries Very good working knowledge in More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom
Norton Rose Fulbright LLP
Azure/Microsoft Fabric/Data Factory) and modern data warehouse technologies (Databricks, Snowflake) Experience with database technologies such as RDBMS (SQL Server, Oracle) or NoSQL (MongoDB) Knowledge in Apache technologies such as Spark, Kafka and Airflow to build scalable and efficient data pipelines Ability to design, build, and deploy data solutions that explore, capture, transform, and utilize More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Oliver Bernard
Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey OB have partnered with a leading scale-up business in the AdTech space, where Data is at the forefront of everything they do, and they are currently hiring for a Lead Data Engineer to join their team and work on the development of their Data platform. This will … on experience working in Data heavy environments, with Real-Time Data Pipelines, Distributed Streaming Pipelines and strong knowledge of Cloud Environments. Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey Key Skills and Experience: Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey Prior experience working as a Lead Engineer or Tech Lead Pays £100k-£130k + … in Central London To be considered, you must be UK based and unfortunately visa sponsorship is unavailable 2-stage interview process! Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey More ❯
Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey OB have partnered with a leading scale-up business in the AdTech space, where Data is at the forefront of everything they do, and they are currently hiring for a Lead Data Engineer to join their team and work on the development of their Data platform. This will … on experience working in Data heavy environments, with Real-Time Data Pipelines, Distributed Streaming Pipelines and strong knowledge of Cloud Environments. Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey Key Skills and Experience: Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey Prior experience working as a Lead Engineer or Tech Lead Pays £100k-£130k + … in Central London To be considered, you must be UK based and unfortunately visa sponsorship is unavailable 2-stage interview process! Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey More ❯
North London, London, United Kingdom Hybrid / WFH Options
Oliver Bernard
Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey OB have partnered with a leading scale-up business in the AdTech space, where Data is at the forefront of everything they do, and they are currently hiring for a Lead Data Engineer to join their team and work on the development of their Data platform. This will … on experience working in Data heavy environments, with Real-Time Data Pipelines, Distributed Streaming Pipelines and strong knowledge of Cloud Environments. Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey Key Skills and Experience: Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey Prior experience working as a Lead Engineer or Tech Lead Pays £100k-£130k + … in Central London To be considered, you must be UK based and unfortunately visa sponsorship is unavailable 2-stage interview process! Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey More ❯
Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey OB have partnered with a leading scale-up business in the AdTech space, where Data is at the forefront of everything they do, and they are currently hiring for a Lead Data Engineer to join their team and work on the development of their Data platform. This will … on experience working in Data heavy environments, with Real-Time Data Pipelines, Distributed Streaming Pipelines and strong knowledge of Cloud Environments. Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey Key Skills and Experience: Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey Prior experience working as a Lead Engineer or Tech Lead Pays £100k-£130k + … in Central London To be considered, you must be UK based and unfortunately visa sponsorship is unavailable 2-stage interview process! Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey More ❯
and prioritize opportunities to rapidly deliver new capabilities Build and maintain robust MLOps pipelines to support scalable, reproducible, and automated model development, deployment, and monitoring Leverage tools such as Airflow for workflow orchestration and MLflow for experiment tracking, model registry, and lifecycle management, ensuring strong CI/CD practices and model governance Essential Skills Bachelor’s degree in science … skills Practical experience in the development of machine learning models and/or deep learning to solve complex science and engineering problems Experience with MLOps tools and practices, including Airflow, MLflow, and containerization (e.g., Docker) A passion for gaining insight into real-world datasets and clearly communicating through data visualization techniques Interest in material discovery, computer vision, handling big More ❯
and prioritize opportunities to rapidly deliver new capabilities Build and maintain robust MLOps pipelines to support scalable, reproducible, and automated model development, deployment, and monitoring Leverage tools such as Airflow for workflow orchestration and MLflow for experiment tracking, model registry, and lifecycle management, ensuring strong CI/CD practices and model governance Essential Skills Bachelor’s degree in science … skills Practical experience in the development of machine learning models and/or deep learning to solve complex science and engineering problems Experience with MLOps tools and practices, including Airflow, MLflow, and containerization (e.g., Docker) A passion for gaining insight into real-world datasets and clearly communicating through data visualization techniques Interest in material discovery, computer vision, handling big More ❯
and prioritize opportunities to rapidly deliver new capabilities Build and maintain robust MLOps pipelines to support scalable, reproducible, and automated model development, deployment, and monitoring Leverage tools such as Airflow for workflow orchestration and MLflow for experiment tracking, model registry, and lifecycle management, ensuring strong CI/CD practices and model governance Essential Skills Bachelor’s degree in science … skills Practical experience in the development of machine learning models and/or deep learning to solve complex science and engineering problems Experience with MLOps tools and practices, including Airflow, MLflow, and containerization (e.g., Docker) A passion for gaining insight into real-world datasets and clearly communicating through data visualization techniques Interest in material discovery, computer vision, handling big More ❯
oxford district, south east england, united kingdom
Alloyed
and prioritize opportunities to rapidly deliver new capabilities Build and maintain robust MLOps pipelines to support scalable, reproducible, and automated model development, deployment, and monitoring Leverage tools such as Airflow for workflow orchestration and MLflow for experiment tracking, model registry, and lifecycle management, ensuring strong CI/CD practices and model governance Essential Skills Bachelor’s degree in science … skills Practical experience in the development of machine learning models and/or deep learning to solve complex science and engineering problems Experience with MLOps tools and practices, including Airflow, MLflow, and containerization (e.g., Docker) A passion for gaining insight into real-world datasets and clearly communicating through data visualization techniques Interest in material discovery, computer vision, handling big More ❯
Job title: Data Analyst Client: Elite FinTech Salary: £65,000-£100,000 + Bonus Location: London Skills: SQL, Python, PySpark, Airflow, Linux The role: My client are looking for a Data Analyst to join their team. Responsibilities: Playing a key role in all Data related activities for a wide range of datasets, used by Quants and Traders Working closely … working as a Data Analyst, ideally within FinTech or Financial Services Exposure to Derivatives or other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for More ❯
Job title: Data Analyst Client: Elite FinTech Salary: £65,000-£100,000 + Bonus Location: London Skills: SQL, Python, PySpark, Airflow, Linux The role: My client are looking for a Data Analyst to join their team. Responsibilities: Playing a key role in all Data related activities for a wide range of datasets, used by Quants and Traders Working closely … working as a Data Analyst, ideally within FinTech or Financial Services Exposure to Derivatives or other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for More ❯
cloud foundation, including networking, IAM, security, and “everything as code” with Terraform/Terramate. Implement and maintain self-healing architectures, focusing on Kubernetes. Support critical engineering platforms: multi-tenant Airflow, BigQuery, and PostgreSQL clusters. Enhance developer experience via remote dev environments, self-hosted CI/CD pipelines, and observability tools. What They're Looking For: Hands-on experience with … or equivalent professional experience. Tech Stack: Cloud: GCP Orchestration: Kubernetes IaC: Terraform/Terramate Dev Environments: Remote development tools CI/CD: Self-hosted pipelines, Argo Workflows Data Platforms: Airflow, BigQuery, PostgreSQL Observability: Grafana, Prometheus Deployments: Helm For more information, contact Maria Ciprini at Harrington Starr, or click "Apply" to start your application. More ❯
cloud foundation, including networking, IAM, security, and everything as code with Terraform/Terramate. Implement and maintain self-healing architectures, focusing on Kubernetes. Support critical engineering platforms: multi-tenant Airflow, BigQuery, and PostgreSQL clusters. Enhance developer experience via remote dev environments, self-hosted CI/CD pipelines, and observability tools. What They're Looking For: Hands-on experience with … or equivalent professional experience. Tech Stack: Cloud: GCP Orchestration: Kubernetes IaC: Terraform/Terramate Dev Environments: Remote development tools CI/CD: Self-hosted pipelines, Argo Workflows Data Platforms: Airflow, BigQuery, PostgreSQL Observability: Grafana, Prometheus Deployments: Helm For more information, contact Maria Ciprini at Harrington Starr, or click Apply to start your application. More ❯
cloud foundation, including networking, IAM, security, and "everything as code" with Terraform/Terramate. Implement and maintain self-healing architectures, focusing on Kubernetes. Support critical engineering platforms: multi-tenant Airflow, BigQuery, and PostgreSQL clusters. Enhance developer experience via remote dev environments, self-hosted CI/CD pipelines, and observability tools. What They're Looking For: Hands-on experience with … or equivalent professional experience. Tech Stack: Cloud: GCP Orchestration: Kubernetes IaC: Terraform/Terramate Dev Environments: Remote development tools CI/CD: Self-hosted pipelines, Argo Workflows Data Platforms: Airflow, BigQuery, PostgreSQL Observability: Grafana, Prometheus Deployments: Helm For more information, contact Maria Ciprini at Harrington Starr, or click "Apply" to start your application. More ❯
cloud foundation, including networking, IAM, security, and “everything as code” with Terraform/Terramate. Implement and maintain self-healing architectures, focusing on Kubernetes. Support critical engineering platforms: multi-tenant Airflow, BigQuery, and PostgreSQL clusters. Enhance developer experience via remote dev environments, self-hosted CI/CD pipelines, and observability tools. What They're Looking For: Hands-on experience with … or equivalent professional experience. Tech Stack: Cloud: GCP Orchestration: Kubernetes IaC: Terraform/Terramate Dev Environments: Remote development tools CI/CD: Self-hosted pipelines, Argo Workflows Data Platforms: Airflow, BigQuery, PostgreSQL Observability: Grafana, Prometheus Deployments: Helm For more information, contact Maria Ciprini at Harrington Starr, or click "Apply" to start your application. More ❯
london (city of london), south east england, united kingdom
Harrington Starr
cloud foundation, including networking, IAM, security, and everything as code with Terraform/Terramate. Implement and maintain self-healing architectures, focusing on Kubernetes. Support critical engineering platforms: multi-tenant Airflow, BigQuery, and PostgreSQL clusters. Enhance developer experience via remote dev environments, self-hosted CI/CD pipelines, and observability tools. What They're Looking For: Hands-on experience with … or equivalent professional experience. Tech Stack: Cloud: GCP Orchestration: Kubernetes IaC: Terraform/Terramate Dev Environments: Remote development tools CI/CD: Self-hosted pipelines, Argo Workflows Data Platforms: Airflow, BigQuery, PostgreSQL Observability: Grafana, Prometheus Deployments: Helm For more information, contact Maria Ciprini at Harrington Starr, or click Apply to start your application. More ❯
cloud foundation, including networking, IAM, security, and “everything as code” with Terraform/Terramate. Implement and maintain self-healing architectures, focusing on Kubernetes. Support critical engineering platforms: multi-tenant Airflow, BigQuery, and PostgreSQL clusters. Enhance developer experience via remote dev environments, self-hosted CI/CD pipelines, and observability tools. What They're Looking For: Hands-on experience with … or equivalent professional experience. Tech Stack: Cloud: GCP Orchestration: Kubernetes IaC: Terraform/Terramate Dev Environments: Remote development tools CI/CD: Self-hosted pipelines, Argo Workflows Data Platforms: Airflow, BigQuery, PostgreSQL Observability: Grafana, Prometheus Deployments: Helm For more information, contact Maria Ciprini at Harrington Starr, or click "Apply" to start your application. More ❯
london, south east england, united kingdom Hybrid / WFH Options
Compare the Market
re Looking For Must Have Practical experience deploying ML models into production environments Strong Python development skills and understanding of ML model structures Familiarity with tools such as MLflow, Airflow, SageMaker, or Vertex AI Understanding of CI/CD concepts and basic infrastructure automation Ability to write well-tested, maintainable, and modular code Strong collaboration skills and a growth … ML delivery Nice to Have Experience working in regulated sectors such as insurance, banking, or financial services Exposure to Databricks, container orchestration (e.g. Kubernetes), or workflow engines (e.g. Argo, Airflow) Familiarity with real-time model deployment, streaming data, or event-driven systems (e.g. Kafka, Flink) Interest in MLOps, model governance, and responsible AI practices Understanding of basic model evaluation More ❯