development of a new global data platform (PaaS) Ensure scalable storage solutions (data lakes, data warehouses) to handle structured and unstructured data. Implement ETL/ELT pipelines using Dagster, Airflow, or similar tools. Optimize performance and scalability for large data volumes. Govern data security, compliance, and access controls. Development & DevOps: Strong programming and scripting skills in Python. Knowledge of More ❯
/CD processes GitHub experience including actions and CI/CD Data Modeling experience, including e xtensive experience designing dimensional models based on business use cases and reporting needs Airflow experience (Task scheduler and orchestrator) Python experience (Programming Language) Soft Skills: Interpersonal skills to engage and communicate effectively with customers and audiences of different backgrounds within the organization Please More ❯
role. A strong understanding of data quality concepts, methodologies, and best practices. Proficiency in SQL and data querying for data validation and testing purposes. Hands-on experience with Snowflake, Airflow or Matillion would be ideal Familiarity with data integration, ETL processes, and data governance frameworks. Solid understanding of data structures, relational databases, and data modelling concepts. Excellent analytical and More ❯
similarly complex operational businesses Experience using LLMs or AI tools to structure and extract meaning from unstructured data Experience automating workflows and deploying model pipelines (e.g. MLFlow, GCP Vertex, Airflow, dbt or similar) Exposure to business planning, pricing, or commercial decision-making Familiarity with geospatial data Experience in fast-scaling startups or operational teams We're flexible on experience More ❯
.NET framework backend services and React frontends. You'll utilise tools such as Terraform for infrastructure-as-code ( IaC ), AWS (Lambda, EC2, EKS, Step Functions, VPC etc.) for ETL, Airflow pipelines, Snowflake, and ensure architectural alignment with AI/ML initiatives and data-driven services. You will serve as the go-to engineer for: End-to-end data migration More ❯
Spring Boot, Python/FastAPI - Frontend: TypeScript, React, Next.js . Headless CMS, Design systems. - CI/CD: ArgoCD, GitHub Actions - Infrastructure: GCP, Kubernetes, Terraform, Grafana - Data: Postgres, dbt, BigQuery, Airflow, Hex What You'll Be Doing Igniting Potential: Lead, mentor, and empower a high-performing team of 3-6 engineers, focusing relentlessly on outcomes that wow our members , not More ❯
only clean, high-quality data flows through the pipeline. Collaborate with research to define data quality benchmarks . Optimize end-to-end performance across distributed data processing frameworks (e.g., Apache Spark, Ray, Airflow). Work with infrastructure teams to scale pipelines across thousands of GPUs . Work directly with the leadership on the data team roadmaps. Manage the … on experience in training and optimizing classifiers. Experience managing large-scale datasets and pipelines in production. Experience in managing and leading small teams of engineers. Expertise in Python , Spark , Airflow , or similar data frameworks. Understanding of modern infrastructure: Kubernetes , Terraform , object stores (e.g. S3, GCS) , and distributed computing environments. Strong communication and leadership skills; you can bridge the gap More ❯
Basingstoke, Hampshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
e.g., Archimate), and cloud platforms (AWS, Azure) Hands-on experience with DevSecOps tooling, automation platforms (Power Platform, UiPath), and secure software development practices Familiarity with data integration pipelines (Kafka, ApacheAirflow), API management, and scripting (Python) Strong understanding of software design patterns including microservices, cloud-native, and OO design Eligible and willing to undergo high-level security clearance More ❯
southampton, south east england, united kingdom Hybrid / WFH Options
Anson Mccade
e.g., Archimate), and cloud platforms (AWS, Azure) Hands-on experience with DevSecOps tooling, automation platforms (Power Platform, UiPath), and secure software development practices Familiarity with data integration pipelines (Kafka, ApacheAirflow), API management, and scripting (Python) Strong understanding of software design patterns including microservices, cloud-native, and OO design Eligible and willing to undergo high-level security clearance More ❯
Basingstoke, Hampshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
e.g., Archimate), and cloud platforms (AWS, Azure) Hands-on experience with DevSecOps tooling, automation platforms (Power Platform, UiPath), and secure software development practices Familiarity with data integration pipelines (Kafka, ApacheAirflow), API management, and scripting (Python) Strong understanding of software design patterns including microservices, cloud-native, and OO design Eligible and willing to undergo high-level security clearance More ❯
ML Pipelines: Engage with data and ML scientists to plan the architecture for end-to-end machine learning workflows. Implement scalable training and deployment pipelines using tools such as ApacheAirflow and Kubernetes. Perform comprehensive testing to ensure reliability and accuracy of deployed models. Develop instrumentation and automated alerts to manage system health and detect issues in real More ❯
features. Rapid Prototyping: Create interactive AI demos and proofs-of-concept with Streamlit, Gradio, or Next.js for stakeholder feedback; MLOps & Deployment: Implement CI/CD pipelines (e.g., GitLab Actions, ApacheAirflow), experiment tracking (MLflow), and model monitoring for reliable production workflows; Cross-Functional Collaboration: Participate in code reviews, architectural discussions, and sprint planning to deliver features end-to More ❯
Bath, Somerset, South West, United Kingdom Hybrid / WFH Options
Cathcart Technology
FastAPI with either Flask or Django for API frameworks for web services AWS CDK experience with Infrastructure as code for AWS deployments Docker Containerization for local development Prefect or Airflow or Dagster experience for pipeline orchestration frameworks What you'll be doing: Develop and optimize FastAPI applications integrated with AWS CDK infrastructure. Collaborate closely with developers, subject matter experts … with AWS services, deployment processes, and infrastructure as code approaches AWS CDK, Terraform. Comfortable working with Docker containers for local development. Familiarity with pipeline orchestration frameworks such as Prefect, Airflow, or Dagster. Excellent communication skills with a collaborative mindset. Contract Details: Location: Fully remote UK based candidates only. Length: 9 months. Rate: £450 to £465 per day Outside IR35. More ❯
efficiency and predictive capabilities, with demonstrable performance metrics. Desirable Bachelor's degree in Computer Science or Software Engineering. Experience deploying AI within AWS and MS Azure. Experience using Docker, Airflow and Openshift. Cloud Certification. Data science model review, code refactoring, model optimization, containerisation, deployment, versioning, monitoring of model quality and non-functional requirements. T he role offers a strong More ❯
such as Lightdash, Looker or Tableau Comfortable with version control, testing frameworks, and CI/CD in a data context Familiarity with Python and orchestration tools like Dagster or Airflow is highly desirable Experience with Ecommerce and/or Subscription services is desirable The Interview Process Meet & Greet Call with a Talent Partner Call with the Hiring Manager and More ❯
What we’d like to see from you: • Extensive experience designing and deploying ML systems in production • Deep technical expertise in Python and modern ML tooling (e.g. MLflow, TFX, Airflow, Kubeflow, SageMaker, Vertex AI) • Experience with infrastructure-as-code and CI/CD practices for ML (e.g. Terraform, GitHub Actions, ArgoCD) • Proven ability to build reusable tooling, scalable services More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
Edgesource
tickets, and resolving operational issues Desired Qualifications (NOT REQUIRED): Deep understanding of DevOps, CI/CD, and infrastructure as code Experience Angular/Postgres/Nifi/MongoDB/Airflow Experience with agile development methodologies Excellent written and verbal communication skills Strong problem-solving skills and attention to detail Cloud and Security+ certifications Working at Edgesource: As an ISO More ❯
What we’d like to see from you: Extensive experience designing and deploying ML systems in production Deep technical expertise in Python and modern ML tooling (e.g. MLflow, TFX, Airflow, Kubeflow, SageMaker, Vertex AI) Experience with infrastructure-as-code and CI/CD practices for ML (e.g. Terraform, GitHub Actions, ArgoCD) Proven ability to build reusable tooling, scalable services More ❯
Use AWS and Azure cloud services to provide the necessary infrastructure, resources, and interfaces for data loading and LLM workflows. Use Python and large-scale data workflow orchestration platforms ( Airflow) to build software artifacts for ETL, integrating diverse data formats and storage technologies, and incorporate them into robust data workflows and dynamic systems You May be a Good Fit More ❯
Karlsruhe, Baden-Württemberg, Germany Hybrid / WFH Options
Cinemo GmbH
Requirements: Several years of proven experience in MLOps, including end-to-end machine learning lifecycle management Strong programming skills in Python and C++ Familiarity with MLOps tools like MLFlow, Airflow, or Kubeflow. Experience designing and managing CI/CD pipelines for machine learning projects with experience in CI/CD tools (e.g., Jenkins) Proficiency in automation tools for streamlining … deployment workflows Automate repetitive and manual processes involved in machine learning operations to improve efficiency Implement and manage MLOps solutions on AWS, leveraging Terraform for infrastructure as code Technologies: Airflow AWS CI/CD Cloud Computer Vision Embedded Support Jenkins Kubeflow Machine Learning Mobile Python Terraform C++ DevOps More: Cinemo is a global provider of highly innovative infotainment products More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
to our bespoke data pipeline and associated API services. Key Requirements 5+ years of Python experience with frameworks like Flask, FastAPI, and Django. Strong command orchestration tools (e.g. Prefect, Airflow), Docker, and AWS infrastructure (CDK, Terraform). Solid understanding of API services, authentication methods (JWT, SSO), and clear, pragmatic communication skills. Maintain, upgrade, and improve existing systems and custom More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
US TECH SOLUTIONS LIMITED
robust data solutions Qualifications: 5+ years of experience with SQL and PythonStrong background in data modeling and data visualization (Tableau, MicroStrategy, etc.)Hands-on experience with Azure Data Factory, Airflow, or similar ETL toolsAbility to work independently while collaborating with cross-functional teamsExperience with data analysis and metric definition is a plus Nice to Have: Prior experience in support More ❯
centred around a software product, and have solid Python coding skills, and expertise with cloud infrastructure (preferably AWS). Familiarity with Containers and MLE tools such as MLflow and Airflow is essential, with any knowledge of AI SaaS or GenAI APIs being is a bonus. But what truly matters is your passion for learning and advancing technology. In return More ❯
centred around a software product, and have solid Python coding skills, and expertise with cloud infrastructure (preferably AWS). Familiarity with Containers and MLE tools such as MLflow and Airflow is essential, with any knowledge of AI SaaS or GenAI APIs being is a bonus. But what truly matters is your passion for learning and advancing technology. In return More ❯