experience in media/digital environments Strong Python for automation workflows and data pipelines Proficiency in SQL, data warehousing, and cloud platforms (GCP/Azure preferred) Experience with dbt, Airflow, or similar data modeling tools Understanding of digital marketing data (GA4, ad servers, programmatic platforms) Team management/mentoring experience with collaborative leadership style Excellent communication skills for diverse More ❯
Build CI/CD pipelines for ML models, agents, and GenAI applications. Deploy LLMs and orchestrate multi-agent systems (LangChain, LangGraph, or custom). Create and maintain data pipelines (Airflow, dbt, Step Functions). Implement monitoring, observability, and automated retraining (CloudWatch, Prometheus, Grafana, MLflow). Embed strong security, compliance, and guardrail practices across all ML workflows. Mentor junior engineers More ❯
warehousing and transformation. Strong SQL skills and understanding of modern data architecture principles. Hands-on experience with Tableau for enterprise-grade dashboard development. Familiarity with orchestration tools (e.g., Prefect, Airflow), data quality frameworks, and metadata tools. Proficiency in Git, CI/CD, and scripting with Python. Excellent communication skills and ability to work collaboratively across technical and business teams. More ❯
warehousing and transformation. Strong SQL skills and understanding of modern data architecture principles. Hands-on experience with Tableau for enterprise-grade dashboard development. Familiarity with orchestration tools (e.g., Prefect, Airflow), data quality frameworks, and metadata tools. Proficiency in Git, CI/CD, and scripting with Python. Excellent communication skills and ability to work collaboratively across technical and business teams. More ❯
environments, including ERPs and CRMs. You'll collaborate closely with client stakeholders to translate ambiguous requirements into clean, maintainable solutions that drive real impact. Familiarity with tools such as Airflow, DBT, Databricks, dashboarding frameworks, and Typescript is a strong plus as you help deliver end-to-end production-ready systems. Interview Process Teams conversation (introductory chat) Technical take home More ❯
TypeScript Backend : Python (FastAPI, Flask, or Django), ideally with geospatial data processing Cloud : AWS (Lambda, ECS, RDS, S3, API Gateway, etc.) DevOps : Docker, CI/CD pipelines, Git Orchestration : Airflow or similar tools Eligo Recruitment is acting as an Employment Business in relation to this vacancy. Eligo is proud to be an equal opportunity employer dedicated to fostering diversity More ❯
contexts. Bonus Experience (Nice to Have) Exposure to large language models (LLMs) or foundational model adaptation. Previous work in cybersecurity, anomaly detection, or behavioural analytics. Familiarity with orchestration frameworks (Airflow or similar). Experience with scalable ML systems, pipelines, or real-time data processing. Advanced degree or equivalent experience in ML/AI research or applied science. Cloud platform More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Method Resourcing
contexts. Bonus Experience (Nice to Have) Exposure to large language models (LLMs) or foundational model adaptation. Previous work in cybersecurity, anomaly detection, or behavioural analytics. Familiarity with orchestration frameworks (Airflow or similar). Experience with scalable ML systems, pipelines, or real-time data processing. Advanced degree or equivalent experience in ML/AI research or applied science. Cloud platform More ❯
model performance in production Automation Architecture Deep knowledge of automation tools including GitHub Actions, Terraform, and Ansible Experience with business process automation (RPA) tools like Appian Workflow orchestration experience (Airflow, Prefect) Ability to build custom automation frameworks using Python or similar languages Full-Stack Development Solid software engineering background with proficiency in Python, JavaScript/TypeScript, Java, or Go More ❯
and PySpark Exposure to generative AI platforms or a passion for building AI-powered solutions Ability to lead client delivery in dynamic, fast-paced environments Familiarity with tools like Airflow, Databricks or DBT is a plus What's on offer: Salary up to £75,000 Bonus and equity options Hybrid working: 3 days in a vibrant central London office More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
and PySpark Exposure to generative AI platforms or a passion for building AI-powered solutions Ability to lead client delivery in dynamic, fast-paced environments Familiarity with tools like Airflow, Databricks or DBT is a plus What's on offer: Salary up to £75,000 Bonus and equity options Hybrid working: 3 days in a vibrant central London office More ❯
and cost optimisation. Strong understanding of cloud-native architectures (preferably AWS) and modern data engineering approaches. Experience designing ingestion, transformation and orchestration workflows using tools such as dbt, Matillion, Airflow or similar. Expertise in data modelling, data governance and enterprise data strategy. Excellent client-facing and communication skills, with the ability to influence senior stakeholders. Why Join? Work on More ❯
Background in agentic systems, reasoning pipelines, or semantic/ontology-based architectures. Familiarity with knowledge graphs, schema reconciliation, or semantic modeling. Experience across modern data infrastructure (e.g., dbt, BigQuery, Airflow) and cloud environments (AWS/GCP). Prior startup or early-stage experience; strong preference for candidates who have built AI or data infrastructure at scale. A builder mindset More ❯
to solve problems through technology. Automation Strategy:Architect automation solutions leveraging GitHub Actions, Terraform, Ansible, and RPA tools (eg, Appian). Build custom automation frameworks and orchestrate workflows using Airflow or Prefect. AI/ML Integration:Design and implement AI/ML solutions using frameworks such as TensorFlow, PyTorch, and Scikit-learn. An experienced problem solver with a curious More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Searchability
TO HAVE: Experience integrating AI models into production systems using GCP, AWS, or Azure. Familiarity with vector databases, embedding models, or retrieval-augmented generation (RAG). Knowledge of Docker, Airflow, or MLOps pipelines. Strong understanding of AI ethics, data privacy, and responsible model deployment. TO BE CONSIDERED... Please either apply online or email me directly at .By applying for More ❯
of building performant, maintainable, and testable systems Solid background in microservices architecture Proficiency with Postgres & MongoDB (relational + non-relational) Experience with event-driven architectures and asynchronous workflows (Kafka, Airflow, etc.) Solid coding practices (clean, testable, automated) The mindset of a builder: thrives in fast-paced startup environments, takes ownership, solves complex challenges Bonus points if youve worked with More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Oscar Technology
influence business performance. Ability to operate effectively in a fast-paced, scaling organisation. Excellent communication skills and a collaborative approach. Desirable Skills Experience with analytics engineering tools (e.g., dbt, Airflow). Familiarity with experimentation frameworks and A/B testing platforms. Exposure to cloud-based data environments (e.g., Google Cloud Platform). Experience within digital consumer businesses or online More ❯
code. Implement TMS (Tealium IQ, Adobe Analytics, GTM and Adobe Dynamic Tag Manager) changes. Integrate data sources via web and REST APIs. Data pipping and modelling using SQL, DBT, Airflow, ETL, Data Warehousing, Redshift and Python. Transfer knowledge of the business processes and requirements to the development teams. Collaborate with Product, Marketing and Development teams to collect business requirements More ❯
South East London, London, United Kingdom Hybrid/Remote Options
Stepstone UK
code. Implement TMS (Tealium IQ, Adobe Analytics, GTM and Adobe Dynamic Tag Manager) changes. Integrate data sources via web and REST APIs. Data pipping and modelling using SQL, DBT, Airflow, ETL, Data Warehousing, Redshift and Python. Transfer knowledge of the business processes and requirements to the development teams. Collaborate with Product, Marketing and Development teams to collect business requirements More ❯
to automate complex business or security processes and build intelligent solutions Solid understanding of software security principles, secure coding practices, and security automation Experience with data orchestration frameworks (e.g., Airflow, Dagster, Prefect) to manage and scale automation pipelines Familiarity with infrastructure-as-code (Terraform), CI/CD systems (BuildKite or similar), and container orchestration platforms (Kubernetes, Docker) Strong problem More ❯
to automate complex business or security processes and build intelligent solutions Solid understanding of software security principles, secure coding practices, and security automation Experience with data orchestration frameworks (e.g., Airflow, Dagster, Prefect) to manage and scale automation pipelines Familiarity with infrastructure-as-code (Terraform), CI/CD systems (BuildKite or similar), and container orchestration platforms (Kubernetes, Docker) Strong problem More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Xact Placements Limited
of building performant, maintainable, and testable systems Solid background in microservices architecture Proficiency with Postgres & MongoDB (relational + non-relational) Experience with event-driven architectures and asynchronous workflows (Kafka, Airflow, etc.) Solid coding practices (clean, testable, automated) The mindset of a builder: thrives in fast-paced startup environments, takes ownership, solves complex challenges Bonus points if you’ve worked More ❯
research, staging, and production environments. Design and implement model registries, versioning systems, and experiment tracking to ensure full reproducibility of all model releases. Deploy ML workflows using tools like Airflow or similar, managing dependencies from data ingestion through model deployment and serving. Instrument comprehensive monitoring for model performance, data drift, prediction quality, and system health. Manage infrastructure as code More ❯