Engineering etc Software Development experience in Python or Scala An understanding of Big Data technologies such as Spark, messaging services like Kafka or RabbitMQ, and workflow management tools like Airflow SQL & NoSQL expertise, ideally including Postgres, Redis, MongoDB etc Experience with AWS, and with tools like Docker & Kubernetes As well as this you will be someone willing to take More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
as Azure Data Factory (ADF) and Python. Ensuring the quality of raw datasets to empower Data Analysts in creating robust data models. Deploying and managing data tools on Kubernetes (Airflow, Superset, RStudio Connect). Supporting Data Analytics through the management of DBT, DevOps, and deployment rules. You will have the opportunity to work end-to-end, making meaningful contributions More ❯
systems and APIs (RESTful/GraphQL), with solid experience in microservices and databases (SQL/NoSQL). You know your way around big data tools (Spark, Dask) and orchestration (Airflow, DBT). You understand NLP and have experience working with Large Language Models. You're cloud-savvy (AWS, GCP, or Azure) and comfortable with containerization (Docker, Kubernetes). You More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
and familiarity with Git Strong communicator, eager to learn, and naturally curious Comfortable working across multiple business areas with varied responsibilities Nice-to-Haves Exposure to tools like Prefect , Airflow , or Dagster Familiarity with Azure SQL , Snowflake , or dbt Tech Stack/Tools Python SQL (on-prem + Azure SQL Data Warehouse) Git Benefits £35,000 - £40,000 starting More ❯
from you. Key Responsibilities: - Design and build high-scale systems and services to support data infrastructure and production systems. - Develop and maintain data processing pipelines using technologies such as Airflow, PySpark and Databricks. - Implement dockerized high-performance microservices and manage their deployment. - Monitor and debug backend systems and data pipelines to identify and resolve bottlenecks and failures. - Work collaboratively More ❯
adaptable to fast-paced startup environment, comfortable with ambiguity and evolving responsibilities Work Authorization: Must be eligible to work in US or UK Preferred Experience: Data orchestration tools (e.g. , Airflow, Prefect)Experience deploying, monitoring, and maintaining ML models in production environments (MLOps)Familiarity with big data technologies ( e.g. , Spark, Hadoop)Background in time-series analysis and forecastingExperience with data More ❯
adaptable to fast-paced startup environment, comfortable with ambiguity and evolving responsibilities Work Authorization: Must be eligible to work in US or UK Preferred Experience: Data orchestration tools (e.g. , Airflow, Prefect)Experience deploying, monitoring, and maintaining ML models in production environments (MLOps)Familiarity with big data technologies ( e.g. , Spark, Hadoop)Background in time-series analysis and forecastingExperience with data More ❯
You have experience with RAG (Retrieval-Augmented Generation) systems, vector databases, and embedding models for knowledge extraction. You can architect complex workflows. You have experience with workflow orchestration tools (Airflow, Prefect, Temporal) or have built custom pipeline systems for multi-step autonomous processes. You bridge science and engineering. You are comfortable with scientific computing libraries (NumPy, SciPy, pandas) and More ❯
strategy. Expertise in causal inference methods and forecasting. Expertise in data querying languages (e.g. SQL) and scripting languages (e.g. Python, R). Experience with data architecture technologies such as Airflow, Databricks, and dbt. Preferred qualifications: Experience in technology, financial services and/or a high growth environment. Experience with Excel and Finance systems (e.g. Oracle). Equal opportunity Airwallex More ❯
warehouse provider e.g. Databricks, GCP, Snowflake The following would be nice to have Experience in the following Languages: Python Experience with the following tools: Github Lightdash Elementary CircleCI Databricks Airflow Kubernetes DuckDB Spark Data Modelling Techniques e.g. Kimball, OBT Why else you'll love it here Wondering what the salary for this role is? Just ask us! On a More ❯
AI delivery • Proven track record deploying ML systems in production at scale (batch and/or real-time) • Strong technical background in Python and ML engineering tooling (e.g. MLflow, Airflow, SageMaker, Vertex AI, Databricks) • Understanding of infrastructure-as-code and CI/CD for ML systems (e.g. Terraform, GitHub Actions, ArgoCD) • Ability to lead delivery in agile environments-balancing More ❯
AI delivery • Proven track record deploying ML systems in production at scale (batch and/or real-time) • Strong technical background in Python and ML engineering tooling (e.g. MLflow, Airflow, SageMaker, Vertex AI, Databricks) • Understanding of infrastructure-as-code and CI/CD for ML systems (e.g. Terraform, GitHub Actions, ArgoCD) • Ability to lead delivery in agile environments—balancing More ❯
AI delivery Proven track record deploying ML systems in production at scale (batch and/or real-time) Strong technical background in Python and ML engineering tooling (e.g. MLflow, Airflow, SageMaker, Vertex AI, Databricks) Understanding of infrastructure-as-code and CI/CD for ML systems (e.g. Terraform, GitHub Actions, ArgoCD) Ability to lead delivery in agile environments—balancing More ❯
Previous experience working in e-commerce, retail, or the travel industry Experience designing and analysing large-scale A/B test experiments Mastery of workflow orchestration technologies such as Airflow, Dagster or Prefect Expert knowledge of technologies such as: Google Cloud Platform, particularly Vertex AI Docker and Kubernetes Infrastructure as Code Experience establishing data science best practices across an More ❯
.NET framework backend services and React frontends. You'll utilise tools such as Terraform for infrastructure-as-code ( IaC ), AWS (Lambda, EC2, EKS, Step Functions, VPC etc.) for ETL, Airflow pipelines, Snowflake, and ensure architectural alignment with AI/ML initiatives and data-driven services. You will serve as the go-to engineer for: End-to-end data migration More ❯
intelligence tool, providing reliable data access to users throughout Trustpilot Design, build, maintain, and rigorously monitor robust data pipelines and transformative models using our modern data stack, including GCP, Airflow, dbt, and potentially emerging technologies like real-time streaming platforms Develop and manage reverse ETL processes to seamlessly integrate data with our commercial systems, ensuring operational efficiency Maintain and More ❯
gaming, food, health care). Knowledge of regulatory reporting and treasury operations in retail banking Exposure to Python, Go or similar languages. Experience working with orchestration frameworks such as Airflow/Luigi Have previously used dbt, dataform or similar tooling. Used to AGILE ways of working (Kanban, Scrum) The Interview Process: Our interview process involves 3 main stages More ❯
gaming, food, health care). Knowledge of regulatory reporting and treasury operations in retail banking Exposure to Python, Go or similar languages. Experience working with orchestration frameworks such as Airflow/Luigi Have previously used dbt, dataform or similar tooling. Used to AGILE ways of working (Kanban, Scrum) The Interview Process: Our interview process involves 3 main stages More ❯
an understanding of their migration challenges Familiarity with version control (Git) and modern deployment workflows Proficiency in troubleshooting data pipeline failures and performance bottlenecks Experience with data pipeline orchestration (Airflow or similar is a plus) Nice-to-have experience: Exposure to other parts of the modern data stack (e.g., Fivetran, dbt Metrics Layer, Tableau) Experience in CI/CD More ❯
Understands modern software delivery methodologies and project management tools and uses them to drive successful outcomes Technical requirements Cloud Data Warehouse (Big Query, Snowflake, Redshift etc) Advanced SQL DBT Airflow (or similar tool) ELT Looker (or similar tool) Perks of Working at Viator Competitive compensation packages (routinely benchmarked against the latest industry data), including base salary and annual bonuses More ❯
Senior Data Engineer 100% Remote B2B Contract Full-time position with flexible working hours (overlap with US required) We're looking for a Senior Data Engineer for a company that facilitates freelancing and remote work. Their platform provides a marketplace More ❯
with new methodologies to enhance the user experience. Key skills: Senior Data Scientist experience Commercial experience in Generative AI and recommender systems Strong Python and SQL experience Spark/ApacheAirflow LLM experience MLOps experience AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/ApacheAirflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/ApacheAirflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working More ❯