. NoSQL (e.g., MongoDB and Firestore). SQL querying (e.g., BigQuery, Snowflake), including the ability to work with complex data structures and very large data volumes. Orchestration services (e.g., Airflow, Luigi, Cloud Compose). Proactive, independent, responsible and attentive to detail. Eager and able to learn, analyse, resolve problems, and improve the standard of BVGroup data infrastructure. Degree in More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
and familiarity with Git Strong communicator, eager to learn, and naturally curious Comfortable working across multiple business areas with varied responsibilities Nice-to-Haves Exposure to tools like Prefect , Airflow , or Dagster Familiarity with Azure SQL , Snowflake , or dbt Tech Stack/Tools Python SQL (on-prem + Azure SQL Data Warehouse) Git Benefits £35,000 - £40,000 starting More ❯
from you. Key Responsibilities: - Design and build high-scale systems and services to support data infrastructure and production systems. - Develop and maintain data processing pipelines using technologies such as Airflow, PySpark and Databricks. - Implement dockerized high-performance microservices and manage their deployment. - Monitor and debug backend systems and data pipelines to identify and resolve bottlenecks and failures. - Work collaboratively More ❯
with remote stakeholders Familiarity with AI development tools such as Cursor, GitHub Copilot, or Claude. BS degree in Computer Science or related engineering field Nice to Have Experience with Airflow, Celery, AWS and/or Azure, Postgres Experience with API platform development Experience with Go Ready to be part of AI transformation at Abnormal AI? Apply Now! Once you More ❯
adaptable to fast-paced startup environment, comfortable with ambiguity and evolving responsibilities Work Authorization: Must be eligible to work in US or UK Preferred Experience: Data orchestration tools (e.g. , Airflow, Prefect)Experience deploying, monitoring, and maintaining ML models in production environments (MLOps)Familiarity with big data technologies ( e.g. , Spark, Hadoop)Background in time-series analysis and forecastingExperience with data More ❯
adaptable to fast-paced startup environment, comfortable with ambiguity and evolving responsibilities Work Authorization: Must be eligible to work in US or UK Preferred Experience: Data orchestration tools (e.g. , Airflow, Prefect)Experience deploying, monitoring, and maintaining ML models in production environments (MLOps)Familiarity with big data technologies ( e.g. , Spark, Hadoop)Background in time-series analysis and forecastingExperience with data More ❯
You have experience with RAG (Retrieval-Augmented Generation) systems, vector databases, and embedding models for knowledge extraction. You can architect complex workflows. You have experience with workflow orchestration tools (Airflow, Prefect, Temporal) or have built custom pipeline systems for multi-step autonomous processes. You bridge science and engineering. You are comfortable with scientific computing libraries (NumPy, SciPy, pandas) and More ❯
or in a similar role, with a focus on data management. Strong programming skills in languages such as Python, Shell Scripting, or similar. Experience with automation frameworks (e.g., Jenkins, Airflow, or similar) and version control tools (e.g., Git). Proficiency in working with databases (SQL, NoSQL) and cloud platforms (AWS, Azure, GCP) for data storage and processing. Solid understanding More ❯
strategy. Expertise in causal inference methods and forecasting. Expertise in data querying languages (e.g. SQL) and scripting languages (e.g. Python, R). Experience with data architecture technologies such as Airflow, Databricks, and dbt. Preferred qualifications: Experience in technology, financial services and/or a high growth environment. Experience with Excel and Finance systems (e.g. Oracle). Equal opportunity Airwallex More ❯
warehouse provider e.g. Databricks, GCP, Snowflake The following would be nice to have Experience in the following Languages: Python Experience with the following tools: Github Lightdash Elementary CircleCI Databricks Airflow Kubernetes DuckDB Spark Data Modelling Techniques e.g. Kimball, OBT Why else you'll love it here Wondering what the salary for this role is? Just ask us! On a More ❯
AI delivery • Proven track record deploying ML systems in production at scale (batch and/or real-time) • Strong technical background in Python and ML engineering tooling (e.g. MLflow, Airflow, SageMaker, Vertex AI, Databricks) • Understanding of infrastructure-as-code and CI/CD for ML systems (e.g. Terraform, GitHub Actions, ArgoCD) • Ability to lead delivery in agile environments-balancing More ❯
AI delivery • Proven track record deploying ML systems in production at scale (batch and/or real-time) • Strong technical background in Python and ML engineering tooling (e.g. MLflow, Airflow, SageMaker, Vertex AI, Databricks) • Understanding of infrastructure-as-code and CI/CD for ML systems (e.g. Terraform, GitHub Actions, ArgoCD) • Ability to lead delivery in agile environments—balancing More ❯
AI delivery Proven track record deploying ML systems in production at scale (batch and/or real-time) Strong technical background in Python and ML engineering tooling (e.g. MLflow, Airflow, SageMaker, Vertex AI, Databricks) Understanding of infrastructure-as-code and CI/CD for ML systems (e.g. Terraform, GitHub Actions, ArgoCD) Ability to lead delivery in agile environments—balancing More ❯
on data storage, transfer, and processing, ensuring data is readily available and secure. Proficient in cloud data services (e.g., AWS, Azure), scripting (e.g., Python, Bash), and orchestration tools (e.g., Airflow, Jenkins). Demonstrated experience with data pipeline development, ETL/ELT processes, and cloud-native data management. Strong understanding of network segmentation, data governance, and cross-domain data transfer More ❯
Previous experience working in e-commerce, retail, or the travel industry Experience designing and analysing large-scale A/B test experiments Mastery of workflow orchestration technologies such as Airflow, Dagster or Prefect Expert knowledge of technologies such as: Google Cloud Platform, particularly Vertex AI Docker and Kubernetes Infrastructure as Code Experience establishing data science best practices across an More ❯
.NET framework backend services and React frontends. You'll utilise tools such as Terraform for infrastructure-as-code ( IaC ), AWS (Lambda, EC2, EKS, Step Functions, VPC etc.) for ETL, Airflow pipelines, Snowflake, and ensure architectural alignment with AI/ML initiatives and data-driven services. You will serve as the go-to engineer for: End-to-end data migration More ❯
AWS) is a plus, but not the primary focus Bonus Points For Experience with model optimization techniques (e.g., quantization, pruning) Familiarity with MLOps tools like MLflow, Weights & Biases, or Airflow (for orchestration) Contributions to open-source ML projects or published research in computer vision Experience with C++ or other compiled languages for performance-critical components Why Join Us? At More ❯
Christchurch, Dorset, United Kingdom Hybrid / WFH Options
Wearebasis
backend APIs will be valuable to ensure solutions are fit for purpose. Tooling Familiarity: Experience with relevant QA and data-related tools, which may include (but not limited to) Airflow, Stitch, DBT, Great Expectations, Datafold, and AWS. CI/CD Integration: Familiarity with CI/CD pipelines and the ability to integrate testing processes within automated deployment workflows. Communication More ❯
intelligence tool, providing reliable data access to users throughout Trustpilot Design, build, maintain, and rigorously monitor robust data pipelines and transformative models using our modern data stack, including GCP, Airflow, dbt, and potentially emerging technologies like real-time streaming platforms Develop and manage reverse ETL processes to seamlessly integrate data with our commercial systems, ensuring operational efficiency Maintain and More ❯
gaming, food, health care). Knowledge of regulatory reporting and treasury operations in retail banking Exposure to Python, Go or similar languages. Experience working with orchestration frameworks such as Airflow/Luigi Have previously used dbt, dataform or similar tooling. Used to AGILE ways of working (Kanban, Scrum) The Interview Process: Our interview process involves 3 main stages More ❯
gaming, food, health care). Knowledge of regulatory reporting and treasury operations in retail banking Exposure to Python, Go or similar languages. Experience working with orchestration frameworks such as Airflow/Luigi Have previously used dbt, dataform or similar tooling. Used to AGILE ways of working (Kanban, Scrum) The Interview Process: Our interview process involves 3 main stages More ❯
gaming, food, health care). Knowledge of regulatory reporting and treasury operations in retail banking Exposure to Python, Go or similar languages. Experience working with orchestration frameworks such as Airflow/Luigi Have previously used dbt, dataform or similar tooling. Used to AGILE ways of working (Kanban, Scrum) The Interview Process: Our interview process involves 3 main stages More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
Monzo
gaming, food, health care). Knowledge of regulatory reporting and treasury operations in retail banking Exposure to Python, Go or similar languages. Experience working with orchestration frameworks such as Airflow/Luigi Have previously used dbt, dataform or similar tooling. Used to AGILE ways of working (Kanban, Scrum) The Interview Process: Our interview process involves 3 main stages More ❯
an understanding of their migration challenges Familiarity with version control (Git) and modern deployment workflows Proficiency in troubleshooting data pipeline failures and performance bottlenecks Experience with data pipeline orchestration (Airflow or similar is a plus) Nice-to-have experience: Exposure to other parts of the modern data stack (e.g., Fivetran, dbt Metrics Layer, Tableau) Experience in CI/CD More ❯