with scripting languages such as Python or Scala . Demonstrated ability to create impactful dashboards using Power BI and Tableau . Deep understanding of data pipeline orchestration tools (e.g., Airflow, dbt, Azure Data Factory). Experience working in agile, cross-functional teams, collaborating closely with data analysts, engineers, and business stakeholders. Strong problem-solving skills, attention to detail, and More ❯
AND EXPERIENCE: The ideal Head of Data Platform will have: Extensive experience with Google Cloud Platform (GCP), particularly BigQuery Proficiency with a modern data tech stack, including SQL, Python, Airflow, dbt, Dataform, Terraform Experience in a mid-large sized company within a regulated industry, with a strong understanding of data governance. A strategic mindset, leadership skills, and a hands More ❯
New Malden, Surrey, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
hands-on background in data engineering, with 5+ years working on modern data platforms Experience leading cloud data migrations- GCP and BigQuery strongly preferred Proficiency in SQL, Python, dbt, Airflow, Terraform and other modern tooling Excellent understanding of data architecture, governance, and DevOps best practices Proven leadership or team management experience within a regulated or mid-to-large tech More ❯
. NoSQL (e.g., MongoDB and Firestore). SQL querying (e.g., BigQuery, Snowflake), including the ability to work with complex data structures and very large data volumes. Orchestration services (e.g., Airflow, Luigi, Cloud Compose). Proactive, independent, responsible and attentive to detail. Eager and able to learn, analyse, resolve problems, and improve the standard of BVGroup data infrastructure. Degree in More ❯
/snowflake schemas) Deep expertise in dbt - including documentation, testing, and CI/CD Proficiency with Python or Bash for automation and orchestration Familiarity with pipeline orchestration tools (e.g., Airflow) Knowledge of data governance, lineage, and quality assurance practices Experience working in cloud-native environments (preferably AWS) Comfortable using Git-based workflows for version control If you'd like More ❯
Brighton, East Sussex, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
best practices in testing, data governance, and observability. Lead roadmap planning and explore emerging technologies (e.g. GenAI). Ensure operational stability and support incident resolution. Tech Stack Python , SQL , Airflow , AWS , Fivetran , Snowflake , Looker , Docker (You don't need to tick every box - if you've worked with comparable tools, that's great too.) What We're Looking For More ❯
systems and APIs (RESTful/GraphQL), with solid experience in microservices and databases (SQL/NoSQL). You know your way around big data tools (Spark, Dask) and orchestration (Airflow, DBT). You understand NLP and have experience working with Large Language Models. You're cloud-savvy (AWS, GCP, or Azure) and comfortable with containerization (Docker, Kubernetes). You More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Method Resourcing
contexts. Bonus Experience (Nice to Have) Exposure to large language models (LLMs) or foundational model adaptation. Previous work in cybersecurity, anomaly detection, or behavioural analytics. Familiarity with orchestration frameworks (Airflow or similar). Experience with scalable ML systems, pipelines, or real-time data processing. Advanced degree or equivalent experience in ML/AI research or applied science. Cloud platform More ❯
toggle section visibility Proficiency in SQL and Python Experience with AWS cloud and analytics platforms such as Redshift, Dataiku, and Alation Familiarity with open-source technologies like Presto and Airflow A passion for data and coding, with a focus on user experience A learning mindset, can-do attitude, and effective communication skills The ability to work on innovative projects More ❯
drug discovery, combining quantum-inspired physics with generative models Real-World Impact : Every feature shipped helps scientists prioritize molecules and design better candidates, faster Modern Stack & Challenges : Python, FastAPI, Airflow, Snowflake, Kubernetes, ML workflows, scientific infra, data engineering at scale High Ownership, High Impact : Engineers contribute to architecture, tooling, and scientific decision-making Interdisciplinary Team : Collaborate with chemists, physicists More ❯
optimisation, or demand forecasting problems. Experience with data visualisation tools (e.g., Power BI, Tableau). Familiarity with version control (Git), cloud platforms (GCP, AWS, or Azure), or orchestration tools (Airflow). Experience working in or with aviation, travel, retail, or transportation industries . Experience working in Agile product teams or similar collaborative environments These are full-time positions and More ❯
from you. Key Responsibilities: - Design and build high-scale systems and services to support data infrastructure and production systems. - Develop and maintain data processing pipelines using technologies such as Airflow, PySpark and Databricks. - Implement dockerized high-performance microservices and manage their deployment. - Monitor and debug backend systems and data pipelines to identify and resolve bottlenecks and failures. - Work collaboratively More ❯
adaptable to fast-paced startup environment, comfortable with ambiguity and evolving responsibilities Work Authorization: Must be eligible to work in US or UK Preferred Experience: Data orchestration tools (e.g. , Airflow, Prefect)Experience deploying, monitoring, and maintaining ML models in production environments (MLOps)Familiarity with big data technologies ( e.g. , Spark, Hadoop)Background in time-series analysis and forecastingExperience with data More ❯
You have experience with RAG (Retrieval-Augmented Generation) systems, vector databases, and embedding models for knowledge extraction. You can architect complex workflows. You have experience with workflow orchestration tools (Airflow, Prefect, Temporal) or have built custom pipeline systems for multi-step autonomous processes. You bridge science and engineering. You are comfortable with scientific computing libraries (NumPy, SciPy, pandas) and More ❯
strategy. Expertise in causal inference methods and forecasting. Expertise in data querying languages (e.g. SQL) and scripting languages (e.g. Python, R). Experience with data architecture technologies such as Airflow, Databricks, and dbt. Preferred qualifications: Experience in technology, financial services and/or a high growth environment. Experience with Excel and Finance systems (e.g. Oracle). Equal opportunity Airwallex More ❯
warehouse provider e.g. Databricks, GCP, Snowflake The following would be nice to have Experience in the following Languages: Python Experience with the following tools: Github Lightdash Elementary CircleCI Databricks Airflow Kubernetes DuckDB Spark Data Modelling Techniques e.g. Kimball, OBT Why else you'll love it here Wondering what the salary for this role is? Just ask us! On a More ❯
sources, whether that's batch files or real-time streams. You'll have set up and worked with ETL and ELT tools like Dagster, AWS Glue, Azure Data Factory, Airflow or dbt, and you can decide what tools are right for the job. You'll have an understanding of how Node.js and TypeScript fit into a modern development environment More ❯
Understands modern software delivery methodologies and project management tools and uses them to drive successful outcomes Technical requirements Cloud Data Warehouse (Big Query, Snowflake, Redshift etc) Advanced SQL DBT Airflow (or similar tool) ELT Looker (or similar tool) Perks of Working at Viator Competitive compensation packages (routinely benchmarked against the latest industry data), including base salary and annual bonuses More ❯
Required Skills and Qualifications: Proven experience as an ApacheAirflow SME or Lead Developer in a production-grade environment. Strong understanding of Airflow internals, including scheduler, executor types (Celery, Kubernetes), and plugin development. Experience with workload orchestration and autoscaling using KEDA (Kubernetes-based Event Driven Autoscaler), and familiarity with Celery for distributed task execution and background job More ❯
unsupervised, and reinforcement learning methods. Experience with GCP services such as Vertex AI, BigQuery ML, Dataflow, AI Platform Pipelines, and Dataproc. Solid knowledge of distributed systems, data streaming (e.g., Apache Beam, Kafka), and large-scale data processing. ML Ops: Hands-on experience with continuous integration/deployment (CI/CD) for ML, model versioning, and monitoring. Business Acumen: Ability … to understand marketing and advertising concepts like customer lifetime value (CLV), attribution modeling, real-time bidding (RTB), and audience targeting. Strong understanding of data pipeline orchestration tools (e.g., ApacheAirflow, Kubernetes). You thrive when working as part of a team Comfortable in a fast-paced environment Have excellent written and verbal English skills Last but not least More ❯
City of London, London, United Kingdom Hybrid / WFH Options
QiH Group
unsupervised, and reinforcement learning methods. Experience with GCP services such as Vertex AI, BigQuery ML, Dataflow, AI Platform Pipelines, and Dataproc. Solid knowledge of distributed systems, data streaming (e.g., Apache Beam, Kafka), and large-scale data processing. ML Ops: Hands-on experience with continuous integration/deployment (CI/CD) for ML, model versioning, and monitoring. Business Acumen: Ability … to understand marketing and advertising concepts like customer lifetime value (CLV), attribution modeling, real-time bidding (RTB), and audience targeting. Strong understanding of data pipeline orchestration tools (e.g., ApacheAirflow, Kubernetes). You thrive when working as part of a team Comfortable in a fast-paced environment Have excellent written and verbal English skills Last but not least More ❯
unsupervised, and reinforcement learning methods. Experience with GCP services such as Vertex AI, BigQuery ML, Dataflow, AI Platform Pipelines, and Dataproc. Solid knowledge of distributed systems, data streaming (e.g., Apache Beam, Kafka), and large-scale data processing. ML Ops: Hands-on experience with continuous integration/deployment (CI/CD) for ML, model versioning, and monitoring. Business Acumen: Ability … to understand marketing and advertising concepts like customer lifetime value (CLV), attribution modeling, real-time bidding (RTB), and audience targeting. Strong understanding of data pipeline orchestration tools (e.g., ApacheAirflow, Kubernetes). You thrive when working as part of a team Comfortable in a fast-paced environment Have excellent written and verbal English skills Last but not least More ❯
techniques and tools Strong hands-on experience with cloud data platforms, specifically Google Cloud Platform - BigQuery Experience with leading ETL/ELT tools and data pipeline orchestration (e.g., Dataflow, ApacheAirflow, Talend, Informatica) Advanced SQL skills and deep knowledge of various database technologies (relational, columnar, NoSQL) Practical experience in establishing data governance frameworks, data quality initiatives, and Master More ❯