designing scalable ML infrastructure on cloud platforms (AWS SageMaker, GCP AI Platform, Azure ML, or equivalent). Solid understanding of data-engineering concepts: SQL/noSQL, data pipelines (Airflow, Prefect, or similar), and batch/streaming frameworks (Spark, Kafka). Leadership & Communication: Proven ability to lead cross-functional teams in ambiguous startup settings. Exceptional written and verbal communication skills—able More ❯
complex data pipelines at scale Strong knowledge of distributed computing frameworks (Spark, Hadoop ecosystem) Experience with cloud-based data platforms (AWS, Azure, GCP) Proficiency in data orchestration tools (Airflow, Prefect, Dagster, or similar) Solid programming skills in Python, Scala, or Java Experience integrating ML workflows into production data systems Strong understanding of data modeling, ETL processes, and database design Demonstrated More ❯
with monitoring and logging tools (e.g., Prometheus, Loki, Grafana) in application and data-intensive environments. Proficiency in Configuration Management tools (Chef, Puppet, Ansible) and data orchestration tools (e.g., Airflow, Prefect). Strong background in containerization using Docker and orchestration with Kubernetes. In-depth knowledge of Linux, SQL, cloud security, scripting for automation (Python, Bash), load balancing technologies, and CDN. Agile More ❯
e.g., AWS, Google Cloud Platform, Microsoft Azure) for data storage, processing, and deployment of data solutions. Data Pipeline Orchestration : Experience with workflow orchestration tools such as Apache Airflow or Prefect to manage and schedule data pipelines. Data Modelling : Strong understanding of data modelling concepts (e.g., star schema, snowflake schema) and best practices for designing efficient and scalable data architectures. Data More ❯
Join to apply for the Senior Data Analytics Developer role at Coveo Join to apply for the Senior Data Analytics Developer role at Coveo Are you the expert who transforms data into strategic insights to drive our success? As a More ❯
London, England, United Kingdom Hybrid / WFH Options
Disruptive Industries
working with large datasets, optimising data structures and queries for performance, particularly in relation to geo-spatial and temporal data (e.g. BigQuery). Familiarity with data orchestration solutions (e.g. Prefect, Airflow, AWS Step Functions Desirable Technical Skills (Advantageous but not essential): Containerisation & Orchestration : Experience Kubernetes and Helm for container management and deployment. Ideally experience with GitOps tooling such a s More ❯
in e-commerce, retail, or the travel industry Experience designing and analysing large-scale A/B test experiments Mastery of workflow orchestration technologies such as Airflow, Dagster or Prefect Expert knowledge of technologies such as: Google Cloud Platform, particularly Vertex AI Docker and Kubernetes Infrastructure as Code Experience establishing data science best practices across an organisation Perks of joining More ❯
various cloud providers (AWS and GCP). disaster recovery management leveraging cloud specific capabilities. Cloud Storage concepts (Block storage/Blob storage). job scheduling tools such as Airflow, Prefect Scheduler and Slurm (or other HPC scheduler). designing and maintaining CICD pipelines to ensure fast delivery and integration of the platform services. Contact If this sounds like you, or More ❯
commerce, retail, or the travel industry. Conducted and analysed large scale A/B experiments Experience mentoring team members Experience with workflow orchestration technologies such as Airflow, Dagster or Prefect Experience with technologies such as: Google Cloud Platform, particularly Vertex AI Docker and Kubernetes Perks of joining us: Company pension contributions at 5% Individualised training budget for you to learn More ❯
London, England, United Kingdom Hybrid / WFH Options
Our Future Health UK
Requirements To succeed in this role you will already have some of the following skills: Hands-on experience of working with open source data orchestration systems such as Dagster, Prefect, or Airflow Solid understanding of distributed compute engines such as Spark/Databricks Confidence using Docker, Kubernetes, and Helm in cloud environments Experience building software for public cloud environments (Azure More ❯
game engines technologies Preferred Qualifications * Designing and implementing real-time pipelines. * Designing and implementing data pipelines for CV/ML systems. * Experience with workflow management engines (i.e. Airflow, Luigi, Prefect, Dagster, Google Cloud Composer, AWS Step Functions, Azure Data Factory, UC4, Control-M). * Experience with data quality and validation. * Experience querying massive datasets using Spark, Presto, Hive, Impala, etc. More ❯
London, England, United Kingdom Hybrid / WFH Options
Apollo Solutions
and Apache Spark Experience working with streaming technologies such as Apache Kafka, Apache Flink, or Google Cloud Dataflow Hands-on experience with modern data orchestration tools like Dagster or Prefect Knowledge of data governance and cataloging tools like Great Expectations, Collibra, or Alation Experience in pricing, scoping, and proposal development for data engineering projects Ready to take your career to More ❯
London, England, United Kingdom Hybrid / WFH Options
Tasman
specific requirements and available budget. Some of the products and platforms that you are likely to come across at Tasman are: AWS, GCP and Azure cloud environments; Airflow and Prefect; Snowflake, BigQuery, Athena, and Redshift; Airbyte, Stitch, Fivetran, and Meltano; dbt (both Cloud and Core); Looker, Metabase, Tableau and Hollistics; Docker and Kubernetes; Snowplow, Segment, Rudderstack, and mParticle; Metaplane and More ❯
and contribute to the continuous improvement of our data ecosystem. Lead the design and development of a scalable, modular, and maintainable Data Platform, using our key technologies, NATs, Snowflake, Prefect, running on Kubernetes in our AWS cloud environments. Build our Data Platform and solve complex data problems to deliver insights helping to build our trading data platform. Foster a culture … to thrive in this role which demands technical and data-driven results. Understanding of trading platforms/financial markets. Experience with backend technologies (e.g. Go, C#). Experience with Prefect to allow you to design, test and run your data pipelines with Python. Knowledge of Postgresql. Working with third-party data providers and ingesting real-time trading data feeds. Vitality More ❯
writing Python Experience as Senior Engineer on a data warehouse, ideally Snowflake Experience building and/or maintaining a CI/CD pipeline Experience using modern orchestration tooling e.g. Prefect, Luigi, Airflow Experience developing infrastructure in Terraform or a similar IAC tool Experience using Docker A positive and proactive attitude to problem solving A team player who is willing to More ❯
and Databricks: Deep understanding of Apache Spark principles and experience with Databricks notebooks, clusters, and workspace management. Orchestration tools expertise: Strong experience with workflow management tools like Apache Airflow, Prefect, or similar. This includes designing and implementing complex DAGs (Directed Acyclic Graphs) for pipeline orchestration. Cloud platform experience: Hands-on experience with AWS or Azure , including services related to data More ❯
Slurm) or orchestration technologies (e.g. Kubernetes) Excellent written and verbal communication skills. Ability to work well in a fast-paced environment. Nice to have: Experience with other orchestration technologies (Prefect, Airflow) Experience with advanced software engineering concepts. Experience with modern software development tooling, such as Gitlab, Artifactory or Docker. Experience with infrastructure automation and configuration management, such as Ansible and More ❯
Senior Software Engineer, United Kingdom Remote, United Kingdom Senior Software Engineer About the company: The mining industry has steadily become worse at finding new ore deposits, requiring >10X more capital to make discoveries compared to 30 years ago. The easy More ❯
workflows 🧠 What We’re Looking For: Strong Python development skills, especially in a data engineering or quant dev context Experience with workflow orchestrators like Dagster (big plus), Airflow , or Prefect Familiarity with tools like Postgres , Kafka , Snowflake , and cloud-based storage Ability to thrive in a high-performance, collaborative trading environment If you're passionate about credit markets and want More ❯
workflows 🧠 What We’re Looking For: Strong Python development skills, especially in a data engineering or quant dev context Experience with workflow orchestrators like Dagster (big plus), Airflow , or Prefect Familiarity with tools like Postgres , Kafka , Snowflake , and cloud-based storage Ability to thrive in a high-performance, collaborative trading environment If you're passionate about credit markets and want More ❯
workflows What We’re Looking For: Strong Python development skills, especially in a data engineering or quant dev context Experience with workflow orchestrators like Dagster (big plus), Airflow , or Prefect Familiarity with tools like Postgres , Kafka , Snowflake , and cloud-based storage Ability to thrive in a high-performance, collaborative trading environment If you're passionate about credit markets and want More ❯
London, England, United Kingdom Hybrid / WFH Options
Scope3
Platform and/or Amazon Web Services Expertise in Python, SQL Big Query or equivalent data warehouse experience (Redshift, Snowflake, etc.) Airflow or equivalent in-house data platform experience (Prefect, Dagster, etc.) Experience with Clickhouse Demonstrated experience perpetuating an inclusive and collaborative working environment Preference may be given to candidates with the following: Experience with Golang, Typescript Experience working in More ❯
higher in STEM or quantitative discipline preferred Preferred Skills: Full spectrum experience with modern data platforms (Snowflake, dbt, Sigma, Alation). Proficient with ETL platforms such as Fivetran and Prefect Hands on experience with data lineage and observability platforms Knowledge of Agile methodologies and project management tools (JIRA, Trello). The duties and responsibilities described here are not exhaustive and More ❯
analytics and engineering teams. Builder at Heart: Comfortable rolling up your sleeves to code, model, and optimise - not just direct others. Bonus Points: Experience with orchestration frameworks (Airflow, Dagster, Prefect), Python, AWS ecosystem, and exposure to insurance, fintech, or regulated industries. Strategic Thinker: You thrive on aligning technical roadmaps with business goals, and you naturally think about building for a More ❯
on Business, Marketing, Analytics, or Computer Science Experience with AWS Experience with Javascript Basic understanding of digital interaction technologies such as live chat, virtual agents/chatbots Experience with Prefect Experience building User Interfaces What You Should Know About This Team Our Proof of Concept team is known for its collaborative spirit, a strong desire to learn, and, most importantly More ❯