analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using ApacheAirflow, including developing custom Python operators and DAGs. Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs. Ensure data … and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with ApacheAirflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms (e.g., AWS, Azure, GCP) and relevant data services. Understanding More ❯
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with ApacheAirflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and More ❯
and scalability Contribute to the overall data strategy and architecture 🔹 Tech Stack You’ll be working with: Programming: Python, SQL, Scala/Java Big Data: Spark, Hadoop, Databricks Pipelines: Airflow, Kafka, ETL tools Cloud: AWS, GCP, or Azure (Glue, Redshift, BigQuery, Snowflake) Data Modelling & Warehousing 🔹 What’s on Offer 💷 £80,000pa (Permanent role) 📍 Hybrid – 2 days per week in More ❯
knowledge of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project More ❯
knowledge of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
development efficiency and deployment effectiveness, including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong More ❯
large scale bioinformatics datasets. Experience using Nextflow pipelines. Knowledge of NLP techniques and experience of processing unstructured data, using vector stores, and approximate retrieval. Familiarity with orchestration tooling (e.g. Airflow or Google Workflows). Experience with AI/ML powered applications. Experience with Docker or containerized applications. Why GSK? Uniting science, technology and talent to get ahead of disease More ❯
optimisation. This is an ideal role for someone looking to hit the ground running, work on complex challenges with autonomy. Role Requirements: Exceptional ability with tools such as Python, Airflow, SQL, and at least one Cloud provider Experience forecasting, customer, and propensity models Experience with building machine learning models and deploying at scale 2:1 or above in Mathematics More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Medialab Group
colleagues. Nice to Have Skills Experience working with GCP (BigQuery) or other modern cloud-native data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Hlx Technology
Collaborate with ML researchers and biologists to translate raw data into actionable insights and high-quality training data Scale distributed systems using Kubernetes, Terraform, and orchestration tools such as Airflow, Flyte, or Temporal Write clean, extensible, and well-tested code to ensure long-term maintainability and collaboration across teams About You We are looking for data and platform engineers … Experience designing and implementing large-scale data storage systems (feature stores, timeseries databases, warehouses, or object stores) Strong distributed systems and infrastructure skills (Kubernetes, Terraform, orchestration frameworks such as Airflow/Flyte/Temporal) Hands-on cloud engineering experience (AWS, GCP, or Azure) Strong software engineering fundamentals, with a track record of writing maintainable, testable, and extensible code Familiarity More ❯
management disciplines, including data integration, modeling, optimisation, data quality and Master Data Management. Experience with database technologies such as RDBMS (SQL Server, Oracle) or NoSQL (MongoDB). Knowledge in Apache technologies such as Spark, Kafka and Airflow to build scalable and efficient data pipelines. Have worked on migration projects and some experience with management systems such as SAP More ❯
the development of internal ML platforms, and help identify high-impact opportunities for rapid capability delivery. You'll also be responsible for building robust MLOps pipelines using tools like Airflow and MLflow, ensuring scalable and automated model deployment and governance. What you'll need to succeed Essential Skills: * Bachelor's degree (2:1 or above) in science, engineering, mathematics … or computer science * Strong Python development skills * Practical experience with machine learning or deep learning in scientific/engineering contexts * Familiarity with MLOps tools and practices (Airflow, MLflow) and containerisation (e.g., Docker) * Passion for working with real-world datasets and data visualisation * Interest in material discovery, computer vision, big data, and optimisation techniques * Excellent communication and collaboration skills * Problem More ❯