4 of 4 Apache Airflow Jobs in North Yorkshire

Data and AI Consultant - Full time

Hiring Organisation
Staffworx
Location
York, North Yorkshire, UK
Employment Type
Full-time
Vertex AI or Azure ML Data engineering and MLOps Experience with data warehouses such as Snowflake, BigQuery or Redshift Workflow orchestration using tools like Airflow or Dagster Experience with MLOps tools such as MLflow, Weights and Biases or similar Awareness of data and model drift, and how to monitor ...

Data Engineering Coach

Hiring Organisation
iO Sphere
Location
York, North Yorkshire, UK
Employment Type
Full-time
modelling) ETL/ELT pipeline design, build, and orchestration Docker and containerisation Cloud environments (Azure, AWS, or GCP) Version control (Git) Bonus points for: Airflow, Prefect, dbt, or cloud-native orchestration CI/CD and deployment automation Data warehousing fundamentals (star schema, staging patterns) Plus: Ability to explain complex ...

Lead Data Engineer

Hiring Organisation
Midnite
Location
York, North Yorkshire, UK
Employment Type
Full-time
team to make sure our data function not only delivers but also drives strategic decision-making. Our Tech Stack: Python, Docker, Dagster, dbt, Fivetran, Apache Iceberg, Snowflake, S3, Glue, ECS, and Omni. We're constantly evolving our stack and welcome input from engineering leaders on how we can improve … scaling data platforms in a high-growth or start-up environment. Strong expertise in Python and SQL, with deep experience in orchestration frameworks (Dagster, Airflow, Prefect). Advanced knowledge of data modelling and architecture (Kimball dimensional modelling, Data Vault etc). Hands-on experience with dbt, modern data warehouses ...

Senior Data Engineer

Hiring Organisation
Fynity
Location
York, North Yorkshire, UK
Employment Type
Full-time
fast-paced, challenging environment. Key Responsibilities: Design, implement, and debug ETL pipelines to process and manage complex datasets. Leverage big data tools, including Apache Kafka, Spark, and Airflow, to deliver scalable solutions. Collaborate with stakeholders to ensure data quality and alignment with business goals. Utilize programming expertise … data engineering projects Good hands-on experience of designing, implementing, debugging ETL pipeline Expertise in Python, PySpark and SQL languages Expertise with Spark and Airflow Experience of designing data pipelines using cloud native services on AWS Extensive knowledge of AWS services like API Gateway, Lambda, Redshift, Glue, Cloudwatch, etc. ...