4 of 4 Permanent Apache Airflow Jobs in Watford

Data and AI Consultant - Full time

Hiring Organisation
Staffworx
Location
Watford, Hertfordshire, UK
Employment Type
Full-time
Vertex AI or Azure ML Data engineering and MLOps Experience with data warehouses such as Snowflake, BigQuery or Redshift Workflow orchestration using tools like Airflow or Dagster Experience with MLOps tools such as MLflow, Weights and Biases or similar Awareness of data and model drift, and how to monitor ...

Data Engineering Coach

Hiring Organisation
iO Sphere
Location
Watford, Hertfordshire, UK
Employment Type
Full-time
modelling) ETL/ELT pipeline design, build, and orchestration Docker and containerisation Cloud environments (Azure, AWS, or GCP) Version control (Git) Bonus points for: Airflow, Prefect, dbt, or cloud-native orchestration CI/CD and deployment automation Data warehousing fundamentals (star schema, staging patterns) Plus: Ability to explain complex ...

Engineering Manager - Cybersecurity Applications - Fully Remote

Hiring Organisation
Digital Saints | Start-up Talent Partner
Location
Watford, Hertfordshire, UK
Employment Type
Full-time
start-ups or high-growth scale-ups • Strong technical grounding across Node.js, Typescript, React, MySQL and cloud-native architectures (AWS, ECS/K8s, Kafka, Airflow) • Comfortable driving Agile and DevOps culture, and setting high standards for clean, testable, secure code in fast-moving environments If you want ...

Senior Data Engineer

Hiring Organisation
Fynity
Location
Watford, Hertfordshire, UK
Employment Type
Full-time
fast-paced, challenging environment. Key Responsibilities: Design, implement, and debug ETL pipelines to process and manage complex datasets. Leverage big data tools, including Apache Kafka, Spark, and Airflow, to deliver scalable solutions. Collaborate with stakeholders to ensure data quality and alignment with business goals. Utilize programming expertise … data engineering projects Good hands-on experience of designing, implementing, debugging ETL pipeline Expertise in Python, PySpark and SQL languages Expertise with Spark and Airflow Experience of designing data pipelines using cloud native services on AWS Extensive knowledge of AWS services like API Gateway, Lambda, Redshift, Glue, Cloudwatch, etc. ...