5 of 5 Apache Airflow Jobs in Birmingham

Data and AI Consultant - Full time

Hiring Organisation
Staffworx
Location
Birmingham, UK
Employment Type
Full-time
Vertex AI or Azure ML Data engineering and MLOps Experience with data warehouses such as Snowflake, BigQuery or Redshift Workflow orchestration using tools like Airflow or Dagster Experience with MLOps tools such as MLflow, Weights and Biases or similar Awareness of data and model drift, and how to monitor ...

Data Engineering Intern

Hiring Organisation
Hireshire
Location
Birmingham, England, United Kingdom
program supports beginners transitioning into the data domain. Key Responsibilities Learn to design and build data pipelines using tools such as Python, SQL, and Apache Airflow Assist in collecting, transforming, and loading (ETL) data from multiple sources Support data warehousing solutions using platforms like BigQuery, Snowflake, Redshift ...

Analytics Engineer

Hiring Organisation
Xcede
Location
Birmingham, England, United Kingdom
tools, helping the business make confident, data-driven decisions. Why this role is exciting: • Work in a modern data environment with tools like dbt, Airflow, Python, AWS, Databricks and Looker/BI platforms • Build scalable data models, pipelines and self-serve analytics solutions • Automate processes and improve operational efficiency ...

Engineering Manager - Cybersecurity Applications - Fully Remote

Hiring Organisation
Digital Saints | Start-up Talent Partner
Location
Birmingham, UK
Employment Type
Full-time
start-ups or high-growth scale-ups • Strong technical grounding across Node.js, Typescript, React, MySQL and cloud-native architectures (AWS, ECS/K8s, Kafka, Airflow) • Comfortable driving Agile and DevOps culture, and setting high standards for clean, testable, secure code in fast-moving environments If you want ...

Senior Data Engineer

Hiring Organisation
Fynity
Location
Birmingham, UK
Employment Type
Full-time
fast-paced, challenging environment. Key Responsibilities: Design, implement, and debug ETL pipelines to process and manage complex datasets. Leverage big data tools, including Apache Kafka, Spark, and Airflow, to deliver scalable solutions. Collaborate with stakeholders to ensure data quality and alignment with business goals. Utilize programming expertise … data engineering projects Good hands-on experience of designing, implementing, debugging ETL pipeline Expertise in Python, PySpark and SQL languages Expertise with Spark and Airflow Experience of designing data pipelines using cloud native services on AWS Extensive knowledge of AWS services like API Gateway, Lambda, Redshift, Glue, Cloudwatch, etc. ...