4 of 4 Apache Airflow Jobs in the Midlands

Data Engineer 12 month FTC

Hiring Organisation
Harnham - Data & Analytics Recruitment
Location
Birmingham, West Midlands, England, United Kingdom
Employment Type
Full-Time
Salary
£70,000 - £75,000 per annum
based environment (preferred). Provide input into best practices across the data function and help "keep the lights on." Tech Stack Core: Python, SQL, Airflow, dbt, Terraform, CI/CD, Power BI Nice to have: Kimball methodology Cloud: Azure (preferred) What They're Looking For 2-4 years' experience

Senior Software Engineer

Hiring Organisation
Previsico Limited
Location
Loughborough, Leicestershire, East Midlands, United Kingdom
Employment Type
Permanent, Work From Home
Salary
£80,000
produce technical specifications. Strong interpersonal skills with the ability to work effectively with a wide range of stakeholders. Understanding of Git. Experience using Airflow or similar. Benefits and Working Arrangements: 5% of annual salary in share options in line with the Company Policy at the end of a successfully

Data Software Engineer Python Spark SaaS

Hiring Organisation
Client Server
Location
Nottingham, Nottinghamshire, East Midlands, United Kingdom
Employment Type
Permanent, Work From Home
have strong Python backend software engineer skills You have experience working with large data sets You have experience of using PySpark and ideally also Apache Spark You believe in automating wherever possible You're a collaborative problem solver with great communication skills Other technology in the stack includes: FastAPI … Django, Airflow, Kafka, ETL, CI/CD, experience with some or all would be beneficial but you'll pick it up on the job What's in it for you: Salary to £100k + bonus Health Insurance Life Assurance 25 days holiday Training and self-development budget Impactful role

Data Engineer Airflow

Hiring Organisation
Harnham - Data & Analytics Recruitment
Location
Birmingham, West Midlands, England, United Kingdom
Employment Type
Full-Time
Salary
£70,000 - £75,000 per annum
used to drive commercial decisions - particularly around pricing, revenue, and customer insight. Key responsibilities include: Manage and maintain the company's data warehouse (Python, Airflow, DBT, Kimball) Ensure data pipelines are robust, accurate, and performant Maintain and develop cloud infrastructure using Infrastructure as Code (Terraform) Identify opportunities to improve … merger YOUR SKILLS AND EXPERIENCE: A successful Data Engineer will bring: Strong SQL and Python skills Experience managing or building data warehouses Familiarity with Airflow and modern data engineering workflows Interest in cloud infrastructure and IaC principles Proactive mindset - not just maintaining systems, but improving them THE BENEFITS