4 of 4 Remote Apache Airflow Jobs in Surrey

Senior Software Engineer (Python) - FinCrime

Hiring Organisation
Zepz
Location
Guildford, Surrey, UK
Employment Type
Full-time
Java and Spring or are willing to learn on the job. Have strong familiarity with SQL Experience with the following tools and frameworks: Apache Airflow and or data pipelines. Celery FastAPI and/or Flask Postgres (AWS Aurora Ideally) DynamoDB Have experience working in the FinTech industry, particularly ...

Senior Software Engineer (Python) - FinCrime

Hiring Organisation
Zepz
Location
Woking, Surrey, UK
Employment Type
Full-time
Java and Spring or are willing to learn on the job. Have strong familiarity with SQL Experience with the following tools and frameworks: Apache Airflow and or data pipelines. Celery FastAPI and/or Flask Postgres (AWS Aurora Ideally) DynamoDB Have experience working in the FinTech industry, particularly ...

Data Engineer

Hiring Organisation
Anson Mccade
Location
Woking, Surrey, South East, United Kingdom
Employment Type
Permanent, Work From Home
Salary
£65,000
governance, and data architecture. Familiarity with modern data architectures and cloud-based/distributed systems. Proficiency in SQL, Python and data pipeline tools (e.g., Apache Airflow, Spark). Experience working with major cloud platforms (AWS, Azure, GCP) and big data technologies. Awareness of data governance, data sharing across ...

Data Architect - Reigate

Hiring Organisation
esure Group
Location
Reigate, Surrey, UK
Employment Type
Full-time
controls, including RBAC, SSO, token policies, and pseudonymisation frameworks. Develop resilient data flows for both batch and streaming workloads using technologies such as Kafka, Airflow, DBT, and Terraform. Shape data strategy and standards by contributing to architectural decisions, authoring ADRs, and participating in reviews, data councils, and platform enablement … native data infrastructures (Databricks, Snowflake) especially in AWS environments is a plus Experience in building and maintaining batch and streaming data pipelines using Kafka, Airflow, or Spark Familiarity with governance frameworks, access controls (RBAC), and implementation of pseudonymisation and retention policies Exposure to enabling GenAI and ML workloads ...