8 of 8 Apache Airflow Jobs in Slough

Data Engineer

Hiring Organisation
Anson Mccade
Location
Slough, Berkshire, South East, United Kingdom
Employment Type
Permanent, Work From Home
Salary
£65,000
governance, and data architecture. Familiarity with modern data architectures and cloud-based/distributed systems. Proficiency in SQL, Python and data pipeline tools (e.g., Apache Airflow, Spark). Experience working with major cloud platforms (AWS, Azure, GCP) and big data technologies. Awareness of data governance, data sharing across ...

Sr. Software/Data Engineer, Autonomy (Databricks/Pipelines)

Hiring Organisation
Rivian
Location
Slough, Berkshire, UK
Employment Type
Full-time
experience developing dashboards for analysis and reporting, leveraging tools such as Databricks Workspaces, Streamlit or Jupyter notebooks. Familiarity with workflow orchestration tools (e.g. Apache Airflow) or data transformation frameworks (e.g. dbt) a plus. Familiarity with autonomous systems development (e.g., perception, planning, control) or relevant domains such as robotics ...

Data Platform Engineer

Hiring Organisation
Parser
Location
Slough, Berkshire, UK
Employment Type
Full-time
professional experience in Data Platform. Proficiency in SQL and programming languages like Python Working knowledge of ETL and ELT frameworks and orchestration. Tools like Airflow and dbt Experience working with cloud platforms e.g. Azure Knowledge of data lakes and data warehouses concepts and distributed systems like Kafka, Spark etc. ...

Data Architect

Hiring Organisation
Xcede
Location
Slough, Berkshire, UK
Employment Type
Full-time
Develop and maintain secure data access controls, including RBAC, token policies, and anonymisation mechanisms Support batch and real-time data flows using tools like Airflow, Kafka, Spark, and Terraform Monitor cloud platform performance and implement cost-control measures while improving reliability Collaborate with product, engineering, and governance teams ...

AI Engineer (Remote)

Hiring Organisation
Lumenalta
Location
Slough, Berkshire, UK
Employment Type
Full-time
into production at scale. Technical Skills: Strong backend Python; expertise with ML frameworks (TensorFlow, PyTorch, Scikit-Learn, etc.); familiarity with modern data pipeline tools (Airflow, Spark, Kafka) and workflow orchestration frameworks such as n8n or LangGraph. Applied AI: Background in enterprise applications such as chatbots, workflow automation, RAG pipelines ...

Director of Data Engineering and ML Platform

Hiring Organisation
Preply
Location
Slough, Berkshire, UK
Employment Type
Full-time
product company (experience in edtech or marketplace environments is a plus). Deep technical expertise in modern data and ML stacks, .e.g., Kafka, Spark, Airflow, SageMaker, MLflow, or similar cloud-native architectures. Practical understanding of machine learning workflows and the infrastructure required to support them (feature computation, training, deployment ...

Lead / Senior Software Engineer- Python

Hiring Organisation
Vortexa
Location
Slough, Berkshire, UK
Employment Type
Full-time
code review, source control, build, test, deploy, and operations Awesome If You: Are experienced in Rust/Java/Kotlin Have experience with AWS, Apache Kafka, Kafka Streams, Apache Beam/Flink/Spark - especially deployment, monitoring & debugging Have experience with productisation of Machine Learning research projects … familiar with Airflow or other workflow orchestration tools, and worked with Kubernetes Understand data lake systems and file formats like Parquet, Orc, Athena Have some relevant AWS or Kafka certifications Benefits: A vibrant, diverse company pushing ourselves and the technology to deliver beyond the cutting edge A team ...

Senior Data Engineer

Hiring Organisation
EDF Trading
Location
Slough, Berkshire, UK
Employment Type
Full-time
company's requirements. Design and build data storage solutions to address different use cases. Be technology agnostic and favour open standards (e.g., Parquet, Apache Iceberg). Find the right balance between cost and performance of the proposed solutions. Create scalable ingestion and transformation workflows, covering aspects such as observability … Standards: Hands-on experience with a data lakehouse implementation (either vendor based or open source) which is based open standards such as Parquet and Apache Iceberg. Experience in building data models, partition strategies and performance optimization (e.g. data compaction). Data Transformation & Lineage: Experience in performing data transformation ...