Apache Airflow Jobs in Berkshire

17 of 17 Apache Airflow Jobs in Berkshire

Statistical Data Scientist

Slough, England, United Kingdom
Hybrid / WFH Options
JR United Kingdom
pandas, xarray, SciPy/PyMC/PyTorch, or similar). Experience validating models with historical data and communicating results to non-specialists. Exposure to real-time data engineering (Kafka, Airflow, dbt) Track record turning research code into production services (CI/CD, containers etc) Strong SQL and data-management skills; experience querying large analytical databases (Snowflake highly desirable, but More ❯
Posted:

Data Engineer – Quant Hedge Fund

Slough, England, United Kingdom
Hybrid / WFH Options
JR United Kingdom
z2bz0 years experience gained in a Hedge Fund, Investment Bank, FinTech or similar Expertise in Python and SQL and familiarity with relational and time-series databases. Exposure to Airflow and dbt, as well as Snowflake, Databricks or other Cloud Data Warehouses preferred. Experience implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps: Git More ❯
Posted:

Backend Python Developer

Slough, England, United Kingdom
JR United Kingdom
Manage deployments with Helm and configuration in YAML. Develop shell scripts and automation for deployment and operational workflows. Work with Data Engineering to integrate and manage data workflows using Apache Airflow and DAG-based models. Perform comprehensive testing, debugging, and optimization of backend components. Required Skills Bachelor's degree in Computer Science, Software Engineering, or a related field … and YAML for defining deployment configurations and managing releases. Proficiency in shell scripting for automating deployment and maintenance tasks. Understanding of DAG (Directed Acyclic Graph) models and experience with Apache Airflow for managing complex data processing workflows. Familiarity with database systems (SQL and NoSQL) and proficiency in writing efficient queries. Solid understanding of software development best practices, including More ❯
Posted:

Machine Learning Engineer with Data Engineering expertise

Slough, England, United Kingdom
JR United Kingdom
in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like Apache Airflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with cloud platforms (AWS, Azure, GCP) for deploying ML and data solutions. Knowledge of MLOps More ❯
Posted:

Pricing & Revenue Data Scientist

Slough, England, United Kingdom
JR United Kingdom
SQL, craft new features. Modelling sprint: run hyper-parameter sweeps or explore heuristic/greedy and MIP/SAT approaches. Deployment: ship a model as a container, update an Airflow (or Azure Data Factory) job. Review: inspect dashboards, compare control vs. treatment, plan next experiment. Tech stack Python (pandas, NumPy, scikit-learn, PyTorch/TensorFlow) SQL (Redshift, Snowflake or … similar) AWS SageMaker → Azure ML migration, with Docker, Git, Terraform, Airflow/ADF Optional extras: Spark, Databricks, Kubernetes. What you'll bring 3-5+ years building optimisation or recommendation systems at scale. Strong grasp of mathematical optimisation (e.g., linear/integer programming, meta-heuristics) as well as ML. Hands-on cloud ML experience (AWS or Azure). Proven … Terraform. SQL mastery for heavy-duty data wrangling and feature engineering. Experimentation chops - offline metrics, online A/B test design, uplift analysis. Production mindset: containerise models, deploy via Airflow/ADF, monitor drift, automate retraining. Soft skills: clear comms, concise docs, and a collaborative approach with DS, Eng & Product. Bonus extras: Spark/Databricks, Kubernetes, big-data panel More ❯
Posted:

Lead Data Engineer (Remote)

Bracknell, England, United Kingdom
Hybrid / WFH Options
Circana, LLC
In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and … desire to make a significant impact, we encourage you to apply! Job Responsibilities Data Engineering & Data Pipeline Development Design, develop, and optimize scalable DATA workflows using Python, PySpark, and Airflow Implement real-time and batch data processing using Spark Enforce best practices for data quality, governance, and security throughout the data lifecycle Ensure data availability, reliability and performance through … data processing workloads Implement CI/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Build and optimize large-scale data processing pipelines using Apache Spark and PySpark Implement data partitioning, caching, and performance tuning for Spark-based workloads. Work with diverse data formats (structured and unstructured) to support advanced analytics and machine learning More ❯
Posted:

Solutions Architect (Data Analytics)

Slough, England, United Kingdom
JR United Kingdom
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Ability to define the monitoring, alerting, deployment strategies for various services. Experience providing … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Designing Databricks based solutions for More ❯
Posted:

Senior Cloud and Data Architect

Slough, England, United Kingdom
JR United Kingdom
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients. Ability to define the monitoring, alerting, deployment strategies for various services. Experience providing … minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Required Skills : Mandatory Skills [at least 2 Hyperscalers]: GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF. Preferred Skills : Designing Databricks based More ❯
Posted:

Senior Cloud and Data Solution Architect

Slough, England, United Kingdom
JR United Kingdom
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Ability to define the monitoring, alerting, deployment strategies for various services. Experience providing … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based More ❯
Posted:

Data Orchestration Systems Analyst

Slough, England, United Kingdom
JR United Kingdom
driven decision-making. What we'd like to see from you: 3–5 years of experience in data integration, orchestration, or automation roles Solid experience with orchestration tools (e.g., Apache Airflow, MuleSoft, Dell Boomi, Informatica Cloud). Familiarity with cloud data platforms (e.g., AWS, Microsoft Azure, Google Cloud Platform) and related data movement technologies, including AWS Lambda and More ❯
Posted:

Generative AI Engineer

Slough, England, United Kingdom
JR United Kingdom
Concept. Contributing to AI infrastructure. Building reliable, scalable, and flexible systems. Influencing opinion and decision-making across AI and ML. Skills Python SQL/Pandas/Snowflake/Elasticsearch Airflow/Spark Familiarity with GenAI models/libraries Requirements 6+ years of relevant software engineering experience post-graduation. A degree (ideally a Master’s) in Computer Science, Physics, Mathematics More ❯
Posted:

AI Product and Research Engineer

Slough, England, United Kingdom
JR United Kingdom
features. Rapid Prototyping: Create interactive AI demos and proofs-of-concept with Streamlit, Gradio, or Next.js for stakeholder feedback; MLOps & Deployment: Implement CI/CD pipelines (e.g., GitLab Actions, Apache Airflow), experiment tracking (MLflow), and model monitoring for reliable production workflows; Cross-Functional Collaboration: Participate in code reviews, architectural discussions, and sprint planning to deliver features end-to More ❯
Posted:

Software/Data Engineer - Commodities Desk

Slough, England, United Kingdom
JR United Kingdom
like ICE, CME, Reuters, Bloomberg). Candidates should have 3-6 years of relevant experience and ideally some exposure to commodities data sets. Strong Python skills and exposure to Airflow are essential. Please apply if you want to be part of this unique build-out. #J-18808-Ljbffr More ❯
Posted:

Databricks Data Engineer | Energy Trading

Slough, England, United Kingdom
JR United Kingdom
platforms including, AWS, Azure or SAP ETL/ELT Development Data Modeling Data Integration & Ingestion Data Manipulation & Processing Version Control & DevOps: Skilled in GitHub, GitHub Actions, Azure DevOps Glue, Airflow, Kinesis, Redshift SonarQube, PyTest If you're ready to take on a new challenge and shape data engineering in a trading-first environment, submit your CV today to be More ❯
Posted:

Founding Engineer (Full-Stack)

Slough, England, United Kingdom
JR United Kingdom
core concepts in ML, data science and MLOps. Nice-to-Have : Built agentic workflows/LLM tool-use. Experience with MLFlow, WandB, LangFuse, or other MLOps tools. Experience with AirFlow, Spark, Kafka or similar. Why Plexe? Hard problems: we're automating the entire ML/AI lifecycle from data engineering to insights. High ownership: first 5 engineers write the More ❯
Posted:

Full Stack Developer

Slough, Berkshire, United Kingdom
Lumi Space
Lumi Space is empowering the future prosperity of earth - making space scalable and sustainable using ground-based laser systems. We work with global companies and institutions to build products and services to precisely track satellites and remove the dangers of More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Principal Data Architect

Slough, England, United Kingdom
JR United Kingdom
especially in retail and consumer sectors, and how data supports operational outcomes. Strong coding ability with SQL and Python, as well as experience working with data orchestration tools like Airflow or Dataform. Commercial experience with Spark and Databricks. Familiarity with leading integration and data platforms such as Mulesoft, Talend, or Alteryx. A natural ability to mentor others and provide More ❯
Posted: