Remote Apache Spark Jobs in England

1 to 25 of 309 Remote Apache Spark Jobs in England

Senior Data Engineer (Remote)

South East, United Kingdom
Hybrid / WFH Options
Circana
be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a … to apply! Job Responsibilities ETL/ELT Pipeline Development: Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow. Implement batch and real-time data processing solutions using Apache Spark. Ensure data quality, governance, and security throughout the data lifecycle. Cloud …/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Develop and optimize large-scale data processing pipelines using Apache Spark and PySpark. Implement data partitioning, caching, and performance tuning techniques to enhance Spark-based workloads. Work with diverse data formats (structured More ❯
Employment Type: Permanent
Posted:

Senior DevOps Engineer London Office, Oxford St ·

London, England, United Kingdom
Hybrid / WFH Options
Biprocsi Ltd
automation to ensure successful project delivery, adhering to client timelines and quality standards. Implement and manage real-time and batch data processing frameworks (e.g., Apache Kafka, Apache Spark, Google Cloud Dataproc) in line with project needs. Build and maintain robust monitoring, logging, and alerting systems for client … in languages like Python, Bash, or Go to automate tasks and build necessary tools. Expertise in designing and optimising data pipelines using frameworks like Apache Airflow or equivalent. Demonstrated experience with real-time and batch data processing frameworks, including Apache Kafka, Apache Spark, or Google Cloud More ❯
Posted:

Lead Data Engineer (Remote)

South East, United Kingdom
Hybrid / WFH Options
Circana
be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a … Engineering & Data Pipeline Development Design, develop, and optimize scalable DATA workflows using Python, PySpark, and Airflow Implement real-time and batch data processing using Spark Enforce best practices for data quality, governance, and security throughout the data lifecycle Ensure data availability, reliability and performance through monitoring and automation. Cloud …/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Build and optimize large-scale data processing pipelines using Apache Spark and PySpark Implement data partitioning, caching, and performance tuning for Spark-based workloads. Work with diverse data formats (structured and unstructured More ❯
Employment Type: Permanent
Posted:

We are hiring: Data Scientist, London

London, United Kingdom
Hybrid / WFH Options
The Society for Location Analysis
learning libraries in one or more programming languages. Keen interest in some of the following areas: Big Data Analytics (e.g. Google BigQuery/BigTable, Apache Spark), Parallel Computing (e.g. Apache Spark, Kubernetes, Databricks), Cloud Engineering (AWS, GCP, Azure), Spatial Query Optimisation, Data Storytelling with (Jupyter) Notebooks More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineering Consultant

London, England, United Kingdom
Hybrid / WFH Options
Endava
Key Responsibilities Data Pipeline Development Architect, implement and maintain real-time and batch data pipelines to handle large datasets efficiently. Employ frameworks such as Apache Spark, Databricks, Snowflake or Airflow to automate ingestion, transformation, and delivery. Data Integration & Transformation Work with Data Analysts to understand source-to-target … ensure regulatory compliance (GDPR). Document data lineage and recommend improvements for data ownership and stewardship. Qualifications Programming: Python, SQL, Scala, Java. Big Data: Apache Spark, Hadoop, Databricks, Snowflake, etc. Cloud: AWS (Glue, Redshift), Azure (Synapse, Data Factory, Fabric), GCP (BigQuery, Dataflow). Data Modelling & Storage: Relational (PostgreSQL More ❯
Posted:

Senior Data Engineer (Databricks) - UK

London, England, United Kingdom
Hybrid / WFH Options
Datapao
companies where years-long behemoth projects are the norm, our projects are fast-paced, typically 2 to 4 months long. Most are delivered using Apache Spark/Databricks on AWS/Azure and require you to directly manage the customer relationship alone or in collaboration with a Project … at DATAPAO, meaning that you'll get access to Databricks' public and internal courses to learn all the tricks of Distributed Data Processing, MLOps, Apache Spark, Databricks, and Cloud Migration from the best. Additionally, we'll pay for various data & cloud certifications, you'll get dedicated time for … seniority level during the selection process. About DATAPAO At DATAPAO, we are delivery partners and the preferred training provider for Databricks, the creators of Apache Spark. Additionally, we are Microsoft Gold Partners in delivering cloud migration and data architecture on Azure. Our delivery partnerships enable us to work in More ❯
Posted:

Lead Data Scientist (Equity Only) - 1%

London, England, United Kingdom
Hybrid / WFH Options
Luupli
analytical, problem-solving, and critical thinking skills. 8.Experience with social media analytics and understanding of user behaviour. 9.Familiarity with big data technologies, such as Apache Hadoop, Apache Spark, or Apache Kafka. 10.Knowledge of AWS machine learning services, such as Amazon SageMaker and Amazon Comprehend. 11.Experience with More ❯
Posted:

Python Software Engineer

London, England, United Kingdom
Hybrid / WFH Options
bigspark
Engineer - UK Remote About Us bigspark, a UK based consultancy delivering next level data platforms and solutions with a focus on exciting technologies including Apache Spark, Apache Kafka and working on projects within Machine Learning, Data Engineering, Streaming and Data Science is looking for a Python Software More ❯
Posted:

Data Engineering Consultant

London, England, United Kingdom
Hybrid / WFH Options
Endava Limited
with business objectives. Key Responsibilities Architect, implement, and maintain real-time and batch data pipelines to handle large datasets efficiently. Employ frameworks such as Apache Spark, Databricks, Snowflake, or Airflow to automate ingestion, transformation, and delivery. Data Integration & Transformation Work with Data Analysts to understand source-to-target … ensure regulatory compliance (GDPR). Document data lineage and recommend improvements for data ownership and stewardship. Qualifications Programming: Python, SQL, Scala, Java. Big Data: Apache Spark, Hadoop, Databricks, Snowflake, etc. Data Modelling: Designing dimensional, relational, and hierarchical data models. Scalability & Performance: Building fault-tolerant, highly available data architectures. More ❯
Posted:

Senior Engineering Manager - Data Platform, 6-Month FTC

London, England, United Kingdom
Hybrid / WFH Options
DEPOP
platform teams at scale, ideally in a consumer or marketplace environment. Deep understanding of distributed systems and modern data ecosystems - including experience with Databrick, Apache Spark, Apache Kafka and DBT. Demonstrated success in managing data platforms at scale, including both batch processing and real-time streaming architectures. More ❯
Posted:

Sr Data Science Manager, Professional Services

London, United Kingdom
Hybrid / WFH Options
Databricks Inc
driving business value through ML Company first focus and collaborative individuals - we work better when we work together. Preferred Experience working with Databricks and Apache Spark Preferred Experience working in a customer-facing role About Databricks Databricks is the data and AI company. More than 10,000 organizations … data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Consultant

London, England, United Kingdom
Hybrid / WFH Options
Apollo Solutions
manipulation and analysis, with the ability to build, maintain, and deploy sequences of automated processes Bonus Experience (Nice to Have) Familiarity with dbt, Fivetran, Apache Airflow, Data Mesh, Data Vault 2.0, Fabric, and Apache Spark Experience working with streaming technologies such as Apache Kafka, Apache More ❯
Posted:

Senior Big Data Engineer

Newcastle upon Tyne, England, United Kingdom
Hybrid / WFH Options
Gaming Innovation Group
at: Object oriented programming (Java) Data modelling using any database technologies ETL processes (ETLs are oldschool, we transfer in memory now) and experience with Apache Spark or Apache NiFi Applied understanding of CI\CD in change management Dockerised applications Used distributed version control systems Excellent team player More ❯
Posted:

Big Data Engineer

Manchester, England, United Kingdom
Hybrid / WFH Options
Gaming Innovation Group
Object-oriented programming (Java) Data modeling using various database technologies ETL processes (transferring data in-memory, moving away from traditional ETLs) and experience with Apache Spark or Apache NiFi Applied understanding of CI/CD in change management Dockerized applications Using distributed version control systems Being an More ❯
Posted:

Data Architect (Bristol - Hybrid, 2 Office Days)

London, England, United Kingdom
Hybrid / WFH Options
SBS
modelling, design, and integration expertise. Data Mesh Architectures: In-depth understanding of data mesh architectures. Technical Proficiency: Proficient in dbt, SQL, Python/Java, Apache Spark, Trino, Apache Airflow, and Astro. Cloud Technologies: Awareness and experience with cloud technologies, particularly AWS. Analytical Skills: Excellent problem-solving and More ❯
Posted:

Senior Data Engineer

City of Westminster, England, United Kingdom
Hybrid / WFH Options
nudge Global Ltd
with cloud data platforms such as GCP (BigQuery, Dataflow) or Azure (Data Factory, Synapse) Expert in SQL, MongoDB and distributed data systems such as Spark, Databricks or Kafka Familiarity with data warehousing concepts and tools (e.g. Snowflake) Experience with CI/CD pipelines, containerization (Docker), and infrastructure-as-code More ❯
Posted:

Senior Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
nudge
with cloud data platforms such as GCP (BigQuery, Dataflow) or Azure (Data Factory, Synapse) Expert in SQL, MongoDB and distributed data systems such as Spark, Databricks or Kafka Familiarity with data warehousing concepts and tools (e.g. Snowflake) Experience with CI/CD pipelines, containerization (Docker), and infrastructure-as-code More ❯
Posted:

Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
Locus Robotics
and scaling data systems. Highly desired experience with Azure, particularly Lakehouse and Eventhouse architectures. Experience with relevant infrastructure and tools including NATS, Power BI, Apache Spark/Databricks, and PySpark. Hands-on experience with data warehousing methodologies and optimization libraries (e.g., OR-Tools). Experience with log analysis More ❯
Posted:

Senior Data Engineer (Databricks)

London, England, United Kingdom
Hybrid / WFH Options
DATAPAO
most complex projects - individually or by leading small delivery teams. Our projects are fast-paced, typically 2 to 4 months long, and primarily use Apache Spark/Databricks on AWS/Azure. You will manage customer relationships either alone or with a Project Manager, and support our pre More ❯
Posted:

Lead Data Scientist, Machine Learning Engineer 2025- UK

London, England, United Kingdom
Hybrid / WFH Options
Aimpoint Digital
industries Design and develop feature engineering pipelines, build ML & AI infrastructure, deploy models, and orchestrate advanced analytical insights Write code in SQL, Python, and Spark following software engineering best practices Collaborate with stakeholders and customers to ensure successful project delivery Who we are looking for We are looking for More ❯
Posted:

Senior Data Engineer - Data Infrastructure and Architecture: C-4 Analytics

Wakefield, Yorkshire, United Kingdom
Hybrid / WFH Options
Flippa.com
Terraform, Flask, Pandas, FastAPI, Dagster, GraphQL, SQLAlchemy, GitLab, Athena. Your Trusted Companions : Docker, Snowflake, MongoDB, Relational Databases (eg MySQL, PostgreSQL), Dagster, Airflow/Luigi, Spark, Kubernetes. Your AWS Kingdom : Lambda, Redshift, EC2, ELB, IAM, RDS, Route53, S3-the building blocks of cloud mastery. Your Philosophy : Continuous integration/deployments More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Director, Software Engineering, Data

London, England, United Kingdom
Hybrid / WFH Options
Cloudera
Data Engineering product area. This next-generation cloud-native service empowers customers to run large-scale data engineering workflows—using industry-standard tools like Apache Spark and Apache Airflow—with just a few clicks, across both on-premises and public cloud environments. You'll play a critical … lead their own teams across multiple time zones Oversee a global team, many of whom are active contributors to open source communities like the Apache Software Foundation Own both technical direction and people management within the team Ensure consistent, high-quality software delivery through iterative releases Hire, manage, coach More ❯
Posted:

Data Architect - Senior Manager

City of London, England, United Kingdom
Hybrid / WFH Options
Staging It
modelling (relational, NoSQL) and ETL/ELT processes. Experience with data integration tools (e.g., Kafka, Talend) and APIs. Familiarity with big data technologies (Hadoop, Spark) and real-time streaming. Expertise in cloud security, data governance, and compliance (GDPR, HIPAA). Strong SQL skills and proficiency in at least one More ❯
Posted:

Data Engineer

London, United Kingdom
Hybrid / WFH Options
Different Technologies Pty Ltd
delivery across a range of projects, including data analysis, extraction, transformation, and loading, data intelligence, data security and proven experience in their technologies (e.g. Spark, cloud-based ETL services, Python, Kafka, SQL, Airflow) You have experience in assessing the relevant data quality issues based on data sources & uses cases More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Consultant - Business Intelligence Specialist

London, United Kingdom
Hybrid / WFH Options
Delta Capita Group
team-oriented environment. Preferred Skills: Experience with programming languages such as Python or R for data analysis. Knowledge of big data technologies (e.g., Hadoop, Spark) and data warehousing concepts. Familiarity with cloud data platforms (e.g., Azure, AWS, Google Cloud) is a plus. Certification in BI tools, SQL, or related More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:
Apache Spark
England
10th Percentile
£47,500
25th Percentile
£56,250
Median
£83,750
75th Percentile
£115,000
90th Percentile
£138,750