Apache Spark Jobs in the UK

1 to 25 of 110 Apache Spark Jobs in the UK

Solutions Architect - Big Data and DevOps (f/m/d)

England, United Kingdom
Stackable
robust way possible! Diverse training opportunities and social benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, Apache Kafka, Apache Hadoop, Apache HBase, Apache Nifi, Apache Airflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Software Engineer

Manchester, Lancashire, United Kingdom
Anaplan Inc
production issues. Optimize applications for performance and responsiveness. Stay Up to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark, Databricks, Apache Pulsar, Apache Airflow, Temporal, and Apache Flink, sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise documentation for software, processes … Experience with cloud platforms like AWS, GCP, or Azure. DevOps Tools: Familiarity with containerization (Docker) and infrastructure automation tools like Terraform or Ansible. Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark, Databricks, or similar big data platforms for … processing large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like Apache Airflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with Apache Flink or other stream processing frameworks is a plus. Desired Skills Asynchronous Programming: Familiarity with asynchronous programming tools like Celery or asyncio. Frontend Knowledge More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Databricks Azure Data Engineer x2 - UK Wide (Hybrid Working)

Nationwide, United Kingdom
Hybrid/Remote Options
Adecco
Engineer with an Azure focus, you will be an integral part of our team dedicated to building scalable and secure data platforms. You will leverage your expertise in Databricks, Apache Spark, and Azure to design, develop, and implement data warehouses, data lakehouses, and AI/ML models that fuel our data-driven operations. Skills/Experience Design and … build high-performance data pipelines: Utilize Databricks and Apache Spark to extract, transform, and load data into Azure Data Lake Storage and other Azure services. Develop and maintain secure data warehouses and data lakehouses: Implement data models, data quality checks, and governance practices to ensure reliable and accurate data. Build and deploy AI/ML models: Integrate Machine … and best practices with a focus on how AI can support you in your delivery work Solid experience as a Data Engineer or similar role. Proven expertise in Databricks, Apache Spark, and data pipeline development and strong understanding of data warehousing concepts and practices. Experience with Microsoft Azure cloud platform, including Azure Data Lake Storage, Databricks and Azure More ❯
Employment Type: Permanent
Salary: £72000 - £80000/annum + Benefits
Posted:

Databricks Data Architectx2 UK Wide Hybrid Working

Nationwide, United Kingdom
Hybrid/Remote Options
Adecco
an Azure and Databrick focus, you will be an integral part of our team dedicated to building scalable and secure data platforms. You will leverage your expertise in Databricks, Apache Spark, and Azure to design, develop, and implement data warehouses, data lakehouses, and AI/ML models that fuel our data-driven operations. Duties Design and build high … performance data platforms: Utilize Databricks and Apache Spark to extract, transform, and load data into Azure Data Lake Storage and other Azure services. Design and oversee the delivery of secure data warehouses and data lakehouses: Implement data models, data quality checks, and governance practices to ensure reliable and accurate data. Abilty to Design, Build and deploy AI/… to ensure successful data platform implementations. Your Skills and Experience Solid experience as a Data Architect with experience in designing, developing and implementing Databricks solutions Proven expertise in Databricks, Apache Spark, and data platforms with a strong understanding of data warehousing concepts and practices. Experience with Microsoft Azure cloud platform, including Azure Data Lake Storage, Databricks, and Azure More ❯
Employment Type: Permanent
Salary: £80000 - £90000/annum + Benefits
Posted:

Data Architect

Basildon, Essex, UK
Coforge
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs … skills. A minimum of 5 years' experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills Designing Databricks based More ❯
Employment Type: Full-time
Posted:

Data Architect

Chelmsford, Essex, UK
Coforge
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs … skills. A minimum of 5 years' experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills Designing Databricks based More ❯
Employment Type: Full-time
Posted:

Data Engineer

City of London, London, England, United Kingdom
Equiniti
in Microsoft Fabric and Databricks, including data pipeline development, data warehousing, and data lake management Proficiency in Python, SQL, Scala, or Java Experience with data processing frameworks such as Apache Spark, Apache Beam, or Azure Data Factory Strong understanding of data architecture principles, data modelling, and data governance Experience with cloud-based data platforms, including Azure and More ❯
Employment Type: Full-Time
Salary: Competitive salary
Posted:

Senior / Lead Data Engineer

London, UK
Sahaj Software
while staying close to the code. Perfect if you want scope for growth without going "post-technical." What you'll do Design and build modern data platforms using Databricks, Apache Spark, Snowflake, and cloud-native services (AWS, Azure, or GCP). Develop robust pipelines for real-time and batch data ingestion from diverse and complex sources. Model and … for Solid experience as a Senior/Lead Data Engineer in complex enterprise environments. Strong coding skills in Python (Scala or functional languages a plus). Expertise with Databricks, Apache Spark, and Snowflake (HDFS/HBase also useful). Experience integrating large, messy datasets into reliable, scalable data products. Strong understanding of data modelling, orchestration, and automation. Hands More ❯
Employment Type: Full-time
Posted:

Senior / Lead Data Engineer

Slough, Berkshire, UK
Sahaj Software
while staying close to the code. Perfect if you want scope for growth without going "post-technical." What you'll do Design and build modern data platforms using Databricks, Apache Spark, Snowflake, and cloud-native services (AWS, Azure, or GCP). Develop robust pipelines for real-time and batch data ingestion from diverse and complex sources. Model and … for Solid experience as a Senior/Lead Data Engineer in complex enterprise environments. Strong coding skills in Python (Scala or functional languages a plus). Expertise with Databricks, Apache Spark, and Snowflake (HDFS/HBase also useful). Experience integrating large, messy datasets into reliable, scalable data products. Strong understanding of data modelling, orchestration, and automation. Hands More ❯
Employment Type: Full-time
Posted:

Sr.Databricks Engineer (AWS)

Glasgow, Lanarkshire, Scotland, United Kingdom
eTeam Inc
optimizing scalable data solutions using the Databricks platform. Key Responsibilities: Lead the migration of existing AWS-based data pipelines to Databricks. Design and implement scalable data engineering solutions using Apache Spark on Databricks. Collaborate with cross-functional teams to understand data requirements and translate them into efficient pipelines. Optimize performance and cost-efficiency of Databricks workloads. Develop and … best practices for data governance, security, and access control within Databricks. Provide technical mentorship and guidance to junior engineers. Must-Have Skills: Strong hands-on experience with Databricks and Apache Spark (preferably PySpark). Proven track record of building and optimizing data pipelines in cloud environments. Experience with AWS services such as S3, Glue, Lambda, Step Functions, Athena More ❯
Employment Type: Contractor
Rate: £350 - £400 per day
Posted:

Senior Engineer

london, south east england, united kingdom
Hong Kong Exchanges and Clearing
Required: Bachelor's degree in Computer Science, Software Engineering, Data Science, or a closely related field. Advantageous: Certifications or substantial hands-on experience with modern data pipeline tools (e.g., Apache Airflow, Spark, Kafka, dbt, or similar). Desirable: Familiarity with financial services regulatory frameworks (e.g., MiFID II, GDPR, SOX) and best practices for data governance Required Knowledge and … Engineering: Hands-on experience with Java (Spring Boot), React, and Python, covering backend, frontend, and data engineering. Data Engineering Tools: Proficient with modern data engineering and analytics platforms (e.g., Apache Airflow, Spark, Kafka, dbt, Snowflake, or similar). DevOps & Cloud: Experience with containerisation (Docker, Kubernetes), CI/CD pipelines, and cloud platforms (e.g., AWS, Azure, GCP) is highly More ❯
Posted:

Data & Analytics Practice:-Data Architect role- Junior level

london, south east england, united kingdom
Infosys Consulting - Europe
modelling tools, data warehousing, ETL processes, and data integration techniques. · Experience with at least one cloud data platform (e.g. AWS, Azure, Google Cloud) and big data technologies (e.g., Hadoop, Spark). · Strong knowledge of data workflow solutions like Azure Data Factory, Apache NiFi, Apache Airflow etc · Good knowledge of stream and batch processing solutions like Apache Flink, Apache Kafka/· Good knowledge of log management, monitoring, and analytics solutions like Splunk, Elastic Stack, New Relic etc Given that this is just a short snapshot of the role we encourage you to apply even if you don't meet all the requirements listed above. We are looking for individuals who strive to make an impact More ❯
Posted:

Senior Data Engineer

Cardiff, South Glamorgan, Wales, United Kingdom
Yolk Recruitment
understanding of data modelling, warehousing, and performance optimisation. Proven experience with cloud platforms (AWS, Azure, or GCP) and their data services. Hands-on experience with big data frameworks (e.g. Apache Spark, Hadoop). Strong knowledge of data governance, security, and compliance. Ability to lead technical projects and mentor junior engineers. Excellent problem-solving skills and experience in agile More ❯
Employment Type: Permanent
Salary: £75,000
Posted:

Senior Data Engineer

london (city of london), south east england, united kingdom
Hybrid/Remote Options
Atreides Caseri Inc
platform components. Big Data Architecture: Build and maintain big data architectures and data pipelines to efficiently process large volumes of geospatial and sensor data. Leverage technologies such as Hadoop, Apache Spark, and Kafka to ensure scalability, fault tolerance, and speed. Geospatial Data Integration: Develop systems that integrate geospatial data from a variety of sources (e.g., satellite imagery, remote … driven applications. Familiarity with geospatial data formats (e.g., GeoJSON, Shapefiles, KML) and tools (e.g., PostGIS, GDAL, GeoServer). Technical Skills: Expertise in big data frameworks and technologies (e.g., Hadoop, Spark, Kafka, Flink) for processing large datasets. Proficiency in programming languages such as Python, Java, or Scala, with a focus on big data frameworks and APIs. Experience with cloud services … Science, or related field. Experience with data visualization tools and libraries (e.g., Tableau, , Mapbox, Leaflet) for displaying geospatial insights and analytics. Familiarity with real-time stream processing frameworks (e.g., Apache Flink, Kafka Streams). Experience with geospatial data processing libraries (e.g., GDAL, Shapely, Fiona). Background in defense, national security, or environmental monitoring applications is a plus. Compensation and More ❯
Posted:

Data Engineer

Cardiff, South Glamorgan, Wales, United Kingdom
Yolk Recruitment
pipelines and ETL processes. Proficiency in Python. Experience with cloud platforms (AWS, Azure, or GCP). Knowledge of data modelling, warehousing, and optimisation. Familiarity with big data frameworks (e.g. Apache Spark, Hadoop). Understanding of data governance, security, and compliance best practices. Strong problem-solving skills and experience working in agile environments. Desirable: Experience with Docker/Kubernetes More ❯
Employment Type: Permanent
Salary: £50,000
Posted:

Data Pipeline Engineer On prem to AWS migration

Bristol, Avon, England, United Kingdom
Pontoon
Azure, or GCP, with hands-on experience in cloud-based data services. Proficiency in SQL and Python for data manipulation and transformation. Experience with modern data engineering tools, including Apache Spark, Kafka, and Airflow. Strong understanding of data modelling, schema design, and data warehousing concepts. Familiarity with data governance, privacy, and compliance frameworks (e.g., GDPR, ISO27001). Hands More ❯
Employment Type: Contractor
Rate: £700 - £750 per day
Posted:

Senior Software Engineer, Data, Platform - Enterprise Engineering

Manchester, Lancashire, United Kingdom
Roku, Inc
s expertise spans a wide range of technologies, including Java and Python based MicroServices, Data Platform services, AWS/GCP cloud backend systems, Big Data technologies like Hive and Spark, and modern Web applications. With a globally distributed presence across the US, India and Europe, the team thrives on collaboration, bringing together diverse perspectives to solve complex challenges. At … skills We're excited if you have 7+ years of experience delivering multi tier, highly scalable, distributed web applications Experience working with Distributed computing frameworks knowledge: Hive/Hadoop, Apache Spark, Kafka, Airflow Working with programming languages Python , Java, SQL. Working on building ETL (Extraction Transformation and Loading) solution using PySpark Experience in SQL/NoSQL database design More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Data Engineer

london, south east england, united kingdom
Solirius Reply
have framework experience within either Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands on coding experience, such as More ❯
Posted:

Software Engineer, Data Products

london, south east england, united kingdom
Hybrid/Remote Options
Yapily
systems. API & Micro services Architecture: Comfortable working with REST APIs and micro services architectures. Real-time Stream Processing: Understanding of real-time stream processing frameworks (e.g., PubSub, Kafka, Flink, Spark Streaming). BI Tools & Visualisation Platforms: Experience supporting BI tools or visualization platforms (e.g. Looker, Grafana, PowerBI etc.). Data Pipelines & APIs: Experience in building and maintaining both batch More ❯
Posted:

Lead Data Engineer

London, United Kingdom
Hybrid/Remote Options
CV Technical
platform. Candidate Profile: Proven experience as a Data Engineer, with strong expertise in designing and managing large-scale data systems. Hands-on proficiency with modern data technologies such as Spark, Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. More ❯
Employment Type: Permanent
Salary: £85000 - £100000/annum
Posted:

Senior Software Engineer

london, south east england, united kingdom
Hybrid/Remote Options
LocalStack
on experience with cloud data platforms such as Snowflake, Redshift, Athena, or BigQuery, including optimization techniques and custom parsers/transpilers. Practical knowledge of distributed and analytical engines (e.g., Apache Spark, Trino, PostgreSQL, DuckDB) with skills in query engines, performance tuning, and integration in local and production environments. Experience building developer tooling such as CLI tools, SDKs, and More ❯
Posted:

Databricks Engineer - £400PD - Remote

London, South East, England, United Kingdom
Hybrid/Remote Options
Tenth Revolution Group
pipelines. Understanding of data modelling, data warehousing concepts, and distributed computing. Familiarity with CI/CD, version control, and DevOps practices. Nice-to-Have Experience with streaming technologies (e.g., Spark Structured Streaming, Event Hub, Kafka). Knowledge of MLflow, Unity Catalog, or advanced Databricks features. Exposure to Terraform or other IaC tools. Experience working in Agile/Scrum environments. More ❯
Employment Type: Contractor
Rate: £350 - £400 per day
Posted:

Databricks Engineer - £400PD - Remote

London, United Kingdom
Hybrid/Remote Options
Tenth Revolution Group
pipelines. Understanding of data modelling, data warehousing concepts, and distributed computing. Familiarity with CI/CD, version control, and DevOps practices. Nice-to-Have Experience with streaming technologies (e.g., Spark Structured Streaming, Event Hub, Kafka). Knowledge of MLflow, Unity Catalog, or advanced Databricks features. Exposure to Terraform or other IaC tools. Experience working in Agile/Scrum environments. More ❯
Employment Type: Contract
Rate: £350 - £400/day
Posted:

Data Engineer

Sheffield, South Yorkshire, England, United Kingdom
Hybrid/Remote Options
DCS Recruitment
principles. Experience working with cloud platforms such as AWS, Azure, or GCP. Exposure to modern data tools such as Snowflake, Databricks, or BigQuery. Familiarity with streaming technologies (e.g., Kafka, Spark Streaming, Flink) is an advantage. Experience with orchestration and infrastructure tools such as Airflow, dbt, Prefect, CI/CD pipelines, and Terraform. What you get in return: Up to More ❯
Employment Type: Full-Time
Salary: £50,000 - £60,000 per annum
Posted:

Senior Data Scientist

Manchester, Lancashire, United Kingdom
Hybrid/Remote Options
CHEP UK Ltd
plus work experience BS & 5+ years of work experience MS & 4+ years of work experience Proficient with machine learning and statistics Proficient with Python, deep learning frameworks, Computer Vision, Spark Have produced production level algorithms Proficient in researching, developing, synthesizing new algorithms and techniques Excellent communication skills Desirable Qualifications Master's or PhD level degree 5+ years of work More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:
Apache Spark
10th Percentile
£56,000
25th Percentile
£65,000
Median
£81,000
75th Percentile
£116,250
90th Percentile
£140,000