Apache Spark Jobs in the South East

26 to 50 of 117 Apache Spark Jobs in the South East

Data Engineer

slough, south east england, united kingdom
Fimador
scalable pipelines, data platforms, and integrations, while ensuring solutions meet regulatory standards and align with architectural best practices. Key Responsibilities: Build and optimise scalable data pipelines using Databricks and Apache Spark (PySpark). Ensure performance, scalability, and compliance (GxP and other standards). Collaborate on requirements, design, and backlog refinement. Promote engineering best practices including CI/CD … experience: Experience with efficient, reliable data pipelines that improve time-to-insight. Knowledge of secure, auditable, and compliant data workflows. Know how on optimising performance and reducing costs through Spark and Databricks tuning. Be able to create reusable, well-documented tools enabling collaboration across teams. A culture of engineering excellence driven by mentoring and high-quality practices. Preferred Experience … Databricks in a SaaS environment, Spark, Python, and database technologies. Event-driven and distributed systems (Kafka, AWS SNS/SQS, Java, Python). Data Governance, Data Lakehouse/Data Intelligence platforms. AI software delivery and AI data preparation. More ❯
Posted:

Data Engineer

london (city of london), south east england, united kingdom
Fimador
scalable pipelines, data platforms, and integrations, while ensuring solutions meet regulatory standards and align with architectural best practices. Key Responsibilities: Build and optimise scalable data pipelines using Databricks and Apache Spark (PySpark). Ensure performance, scalability, and compliance (GxP and other standards). Collaborate on requirements, design, and backlog refinement. Promote engineering best practices including CI/CD … experience: Experience with efficient, reliable data pipelines that improve time-to-insight. Knowledge of secure, auditable, and compliant data workflows. Know how on optimising performance and reducing costs through Spark and Databricks tuning. Be able to create reusable, well-documented tools enabling collaboration across teams. A culture of engineering excellence driven by mentoring and high-quality practices. Preferred Experience … Databricks in a SaaS environment, Spark, Python, and database technologies. Event-driven and distributed systems (Kafka, AWS SNS/SQS, Java, Python). Data Governance, Data Lakehouse/Data Intelligence platforms. AI software delivery and AI data preparation. More ❯
Posted:

Data Engineer

Stroud, south east england, united kingdom
Hybrid / WFH Options
Ecotricity
for best practice and technical excellence and be a person that actively looks for continual improvement opportunities. Knowledge and skills Experience as a Data Engineer or Analyst Databricks/Apache Spark SQL/Python BitBucket/GitHub. Advantageous dbt AWS Azure Devops Terraform Atlassian (Jira, Confluence) About Us What's in it for you... Healthcare plan, life assurance More ❯
Posted:

Machine Learning Engineer

Oxfordshire, England, United Kingdom
Llama Recruitment Solutions
technical teams and stakeholders Effective problem-solver who takes initiative in complex production settings Experience with scientific computing, deep learning, big data, or health IT ontologies (e.g., PyTorch, JAX, Spark, HL7, FHIR) (desirable) Familiarity with cloud infrastructure (Azure/AWS), infrastructure as code, Kubernetes, Linux, Docker, data pipelines, and MLOps tools (desirable) Passion for biomedical topics and startup experience More ❯
Posted:

Machine Learning Engineer

oxford district, south east england, united kingdom
Llama Recruitment Solutions
technical teams and stakeholders Effective problem-solver who takes initiative in complex production settings Experience with scientific computing, deep learning, big data, or health IT ontologies (e.g., PyTorch, JAX, Spark, HL7, FHIR) (desirable) Familiarity with cloud infrastructure (Azure/AWS), infrastructure as code, Kubernetes, Linux, Docker, data pipelines, and MLOps tools (desirable) Passion for biomedical topics and startup experience More ❯
Posted:

Senior Data Engineer GCP - Finance

london, south east england, united kingdom
Hybrid / WFH Options
Client Server
GCP including BigQuery, Pub/Sub, Cloud Composer and IAM You have strong Python, SQL and PySpark skills You have experience with real-time data streaming using Kafka or Spark You have a good knowledge of Data Lakes, Data Warehousing, Data Modelling You're familiar with DevOps principles, containerisation and CI/CD tools such as Jenkins or GitHub More ❯
Posted:

Senior Data Engineer GCP - Finance

slough, south east england, united kingdom
Hybrid / WFH Options
Client Server
GCP including BigQuery, Pub/Sub, Cloud Composer and IAM You have strong Python, SQL and PySpark skills You have experience with real-time data streaming using Kafka or Spark You have a good knowledge of Data Lakes, Data Warehousing, Data Modelling You're familiar with DevOps principles, containerisation and CI/CD tools such as Jenkins or GitHub More ❯
Posted:

Senior Data Engineer - Central Services

London, South East, England, United Kingdom
Norton Rose Fulbright LLP
data quality, or other areas directly relevant to data engineering responsibilities and tasks Proven project experience developing and maintaining data warehouses in big data solutions (Snowflake) Expert knowledge in Apache technologies such as Kafka, Airflow, and Spark to build scalable and efficient data pipelines Ability to design, build, and deploy data solutions that capture, explore, transform, and utilize More ❯
Employment Type: Full-Time
Salary: Competitive salary
Posted:

Data Engineer SC Cleared

London, South East, England, United Kingdom
Sanderson
teams to build scalable data pipelines and contribute to digital transformation initiatives across government departments. Key Responsibilities Design, develop and maintain robust data pipelines using PostgreSQL and Airflow or Apache Spark Collaborate with frontend/backend developers using Node.js or React Implement best practices in data modelling, ETL processes and performance optimisation Contribute to containerised deployments (Docker/… within Agile teams and support DevOps practices What We're Looking For Proven experience as a Data Engineer in complex environments Strong proficiency in PostgreSQL and either Airflow or Spark Solid understanding of Node.js or React for integration and tooling Familiarity with containerisation technologies (Docker/Kubernetes) is a plus Excellent communication and stakeholder engagement skills Experience working within More ❯
Employment Type: Contractor
Rate: Salary negotiable
Posted:

Data Engineer

london, south east england, united kingdom
Hybrid / WFH Options
Peaple Talent
a focus on having delivered in Microsoft Azure Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure Data Engineering certifications Databricks certifications What's in it for you: 📍Location: London More ❯
Posted:

Data Engineer

slough, south east england, united kingdom
Hybrid / WFH Options
Peaple Talent
a focus on having delivered in Microsoft Azure Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure Data Engineering certifications Databricks certifications What's in it for you: 📍Location: London More ❯
Posted:

Data Engineer

london (city of london), south east england, united kingdom
Hybrid / WFH Options
Peaple Talent
a focus on having delivered in Microsoft Azure Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure Data Engineering certifications Databricks certifications What's in it for you: 📍Location: London More ❯
Posted:

Data Scientist

Uckfield, East Sussex, South East, United Kingdom
McGregor Boyall Associates Limited
Pydantic) for document processing, summarization, and clinical Q&A systems. Develop and optimize predictive models using scikit-learn, PyTorch, TensorFlow, and XGBoost. Design robust data pipelines using tools like Spark and Kafka for real-time and batch processing. Manage ML lifecycle with tools such as Databricks , MLflow , and cloud-native platforms (Azure preferred). Collaborate with engineering teams to More ❯
Employment Type: Permanent
Salary: £65,000
Posted:

Ab Initio Developer

milton keynes, south east england, united kingdom
Experis
It, Express>It, Metadata Hub, and PDL. Hands-on experience with SQL , Unix/Linux shell scripting , and data warehouse concepts . Familiarity with big data ecosystems (Hadoop, Hive, Spark) and cloud platforms (AWS, Azure, GCP) is a plus. Proven ability to troubleshoot complex ETL jobs and resolve performance issues. Experience working with large-scale datasets and enterprise data More ❯
Posted:

Test Architect - Cloud Automation

london, south east england, united kingdom
HCLTech
testing of etl (extract, transform, load) processes and data warehousing. 3. Strong understanding of sql for data querying and validation. 4. Knowledge of big data technologies such as hadoop, spark, or kafka is a plus. 5. Familiarity with scripting languages like python, java, or shell scripting. 6. Excellent analytical and problem-solving skills with a keen attention to detail. More ❯
Posted:

Test Architect - Cloud Automation

slough, south east england, united kingdom
HCLTech
testing of etl (extract, transform, load) processes and data warehousing. 3. Strong understanding of sql for data querying and validation. 4. Knowledge of big data technologies such as hadoop, spark, or kafka is a plus. 5. Familiarity with scripting languages like python, java, or shell scripting. 6. Excellent analytical and problem-solving skills with a keen attention to detail. More ❯
Posted:

Test Architect - Cloud Automation

london (city of london), south east england, united kingdom
HCLTech
testing of etl (extract, transform, load) processes and data warehousing. 3. Strong understanding of sql for data querying and validation. 4. Knowledge of big data technologies such as hadoop, spark, or kafka is a plus. 5. Familiarity with scripting languages like python, java, or shell scripting. 6. Excellent analytical and problem-solving skills with a keen attention to detail. More ❯
Posted:

Quantitative Developer

London, South East, England, United Kingdom
Robert Half
and technology teams. Exposure to low-latency or real-time systems. Experience with cloud infrastructure (AWS, GCP, or Azure). Familiarity with data engineering tools such as Kafka, Airflow, Spark, or Dask. Knowledge of equities, futures, or FX markets. Company Rapidly growing hedge fund offices globally including London Salary & Benefits The salary range/rates of pay is dependent More ❯
Employment Type: Contractor
Rate: Salary negotiable
Posted:

Lead Data Engineer

London, South East, England, United Kingdom
Harnham - Data & Analytics Recruitment
on leadership and communication, ensuring all key builds and improvements flow through this individual. Working with a modern tech stack including AWS, Snowflake, Python, SQL, DBT, Airflow, Spark, Kafka, and Terraform, you'll drive automation and end-to-end data solutions that power meaningful insights. Ideal for ambitious, proactive talent from scale-up or start-up environments, this position More ❯
Employment Type: Full-Time
Salary: £90,000 - £115,000 per annum
Posted:

Senior Data Engineer

london, south east england, united kingdom
Mastek
Databricks platform. Optimise data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure … Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need to program using the languages such as SQL, Python, R, YAML and JavaScript Data Integration: Integrate data from various sources, including relational databases, APIs, and … best practices. Essential Skills & Experience: 10+ years of experience in data engineering, with at least 3+ years of hands-on experience with Azure Databricks. Strong proficiency in Python and Spark (PySpark) or Scala. Deep understanding of data warehousing principles, data modelling techniques, and data integration patterns. Extensive experience with Azure data services, including Azure Data Factory, Azure Blob Storage More ❯
Posted:

Senior Data Engineer

slough, south east england, united kingdom
Mastek
Databricks platform. Optimise data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure … Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need to program using the languages such as SQL, Python, R, YAML and JavaScript Data Integration: Integrate data from various sources, including relational databases, APIs, and … best practices. Essential Skills & Experience: 10+ years of experience in data engineering, with at least 3+ years of hands-on experience with Azure Databricks. Strong proficiency in Python and Spark (PySpark) or Scala. Deep understanding of data warehousing principles, data modelling techniques, and data integration patterns. Extensive experience with Azure data services, including Azure Data Factory, Azure Blob Storage More ❯
Posted:

Senior Data Engineer

london (city of london), south east england, united kingdom
Mastek
Databricks platform. Optimise data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure … Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need to program using the languages such as SQL, Python, R, YAML and JavaScript Data Integration: Integrate data from various sources, including relational databases, APIs, and … best practices. Essential Skills & Experience: 10+ years of experience in data engineering, with at least 3+ years of hands-on experience with Azure Databricks. Strong proficiency in Python and Spark (PySpark) or Scala. Deep understanding of data warehousing principles, data modelling techniques, and data integration patterns. Extensive experience with Azure data services, including Azure Data Factory, Azure Blob Storage More ❯
Posted:

GCP Data Engineer

london, south east england, united kingdom
Hybrid / WFH Options
Peaple Talent
delivered solutions in Google Cloud Platform (GCP) Strong experience designing and delivering data solutions using BigQuery Proficient in SQL and Python Experience working with Big Data technologies such as Apache Spark or PySpark Excellent communication skills, with the ability to engage effectively with senior stakeholders Nice to haves: GCP Data Engineering certifications BigQuery or other GCP tool certifications More ❯
Posted:

GCP Data Engineer

london (city of london), south east england, united kingdom
Hybrid / WFH Options
Peaple Talent
delivered solutions in Google Cloud Platform (GCP) Strong experience designing and delivering data solutions using BigQuery Proficient in SQL and Python Experience working with Big Data technologies such as Apache Spark or PySpark Excellent communication skills, with the ability to engage effectively with senior stakeholders Nice to haves: GCP Data Engineering certifications BigQuery or other GCP tool certifications More ❯
Posted:

GCP Data Engineer

slough, south east england, united kingdom
Hybrid / WFH Options
Peaple Talent
delivered solutions in Google Cloud Platform (GCP) Strong experience designing and delivering data solutions using BigQuery Proficient in SQL and Python Experience working with Big Data technologies such as Apache Spark or PySpark Excellent communication skills, with the ability to engage effectively with senior stakeholders Nice to haves: GCP Data Engineering certifications BigQuery or other GCP tool certifications More ❯
Posted:
Apache Spark
the South East
25th Percentile
£61,250
Median
£65,000
75th Percentile
£73,250
90th Percentile
£78,800