PySpark Jobs in London

1 to 25 of 82 PySpark Jobs in London

Data Engineer

London, United Kingdom
Sandtech
Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data architectures such as Data Mesh, lakehouse, data vault and data warehouses. Our data engineers create pipelines that support More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Data Engineer

London, United Kingdom
Sandtech
Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data architectures such as Data Mesh, lakehouse, data vault and data warehouses. Our data engineers create pipelines that support More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Data Engineer

London, United Kingdom
Hybrid / WFH Options
Scott Logic Ltd
data engineering and reporting. Including storage, data pipelines to ingest and transform data, and querying & reporting of analytical data. You've worked with technologies such as Python, Spark, SQL, Pyspark, PowerBI etc. You're a problem-solver, pragmatically exploring options and finding effective solutions. An understanding of how to design and build well-structured, maintainable systems. Strong communication skills More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Data Engineer

London, United Kingdom
Mars, Incorporated and its Affiliates
alignment and shared value creation. As a Data Engineer in the Commercial team, your key responsibilities are as follows: 1. Technical Proficiency: Collaborate in hands-on development using Python, PySpark, and other relevant technologies to create and maintain data assets and reports for business insights. Assist in engineering and managing data models and pipelines within a cloud environment, utilizing … technologies like Databricks, Spark, Delta Lake, and SQL. Contribute to the maintenance and enhancement of our progressive tech stack, which includes Python, PySpark, Logic Apps, Azure Functions, ADLS, Django, and ReactJs. Support the implementation of DevOps and CI/CD methodologies to foster agile collaboration and contribute to building robust data solutions. Develop code that adheres to high-quality … ideas to improve platform excellence. As a Data Engineer in the Commercial team, your key responsibilities are as follows: 1. Technical Proficiency: Collaborate in hands-on development using Python, PySpark, and other relevant technologies to create and maintain data assets and reports for business insights. Assist in engineering and managing data models and pipelines within a cloud environment, utilizing More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Data Engineer Global Trading Technology Firm

London, United Kingdom
Out in Science, Technology, Engineering, and Mathematics
data governance processes. Requirements: 5+ years of experience in data engineering, with a strong focus on building scalable data platforms. Proficiency in Python and modern data libraries (e.g. Pandas, PySpark, Dask). Strong SQL skills and experience with cloud-native data tools (AWS, GCP, or Azure). Hands-on experience with tools like Airflow, Spark, Kafka, or Snowflake. Experience More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineer

London, United Kingdom
Harvey Nash Group
across the team. Skills & Experience Hands-on experience with Azure Databricks, Delta Lake, Data Factory, and Synapse. Strong understanding of Lakehouse architecture and medallion design patterns. Proficient in Python, PySpark, and SQL, with advanced query optimisation skills. Proven experience building scalable ETL pipelines and managing data transformations. Familiarity with data quality frameworks and monitoring tools. Experience working with Git More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineer

London, United Kingdom
Hybrid / WFH Options
Veeva Systems, Inc
cooperation with our data science team Experiment in your domain to improve precision, recall, or cost savings Requirements Expert skills in Java or Python Experience with Apache Spark or PySpark Experience writing software for the cloud (AWS or GCP) Speaking and writing in English enables you to take part in day-to-day conversations in the team and contribute More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineer Cyprus, Remote, United Kingdom, Remote

London, United Kingdom
Hybrid / WFH Options
Ocean Finance
Head of Data Platform and Services, you'll not only maintain and optimize our data infrastructure but also spearhead its evolution. Built predominantly on Databricks, and utilizing technologies like Pyspark and Delta Lake, our infrastructure is designed for scalability, robustness, and efficiency. You'll take charge of developing sophisticated data integrations with various advertising platforms, empowering our teams with … and informed decision-making What you'll be doing for us Leadership in Design and Development : Lead in the architecture, development, and upkeep of our Databricks-based infrastructure, harnessing Pyspark and Delta Lake. CI/CD Pipeline Mastery : Create and manage CI/CD pipelines, ensuring automated deployments and system health monitoring. Advanced Data Integration : Develop sophisticated strategies for … standards. Data-Driven Culture Champion : Advocate for the strategic use of data across the organization. Skills-wise, you'll definitely: Expertise in Apache Spark Advanced proficiency in Python and Pyspark Extensive experience with Databricks Advanced SQL knowledge Proven leadership abilities in data engineering Strong experience in building and managing CI/CD pipelines. Experience in implementing data integrations with More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Software Engineer Python PySpark

City of London, London, United Kingdom
Hybrid / WFH Options
Client Server
Data Software Engineer (Python PySpark) Remote UK to £95k Are you a data savvy Software Engineer with strong Python coding skills? You could be progressing your career in a senior, hands-on Data Software Engineer role as part of a friendly and supportive international team at a growing and hugely successful European car insurance tech company as they expand … on your location/preferences. About you: You are degree educated in a relevant discipline, e.g. Computer Science, Mathematics You have a software engineering background with advanced Python and PySpark coding skills You have experience in batch, distributed data processing and near real-time streaming data pipelines with technologies such as Kafka You have experience of Big Data Analytics More ❯
Employment Type: Permanent, Work From Home
Salary: £95,000
Posted:

Data Engineer

London, South East, England, United Kingdom
Hybrid / WFH Options
McGregor Boyall
to performance optimisation and cost efficiency across data solutions. Required Skills & Experience: Proven hands-on experience with Azure Databricks, Data Factory, Delta Lake, and Synapse. Strong proficiency in Python, PySpark, and advanced SQL. Understanding of Lakehouse architecture and medallion data patterns. Familiarity with data governance, lineage, and access control tools. Experience in Agile environments, with solid CI/CD More ❯
Employment Type: Contractor
Rate: £400 - £425 per day
Posted:

Machine Learning Engineer with Data Engineering expertise (London)

London, UK
ZipRecruiter
in both data engineering and machine learning, with a strong portfolio of relevant projects. Proficiency in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like Apache Airflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with More ❯
Employment Type: Full-time
Posted:

Data Engineering Associate

London, United Kingdom
Metyis AG
experience with Azure services such as Data Factory, Databricks, Synapse (DWH), Azure Functions, and other data analytics tools, including streaming. Experience with Airflow and Kubernetes. Programming skills in Python (PySpark) and scripting languages like Bash. Knowledge of Git, CI/CD operations, and Docker. Basic PowerBI knowledge is a plus. Experience deploying cloud infrastructure is desirable. Understanding of Infrastructure More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Data Engineer

London, United Kingdom
Hybrid / WFH Options
LEGO Gruppe
infrastructure Excellent communication and collaboration skills Experience working with Git, practicing code reviews and branching strategies, CI/CD and testing in software solutions Proficiency in SQL, Python, and PySpark Ability to translate marketing needs into well-structured data products Deep understanding of data modeling concepts and building scalable data marts Basic experience with frontend technologies is a plus More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineer

London, United Kingdom
Hybrid / WFH Options
Automobile Association
and root cause analysis. Following agreed architectural standards and contributing to their continuous improvement. What do I need? Proficiency in Azure and its data related services. Strong SQL and PySpark skills, with a focus on writing efficient, readable, modular code. Experience of development on modern cloud data platforms (e.g. Databricks, Snowflake, RedShift). Familiarity of Data Lakehouse principles, standards More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

MLOps Engineering Manager FullTime London

London, United Kingdom
Trainline plc
to have Experience deploying LLMs and agent-based systems Our technology stack Python and associated ML/DS libraries (scikit-learn, numpy, pandas, LightGBM, LangChain/LangGraph, TensorFlow, etc ) PySpark AWS cloud infrastructure: EMR, ECS, ECR, Athena, etc. MLOps: Terraform, Docker, Spacelift, Airflow, MLFlow Monitoring: New Relic CI/CD: Jenkins, Github Actions More information: Enjoy fantastic perks like More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Machine Learning Engineer - Agentic AI

London, United Kingdom
Trainline
infrastructure Experience with graph technology and/or algorithms Our technology stack Python and associated ML/DS libraries (Scikit-learn, Numpy, LightlGBM, Pandas, LangChain/LangGraph, TensorFlow, etc ) PySpark AWS cloud infrastructure: EMR, ECS, Athena, etc. MLOps: Terraform, Docker, Airflow, MLFlow More information: Enjoy fantastic perks like private healthcare & dental insurance, a generous work from abroad policy More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Machine Learning Engineer - Agentic AI (London)

London, UK
Trainline
infrastructure Experience with graph technology and/or algorithms Our technology stack Python and associated ML/DS libraries (Scikit-learn, Numpy, LightlGBM, Pandas, LangChain/LangGraph, TensorFlow, etc...) PySpark AWS cloud infrastructure: EMR, ECS, Athena, etc. MLOps: Terraform, Docker, Airflow, MLFlow More information: Enjoy fantastic perks like private healthcare & dental insurance, a generous work from abroad policy More ❯
Employment Type: Full-time
Posted:

Machine Learning Engineer - Recommendations & Reinforcement Learning

London, United Kingdom
Trainline
designing, fine-tunning and developing GenAI models and building agent AI systems Our technology stack Python and associated ML/DS libraries (Scikit-learn, Numpy, LightlGBM, Pandas, TensorFlow, etc ) PySpark AWS cloud infrastructure: EMR, ECS, S3, Athena, etc. MLOps: Terraform, Docker, Airflow, MLFlow, Jenkins On call statement: Please be aware that our Machine Learning Engineers are required to be More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Machine Learning Engineer - Recommendations & Reinforcement Learning (London)

London, UK
Trainline
designing, fine-tunning and developing GenAI models and building agent AI systems Our technology stack Python and associated ML/DS libraries (Scikit-learn, Numpy, LightlGBM, Pandas, TensorFlow, etc...) PySpark AWS cloud infrastructure: EMR, ECS, S3, Athena, etc. MLOps: Terraform, Docker, Airflow, MLFlow, Jenkins On call statement: Please be aware that our Machine Learning Engineers are required to be More ❯
Employment Type: Full-time
Posted:

Databricks Engineer

London, United Kingdom
Tenth Revolution Group
Azure Databricks, handling ingestion from various data sources, performing complex transformations, and publishing data to Azure Data Lake or other storage services. Write efficient and standardized Spark SQL and PySpark code for data transformations, ensuring data integrity and accuracy across the pipeline. Automate pipeline orchestration using Databricks Workflows or integration with external tools (e.g., Apache Airflow, Azure Data Factory … in designing and implementing scalable ETL/ELT data pipelines in Azure Databricks, transforming raw data into usable datasets for analysis. Azure Databricks Proficiency: Strong knowledge of Spark (SQL, PySpark) for data transformation and processing within Databricks, along with experience building workflows and automation using Databricks Workflows. Azure Data Services: Hands-on experience with Azure services like Azure Data More ❯
Employment Type: Contract
Rate: £400 - £500/day
Posted:

Databricks Engineer

London, South East, England, United Kingdom
Tenth Revolution Group
Azure Databricks, handling ingestion from various data sources, performing complex transformations, and publishing data to Azure Data Lake or other storage services. Write efficient and standardized Spark SQL and PySpark code for data transformations, ensuring data integrity and accuracy across the pipeline. Automate pipeline orchestration using Databricks Workflows or integration with external tools (e.g., Apache Airflow, Azure Data Factory … in designing and implementing scalable ETL/ELT data pipelines in Azure Databricks, transforming raw data into usable datasets for analysis. Azure Databricks Proficiency: Strong knowledge of Spark (SQL, PySpark) for data transformation and processing within Databricks, along with experience building workflows and automation using Databricks Workflows. Azure Data Services: Hands-on experience with Azure services like Azure Data More ❯
Employment Type: Contractor
Rate: £400 - £500 per day
Posted:

Data Engineer

London, South East, England, United Kingdom
Harnham - Data & Analytics Recruitment
create bespoke, scalable data solutions Support data migration efforts from Azure to Databricks Use Terraform to manage and deploy cloud infrastructure Build robust data workflows in Python (e.g., pandas, PySpark) Ensure the platform is scalable, efficient, and ready for future AI use cases REQUIRED SKILLS & EXPERIENCE Strong experience with Azure and Databricks environments Advanced Python skills for data engineering … pandas, PySpark) Proficiency in designing and maintaining ETL pipelines Experience with Terraform for infrastructure automation Track record of working on cloud migration projects, especially Azure to Databricks Comfortable working onsite in London 2 days/week and engaging cross-functionally Strong communication and problem-solving abilities NICE TO HAVES Experience with Qlik or other data visualisation tools Exposure to More ❯
Employment Type: Contractor
Rate: £600 - £650 per day
Posted:

AWS Data Architect - Market Data

London, United Kingdom
Vertus Partners
Design and implement end-to-end data architecture on AWS using tools such as Glue, Lake Formation, and Athena Develop scalable and secure ETL/ELT pipelines using Python, PySpark, and SQL Drive decisions on data modeling, lakehouse architecture, and integration strategies with Databricks and Snowflake Collaborate cross-functionally to embed data governance, quality, and lineage into platform design … Serve as a trusted advisor to engineering and business stakeholders on data strategy and architecture What You Bring: Deep, hands-on expertise with AWS data services (Glue, Lake Formation, PySpark, Athena, etc.) Strong coding skills in Python and SQL for building, testing, and optimizing data pipelines Proven experience designing secure, scalable, and reliable data architectures in cloud environments Solid More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

GCP Data Engineer

London Area, United Kingdom
Anson McCade
practice Essential Experience: Proven expertise in building data warehouses and ensuring data quality on GCP Strong hands-on experience with BigQuery, Dataproc, Dataform, Composer, Pub/Sub Skilled in PySpark, Python and SQL Solid understanding of ETL/ELT processes Clear communication skills and ability to document processes effectively Desirable Skills: GCP Professional Data Engineer certification Exposure to Agentic More ❯
Posted:

GCP Data Engineer

City of London, London, United Kingdom
Anson McCade
practice Essential Experience: Proven expertise in building data warehouses and ensuring data quality on GCP Strong hands-on experience with BigQuery, Dataproc, Dataform, Composer, Pub/Sub Skilled in PySpark, Python and SQL Solid understanding of ETL/ELT processes Clear communication skills and ability to document processes effectively Desirable Skills: GCP Professional Data Engineer certification Exposure to Agentic More ❯
Posted:
PySpark
London
10th Percentile
£58,000
25th Percentile
£82,500
Median
£110,000
75th Percentile
£126,563
90th Percentile
£148,750