PySpark Jobs in England

1 to 25 of 455 PySpark Jobs in England

Pricing Manager

Manchester, Lancashire, England, United Kingdom
Hybrid / WFH Options
Vermelo RPO
Knowledge of the technical differences between different packages for some of these model types would be an advantage. Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW’s Radar software is preferred Proficient at communicating results in a concise More ❯
Employment Type: Full-Time
Salary: Salary negotiable
Posted:

Pricing Manager

Manchester, United Kingdom
Hybrid / WFH Options
Vermelo RPO
Knowledge of the technical differences between different packages for some of these model types would be an advantage. Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW’s Radar software is preferred Proficient at communicating results in a concise More ❯
Employment Type: Permanent
Posted:

Data Scientist - Commercial

Uxbridge, Middlesex, United Kingdom
Coca-Cola Europacific Partners
expertise in different machine learning techniques and commercial data science models, to drive impactful business outcomes. Explorations will be scaled to products, which requires proficiency on programming languages as PySpark and knowledge of cloud platforms. Key Responsibilities: Develop and implement commercial data science solutions. Collaborate with business stakeholders to align AI initiatives with commercial objectives. Build and optimize commercial … Statistics, Data Science, or a related field. Strong expertise in Machine Learning modelling and statistical techniques. Proficiency in R/Python, SQL, and cloud platforms (e.g., Databricks, Azure). PySpark will be valued. Experience in developing commercial data science applications. Strong communication skills for conveying technical insights to stakeholders. Application If this role is of interest to you please More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Curation Developer

London, United Kingdom
Hybrid / WFH Options
ENGINEERINGUK
and value creation from data curation activities. Agile mindset with the ability to deliver prototypes quickly and iterate improvements based on stakeholder feedback Experience in Python, Databricks, Delta Lake, PySpark, Pandas, other data engineering frameworks and applying them to achieve industry standards-compliant datasets Strong communication skills and expertise to translate business needs into technical data requirements and processes More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineering Specialist

Coventry, Warwickshire, United Kingdom
Hybrid / WFH Options
Cadent Gas
inclusive, forward-thinking culture, and help drive the energy transition for the UK. Code & create - Develop complex SQL and ABAP CDS views for analytics and reporting Transform & optimise - Use PySpark and Databricks to manipulate big data efficiently Automate & schedule - Manage workflows, jobs and clusters for scalable data processing Collaborate & deliver - Engage across agile teams to build high-impact solutions … Experience in building data pipelines and models in SAP Datasphere or SAP BW4/Hana Advanced skills in SQL, data modelling, and data transformation Familiarity with Databricks, Apache Spark, PySpark, and Delta Lake Agile mindset with experience in DevOps and iterative delivery Excellent communication and stakeholder engagement abilities Sound like a fit? Let's build the future of data More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Data Engineer (Remote)

South East, United Kingdom
Hybrid / WFH Options
Circana
team. In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and a … make a significant impact, we encourage you to apply! Job Responsibilities ETL/ELT Pipeline Development: Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow. Implement batch and real-time data processing solutions using Apache Spark. Ensure data quality, governance, and security throughout the data lifecycle. Cloud Data Engineering: Manage and … and documentation. Required profile: Requirements Client facing role so strong communication and collaboration skills are vital Proven experience in data engineering, with hands-on expertise in Azure Data Services, PySpark, Apache Spark, and Apache Airflow. Strong programming skills in Python and SQL, with the ability to write efficient and maintainable code. Deep understanding of Spark internals, including RDDs, DataFrames More ❯
Employment Type: Permanent
Posted:

Principal Data Engineer

Bristol, Avon, South West, United Kingdom
Hays
Work with the team to support ETL processes What you'll need to succeed Seasoned knowledge of the Azure Databricks platform and associated functionalities Strong Python programming knowledge, ideally Pyspark A logical and analytical approach to problem-solving Awareness of the modern data stack and associated methodologies What you'll get in return A rewarding contract providing exposure to More ❯
Employment Type: Contract
Rate: £500.0 - £650.0 per day + £500 to £650 per day
Posted:

Data Architect

Bracknell, Berkshire, South East, United Kingdom
Hybrid / WFH Options
Halian Technology Limited
business intelligence, reporting, and regulatory needs Lead the integration and optimisation of large-scale data platforms using Azure Synapse and Databricks Build and maintain robust data pipelines using Python (PySpark) and SQL Collaborate with data engineers, analysts, and stakeholders to ensure data quality, governance, and security Ensure all solutions adhere to financial regulations and internal compliance standards Key Skills … Experience: Proven experience as a Data Architect within the financial services sector Hands-on expertise with Azure Synapse Analytics and Databricks Strong programming and data engineering skills in Python (PySpark) and SQL Solid understanding of financial data and regulatory compliance requirements Excellent stakeholder communication and documentation skills More ❯
Employment Type: Contract
Rate: From £500 to £650 per day
Posted:

Lead Data Engineer (Remote)

South East, United Kingdom
Hybrid / WFH Options
Circana
UK. In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and a … desire to make a significant impact, we encourage you to apply! Job Responsibilities Data Engineering & Data Pipeline Development Design, develop, and optimize scalable DATA workflows using Python, PySpark, and Airflow Implement real-time and batch data processing using Spark Enforce best practices for data quality, governance, and security throughout the data lifecycle Ensure data availability, reliability and performance through … Implement CI/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Build and optimize large-scale data processing pipelines using Apache Spark and PySpark Implement data partitioning, caching, and performance tuning for Spark-based workloads. Work with diverse data formats (structured and unstructured) to support advanced analytics and machine learning initiatives. Workflow Orchestration More ❯
Employment Type: Permanent
Posted:

GenAI Specialist

East London, London, United Kingdom
Hybrid / WFH Options
McGregor Boyall Associates Limited
s/PhD in Computer Science, Data Science, Mathematics, or related field. 5+ years of experience in ML modeling, ranking, or recommendation systems . Proficiency in Python, SQL, Spark, PySpark, TensorFlow . Strong knowledge of LLM algorithms and training techniques . Experience deploying models in production environments. Nice to Have: Experience in GenAI/LLMs Familiarity with distributed computing More ❯
Employment Type: Contract
Rate: £725 - £775 per day
Posted:

Data Engineer

Coalville, Leicestershire, East Midlands, United Kingdom
Hybrid / WFH Options
Ibstock PLC
consistency across the data platform. Knowledge, Skills and Experience: Essentia l Strong expertise in Databricks and Apache Spark for data engineering and analytics. Proficient in SQL and Python/PySpark for data transformation and analysis. Experience in data lakehouse development and Delta Lake optimisation. Experience with ETL/ELT processes for integrating diverse data sources. Experience in gathering, documenting More ❯
Employment Type: Permanent, Work From Home
Posted:

Data Solution Architect

Warwickshire, West Midlands, United Kingdom
Hybrid / WFH Options
Hays
We're looking for someone with strong technical expertise and a passion for solving complex business problems. You'll bring: Strong experience with SQL, SQL Server DB, Python, and PySpark Proficiency in Azure Data Factory, Databricks is a must, and Cloudsmith Background in data warehousing and data engineering Solid project management capabilities Outstanding communication skills, translating technical concepts into More ❯
Employment Type: Permanent, Work From Home
Salary: £80,000
Posted:

Principal Data Engineer

London, England, United Kingdom
Epam
of engineers, architects, designers, and strategists as we continue to grow our Data & Analytics practice across Europe. We're looking for a Principal Data Engineer with Azure, Databricks, and PySpark to join our team in London. The ideal candidate will have a strong background in data engineering, extensive experience with Azure cloud services, and experience leading a technical team … on experience with Azure data services (Apache Spark, Azure Data Factory, Synapse Analytics, RDBMS such as SQL Server) Proven leadership and management experience in data engineering teams Proficiency in PySpark, Python (with Pandas), T-SQL, SparkSQL, and experience with CI/CD pipelines Strong understanding of data modeling, ETL processes, and data warehousing concepts Knowledge of version control systems More ❯
Posted:

Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
Locus Robotics
Locus Robotics is a global leader in warehouse automation, delivering unmatched flexibility and unlimited throughput, and actionable intelligence to optimize operations. Powered by LocusONE, an AI-driven platform, our advanced autonomous mobile robots seamlessly integrate into existing warehouse environments to More ❯
Posted:

Senior Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
Artefact
Social network you want to login/join with: Artefact is a new generation of data service provider, specialising in data consulting and data-driven digital marketing, dedicated to transforming data into business impact across the entire value chain of More ❯
Posted:

Principal Data Engineer

London, England, United Kingdom
Landmark Information Group
technical concepts to a range of audiences Able to provide coaching and training to less experienced members of the team Essential skills: Programming Languages such as Spark, Java, Python, PySpark, Scala, etc (minimum 2) Extensive Data Engineering and Data Analytics hands-on experience (coding/configuration/automation/monitoring/security/etc) FME Advanced Database and SQL More ❯
Posted:

Data Engineer

London, England, United Kingdom
Zodiac Maritime Ltd
Skills profile Relevant experience & education Hands-on experience with Azure Databricks, Delta Lake, Data Factory, and Synapse. Strong understanding of Lakehouse architecture and medallion design patterns. Proficient in Python, PySpark, and SQL (advanced query optimization). Experience building scalable ETL pipelines and data transformations. Knowledge of data quality frameworks and monitoring. Experience with Git, CI/CD pipelines, and More ❯
Posted:

Data Engineer

London, England, United Kingdom
PlayStation Network
of No Sql dbs (dynamo db, Cassandra) Knowledge of node based architecture, graph databases and languages – Neptune, Neo4j, Gremlin, Cypher Experience 5+ years of experience with Databricks, Spark, Scala,PySpark, Python 5+ years of experience in SQL and database technologies like Snowflake or equivalent. 3+ year of experience with data and ETL programming (Ab Initio) 3+ years of experience More ❯
Posted:

Global Data Engineer

Billericay, England, United Kingdom
Hybrid / WFH Options
epay, a Euronet Worldwide Company
capable of the following: Recommended: 2+ years of professional experience in a data engineering or similar role. Proficiency in Python , including use of libraries for data processing (e.g., pandas, pySpark). Experience working with Azure-based data services , particularly Azure Databricks , Data Factory, and Blob Storage. Demonstrable knowledge of data pipeline orchestration and optimisation. Understanding of SQL for data More ❯
Posted:

Staff Analytics Engineer - Cyber Data Platform

Digswell, England, United Kingdom
Tesco UK
management of data solutions using DevOps practices. Qualifications and Experience Proven leadership in delivering robust data models, analytics frameworks, and providing technical guidance. Strong programming skills in Python/PySpark and SQL. Experience building data lake and data warehouse solutions on cloud platforms like Databricks on Azure, with a DaaS model. Knowledge of data lake and warehouse concepts, architectural More ❯
Posted:

Data Engineering Associate

London, UK
Metyis
hands on experience with Azure: Data factory, Databricks, Synapse (DWH), Azure Functions, App logic and other data analytics services, including streaming. Experience with Airflow and Kubernetes. Programming languages: Python (PySpark), scripting languages like Bash. Knowledge of Git, CI/CD operations and Docker. Basic knowledge of PowerBI is a plus. Experience deploying cloud infrastructure is desirable Understanding of Infrastructure More ❯
Posted:

Lead Data Engineer

London, England, United Kingdom
rmg digital
stack”. You’ll be expected to work across a broad tech landscape: Big Data & Distributed Systems: HDFS, Hadoop, Spark, Kafka Cloud: Azure or AWS Programming: Python, Java, Scala, PySpark – you’ll need two or more, Python preferred Data Engineering Tools: Azure Data Factory, Databricks, Delta Lake, Azure Data Lake SQL & Warehousing: Strong experience with advanced SQL and database More ❯
Posted:

Data and Analytics Architect - L1

Leeds, England, United Kingdom
Wipro Technologies
transaction processing with maintaining and strengthening the modelling standards and business information. ͏ Key Responsibilities: Build and optimize Prophecy data pipelines for large scale batch and streaming data workloads using Pyspark Define end-to-end data architecture leveraging prophecy integrated with databricks or Spark or other cloud-native compute engines Establish coding standards, reusable components, and naming conventions using Prophecy … exposure to convert legacy etl tools like datastage, informatica into Prophecy pipelines using Transpiler component of Prophecy Required skill & experience: 2+ years of hands-on experience with Prophecy (Using pyspark) approach 5+ years of experience in data engineering with tools such as Spark, Databricks,scala/Pyspark or SQL Strong understanding of ETL/ELT pipelines, distributed data More ❯
Posted:

Data Engineering Manager - Commercial & Supply

Leeds, England, United Kingdom
Driver Hire Borders
Job Title Data Engineering Manager - Commercial & Supply Location Asda House Employment Type Full time Contract Type Permanent Hours Per Week 37.5 Salary Competitive salary plus benefits Category Data Science Closing Date 27 June 2025 The role requires on-site presence More ❯
Posted:

Product Owner - Data Platform

London, England, United Kingdom
Skipton Building Society
collaboration skills across teams. Key Technologies (awareness of) Azure Databricks, Data Factory, Storage, Key Vault Source control systems, such as Git dbt (Data Build Tool), SQL (Spark SQL), Python (PySpark) Certifications (Ideal) SAFe POPM or Scrum PSP Microsoft Certified: Azure Fundamentals (AZ-900) Microsoft Certified: Azure Data Fundamentals (DP-900) What’s in it for you We value work More ❯
Posted:
PySpark
England
10th Percentile
£50,000
25th Percentile
£63,750
Median
£105,000
75th Percentile
£122,500
90th Percentile
£143,750