PySpark Jobs in the UK

51 to 75 of 96 PySpark Jobs in the UK

Data Engineer (AWS)

England, United Kingdom
Xcede
experience as a Data Engineer. Strong proficiency in AWS and relevant technologies Good communication skills Preferably experience with some or all of the following; Pyspark, Kubernetes, Terraform Expeirence in a start-up/scale up or fast-paced environment is desirable. HOW TO APPLY Please register your interest by more »
Posted:

Senior Data Engineer - Remote first.

England, United Kingdom
Hybrid / WFH Options
Revoco
I have placed quite a few candidates with this organisation now, all have said glowing reviews about it, they're the leader in legal representation comparison and have rather interesting, unique data to work with. They are using modern Azure more »
Posted:

Spark Architect

Leeds, England, United Kingdom
PRACYVA
months to begin with & its extendable Location: Leeds, UK (min 3 days onsite) Context: Legacy ETL code for example DataStage is being refactored into PySpark using Prophecy low-code no-code and available converters. Converted code is causing failures/performance issues. Skills: Spark Architecture – component understanding around Spark … Data Integration (PySpark, scripting, variable setting etc.), Spark SQL, Spark Explain plans. Spark SME – Be able to analyse Spark code failures through Spark Plans and make correcting recommendations. Spark SME – Be able to review PySpark and Spark SQL jobs and make performance improvement recommendations. Spark – SME Be able … are Cluster level failures. Cloudera (CDP) – Knowledge of understanding how Cloudera Spark is set up and how the run time libraries are used by PySpark code. Prophecy – High level understanding of Low-Code No-Code prophecy set up and its use to generate PySpark code. more »
Posted:

Senior Technical Business Analyst

London Area, United Kingdom
InfoCepts
and industry standards for the organization. Strong experience on Azure cloud services like Azure, ADF, ADLS, Synapse Proficiency in querying languages such as SQL, Pyspark, Python and familiarity with data visualization tools (e.g. Power BI). Strong communication skills to gather the business requirements from stakeholder and propose best more »
Posted:

Lead Data Scientist

Leicester, England, United Kingdom
Harnham
STEM subject e.g. Mathematics, Statistics or Computer Science Experience in personalisation, segmentation with a focus on CRM Retail experience is a bonus Experience with PySpark/Azure/Databricks is a bonus Experience of management NEXT STEPS: If this role looks of interest, please reach out to Joseph Gregory. more »
Posted:

Senior Data Scientist

London Area, United Kingdom
Higher - AI recruitment
experience-related problems such as workforce management, demand forecasting, or root cause analysis Strong visualisation skills including experience with Tableau Familiarity with Databricks and PySpark for data manipulation and analysis Familiarity with Git-based source control methodologies, including branching and pull requests A self-starter, passionate about converting data more »
Posted:

Data Engineer

Greater London, England, United Kingdom
Hybrid / WFH Options
Agora Talent
come with scaling a company • The ability to translate complex and sometimes ambiguous business requirements into clean and maintainable data pipelines • Excellent knowledge of PySpark, Python and SQL fundamentals • Experience in contributing to complex shared repositories. What’s nice to have: • Prior early-stage B2B SaaS experience involving client more »
Posted:

Java Data Engineer

United Kingdom
Amber Labs
experience in PowerBi would also be useful. You will be a Engineer with past experience with java, data, and infrastructure (devOps). Java, Python, PySpark Mechanisms: MongoDB, Redshift, AWS S3 Environments/Infra: AWS (required), [AWS Lambda, Terraform] (nice to have) Platforms: Creating data pipelines within Databricks or equivalent more »
Posted:

Data Modeler

London Area, United Kingdom
Hybrid / WFH Options
Tata Consultancy Services
vision to convert data requirements into logical models and physical solutions. with data warehousing solutions (e.g. Snowflake) and data lake architecture, Azure Databricks/Pyspark & ADF. retail data model standards - ADRM. communication skills, organisational and time management skills. to innovate, support change and problem solve. attitude, with the ability more »
Posted:

Senior Data Engineer (AWS)

London Area, United Kingdom
Xcede
of 4 years commercial experience Strong proficiency in AWS and relevant technologies Good communication skills Preferably experience with some or all of the following; Pyspark, Kubernetes, Terraform Expeirence in a start-up/scale up or fast-paced environment is desirable. HOW TO APPLY Please register your interest by more »
Posted:

Python Developer

Glasgow, Scotland, United Kingdom
McGregor Boyall
Python, PySpark, AWS, Oracle, Kafka, Banking Become a key member of an agile team designing and delivering a market-leading secure and scalable reporting product. The team is migrating from Oracle to AWS cloud so experience with either is nice to have. You serve as a seasoned member of … candidates able to work under these requirements without visa sponsorship will be able to be considered at this time. Required skills: - Either Python or PySpark experience. Ideally, you will be working as either a Python Developer (3+ years) or Data Engineer using PySpark - Either AWS or Oracle. Candidates more »
Posted:

Senior Data & Analytics Developer

London, United Kingdom
Hybrid / WFH Options
McGregor Boyall Associates Limited
data warehouse design. Cloud data products such as: Data factory Events Hubs Data Lake Synapse Azure SQL Server Experience developing Databricks and coding with PySpark and Spark SQL. Proficient in ETL coding standards Data encryption techniques and standards Knowledge of relevant legislation such as: Data Protection Act, EU Procurement more »
Employment Type: Contract, Work From Home
Posted:

Cloud Data Engineer

Knutsford, England, United Kingdom
Hybrid / WFH Options
Capgemini
warehousing solutions. Familiarity with DBT (Data Build Tool) for managing transformations in the data pipeline. Strong programming skills in technologies like Python, Scala, Spark, PySpark or Ab Initio, Glue, Starburst, Snowflake, Redshift, Starburst, Hybrid(on-prem to cloud) for building data pipelines and ETL processes. Experience with data pipeline more »
Posted:

Cloud Data Engineer

Northamptonshire, England, United Kingdom
Hybrid / WFH Options
Capgemini
warehousing solutions. Familiarity with DBT (Data Build Tool) for managing transformations in the data pipeline. Strong programming skills in technologies like Python, Scala, Spark, PySpark or Ab Initio, Glue, Starburst, Snowflake, Redshift, Starburst, Hybrid(on-prem to cloud) for building data pipelines and ETL processes. Experience with data pipeline more »
Posted:

Technical Product Lead - Active SC Mandatory

London Area, United Kingdom
Infoplus Technologies UK Limited
Banking and Financial Services sector is advantageous. • Deep knowledge or experience with using as much of the following: • Azure Cloud Data Components | Databricks | Python | PySpark | Terraform | APIs | Lakehouse | Data Mesh | Nosql DBs | GitHub Person Specification • Self motivator with a desire to learn new skills and embrace new technologies in more »
Posted:

Python App Developer

Glasgow, Scotland, United Kingdom
Morgan McKinley
Cycle · Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Preferred qualifications, capabilities, and skills: · Skilled with Python or PySpark · Exposure to cloud technologies (Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka) · Experience with Big Data solutions or Relational DB. · Experience in Financial Service Industry is more »
Posted:

Engineering Delivery Lead (AWS/Snowflake/Python/Scala) - UK based - 6 Month contract + extensions - £450-550/day + negotiable

England, United Kingdom
Orbis Group
experience leading a data engineering team. Key Tech: - AWS (S3, Glue, EMR, Athena, Lambda) - Snowflake, Redshift - DBT (Data Build Tool) - Programming: Python, Scala, Spark, PySpark or Ab Initio - Data pipeline orchestration (Apache Airflow) - Knowledge of SQL This is a 6 month initial contract with a trusted client of ours. more »
Posted:

Data Scientist/ Data Engineer

South Harting, England, United Kingdom
Hybrid / WFH Options
Adecco
insurance sector is advantageous. - Education: A degree in Computer Science, Data Science, Engineering, or a related field. Technical Skills Strong proficiency in Python, SQL, PySpark, and Databricks. Demonstrated experience with modern NLP techniques and tools. Proven ability to create and manage data quality metrics and dashboards. Experience working in more »
Posted:

Data Engineer

London Area, United Kingdom
Harrington Starr
start interviewing ASAP. Responsibilities: Azure Cloud Data Engineering using Azure Databricks Data Warehousing Data Engineering Very strong with the Microsoft Stack ESSENTIAL knowledge of PySpark clusters Python & C# Scripting experience Experience of message queues (Kafka) Experience of containerization (Docker) FINANCIAL SERVICES EXPERIENCE (Energy/commodities trading) If you have more »
Posted:

Lead Data Engineer

London, England, United Kingdom
Arrows
preferably GCP | Expertise in event-driven data integrations and click-stream ingestion | Proven ability in stakeholder management and project leadership | Proficiency in SQL, Python, PySpark | Solid background in data pipeline orchestration, data access, and retention tooling | Demonstrable impact on infrastructure scalability and data privacy initiatives | Collaborative spirit | Innovative problem more »
Posted:

Machine Learning Engineer

London Area, United Kingdom
Harnham
in STEM subjects. Strong experience in data pipelines and deploying ML models Preference for experience in retail/marketing Tech across: Python, AWS, Databricks, PySpark, AB Testing, MLFlow, APIs Experience in feature engineering and third-party data Apply below more »
Posted:

Data Engineer

London Area, United Kingdom
Prism Digital
Lead Azure Data Engineer | PySpark (Python) & Synapse | Tech for Good/Charity Essential skills required: Azure – solid experience required of the Azure Data ecosystem Python - ESSENTIAL as PySpark is used heavily. You will be tested on PySpark Azure Synapse - ESSENTIAL as is used heavily Spark Azure Data … people comprising developers, data engineers, QA and DevOps. Essential skills required: Azure – solid experience required of the Azure Data ecosystem Python - ESSENTIAL as PySpark is used heavily. You will be tested on PySpark Azure Synapse - ESSENTIAL as is used heavily Spark Azure Data Lake/Data Bricks/… architecture Familiar with Synapse CI/CD Azure Purview or another governance tool experience. Familiar with building Catalogs and lineage Lead Azure Data Engineer | PySpark (Python), Synapse, Data Lake | Tech for Good/Charity more »
Posted:

Senior Data Engineer

Edinburgh, Scotland, United Kingdom
AI Connect
the opportunity to develop data pipelines into Azure & AWS, as well as contributing to some SAS development on legacy projects when required. Python or PySpark and SQL will be your bread and butter, with any experience of Airflow being a great bonus. The core skillset: Python/PySpark more »
Posted:

Python Solutions Architect Engineering Trading Finance London

Clerkenwell, England, United Kingdom
Joseph Harry Ltd
Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Iceberg PySpark MWAA Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives … Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Iceberg PySpark MWAA Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives more »
Posted:

Mid Level Data Scientist (Databricks / Pyspark)

South Harting, England, United Kingdom
Adecco
insurance sector is advantageous. Education: A degree in Computer Science, Data Science, Engineering, or a related field. Technical Skills: Strong proficiency in Python, SQL, PySpark, and Databricks. Demonstrated experience with modern NLP techniques and tools. Proven ability to create and manage data quality metrics and dashboards. Experience working in more »
Posted:
PySpark
10th Percentile
£52,500
25th Percentile
£60,000
Median
£75,000
75th Percentile
£95,000
90th Percentile
£115,000