PySpark Job Vacancies

151 to 175 of 956 PySpark Jobs

Sr. Portfolio Data Engineer

Edinburgh, Scotland, United Kingdom
Addepar
or a related technical field Experience with object-oriented programming preferred General familiarity with some of the technologies we use: Python, Apache Spark/PySpark, Java/Spring Amazon Web Services SQL, relational databases Understanding of data structures and algorithms Interest in data modeling, visualisation, and ETL pipelines Knowledge More ❯
Posted:

Data Engineer

Glasgow, Lanarkshire, Scotland, United Kingdom
Purview Consultancy Services Ltd
Role : Data Engineer Location : Glasgow ( 3 days in a week) Key Requirements/Expertise : Primary skills : Data, Python, pyspark Should have expertise in data engineering experience leveraging technologies such as Snowflake, Azure Data Factory, ADLS, Databricks etc. Should have expertise in writing SQL queries against any RDBMS with query More ❯
Employment Type: Contract
Posted:

Data Engineer

Glasgow, Scotland, United Kingdom
PACeHR
Role : Data Engineer Location : Glasgow ( 3 days in a week) Key Requirements/Expertise Primary skills : Data, Python, pyspark Should have expertise in data engineering experience leveraging technologies such as Snowflake, Azure Data Factory, ADLS, Databricks etc. Should have expertise in writing SQL queries against any RDBMS with query More ❯
Posted:

Data Engineer - AdTech

London, England, United Kingdom
Sainsbury's
the engineering community. We are looking for an experienced Data Engineer confident in investigating, ingesting, and exploiting large datasets with core skills in Python, Pyspark, EMR, AWS, SQL, CI/CD. At Sainsburys Tech, within the Media Agency, we’re unlocking petabytes of untapped potential. We have thousands of More ❯
Posted:

Senior Data Engineer

London, England, United Kingdom
Xcede
DevOps and Infrastructure-as-Code (ideally using Terraform) Take ownership of system observability, stability, and documentation Requirements Strong experience in Python (especially Pandas and PySpark) and SQL Proven expertise in building data pipelines and working with Databricks and Lakehouse environments Deep understanding of Azure (or similar cloud platforms), including More ❯
Posted:

Lead Architect - Data Engineering / Azure

London, England, United Kingdom
Fractal
years of experience. Strong background in System Integration, Application Development, or Data-Warehouse projects across enterprise technologies. Experience with Object-oriented languages (e.g., Python, PySpark) and frameworks. Expertise in relational and dimensional modeling, including big data technologies. Proficiency in Microsoft Azure components like Azure Data Factory, Data Lake, SQL More ❯
Posted:

Data Engineer

Greater London, England, United Kingdom
Hybrid / WFH Options
Bounce Digital
and external (eBay APIs) sources Define data quality rules, set up monitoring/logging, and support architecture decisions What You Bring Strong SQL & Python (PySpark); hands-on with GCP or AWS Experience with modern ETL tools (dbt, Airflow, Fivetran) BI experience (Looker, Power BI, Metabase); Git and basic CI More ❯
Posted:

Data Engineer

london, south east england, united kingdom
Hybrid / WFH Options
Bounce Digital
and external (eBay APIs) sources Define data quality rules, set up monitoring/logging, and support architecture decisions What You Bring Strong SQL & Python (PySpark); hands-on with GCP or AWS Experience with modern ETL tools (dbt, Airflow, Fivetran) BI experience (Looker, Power BI, Metabase); Git and basic CI More ❯
Posted:

Data Engineer

slough, south east england, united kingdom
Hybrid / WFH Options
Bounce Digital
and external (eBay APIs) sources Define data quality rules, set up monitoring/logging, and support architecture decisions What You Bring Strong SQL & Python (PySpark); hands-on with GCP or AWS Experience with modern ETL tools (dbt, Airflow, Fivetran) BI experience (Looker, Power BI, Metabase); Git and basic CI More ❯
Posted:

Data Engineer

Leeds, England, United Kingdom
Asda
team that’s transforming how data powers retail, this is your opportunity. Your Role (Key Responsibilities) Design, build, and optimise robust data pipelines using PySpark, SparkSQL, and Databricks to ingest, transform, and enrich data from a variety of sources. Translate business requirements into scalable and performant data engineering solutions More ❯
Posted:

Junior Data Engineer

Stafford, England, United Kingdom
Hybrid / WFH Options
Agena Group
Competent working with version control systems e,g. Git/SVN Desirable Skills AWS platform knowledge or equivalent (Azure/GCP) Experience working with PySpark, Pandas and other data engineering libraries Ability to deploy applications using pipelines with IaaC (CDK/Terraform) Exposure to containerisation e.g. Docker Experience with More ❯
Posted:

Senior Data Engineer

London Area, United Kingdom
Xcede
DevOps and Infrastructure-as-Code (ideally using Terraform) Take ownership of system observability, stability, and documentation Requirements Strong experience in Python (especially Pandas and PySpark) and SQL Proven expertise in building data pipelines and working with Databricks and Lakehouse environments Deep understanding of Azure (or similar cloud platforms), including More ❯
Posted:

Senior Data Engineer

City of London, London, United Kingdom
Xcede
DevOps and Infrastructure-as-Code (ideally using Terraform) Take ownership of system observability, stability, and documentation Requirements Strong experience in Python (especially Pandas and PySpark) and SQL Proven expertise in building data pipelines and working with Databricks and Lakehouse environments Deep understanding of Azure (or similar cloud platforms), including More ❯
Posted:

Senior Data Engineer

South East London, England, United Kingdom
Xcede
DevOps and Infrastructure-as-Code (ideally using Terraform) Take ownership of system observability, stability, and documentation Requirements Strong experience in Python (especially Pandas and PySpark) and SQL Proven expertise in building data pipelines and working with Databricks and Lakehouse environments Deep understanding of Azure (or similar cloud platforms), including More ❯
Posted:

Senior Data Engineer

london, south east england, united kingdom
Xcede
DevOps and Infrastructure-as-Code (ideally using Terraform) Take ownership of system observability, stability, and documentation Requirements Strong experience in Python (especially Pandas and PySpark) and SQL Proven expertise in building data pipelines and working with Databricks and Lakehouse environments Deep understanding of Azure (or similar cloud platforms), including More ❯
Posted:

Senior Data Engineer

slough, south east england, united kingdom
Xcede
DevOps and Infrastructure-as-Code (ideally using Terraform) Take ownership of system observability, stability, and documentation Requirements Strong experience in Python (especially Pandas and PySpark) and SQL Proven expertise in building data pipelines and working with Databricks and Lakehouse environments Deep understanding of Azure (or similar cloud platforms), including More ❯
Posted:

Senior Data Engineer

london (city of london), south east england, united kingdom
Xcede
DevOps and Infrastructure-as-Code (ideally using Terraform) Take ownership of system observability, stability, and documentation Requirements Strong experience in Python (especially Pandas and PySpark) and SQL Proven expertise in building data pipelines and working with Databricks and Lakehouse environments Deep understanding of Azure (or similar cloud platforms), including More ❯
Posted:

Senior Lead Software Engineer - AWS Data Platform Engineer

London, England, United Kingdom
ZipRecruiter
within Cloud Data & Analytics Platforms Practical experience in system design, application development, testing, and operational stability Advanced knowledge in programming languages such as Python, PySpark, SQL Deep understanding of software applications and technical processes in disciplines like cloud computing, AI, ML, mobile development Ability to independently solve design and More ❯
Posted:

Data Engineer (GCP)

London Area, United Kingdom
Hybrid / WFH Options
Anson McCade
data platforms using Google Cloud Platform Hands-on experience with GCP tools: BigQuery, Dataform, Dataproc, Composer, Pub/Sub Strong programming skills in Python, PySpark , and SQL Deep understanding of data engineering concepts, including ETL, data warehousing, and cloud storage Strong communication skills with the ability to collaborate across More ❯
Posted:

Data Engineer (GCP)

City of London, London, United Kingdom
Hybrid / WFH Options
Anson McCade
data platforms using Google Cloud Platform Hands-on experience with GCP tools: BigQuery, Dataform, Dataproc, Composer, Pub/Sub Strong programming skills in Python, PySpark , and SQL Deep understanding of data engineering concepts, including ETL, data warehousing, and cloud storage Strong communication skills with the ability to collaborate across More ❯
Posted:

Data Engineer-GCP

London, England, United Kingdom
Hybrid / WFH Options
JR United Kingdom
data platforms using Google Cloud Platform Hands-on experience with GCP tools: BigQuery, Dataform, Dataproc, Composer, Pub/Sub Strong programming skills in Python, PySpark , and SQL Deep understanding of data engineering concepts, including ETL, data warehousing, and cloud storage Strong communication skills with the ability to collaborate across More ❯
Posted:

Data Architect (GCP)

London Area, United Kingdom
Hybrid / WFH Options
Anson McCade
data warehouses and data lakes. Expertise in GCP data services including BigQuery, Composer, Dataform, DataProc, and Pub/Sub. Strong programming experience with Python, PySpark, and SQL. Hands-on experience with data modelling, ETL processes, and data quality frameworks. Proficiency with BI/reporting tools such as Looker or More ❯
Posted:

Data Engineer (GCP)

london, south east england, united kingdom
Hybrid / WFH Options
Anson McCade
data platforms using Google Cloud Platform Hands-on experience with GCP tools: BigQuery, Dataform, Dataproc, Composer, Pub/Sub Strong programming skills in Python, PySpark , and SQL Deep understanding of data engineering concepts, including ETL, data warehousing, and cloud storage Strong communication skills with the ability to collaborate across More ❯
Posted:

Data Architect (GCP)

City of London, London, United Kingdom
Hybrid / WFH Options
Anson McCade
data warehouses and data lakes. Expertise in GCP data services including BigQuery, Composer, Dataform, DataProc, and Pub/Sub. Strong programming experience with Python, PySpark, and SQL. Hands-on experience with data modelling, ETL processes, and data quality frameworks. Proficiency with BI/reporting tools such as Looker or More ❯
Posted:

Data Architect (GCP)

Slough, England, United Kingdom
Hybrid / WFH Options
JR United Kingdom
data warehouses and data lakes. Expertise in GCP data services including BigQuery, Composer, Dataform, DataProc, and Pub/Sub. Strong programming experience with Python, PySpark, and SQL. Hands-on experience with data modelling, ETL processes, and data quality frameworks. Proficiency with BI/reporting tools such as Looker or More ❯
Posted:
PySpark
10th Percentile
£50,000
25th Percentile
£62,500
Median
£105,000
75th Percentile
£122,500
90th Percentile
£139,750