PySpark Jobs in Leeds

1 to 8 of 8 PySpark Jobs in Leeds

Spark Scala Developer - Hybrid / Leeds - £450-£550

Leeds, West Yorkshire, Yorkshire, United Kingdom
Hybrid / WFH Options
Damia Group Ltd
over 150 PB of data. As a Spark Scala Engineer, you will have the responsibility to refactor Legacy ETL code, for example DataStage into PySpark using Prophecy low-code no-code and available converters. Converted code is causing failures/performance issues. Your responsibilities; As a Spark Scala Engineer more »
Employment Type: Contract
Rate: £450 - £550 per day
Posted:

Azure Data Engineer

Leeds, England, United Kingdom
HCLTech
Job description: hands-on experience in business intelligence development or data engineering experience,Solid experience with data modelling, data warehouse design and data lake concepts and practices; Exposure working in a Microsoft Azure Data Platform environment. Exposer inAzure Data Factory more »
Posted:

Senior Data Engineer

Leeds, West Yorkshire, Richmond Hill, United Kingdom
Hybrid / WFH Options
Department of Work & Pensions
prem, but the direction of travel is cloud engineering. You'll be executing code in different places across the following tech stack: Azure, Databricks, PySpark and Pandas. You will steer the data engineering function within a wider product team. There'll be lots of connecting and interaction with stakeholders … inclusive environment where you can grow your career and make a real difference. Essential criteria: Enterprise-scale experience with Azure data engineering tools, Databricks, PySpark and Pandas Experience of data modelling and transforming raw data into datasets Experience of building team capability through role modelling, mentoring, and coaching Able more »
Employment Type: Permanent
Salary: £52412 - £78517/annum
Posted:

Data Engineer

Leeds, West Yorkshire, Richmond Hill, United Kingdom
Hybrid / WFH Options
Department of Work & Pensions
currently on-prem, but the direction of travel is cloud engineering, and you'll be executing code across the following tech stack: Azure, Databricks, PySpark and Pandas. DWP Digital is a great place to work, we offer a supportive and inclusive environment where you can grow your career and … make a real difference. Essential criteria: Commercial experience of Databricks, PySpark and Pandas Commercial experience of Azure data engineering tools such as Azure Data Factory, dedicated SQL pools and ADSL Gen 2 Experience of working with data lakes An understanding of dimensional modelling Details. Wages. Perks. You'll join more »
Employment Type: Permanent
Salary: £35711 - £41567/annum
Posted:

Data Architect (Spark)

Leeds, West Yorkshire, Yorkshire, United Kingdom
Hybrid / WFH Options
Damia Group Ltd
Spark/PySpark Architect - 12 months+ -£Inside IR35- Hybrid working of 3 days on site in Leeds My client are a Global Consultancy who are looking for a number of Spark/PySpark Architects to join them on a Long term programme. As the Spark architect, you will … objectives. Responsibilities: Working on an Enterprise scale Cloud infrastructure and Cloud Services in one of the Clouds (GCP). Drive Data Integration upgrade to PySpark Collaboration with multiple customer stakeholders Knowledge of working with Cloud Databases Excellent communication and solution presentation skills. Able to analyse Spark code failures through … Spark Plans and make correcting recommendations Able to review PySpark and Spark SQL jobs and make performance improvement recommendations Able to understand Data Frames/Resilient Distributed Data Sets and understand any memory related problems and make corrective recommendations Able to monitor Spark jobs using wider tools such as more »
Employment Type: Contract, Work From Home
Rate: £600 - £650 per day + Inside IR35
Posted:

Data Architect

Leeds, England, United Kingdom
Hybrid / WFH Options
Damia Group
Spark/PySpark Architect - 12 months+ -£Inside IR35- Hybrid working of 3 days on site in Leeds My client are a Global Consultancy who are looking for a number of Spark/PySpark Architects to join them on a Long term programme. As the Spark architect, you will … objectives. Responsibilities: Working on an Enterprise scale Cloud infrastructure and Cloud Services in one of the Clouds (GCP). Drive Data Integration upgrade to PySpark Collaboration with multiple customer stakeholders Knowledge of working with Cloud Databases Excellent communication and solution presentation skills. Able to analyse Spark code failures through … Spark Plans and make correcting recommendations Able to review PySpark and Spark SQL jobs and make performance improvement recommendations Able to understand Data Frames/Resilient Distributed Data Sets and understand any memory related problems and make corrective recommendations Able to monitor Spark jobs using wider tools such as more »
Posted:

Spark Architect

Leeds, England, United Kingdom
PRACYVA
months to begin with & its extendable Location: Leeds, UK (min 3 days onsite) Context: Legacy ETL code for example DataStage is being refactored into PySpark using Prophecy low-code no-code and available converters. Converted code is causing failures/performance issues. Skills: Spark Architecture – component understanding around Spark … Data Integration (PySpark, scripting, variable setting etc.), Spark SQL, Spark Explain plans. Spark SME – Be able to analyse Spark code failures through Spark Plans and make correcting recommendations. Spark SME – Be able to review PySpark and Spark SQL jobs and make performance improvement recommendations. Spark – SME Be able … are Cluster level failures. Cloudera (CDP) – Knowledge of understanding how Cloudera Spark is set up and how the run time libraries are used by PySpark code. Prophecy – High level understanding of Low-Code No-Code prophecy set up and its use to generate PySpark code. more »
Posted:

Scala Developer

Leeds, England, United Kingdom
Osmii
Integration upgrades to PySpark. Mandatory Skills: At least 12+ years of IT experience with a deep understanding of component integration around Spark Data Integration (PySpark, scripting, variable setting, etc.), Spark SQL, Spark Explain plans. Expertise as a Spark SME – able to analyze Spark code failures through Spark Plans and … to traverse and explain the architecture you have been part of and justify the use of any particular tool/technology. Proficiency in reviewing PySpark and Spark SQL jobs and making performance improvement recommendations. Comprehensive understanding of Data Frames/Resilient Distributed Data Sets, recognizing any memory-related problems … using wider tools such as Grafana to identify Cluster level failures. Knowledge of Cloudera (CDP) Spark and how the runtime libraries are used by PySpark code. High-level understanding of Low-Code No-Code prophecy setup and its use in generating PySpark code. Please reach out at Rhys.Kendall more »
Posted:
PySpark
Leeds
10th Percentile
£57,500
25th Percentile
£77,500
Median
£80,000
75th Percentile
£82,500
90th Percentile
£85,000