and over 150 PB of data. As a Spark Architect, you will have the responsibility to refactor Legacy ETL code for example DataStage into PySpark using Prophecy low-code no-code and available converters. Converted code is causing failures/performance issues. The End Client Account is looking for … an enthusiastic Spark Architect with deep component understanding around Spark Data Integration (PySpark, scripting, variable setting etc.), Spark SQL, Spark Explain plans. Also able to analyse Spark code failures through Spark Plans and make correcting recommendations; able to review PySpark and Spark SQL jobs and make performance improvement … a Spark architect, who can demonstrate deep knowledge of how Cloudera Spark is set up and how the run time libraries are used by PySpark code. Your benefits As the Spark architect , you will have the opportunity to work with one of the biggest IT landscapes in the world. more »
Newcastle Upon Tyne, Tyne and Wear, North East, United Kingdom
Central Employment Agency (North East) Limited
engineering, with a proven track record in building and maintaining data platforms, preferably on AWS. Strong proficiency in Python, experience in SQL and PostgreSQL. PySpark, Scala or Java is a plus. Familiarity with Databricks and the Delta Lakehouse concept. Experience mentoring or leading junior engineers is highly desirable. more »
Newcastle Upon Tyne, Tyne and Wear, North East, United Kingdom
Asset Resourcing
or similar roles. Extensive ETL and data pipeline design experience, technology agnostic. Experience in creating and ensuring adherence to coding standards for Python and PySpark development Experience developing CI/CD pipelines and deep knowledge of DevOps and source control best practices. Strong knowledge of DataOps practices and awareness more »
Ways of Working : Participate actively in planning, stand-ups, and CI/CD processes. Skills Required Experience with Spark-based data projects and Python (PySpark). Proficiency in designing data pipelines and ETL/ELT processes. Knowledge of machine learning algorithms and AWS services (MWAA, Glue, SageMaker). Strong more »
engineering, with a proven track record in building and maintaining data platforms, preferably on AWS. Strong proficiency in Python, experience in SQL and PostgreSQL. PySpark, Scala or Java is a plus. Familiarity with Databricks and the Delta Lakehouse concept. Experience mentoring or leading junior engineers is highly desirable. more »
a complete greenfield re-architecture from the ground up in Microsoft Azure. The Tech you'll be playing with: Azure Data Factory Azure Databricks PySpark SQL DBT What you need to bring: 3+ year of experience in building Data Pipelines SQL experience in data warehousing Python experience would be more »
risk models, including scoring and model monitoring, with a strong preference in IRB Extensive banking and financial services experience Strong coding skills with Python, PySpark, with AWS being preferred as well as experience of working with very large data sets A broad background of risk systems, methodologies and processes more »
Sunderland, Tyne and Wear, Tyne & Wear, United Kingdom
Nigel Frank International
will have. Strong experience in a Data Engineering capacity working with SQL Server and Azure Databricks experience Experience working with Python/Spark/PySpark Experience creating data pipelines with Azure Data Factory This is just a brief overview of the role. For the full information, simply apply to more »
Newcastle upon Tyne, Tyne and Wear, Tyne & Wear, United Kingdom
Nigel Frank International
To be successful in the role you will have. Strong Azure Data Platform experience Strong understanding of Databricks Coding experience with both Python/PySpark and SQL Experience working with Azure Data Factory for creating ETL solutions This is just a brief overview of the role. For the full more »
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
WRK DIGITAL LTD
and can query 1.5 billion rows per second. This role provides an opportunity to work on Big Data applications and learn technologies such as PySpark, Airflow, Trino, and more. Its a collaborative, fully remote UK-based team, passionate about their work and committed to having fun along the way. more »
Manchester, North West, United Kingdom Hybrid / WFH Options
Sanderson Recruitment
Job Title: PySpark Data Engineer - Azure Salary: up to £54,000 Mostly remote About Us: Join our innovative team where PySpark expertise meets Azure ingenuity. We're passionate about leveraging data to drive business success and looking for a skilled Data Engineer to join us in delivering high … impact solutions. Key Responsibilities: Develop secure, efficient data pipelines using PySpark for ingestion, transformation, and consumption within Azure. Ensure data quality and adherence to best practices throughout the pipeline life cycle. Design and optimise physical data models to meet business needs and storage requirements. Collaborate with cross-functional teams … to deliver BI solutions and reporting structures using Power BI. Experience & Qualifications: 2-5 years of experience in designing and implementing PySpark-based data solutions. Proficiency in SQL and Azure technologies (Data factory, Synapse). Strong understanding of data life cycle management and CI/CD principles. Experience working more »
verbal presentations. Investigate data trends and quality issues, using your initiative to suggest process improvements. Utilise statistical and analytical coding techniques, particularly in Python (PySpark) and R, ensuring compliance with statistical protocols to maximise trust and prevent unintended disclosure. Skills and Experience Required: Strong quantitative analysis skills, with experience more »
verbal presentations. Investigate data trends and quality issues, using your initiative to suggest process improvements. Utilise statistical and analytical coding techniques, particularly in Python (PySpark) and R, ensuring compliance with statistical protocols to maximise trust and prevent unintended disclosure. Skills and Experience Required: Strong quantitative analysis skills, with experience more »
Manchester, North West, United Kingdom Hybrid / WFH Options
Sanderson Recruitment
standards/architectural principles. Mentoring/coaching junior data engineers. Developing BI solutions including data marts/semantic layers/visualisations. Core Skills: Strong PySpark and SQL development skills. Experience with Azure Data Engineering. Ability to turn customer requests into actionable designs. Collaboration skills in an agile team environment. more »
Manchester, North West, United Kingdom Hybrid / WFH Options
Adria Solutions
across numerous sectors. They excel in innovation and leverage technology and data to achieve outstanding results. They seek a skilled Data Engineer with strong PySpark and SQL capabilities to join their vibrant technology and data teams. The Role: In this role, youll build and implement data solutions for various … projects and ongoing products. Youll design and create data pipelines that adhere to technical specifications and data platform strategies. Key Skills: PySpark and SQL Development: Strong proficiency in PySpark and SQL, with a passion for advancing your data engineering career. Azure Data Engineering: Experience delivering data from source more »