months to begin with & its extendable Location: Leeds, UK (min 3 days onsite) Context: Legacy ETL code for example DataStage is being refactored into PySpark using Prophecy low-code no-code and available converters. Converted code is causing failures/performance issues. Skills: Spark Architecture – component understanding around Spark … Data Integration (PySpark, scripting, variable setting etc.), Spark SQL, Spark Explain plans. Spark SME – Be able to analyse Spark code failures through Spark Plans and make correcting recommendations. Spark SME – Be able to review PySpark and Spark SQL jobs and make performance improvement recommendations. Spark – SME Be able … are Cluster level failures. Cloudera (CDP) – Knowledge of understanding how Cloudera Spark is set up and how the run time libraries are used by PySpark code. Prophecy – High level understanding of Low-Code No-Code prophecy set up and its use to generate PySpark code. more »
Basingstoke, England, United Kingdom Hybrid / WFH Options
Blatchford
with various databases e.g. MS SQL, Azure Cosmos DB. Skilled at optimizing large and more complicated SQL statements. Proficiency in Python and experience with PySpark Experience using: CI/CD, Microsoft Azure, and Azure Dev Ops in an agile environment. Knowledge of Azure ETL services, i.e. Data Factory, Synapse more »
Cycle · Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Preferred qualifications, capabilities, and skills: · Skilled with Python or PySpark · Exposure to cloud technologies (Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka) · Experience with Big Data solutions or Relational DB. · Experience in Financial Service Industry is more »
Surrey, England, United Kingdom Hybrid / WFH Options
Hawksworth
that are comfortable with terms like; Statistical Models, Computer Vision, Predictive Analytics, Data Visualization, Large Language Models (LLM), NLP, AI, Machine Learning, MLOPs, Python, Pyspark, and Azure. Flexible Working : The role is 60% work from home Sponsorship : Sadly sponsorship isn't available for these roles. About You: Bachelors Degree more »
a complete greenfield re-architecture from the ground up in Microsoft Azure. The Tech you'll be playing with: Azure Data Factory Azure Databricks PySpark SQL DBT What you need to bring: 1-3 year's experience in building Data Pipelines SQL experience in data warehousing Python experience would more »
Guildford, England, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
Data Engineer We're partnering with a with a global financial services institution that are embarking on an ambitious project: to build an industry-leading data platform. The long-term vision is to migrate all data within the coming years more »
Newcastle Upon Tyne, Tyne and Wear, North East, United Kingdom Hybrid / WFH Options
DWP Digital
currently on-prem, but the direction of travel is cloud engineering, and you'll be executing code across the following tech stack: Azure, Databricks, PySpark and Pandas. DWP Digital is a great place to work, we offer a supportive and inclusive environment where you can grow your career and … make a real difference. Essential criteria: Commercial experience of Databricks, PySpark and Pandas Commercial experience of Azure data engineering tools such as Azure Data Factory, dedicated SQL pools and ADSL Gen 2 Experience of working with data lakes An understanding of dimensional modelling Details. Wages. Perks. You'll join more »
Knowledge on Spark architecture and modern Datawarehouse/Data-Lake/Lakehouse techniques Build transformation tables using SQL. Moderate level knowledge of Python/PySpark or equivalent programming language. PowerBI Data Gateways and DataFlows, permissions. Creation, utilisation, optimisation and maintenance of Relational SQL and NoSQL databases. Experienced working with more »
Manchester, North West, United Kingdom Hybrid / WFH Options
DWP Digital
currently on-prem, but the direction of travel is cloud engineering, and you'll be executing code across the following tech stack: Azure, Databricks, PySpark and Pandas. DWP Digital is a great place to work, we offer a supportive and inclusive environment where you can grow your career and … make a real difference. Essential criteria: Commercial experience of Databricks, PySpark and Pandas Commercial experience of Azure data engineering tools such as Azure Data Factory, dedicated SQL pools and ADSL Gen 2 Experience of working with data lakes An understanding of dimensional modelling Details. Wages. Perks. You'll join more »
Python, PySpark, AWS, Oracle, Kafka, Banking Become a key member of an agile team designing and delivering a market-leading secure and scalable reporting product. The team is migrating from Oracle to AWS cloud so experience with either is nice to have. You serve as a seasoned member of … candidates able to work under these requirements without visa sponsorship will be able to be considered at this time. Required skills: - Either Python or PySpark experience. Ideally, you will be working as either a Python Developer (3+ years) or Data Engineer using PySpark - Either AWS or Oracle. Candidates more »
Proficiency in modern programming languages and database querying languages. Comprehensive understanding of the Software Development Life Cycle and Agile methodologies. Familiarity with Python or PySpark and cloud technologies like AWS, Kubernetes, and Spark is highly desirable. Ideal Candidate: Someone with a knack for innovative solutions and a commitment to more »
STEM subject e.g. Mathematics, Statistics or Computer Science Experience in personalisation, segmentation with a focus on CRM Retail experience is a bonus Experience with PySpark/Azure/Databricks is a bonus Experience of management NEXT STEPS: If this role looks of interest, please reach out to Joseph Gregory. more »
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Synchro
contract position. If you possess a solid background in software application development, with experience in cloud or microservice architecture, and proficiency in Python or PySpark, we encourage you to apply. Python App Developer Requirements: Proficiency with Python or PySpark. Exposure to cloud technologies such as Airflow, Astronomer, Kubernetes, AWS more »
Python, PySpark, AWS, Oracle, Kafka, Banking Become a key member of an agile team designing and delivering a market-leading secure and scalable reporting product. The team is migrating from Oracle to AWS cloud so experience with either is nice to have. You serve as a seasoned member of … candidates able to work under these requirements without visa sponsorship will be able to be considered at this time. Required skills: - Either Python or PySpark experience. Ideally, you will be working as either a Python Developer (3+ years) or Data Engineer using PySpark - Either AWS or Oracle. Candidates more »
Blackpool, Lancashire, North West, United Kingdom Hybrid / WFH Options
DWP Digital
currently on-prem, but the direction of travel is cloud engineering, and you'll be executing code across the following tech stack: Azure, Databricks, PySpark and Pandas. DWP Digital is a great place to work, we offer a supportive and inclusive environment where you can grow your career and … make a real difference. Essential criteria: Commercial experience of Databricks, PySpark and Pandas Commercial experience of Azure data engineering tools such as Azure Data Factory, dedicated SQL pools and ADSL Gen 2 Experience of working with data lakes An understanding of dimensional modelling Details. Wages. Perks. You'll join more »
Sheffield, South Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
DWP Digital
currently on-prem, but the direction of travel is cloud engineering, and you'll be executing code across the following tech stack: Azure, Databricks, PySpark and Pandas. DWP Digital is a great place to work, we offer a supportive and inclusive environment where you can grow your career and … make a real difference. Essential criteria: Commercial experience of Databricks, PySpark and Pandas Commercial experience of Azure data engineering tools such as Azure Data Factory, dedicated SQL pools and ADSL Gen 2 Experience of working with data lakes An understanding of dimensional modelling Details. Wages. Perks. You'll join more »
currently on-prem, but the direction of travel is cloud engineering, and you'll be executing code across the following tech stack: Azure, Databricks, PySpark and Pandas. DWP Digital is a great place to work, we offer a supportive and inclusive environment where you can grow your career and … make a real difference. Essential criteria: Commercial experience of Databricks, PySpark and Pandas Commercial experience of Azure data engineering tools such as Azure Data Factory, dedicated SQL pools and ADSL Gen 2 Experience of working with data lakes An understanding of dimensional modelling Details. Wages. Perks. You'll join more »
South West London, London, United Kingdom Hybrid / WFH Options
DWP Digital
currently on-prem, but the direction of travel is cloud engineering, and you'll be executing code across the following tech stack: Azure, Databricks, PySpark and Pandas. DWP Digital is a great place to work, we offer a supportive and inclusive environment where you can grow your career and … make a real difference. Essential criteria: Commercial experience of Databricks, PySpark and Pandas Commercial experience of Azure data engineering tools such as Azure Data Factory, dedicated SQL pools and ADSL Gen 2 Experience of working with data lakes An understanding of dimensional modelling Details. Wages. Perks. You'll join more »
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
DWP Digital
currently on-prem, but the direction of travel is cloud engineering, and you'll be executing code across the following tech stack: Azure, Databricks, PySpark and Pandas. DWP Digital is a great place to work, we offer a supportive and inclusive environment where you can grow your career and … make a real difference. Essential criteria: Commercial experience of Databricks, PySpark and Pandas Commercial experience of Azure data engineering tools such as Azure Data Factory, dedicated SQL pools and ADSL Gen 2 Experience of working with data lakes An understanding of dimensional modelling Details. Wages. Perks. You'll join more »
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
DWP Digital
currently on-prem, but the direction of travel is cloud engineering, and you'll be executing code across the following tech stack: Azure, Databricks, PySpark and Pandas. DWP Digital is a great place to work, we offer a supportive and inclusive environment where you can grow your career and … make a real difference. Essential criteria: Commercial experience of Databricks, PySpark and Pandas Commercial experience of Azure data engineering tools such as Azure Data Factory, dedicated SQL pools and ADSL Gen 2 Experience of working with data lakes An understanding of dimensional modelling Details. Wages. Perks. You'll join more »
Northamptonshire, England, United Kingdom Hybrid / WFH Options
Capgemini
warehousing solutions. Familiarity with DBT (Data Build Tool) for managing transformations in the data pipeline. Strong programming skills in technologies like Python, Scala, Spark, PySpark or Ab Initio, Glue, Starburst, Snowflake, Redshift, Starburst, Hybrid(on-prem to cloud) for building data pipelines and ETL processes. Experience with data pipeline more »
Knutsford, England, United Kingdom Hybrid / WFH Options
Capgemini
warehousing solutions. Familiarity with DBT (Data Build Tool) for managing transformations in the data pipeline. Strong programming skills in technologies like Python, Scala, Spark, PySpark or Ab Initio, Glue, Starburst, Snowflake, Redshift, Starburst, Hybrid(on-prem to cloud) for building data pipelines and ETL processes. Experience with data pipeline more »
Liverpool office 2-3 days a week - rest remote* Senior Data Engineer, Data, Datamodelling, Migration, ETL, ETL Tooling (Informatica IICS & IDMC ) SQL, Python or PySpark, Agile, migrating data from on-prem to cloud. *Informatica (IICS & IDMC) is essential* A top Reinsurance firm are looking for a Senior/Lead … cloud and tooling - Informatica and Azure are highly desired. Data Engineer, Data, Datamodelling, Migration, ETL, ETL Tooling ETL (Informatica IICS & IDMC) SQL, Python or PySpark, Agile, migrating dat... more »
Saffron Walden, Essex, South East, United Kingdom Hybrid / WFH Options
EMBL-EBI
expertise and requirements. You have A BSc or MSc in computer science or related fields. Expertise in Python, including popular Python libraries: NumPy, Pandas, PySpark and frameworks: Django, Django Rest Framework, FastAPI Hands-on experience with both relational (e.g. PostgreSQL) and non-relational databases (e.g. Elasticsearch, Redis). Strong more »
related field Certifications such as Azure Data Engineer Associate are desirable. Knowledge of data ingestion methods for real-time and batch processing Proficiency in PySpark and debugging Apache Spark workloads. What’s in it for you? Annual bonus scheme – up to 10% Excellent pension scheme Flexible working Enhanced family more »