Permanent PySpark Jobs in the UK

1 to 25 of 80 Permanent PySpark Jobs in the UK

Senior Data Engineer

South East London, England, United Kingdom
Prism Digital
Job DescriptionLead Azure Data Engineer | PySpark (Python) & Synapse | Tech for Good/CharityRate: £500-650 per dayDuration: 6-12 monthsIR35: OutsideLocation: Remote with occasional travel to London (Once every two months max)Essential skills required:Azure – solid experience required of the Azure Data ecosystemPython - ESSENTIAL as PySpark is … another governance tool experience. Familiar with building Catalogs and lineageThis is an urgent contract, so if you are interested apply ASAP.Lead Azure Data Engineer | PySpark (Python), Synapse, Data Lake | Tech for Good/Charity more »
Posted:

Lead Data Scientist

Exeter, England, United Kingdom
Staffworx Limited
Ability to optimise workflows and analysis for map reduce processing Experience with BI software (Power BI, Tableau, Qlik Sense) Any experience with data engineering, PySpark, Databricks, Delta Lakes beneficial Confident presenting complex problems in ways suitable to target audience Experience leading or managing a small analysis team Familiarity working more »
Posted:

Senior Data Engineer

South East London, England, United Kingdom
Hybrid / WFH Options
MBN Solutions
understanding of Quality and Information Security principlesExperience with Azure, ETL Tools such as ADF and DatabricksAdvanced Database and SQL skills, alng with SQL, Python, Pyspark, Spark SQLStrong understanding of data model design and implementation principlesData warehousing design patterns and implementationBenefits:£50-£60k DOE Mainly home based working. Twice a more »
Posted:

Senior IICS & IDMC Engineer (Data Engineer) - Liverpool - £90,000

Liverpool, England, United Kingdom
Tiger Resourcing Group
a week in the Liverpool office - rest remote**** Senior Data Engineer, Data, Data Modelling, Migration, ETL, ETL Tooling (Informatica IICS & IDMC ) SQL, Python or Pyspark, Agile, migrating data from on-prem to cloud. **** Informatica Cloud (IICS & IDMC ) is essential - NOT PowerCenter**** A top insurance firm are looking for a … e.g. Scrum, SAFe) and tools (i.e. Jira, AzureDevOps). Data Engineer, Data, Data Modelling, Migration, ETL, ETL Tooling (Informatica IICS & IDMC ) SQL, Python or Pyspark, Agile, migrating data from on-prem to cloud. Seniority Level Mid-Senior level Industry Insurance Financial Services Employment Type Full-time Job Functions Information more »
Posted:

Senior Data Engineer

London Area, United Kingdom
Hybrid / WFH Options
HENI
scientific Python toolset. Our tech stack includes Airbyte for data ingestion, Prefect for pipeline orchestration, AWS Glue for managed ETL, along with Pandas and PySpark for pipeline logic implementation. We utilize Delta Lake and PostgreSQL for data storage, emphasizing the importance of data integrity and version control in our … testing frameworks. Proficiency in Python and familiarity with modern software engineering practices, including 12factor, CI/CD, and Agile methodologies. Deep understanding of Spark (PySpark), Python (Pandas), orchestration software (e.g. Airflow, Prefect) and databases, data lakes and data warehouses. Experience with cloud technologies, particularly AWS Cloud services, with a more »
Posted:

Data Engineer

London Area, United Kingdom
Harrington Starr
start interviewing ASAP. Responsibilities: Azure Cloud Data Engineering using Azure Databricks Data Warehousing Data Engineering Very strong with the Microsoft Stack ESSENTIAL knowledge of PySpark clusters Python & C# Scripting experience Experience of message queues (Kafka) Experience of containerization (Docker) FINANCIAL SERVICES EXPERIENCE (Energy/commodities trading) If you have more »
Posted:

Python - Spark Big Data Software Engineer

Glasgow, Scotland, United Kingdom
Morgan McKinley
Proficiency in modern programming languages and database querying languages. Comprehensive understanding of the Software Development Life Cycle and Agile methodologies. Familiarity with Python or PySpark and cloud technologies like AWS, Kubernetes, and Spark is highly desirable. Ideal Candidate: Someone with a knack for innovative solutions and a commitment to more »
Posted:

Engineering Delivery Lead (AWS/Snowflake/Python/Scala) - UK based - 6 Month contract + extensions - £450-550/day + negotiable

England, United Kingdom
Orbis Group
experience leading a data engineering team. Key Tech: - AWS (S3, Glue, EMR, Athena, Lambda) - Snowflake, Redshift - DBT (Data Build Tool) - Programming: Python, Scala, Spark, PySpark or Ab Initio - Data pipeline orchestration (Apache Airflow) - Knowledge of SQL This is a 6 month initial contract with a trusted client of ours. more »
Posted:

Machine Learning Engineer

London Area, United Kingdom
Harnham
in STEM subjects. Strong experience in data pipelines and deploying ML models Preference for experience in retail/marketing Tech across: Python, AWS, Databricks, PySpark, AB Testing, MLFlow, APIs Experience in feature engineering and third-party data Apply below more »
Posted:

Lead Data Engineer with Palantir Foundry

United Kingdom
Saransh Inc
such as, Code Repo, Code Workbook, Pipeline Build, migration techniques, Data Connection and Security setup. Design, develop Data Pipelines, and have excellent skills in PySpark and Spark SQL, hands on with code Build and deployment in Palantir. Must lead a team of 6-7 technical associates with PySpark more »
Posted:

Data Engineer

Bournemouth, Dorset, South West, United Kingdom
Hybrid / WFH Options
LV= General Insurance
experience of writing clean, well-documented, and unit-tested python ETL processes to read, transform, and verifying data using tools such as SQL and PySpark Cloud technologies and architecture including Databricks, Kubernetes, Data/Delta Lake, Azure Machine Learning, Data Factory. Azure is preferable, but AWS and GCP experience more »
Employment Type: Permanent, Work From Home
Posted:

Data Engineer

Derby, England, United Kingdom
Mirai Talent
related field Certifications such as Azure Data Engineer Associate are desirable. Knowledge of data ingestion methods for real-time and batch processing Proficiency in PySpark and debugging Apache Spark workloads. What’s in it for you? Annual bonus scheme – up to 10% Excellent pension scheme Flexible working Enhanced family more »
Posted:

Senior Data Engineer

London Area, United Kingdom
Hybrid / WFH Options
MBN Solutions
Quality and Information Security principles Experience with Azure, ETL Tools such as ADF and Databricks Advanced Database and SQL skills, alng with SQL, Python, Pyspark, Spark SQL Strong understanding of data model design and implementation principles Data warehousing design patterns and implementation Benefits : £50-£60k DOE Mainly home based more »
Posted:

Data Engineer

North Yorkshire, England, United Kingdom
KDR Talent Solutions
a complete greenfield re-architecture from the ground up in Microsoft Azure. The Tech you'll be playing with: Azure Data Factory Azure Databricks PySpark SQL DBT What you need to bring: 1-3 year's experience in building Data Pipelines SQL experience in data warehousing Python experience would more »
Posted:

Software Developer

London Area, United Kingdom
Hybrid / WFH Options
Janus Henderson Investors
and ensuring best practices and understood and followed. Technical Skills and Qualifications Expert knowledge in python including libraries/frameworks such as pandas, numpy, pyspark Good understanding of OOP, software design patterns, and SOLID principles Good experience in Docker Good experience in Linux Good experience in Airflow Good knowledge more »
Posted:

Data Engineer

London Area, United Kingdom
Harrington Starr
of Python Experience developing in the cloud (AWS preferred) Solid understanding of libraries like Pandas and NumPy Experience in data warehousing tools like Snowflake, Pyspark, Databricks Commercial experience with performant database programming in SQL Capability to solve complex technical issues, comprehending risks prior to the circumstance. Please apply today more »
Posted:

Data Analytics Consultant

England, United Kingdom
Hybrid / WFH Options
Primus Connect
snowflake schemas. Knowledge of DevOps practices within a Power BI environment. Familiarity with Microsoft Fabric & Databricks. SQL databases expertise, data engineering with Python and PySpark, and knowledge of geospatial concepts and tools. As part of this engagement, you will work on initiatives that redefine business efficiency through AI. You more »
Posted:

Data Engineer

London Area, United Kingdom
Hybrid / WFH Options
Durlston Partners
building a massively distributed cloud-hosted data platform that would be used by the entire firm. Tech Stack: Python/Java, Dremio, Snowflake, Iceberg, PySpark, Glue, EMR, dbt and Airflow/Dagster. Cloud: AWS, Lambdas, ECS services This role would focus on various areas of Data Engineering including: End more »
Posted:

Data Engineer

London Area, United Kingdom
Stanbrook Consulting
or similar frameworks. Problem-Solving : Strong problem-solving skills and the ability to think quickly in a dynamic environment. Technical Skills: Proficiency in Python, PySpark, Synapse, DataBricks Strong SQL skills. CI/CD practices PyTest or similar Desirable Skills and Experience: Azure/Cloud services Jenkins, Octopus, and Git. more »
Posted:

Data Analyst

Harlow, England, United Kingdom
AutoProtect
to the ideas and delivery of the strategy; Support data queries in SQL (T-SQL/ANSI-SQL) and support data pipelines using in PySpark/Python, Databricks and AWS (Athena, Glue, S3); Analyse data needs and coordinate new data requests and data change requests. Work with clients to … Pivot Charts). Experience of supporting Data Warehousing. Basic SQL experience and understanding XML/JSON files. Basic knowledge/experience of either Python, PySpark, R, Scala etc. Experience using SQL, PowerBI, Tableau or similar tools. Preferred: Knowledge and experience of Financial Systems Support (Access Dimensions) or ServiceNow Support … and Administration. Strong knowledge of using SQL, PowerBI, Tableau etc. Strong knowledge of using Python, PySpark, R, Scala etc. Experience of supporting IT Applications and/or Platforms. Experience of cloud data solutions (AWS, Google, Microsoft Azure), AWS preferred. Degree in Business Analytics or Technology, Computer Science, Math, Statistics more »
Posted:

Senior Data Engineer

London Area, United Kingdom
Prism Digital
Lead Azure Data Engineer | PySpark (Python) & Synapse | Tech for Good/Charity Rate: £500-650 per day Duration: 6-12 months IR35: Outside Location: Remote with occasional travel to London (Once every two months max) Essential skills required: Azure – solid experience required of the Azure Data ecosystem Python - ESSENTIAL … as PySpark is used heavily. You will be tested on PySpark Azure Synapse - ESSENTIAL as is used heavily Spark Azure Data Lake/Data Bricks/Data Factory Be happy to act as a lead and mentor to the other permanent Azure Data Engineers This is the chance … tool experience. Familiar with building Catalogs and lineage This is an urgent contract, so if you are interested apply ASAP. Lead Azure Data Engineer | PySpark (Python), Synapse, Data Lake | Tech for Good/Charity more »
Posted:

Senior Data Engineer

Greater Manchester, England, United Kingdom
Mexa Solutions
Senior Data Engineer Up to £70k plus bonus Manchester Are you looking to take your Data Engineer career to the next level?? This company use extremely modern technologies, and you can be certain you will grow within a technical environment. more »
Posted:

Spark Architect

Leeds, England, United Kingdom
PRACYVA
months to begin with & its extendableLocation: Leeds, UK (min 3 days onsite)Context: Legacy ETL code for example DataStage is being refactored into PySpark using Prophecy low-code no-code and available converters.Converted code is causing failures/performance issues.Skills:Spark Architecture – component understanding around Spark Data Integration (PySpark … SQL, Spark Explain plans.Spark SME – Be able to analyse Spark code failures through Spark Plans and make correcting recommendations.Spark SME – Be able to review PySpark and Spark SQL jobs and make performance improvement recommendations.Spark – SME Be able to understand Data Frames/Resilient Distributed Data Sets and understand any … there are Cluster level failures.Cloudera (CDP) – Knowledge of understanding how Cloudera Spark is set up and how the run time libraries are used by PySpark code.Prophecy – High level understanding of Low-Code No-Code prophecy set up and its use to generate PySpark code. more »
Posted:

Data Engineer

Bristol, Avon, South West, United Kingdom
Hybrid / WFH Options
LV= General Insurance
experience of writing clean, well-documented, and unit-tested python ETL processes to read, transform, and verifying data using tools such as SQL and PySpark Cloud technologies and architecture including Databricks, Kubernetes, Data/Delta Lake, Azure Machine Learning, Data Factory. Azure is preferable, but AWS and GCP experience more »
Employment Type: Permanent, Work From Home
Posted:

Data Engineer

Birmingham, England, United Kingdom
Hybrid / WFH Options
Lorien
and experience of this is a strong preference. However other Cloud platFforms like AWS/GCP are acceptable.Coding - Experience using Python with data (Pandas, PySpark) would be an advantage. Other languages such as C# would be beneficial but not essential.What’s next?If you believe you have the desired more »
Posted:
PySpark
10th Percentile
£52,500
25th Percentile
£57,500
Median
£70,000
75th Percentile
£87,500
90th Percentile
£100,000