Data is the new Oil Are you passionate about data? Does the prospect of dealing with massive volumes of data excite you? Do you want to build big-data solutions that process billions of records a day in a scalable More ❯
and manage data solutions that align with business needs and industry standards. The ideal candidate will have expertise in Java, SQL, Python, and Spark (PySpark & SparkSQL) while also being comfortable working with Microsoft Power Platform. Experience with Microsoft Purview is a plus. The role requires strong communication skills to … data standards. Key Responsibilities: 1. Data Architecture & Engineering Design and implement scalable data architectures that align with business objectives. Work with Java, SQL, Python, PySpark, and SparkSQL to build robust data pipelines. Develop and maintain data models tailored to organizational needs. Reverse-engineer data models from existing live systems. More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
trg.recruitment
Rate: Up to £600 per day 📆 Contract: 6 months (Outside IR35, potential to go perm) 🛠 Tech Stack: Azure Data Factory, Synapse, Databricks, Delta Lake, PySpark, Python, SQL, Event Hub, Azure ML, MLflow We’ve partnered with a new AI-first professional services consultancy that’s taking on the Big … and supporting team capability development What You Need: ✔ 5+ years in data engineering or backend cloud development ✔ Strong Python, SQL, and Databricks skills (especially PySpark & Delta Lake) ✔ Deep experience with Azure: Data Factory, Synapse, Event Hub, Azure Functions ✔ Understanding of MLOps tooling like MLflow and integration with AI pipelines More ❯
london, south east england, United Kingdom Hybrid / WFH Options
trg.recruitment
Rate: Up to £600 per day 📆 Contract: 6 months (Outside IR35, potential to go perm) 🛠 Tech Stack: Azure Data Factory, Synapse, Databricks, Delta Lake, PySpark, Python, SQL, Event Hub, Azure ML, MLflow We’ve partnered with a new AI-first professional services consultancy that’s taking on the Big … and supporting team capability development What You Need: ✔ 5+ years in data engineering or backend cloud development ✔ Strong Python, SQL, and Databricks skills (especially PySpark & Delta Lake) ✔ Deep experience with Azure: Data Factory, Synapse, Event Hub, Azure Functions ✔ Understanding of MLOps tooling like MLflow and integration with AI pipelines More ❯
of data for critical client projects. How to develop and enhance your knowledge of agile ways of working and working in open source stack (PySpark/PySql). Quality engineering professionals utilize Accenture delivery assets to plan and implement quality initiatives to ensure solution quality throughout delivery. As a … practices and contribute to data analytics insights and visualization concepts, methods, and techniques. We are looking for experience in the following skills: Palantir PythonPySpark/PySQL AWS or GCP Set yourself apart: Palantir Certified Data Engineer Certified cloud data engineering (preferably AWS) What's in it for you More ❯
Senior Data Engineer Wiltshire - 3 days in office £65,000 About The Company The company operates in both B2B and D2C markets, providing food solutions to institutions and individuals. With over 30 years of experience and a presence in 400 More ❯
of data for critical client projects. How to develop and enhance your knowledge of agile ways of working and working in open source stack (PySpark/PySQL). Quality engineering professionals utilize Accenture delivery assets to plan and implement quality initiatives to ensure solution quality throughout delivery. As a … team members to provide regular progress updates and raise any risk/concerns/issues. Core skills we're working with include: Palantir PythonPySpark/PySQL AWS or GCP What's in it for you: At Accenture, in addition to a competitive basic salary, you will also have More ❯
contract data engineers to supplement existing team during implementation phase of new data platform. Main Duties and Responsibilities: Write clean and testable code using PySpark and SparkSQL scripting languages, to enable our customer data products and business applications. Build and manage data pipelines and notebooks, deploying code in a … Experience: Excellent understanding of Data Lakehouse architecture built on ADLS. Excellent understanding of data pipeline architectures using ADF and Databricks. Excellent coding skills in PySpark and SQL. Excellent technical governance experience such as version control and CI/CD. Strong understanding of designing, constructing, administering, and maintaining data warehouses More ❯
Senior Data Analyst - Pricing Data Engineering & Automation, CUO Global Pricing Let's care for tomorrow. Whether it's aircraft, international business, offshore wind parks or Hollywood film productions, Allianz Commercial has an extensive range of risks covered when it comes More ❯
Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
Corecom Consulting
modern tech in a collaborative environment. What You’ll Do: Design, build, and maintain scalable data pipelines Optimize and automate data workflows Work with PySpark, Python, and SQL to process and manage large datasets Collaborate in a cloud-based environment to deliver efficient and reliable data solutions What We … re Looking For: Proven experience with Python, PySpark, and SQL Strong understanding of data engineering principles and cloud infrastructure Ability to work collaboratively and communicate technical concepts clearly A passion for clean, efficient, and scalable code Why Join Us? Supportive team environment with a strong focus on innovation Opportunities More ❯
Newcastle upon Tyne, Tyne & Wear Hybrid / WFH Options
Client Server
e.g. Data Science, Mathematics, Statistics, Physics, Computer Science, Informatics or Engineering You have strong experience with analytics and data manipulation software e.g. R, Python, PySpark, SAS, SQL, SPSS You take a consultative approach and have polished communication and stakeholder management skills You're able to work independently and take … and inclusive environment Health and wellbeing support Volunteering opportunities Pension Apply now to find out more about this Data Scientist/Consultant (R PythonPySpark SAS SQL SPSS) opportunity. At Client Server we believe in a diverse workplace that allows people to play to their strengths and continually learn. More ❯
newcastle-upon-tyne, tyne and wear, north east england, United Kingdom Hybrid / WFH Options
Client Server
e.g. Data Science, Mathematics, Statistics, Physics, Computer Science, Informatics or Engineering You have strong experience with analytics and data manipulation software e.g. R, Python, PySpark, SAS, SQL, SPSS You take a consultative approach and have polished communication and stakeholder management skills You're able to work independently and take … and inclusive environment Health and wellbeing support Volunteering opportunities Pension Apply now to find out more about this Data Scientist/Consultant (R PythonPySpark SAS SQL SPSS) opportunity. At Client Server we believe in a diverse workplace that allows people to play to their strengths and continually learn. More ❯
with data governance policies, industry regulations, and best practices to protect data privacy and security. Innovation: Stay up-to-date with the latest Databricks, PySpark, and PowerBi features and industry trends to continuously improve our data capabilities ensuring relevance to a Cyber Security Value Added Distributor. Qualifications: Education: Bachelor … of experience in data management, analytics, data architecture, and AI, with at least 5 years in a leadership role. Technical Skills: Proficiency in Databricks, PySpark, Knowledge Graphs, Neo4J (Graph database and analytics), Power BI, SSRS, Azure Data Factory, and AI technologies. Strong understanding of data architecture, ETL processes, and More ❯
london, south east england, united kingdom Hybrid / WFH Options
Careerwise
with data governance policies, industry regulations, and best practices to protect data privacy and security. Innovation: Stay up-to-date with the latest Databricks, PySpark, and PowerBi features and industry trends to continuously improve our data capabilities ensuring relevance to a Cyber Security Value Added Distributor. Qualifications: Education: Bachelor … of experience in data management, analytics, data architecture, and AI, with at least 5 years in a leadership role. Technical Skills: Proficiency in Databricks, PySpark, Knowledge Graphs, Neo4J (Graph database and analytics), Power BI, SSRS, Azure Data Factory, and AI technologies. Strong understanding of data architecture, ETL processes, and More ❯
Job Description: Engineering Lead - GCP Data Key Responsibilities: 12+ years of overall IT experience with 10+ years in building data warehouse/datamart solutions. Experience in implementing an end-to-end data platform for analytics on cloud from ingestion to More ❯
with cross-functional teams to define requirements and deliver high-quality software solutions. Experience using; Typescript, React and Node. Implementing data processing pipelines using PySpark for efficient data analysis. Integrating with AWS services and utilizing cloud infrastructure for scalable and reliable applications. Conducting code reviews, debugging, and troubleshooting to … NHS Strong knowledge of AWS services, particularly in designing and implementing solutions using services like Lambda, S3, EC2, and Redshift. Hands-on experience with PySpark for big data processing and analysis. Familiarity with database technologies, such as PostgreSQL, MySQL, or MongoDB. Solid understanding of software development principles, design patterns … to developing innovative health tech solutions in a collaborative and dynamic environment. If you are a talented Python Developer with expertise in AWS and PySpark, we would love to hear from you. To apply, please submit your updated CV or Email for more information. Please note that this position More ❯
leeds, west yorkshire, yorkshire and the humber, United Kingdom
Brio Digital
with cross-functional teams to define requirements and deliver high-quality software solutions. Experience using; Typescript, React and Node. Implementing data processing pipelines using PySpark for efficient data analysis. Integrating with AWS services and utilizing cloud infrastructure for scalable and reliable applications. Conducting code reviews, debugging, and troubleshooting to … NHS Strong knowledge of AWS services, particularly in designing and implementing solutions using services like Lambda, S3, EC2, and Redshift. Hands-on experience with PySpark for big data processing and analysis. Familiarity with database technologies, such as PostgreSQL, MySQL, or MongoDB. Solid understanding of software development principles, design patterns … to developing innovative health tech solutions in a collaborative and dynamic environment. If you are a talented Python Developer with expertise in AWS and PySpark, we would love to hear from you. To apply, please submit your updated CV or Email dom@briodigital.io for more information. Please note that More ❯
Preston On The Hill, Cheshire, United Kingdom Hybrid / WFH Options
SF Recruitment
just 1 day a week onsite at a modern North West HQ. Modern Tech Stack: Get hands-on with Power BI, Microsoft Fabric, Python (PySpark), and SQL. High-Impact Projects: From predictive analytics to performance optimisation, your work will drive measurable results across retail, supply chain, and beyond. Why … modern North West HQ (easily commutable from Cheshire, Liverpool, Manchester, Stoke). Modern Tech Stack: Get hands-on with Power BI, Microsoft Fabric, Python (PySpark), and SQL-no legacy reporting here. High-Impact Projects: From predictive analytics to performance optimisation, your work will drive measurable results across retail, supply More ❯
Runcorn, Preston on the Hill, Cheshire, United Kingdom Hybrid / WFH Options
SF Recruitment
just 1 day a week onsite at a modern North West HQ. Modern Tech Stack: Get hands-on with Power BI, Microsoft Fabric, Python (PySpark), and SQL. High-Impact Projects: From predictive analytics to performance optimisation, your work will drive measurable results across retail, supply chain, and beyond. Why … modern North West HQ (easily commutable from Cheshire, Liverpool, Manchester, Stoke). Modern Tech Stack: Get hands-on with Power BI, Microsoft Fabric, Python (PySpark), and SQL-no legacy reporting here. High-Impact Projects: From predictive analytics to performance optimisation, your work will drive measurable results across retail, supply More ❯
Luton, England, United Kingdom Hybrid / WFH Options
easyJet
literacy and capabilities within the team. Requirements of the Role • Experience in a commercial environment, with a strong background in Python, SQL, and preferably PySpark • Comprehensive understanding of the data science product lifecycle, from development to production. • Proven ability in building relationships, ownership, delivery, and developing talent. • Excellent communication … skills, able to simplify complex concepts and influence without authority. • A collaborative spirit, valuing collective success over individual achievements. Essential Skills: • Expertise in Python, PySpark, SQL. • Strong experience in building and optimising machine learning models. • Demonstrated ability to lead initiatives and projects with minimal supervision Benefits: Competitive base salary More ❯
Luton, south east england, united kingdom Hybrid / WFH Options
easyJet
literacy and capabilities within the team. Requirements of the Role • Experience in a commercial environment, with a strong background in Python, SQL, and preferably PySpark • Comprehensive understanding of the data science product lifecycle, from development to production. • Proven ability in building relationships, ownership, delivery, and developing talent. • Excellent communication … skills, able to simplify complex concepts and influence without authority. • A collaborative spirit, valuing collective success over individual achievements. Essential Skills: • Expertise in Python, PySpark, SQL. • Strong experience in building and optimising machine learning models. • Demonstrated ability to lead initiatives and projects with minimal supervision Benefits: Competitive base salary More ❯
luton, bedfordshire, east anglia, united kingdom Hybrid / WFH Options
easyJet
literacy and capabilities within the team. Requirements of the Role • Experience in a commercial environment, with a strong background in Python, SQL, and preferably PySpark • Comprehensive understanding of the data science product lifecycle, from development to production. • Proven ability in building relationships, ownership, delivery, and developing talent. • Excellent communication … skills, able to simplify complex concepts and influence without authority. • A collaborative spirit, valuing collective success over individual achievements. Essential Skills: • Expertise in Python, PySpark, SQL. • Strong experience in building and optimising machine learning models. • Demonstrated ability to lead initiatives and projects with minimal supervision Benefits: Competitive base salary More ❯
Luton, south west england, united kingdom Hybrid / WFH Options
easyJet
literacy and capabilities within the team. Requirements of the Role • Experience in a commercial environment, with a strong background in Python, SQL, and preferably PySpark • Comprehensive understanding of the data science product lifecycle, from development to production. • Proven ability in building relationships, ownership, delivery, and developing talent. • Excellent communication … skills, able to simplify complex concepts and influence without authority. • A collaborative spirit, valuing collective success over individual achievements. Essential Skills: • Expertise in Python, PySpark, SQL. • Strong experience in building and optimising machine learning models. • Demonstrated ability to lead initiatives and projects with minimal supervision Benefits: Competitive base salary More ❯
ll work closely with clients to understand their customer behaviour through deep data analysis and predictive modelling. You’ll leverage tools such as Python, PySpark, SQL , and Hadoop to build and deploy models that influence customer strategy across areas like propensity, churn, segmentation , and more. Key responsibilities include: Developing … considered for this role, you will ideally have: Solid experience working with customer data and applying predictive modelling techniques Proficiency in SQL , Python/PySpark , and exposure to big data environments such as Hadoop Commercial experience in the FMCG or retail space is highly desirable Previous experience working in More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
GMA Consulting
Data Science Manager - Bristol THIS IS A HYBRID ROLE, YOU MUST BE ABLE TO COMMUTE TO BRISTOL EXCELLENT BENEFITS PACKAGE AND WORKING ENVIRONMENT The Company: The company is a leader in its field and is an Insurance business with an More ❯