London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
integration. Proficient in SQL for complex analytical transformations and optimisations. Comfortable working in agile teams and using Azure DevOps for CI/CD workflows. Nice to Have Python or PySpark for automation and data quality testing. Knowledge of data governance and security frameworks (RBAC, masking, encryption). Prior experience in financial services or insurance environments. All candidates must complete More ❯
and development plan beyond generic certifications. Provide a Rough Order of Magnitude (ROM) cost for implementing the proposed roadmap. Essential Deep expertise in the Databricks Lakehouse Platform, including Python, PySpark, and advanced SQL. Strong practical knowledge of Microsoft Fabric. Proven experience in senior, client-facing roles with a consultancy mindset. Background in technical coaching, mentorship, or skills assessment. Excellent More ❯
data platforms, and integrations, while ensuring solutions meet regulatory standards and align with architectural best practices. Key Responsibilities: Build and optimise scalable data pipelines using Databricks and Apache Spark (PySpark). Ensure performance, scalability, and compliance. Collaborate on requirements, design, and backlog refinement. Promote engineering best practices including CI/CD, code reviews, and testing. Research and introduce new More ❯
Reigate, Surrey, England, United Kingdom Hybrid/Remote Options
esure Group
and influence decisions. Strong understanding of data models and analytics; exposure to predictive modelling and machine learning is a plus. Proficient in SQL and Python, with bonus points for PySpark, SparkSQL, and Git. Skilled in data visualisation with tools such as Tableau or Power BI. Confident writing efficient code and troubleshooting sophisticated queries. Clear and adaptable communicator, able to More ❯
data models and transformation pipelines using Databricks, Azure, and Power BI to turn complex datasets into reliable, insight-ready assets. You'll apply strong skills in SQL, Python, and PySpark to build efficient ELT workflows and ensure data quality, performance, and governance. Collaboration will be key as you partner with analysts and business teams to align data models with More ❯
Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT orchestration for structured and unstructured data Proficiency in: PySpark, T-SQL, Notebooks and advanced data manipulation Performance monitoring and orchestration of Fabric solutions Power BI semantic models and Fabric data modelling DevOps deployment using ARM/Bicep templates More ❯
high-impact systems. Line management or mentoring experience, with a genuine commitment to team growth and wellbeing. Strong hands-on skills in: AWS (or equivalent cloud platforms) Python/PySpark for data engineering and automation TypeScript, Node.js, React.js for full-stack development Solid grasp of distributed systems design, secure coding, and data privacy principles. Familiarity with fraud detection models More ❯
explain commercial impact. Understanding of ML Ops vs DevOps and broader software engineering standards. Cloud experience (any platform). Previous mentoring experience. Nice to have: Snowflake or Databricks Spark, PySpark, Hadoop or similar big data tooling BI exposure (PowerBI, Tableau, etc.) Interview Process Video call - high-level overview and initial discussion In-person technical presentation - based on a provided More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Step 2 Recruitment LTD
PowerPoint presentations/reports and presenting to clients or colleagues Industry experience in the retail banking or wider financial services sector Additional technical experience in any of the following – PySpark, Microsoft Azure, VBA, HTML/CSS, JavaScript, JQuery, SQL, PHP, Power Automate, Power BI What we offer A highly competitive salary A genuinely compelling profit share scheme, with the More ❯
for senior leadership as needed. 2. Technical Leadership AWS Expertise : Hands-on experience with AWS services, scalable data solutions, and pipeline design. Strong coding skills in Python , SQL , and pySpark . Optimize data platforms and enhance operational efficiency through innovative solutions. Nice to Have : Background in software delivery, with a solid grasp of CI/CD pipelines and DataOps More ❯
money laundering, and financial crime across global platforms. The role includes direct line management of 5 engineers. We are looking for: Strong Full Stack development skills on Python and PySpark, Typescript (ideally also Node.JS/React.JS) and AWS (Or other cloud provider) as a Technical Lead or Senior Engineer. Line Management experience. The client are looking to offer up More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Opus Recruitment Solutions Ltd
Strong expertise in Power BI – dashboarding, reporting, and data visualisation Advanced SQL skills for querying and data manipulation Experience with Databricks for scalable data processing Desirable Skills Familiarity with PySpark for distributed data processing More ❯
Oxford, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Data Engineer Salary: Up to £75,000 I am working with a forward-thinking organisation that is modernising its data platform to support scalable analytics and business intelligence across the Group. With a strong focus on Microsoft technologies and cloud More ❯
Reading, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Data Engineer Salary: Up to £75,000 I am working with a forward-thinking organisation that is modernising its data platform to support scalable analytics and business intelligence across the Group. With a strong focus on Microsoft technologies and cloud More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
CODEVERSE LIMITED
flexible) Type: Permanent MANDATORY TECHNICAL SKILLS: Core Requirements: Databricks - Hands-on experience Snowflake - Practical knowledge Microsoft Fabric - Data Engineering & Data Visualization (includes Power BI under MS Fabric license) Python, PySpark & SQL - Strong expertise mandatory Note: MS Fabric includes Data Engineering and Power BI capabilities (sold under MS Fabric license), so experience with Fabric Data Engineering or Power BI development … is essential. KEY POINTS: These are the most in-demand skills in the UK market currently Strong Python, PySpark, and SQL expertise is mandatory - please confirm candidates have solid hands-on experience Hybrid working with only 3-4 days per month in London office (very flexible remote) Looking for candidates with proven experience across all three platforms: Databricks, Snowflake More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
and creating scalable AI workflows that bring real business impact. What you'll do: Deploy and customise a powerful data platform for global clients Build and optimise pipelines using PySpark, Python, and SQL Design scalable AI workflows with tools like Palantir Collaborate with client teams to deliver data-driven outcomes What we're looking for: 2-4 years' experience … in data engineering or analytics Hands-on with PySpark, Python, and SQL A proactive problem-solver who thrives in a fast-moving startup Excellent communication and stakeholder skills Why join: £50,000-£75,000 + share options Hybrid working (2-3 days/week in Soho) Highly social, collaborative culture with regular events Work alongside top industry leaders shaping More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
driven team of world-class technologists and business leaders working at the forefront of innovation and real-world impact. In this role, you will: Deliver complex data projects using PySpark and modern data tools Build scalable generative AI workflows using modern infrastructure Collaborate cross-functionally to ensure seamless delivery and adoption Drive innovation and continuous improvement across client engagements … To be successful in this role, you will have: Proven experience in data engineering or integration Strong proficiency in Python and PySpark Exposure to generative AI platforms or a passion for building AI-powered solutions Ability to lead client delivery in dynamic, fast-paced environments Familiarity with tools like Airflow, Databricks or DBT is a plus What's on More ❯
Location: Guildford – 5 days per week onsite What you’ll be doing: Design, build, and optimise scalable data pipelines and architecture Work hands-on with PySpark, Databricks, Azure, and Data Lake to deliver high-performance solutions Translate business needs into technical requirements and data-driven solutions Ensure best practice in data governance, quality, and security Manage and mentor engineers … developing and scaling the data environment What you’ll need: Proven experience as a Data Engineer or Senior Data Engineer with some team leadership responsibilities Strong technical expertise with PySpark, Databricks, Azure, and Data Lake Deep understanding of data architecture and ETL processes in cloud environments The ability to balance coding and technical delivery with people management Excellent problem More ❯
Location: Guildford – 5 days per week onsite What you’ll be doing: Design, build, and optimise scalable data pipelines and architecture Work hands-on with PySpark, Databricks, Azure, and Data Lake to deliver high-performance solutions Translate business needs into technical requirements and data-driven solutions Ensure best practice in data governance, quality, and security Manage and mentor engineers … developing and scaling the data environment What you’ll need: Proven experience as a Data Engineer or Senior Data Engineer with some team leadership responsibilities Strong technical expertise with PySpark, Databricks, Azure, and Data Lake Deep understanding of data architecture and ETL processes in cloud environments The ability to balance coding and technical delivery with people management Excellent problem More ❯
Seeking a hands-on data platform architect/engineer to reverse-engineer a legacy solution (currently on a VM) and migrate it to Microsoft Fabric. The goal is to stabilize critical data processes and lay the groundwork for a modular More ❯
Seeking a hands-on data platform architect/engineer to reverse-engineer a legacy solution (currently on a VM) and migrate it to Microsoft Fabric. The goal is to stabilize critical data processes and lay the groundwork for a modular More ❯