on experience or strong interest in working with Foundry as a core platform Forward-Deployed Engineering – delivering real time solutions alongside users and stakeholders Broader Skillsets of Interest: Python & PySpark – for data engineering and workflow automation Platform Engineering – building and maintaining scalable, resilient infrastructure Cloud (AWS preferred) – deploying and managing services in secure environments Security Engineering & Access Control – designing More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Gerrard White
predictive modelling techniques; Logistic Regression, GBMs, Elastic Net GLMs, GAMs, Decision Trees, Random Forests, Neural Nets and Clustering Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW's Radar and Emblem software is preferred Proficient at communicating results in More ❯
teams as part of a wider trading project. The initial work on the project will involve abstracting code from these product teams into a shared, common python library leveraging PySpark/dataframes. You will then be serving as an extension of these product teams building microservices and libraries to solve the common needs. Skills: • Experience with Unit Testing • Preferably More ❯
Azure Data Engineer - 1/2 days onsite Summary: Join a team building a modern Azure-based data platform. This hands-on engineering role involves designing and developing scalable, automated data pipelines using tools like Data Factory, Databricks, Synapse, and More ❯
Data Analyst (SQL, PySpark, Python) London - Hybrid (2 Days Onsite) Inside IR35 - Daily Rate Our client is seeking an experienced Data Analyst to support a key client engagement within their professional services division. This is an initial 1 year contract based in London , operating on a hybrid basis (2 days onsite per week) and falling inside IR35 . Key … Responsibilities: Independently manage data extraction, cleansing, and transformation activities Develop production-grade analytics code using SQL, PySpark, and Python Analyze large-scale datasets to generate actionable insights and visualizations Collaborate across technical and business teams to drive data-driven decision making Required Experience: 3–5 years' hands-on experience in data analysis, insight generation, and data visualization Proven expertise … in SQL and PySpark , with solid working knowledge of Python Background working with big data platforms , large datasets, and pipelines Strong analytical thinking and a collaborative, self-directed working style Experience in the Payments , FinTech , or financial services sector is highly desirable, though not mandatory More ❯