Responsibilities: Engage in the validation and approval sign off of the firm's models across Liquidity Risk, Market Risk, and Counterparty Risk models. Challenge model assumptions, implementations, and mathematical formulations. Review and oversee the monitoring of the performance of models including outcomes, verification, and benchmarking. Understand and communicate the risks of model limitations to senior management. Requirements … Education: PhD/Masters in a finance/mathematical/quantitative field Prior Experience: 3-5 years in modelvalidation of liquidity/market/counterparty risk models. Knowledge: Strong understanding and experience working with ILST/VaR models Technical: Python More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Albion Blake
gather requirements, design data flows, and help implement pricing models, APIs and analytical capabilities that drive better underwriting decisions. Key Responsibilities: Define and document requirements for pricing system integrations, model deployment and API workflows. Translate pricing models (Excel/Python/R) into technical specifications for developers and data engineers. Collaborate with underwriters and actuaries to improve model governance, version control and reporting. Support UAT, modelvalidation and release testing. Deliver insights and metrics around model performance, quote success and underwriting efficiency. Partner with insurers and MGAs to streamline pricing and underwriting processes through automation and data. Required Skills & Experience: 3+ years in a Business Analyst or Technical Analyst role within insurance, reinsurance or … InsurTech. Experience with pricing or rating systems, underwriting workflows or model integration projects. Strong understanding of insurance data structures, rating factors and model governance. Skilled in SQL and BI tools (Power BI, Tableau, Looker). Comfortable working with APIs, data pipelines and collaborating with developers. Excellent communication and stakeholder management skills. Desirable: Exposure to cloud environments (AWS, Azure More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Albion Blake
gather requirements, design data flows, and help implement pricing models, APIs and analytical capabilities that drive better underwriting decisions. Key Responsibilities: Define and document requirements for pricing system integrations, model deployment and API workflows. Translate pricing models (Excel/Python/R) into technical specifications for developers and data engineers. Collaborate with underwriters and actuaries to improve model governance, version control and reporting. Support UAT, modelvalidation and release testing. Deliver insights and metrics around model performance, quote success and underwriting efficiency. Partner with insurers and MGAs to streamline pricing and underwriting processes through automation and data. Required Skills & Experience: 3+ years in a Business Analyst or Technical Analyst role within insurance, reinsurance or … InsurTech. Experience with pricing or rating systems, underwriting workflows or model integration projects. Strong understanding of insurance data structures, rating factors and model governance. Skilled in SQL and BI tools (Power BI, Tableau, Looker). Comfortable working with APIs, data pipelines and collaborating with developers. Excellent communication and stakeholder management skills. Desirable: Exposure to cloud environments (AWS, Azure More ❯
scale machine learning systems that forecast demand, optimise staffing, and improve operational performance across thousands of venues. Lead projects end-to-end, from data design and modelling through to validation, deployment, and monitoring. Develop AI systems across areas such as computer vision, forecasting, optimisation, and emerging generative or agentic models. Partner with engineers to design scalable ML pipelines, APIs More ❯
london (city of london), south east england, united kingdom
Photon
role centers on evaluating analytical workflows, modeling standards, experimentation culture, and applied business impact . This position is ideal for someone with a strong background in applied data science, model lifecycle design, and organizational data maturity capable of analyzing current practices and defining what best-in-class looks like for scalable, responsible, and high-impact data science operations. Key … Responsibilities Practice Maturity Assessment: Evaluate current data science processes, tools, and team structures to determine capability strengths, weaknesses, and improvement areas. Framework Design: Develop and apply a structured maturity model to assess how data science work is conceived, executed, validated, and scaled. Model Lifecycle Review: Assess practices across data preparation, feature engineering, model development, validation, monitoring … . Collaboration & Alignment: Work with AI and Data & AI Architects to connect findings from people, platform, and practice assessments into a unified capability map. Gap Identification: Identify gaps in model governance, documentation, and model-to-business translation and recommend actionable improvement pathways. Reporting & Advisory: Produce detailed reports summarizing data science maturity, practice gaps, and recommendations for scaling responsibly More ❯
london (city of london), south east england, united kingdom
E-Solutions
role centers on evaluating analytical workflows, modeling standards, experimentation culture, and applied business impact . This position is ideal for someone with a strong background in applied data science, model lifecycle design, and organizational data maturity capable of analyzing current practices and defining what best-in-class looks like for scalable, responsible, and high-impact data science operations. Key … Responsibilities Practice Maturity Assessment: Evaluate current data science processes, tools, and team structures to determine capability strengths, weaknesses, and improvement areas. Framework Design: Develop and apply a structured maturity model to assess how data science work is conceived, executed, validated, and scaled. Model Lifecycle Review: Assess practices across data preparation, feature engineering, model development, validation, monitoring … . Collaboration & Alignment: Work with AI and Data & AI Architects to connect findings from people, platform, and practice assessments into a unified capability map. Gap Identification: Identify gaps in model governance, documentation, and model-to-business translation and recommend actionable improvement pathways. Reporting & Advisory: Produce detailed reports summarizing data science maturity, practice gaps, and recommendations for scaling responsibly More ❯
role centers on evaluating analytical workflows, modeling standards, experimentation culture, and applied business impact . This position is ideal for someone with a strong background in applied data science, model lifecycle design, and organizational data maturity — capable of analyzing current practices and defining what “best-in-class” looks like for scalable, responsible, and high-impact data science operations. Key … Responsibilities • Practice Maturity Assessment: Evaluate current data science processes, tools, and team structures to determine capability strengths, weaknesses, and improvement areas. • Framework Design: Develop and apply a structured maturity model to assess how data science work is conceived, executed, validated, and scaled. • Model Lifecycle Review: Assess practices across data preparation, feature engineering, model development, validation, monitoring … . • Collaboration & Alignment: Work with AI and Data & AI Architects to connect findings from people, platform, and practice assessments into a unified capability map. • Gap Identification: Identify gaps in model governance, documentation, and model-to-business translation and recommend actionable improvement pathways. • Reporting & Advisory: Produce detailed reports summarizing data science maturity, practice gaps, and recommendations for scaling responsibly More ❯
role centers on evaluating analytical workflows, modeling standards, experimentation culture, and applied business impact . This position is ideal for someone with a strong background in applied data science, model lifecycle design, and organizational data maturity — capable of analyzing current practices and defining what “best-in-class” looks like for scalable, responsible, and high-impact data science operations. Key … Responsibilities • Practice Maturity Assessment: Evaluate current data science processes, tools, and team structures to determine capability strengths, weaknesses, and improvement areas. • Framework Design: Develop and apply a structured maturity model to assess how data science work is conceived, executed, validated, and scaled. • Model Lifecycle Review: Assess practices across data preparation, feature engineering, model development, validation, monitoring … . • Collaboration & Alignment: Work with AI and Data & AI Architects to connect findings from people, platform, and practice assessments into a unified capability map. • Gap Identification: Identify gaps in model governance, documentation, and model-to-business translation and recommend actionable improvement pathways. • Reporting & Advisory: Produce detailed reports summarizing data science maturity, practice gaps, and recommendations for scaling responsibly More ❯
City of London, London, United Kingdom Hybrid / WFH Options
PIXIE
migration projects, ensuring data accuracy and completeness. Design, develop, and maintain dashboards and reports using Power BI (experience with Tableau is a plus). Perform data modeling, mapping, and validation to support business intelligence and reporting requirements. Communicate complex technical concepts effectively to both technical and non-technical stakeholders. Collaborate with cross-functional teams to identify opportunities for process More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
PIXIE
migration projects, ensuring data accuracy and completeness. Design, develop, and maintain dashboards and reports using Power BI (experience with Tableau is a plus). Perform data modeling, mapping, and validation to support business intelligence and reporting requirements. Communicate complex technical concepts effectively to both technical and non-technical stakeholders. Collaborate with cross-functional teams to identify opportunities for process More ❯
Provide ongoing system support, resolve issues, and perform debugging and testing. • Ensure platform stability and security across multiple environments. Data Management & Quality • Design and manage data models, queries, and validation rules to ensure integrity and reliability. • Support data migration and ensure compliance with governance standards. Collaboration & Documentation • Work closely with stakeholders, project managers, and other IT specialists to translate More ❯
london (city of london), south east england, united kingdom
nLighten
Provide ongoing system support, resolve issues, and perform debugging and testing. • Ensure platform stability and security across multiple environments. Data Management & Quality • Design and manage data models, queries, and validation rules to ensure integrity and reliability. • Support data migration and ensure compliance with governance standards. Collaboration & Documentation • Work closely with stakeholders, project managers, and other IT specialists to translate More ❯
a global Lakehouse platform (Data Lake + OLAP) in AWS Develop scalable data pipelines in Python , working with SDKs and data libs Own end-to-end data modeling , ingestion, validation, and optimization for high-concurrency access Tune performance of cross-region, multi-format data stores (columnar, real-time, etc.) Deliver tailored solutions directly to quants and traders – real impact More ❯
london (city of london), south east england, united kingdom
NJF Global Holdings Ltd
a global Lakehouse platform (Data Lake + OLAP) in AWS Develop scalable data pipelines in Python , working with SDKs and data libs Own end-to-end data modeling , ingestion, validation, and optimization for high-concurrency access Tune performance of cross-region, multi-format data stores (columnar, real-time, etc.) Deliver tailored solutions directly to quants and traders real impact More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Experis UK
challenges. Collaborate with data scientists to transition prototypes into production-ready systems . Develop and maintain end-to-end ML pipelines for data ingestion, training, testing, and deployment. Optimise model performance, scalability, and reliability using MLOps best practices. Work with large-scale structured and unstructured datasets for model training and validation. Implement model monitoring, versioning, and retraining … e.g., MLflow, Kubeflow, SageMaker, Vertex AI). Experience with data engineering concepts — ETL pipelines, data lakes, and cloud data platforms. Proficiency with cloud services (AWS, Azure, or GCP) for model deployment and orchestration. Knowledge of containerization and orchestration tools (Docker, Kubernetes). Experience integrating ML models into production environments via APIs or microservices. Excellent problem-solving, analytical, and communication … skills. Preferred Qualifications Bachelor’s or Master’s degree in Computer Science , Data Science , Mathematics , or a related field. Familiarity with CI/CD pipelines for ML model deployment. Exposure to natural language processing (NLP) , computer vision , or reinforcement learning projects. Experience working in Agile/Scrum environments. Contract Details Location: Hybrid – London (onsite 2–3 days per week More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Experis UK
challenges. Collaborate with data scientists to transition prototypes into production-ready systems . Develop and maintain end-to-end ML pipelines for data ingestion, training, testing, and deployment. Optimise model performance, scalability, and reliability using MLOps best practices. Work with large-scale structured and unstructured datasets for model training and validation. Implement model monitoring, versioning, and retraining … e.g., MLflow, Kubeflow, SageMaker, Vertex AI). Experience with data engineering concepts — ETL pipelines, data lakes, and cloud data platforms. Proficiency with cloud services (AWS, Azure, or GCP) for model deployment and orchestration. Knowledge of containerization and orchestration tools (Docker, Kubernetes). Experience integrating ML models into production environments via APIs or microservices. Excellent problem-solving, analytical, and communication … skills. Preferred Qualifications Bachelor’s or Master’s degree in Computer Science , Data Science , Mathematics , or a related field. Familiarity with CI/CD pipelines for ML model deployment. Exposure to natural language processing (NLP) , computer vision , or reinforcement learning projects. Experience working in Agile/Scrum environments. Contract Details Location: Hybrid – London (onsite 2–3 days per week More ❯
the following experience & knowledge: Ideally you will have experience leading/mentoring a team, but ideally you will still be a hands-on data scientist, particularly with experience in modelvalidation & setting up risk frameworks. Preferably you will have a background working in the financial services industry , either institutional or regulatory side, or a consultant supporting financial institutions More ❯
london (city of london), south east england, united kingdom
Investigo
the following experience & knowledge: Ideally you will have experience leading/mentoring a team, but ideally you will still be a hands-on data scientist, particularly with experience in modelvalidation & setting up risk frameworks. Preferably you will have a background working in the financial services industry , either institutional or regulatory side, or a consultant supporting financial institutions More ❯
City of London, London, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
and preprocess structured and unstructured data from multiple internal and external sources. Perform exploratory data analysis (EDA) to identify trends, patterns, and anomalies. Design and implement data pipelines for model-ready datasets in collaboration with data engineering teams. Apply feature engineering and selection techniques to improve model accuracy and interpretability. Develop and validate machine learning and statistical models … models using appropriate metrics and perform hyperparameter tuning for optimal performance. Convert proof-of-concept models into production-grade pipelines in collaboration with MLOps and engineering teams. Required: Translate model outcomes into actionable insights through clear storytelling and visualizations. Build dashboards and reports using Power BI, Tableau, or Python-based visualization tools. Communicate findings to both technical and non … Partner with business analysts, architects, and domain experts to define use cases and success metrics. Contribute to the enterprise AI roadmap, bringing thought leadership on analytical methodologies. Document methodologies, model logic, and validation results for audit and reproducibility. Participate in Agile ceremonies, sprint planning, and client showcases. If you'd like to discuss this data scientist role in More ❯
Perform exploratory data analysis (EDA) to uncover trends and anomalies Design and implement data pipelines in collaboration with data engineering teams Apply feature engineering and selection techniques to enhance model performance Build and validate ML models for prediction, classification, clustering, and optimization Use libraries such as Scikit-learn, TensorFlow, and PyTorch for supervised and unsupervised learning Implement NLP, time … and machine learning Strong Python skills and familiarity with ML libraries (Scikit-learn, TensorFlow, PyTorch) Experience with data visualization tools (Power BI, Tableau, Matplotlib, Seaborn) Ability to translate complex model outputs into actionable business insights Excellent communication skills with both technical and non-technical audiences Familiarity with Agile methodologies and cross-functional collaboration Background in banking or financial services More ❯
City of London, London, United Kingdom Hybrid / WFH Options
AVENSYS CONSULTING (UK) LTD
data stores using Python, Java, or Node.js backends. Collaborate with architects to define scalable and secure AI service architectures. Experience in the following Implementing AI/ML pipelines for model training, validation, and deployment (using tools such as MLflow, Vertex AI, or Azure ML). Manage model evaluation, drift monitoring, and continuous improvement processes. Optimize inference performance … and cost (e.g., model compression, quantization, API optimization). Ensure compliance with AI ethics, security, and governance standards. Prepare and curate training datasets (structured/unstructured text, images, code). Apply data preprocessing, tokenization, and embedding generation techniques. Work with vector databases (Pinecone, Weaviate, FAISS, Chroma) for semantic retrieval use cases. Partner with business stakeholders to identify and shape More ❯
City of London, London, United Kingdom Hybrid / WFH Options
MIDDLE8
ethics, and integration, shaping our innovation roadmap. Additionally, you will optimize algorithms with tools like Python, TensorFlow, PyTorch, scikit-learn, and cloud services (e.g., AWS SageMaker), including data analysis, model training, and validation. You will address issues in model accuracy, bias, and integration, complying with data privacy regulations while supporting growth. Finally, you will track AI trends and More ❯
ETL pipelines, manage cloud-based and on-prem data environments, and integrate diverse datasets from external vendors, trading systems and internal sources. The role also involves implementing automated data validation processes, optimising data quality and performance, and ensuring high availability across all systems. Working closely with the portfolio manager, researchers and engineers, you will enable data-driven decision-making … tech firm Strong hands-on expertise in Python and modern ETL frameworks Experience designing and maintaining cloud-based data pipelines (e.g. AWS, Airflow, Snowflake) Deep understanding of data modelling, validation, and pipeline resilience Familiarity with financial or alternative datasets preferred More ❯
london (city of london), south east england, united kingdom
Radley James
ETL pipelines, manage cloud-based and on-prem data environments, and integrate diverse datasets from external vendors, trading systems and internal sources. The role also involves implementing automated data validation processes, optimising data quality and performance, and ensuring high availability across all systems. Working closely with the portfolio manager, researchers and engineers, you will enable data-driven decision-making … tech firm Strong hands-on expertise in Python and modern ETL frameworks Experience designing and maintaining cloud-based data pipelines (e.g. AWS, Airflow, Snowflake) Deep understanding of data modelling, validation, and pipeline resilience Familiarity with financial or alternative datasets preferred More ❯
implement ETL workflows to manage data extraction, transformation, and loading into Teradata. Build and maintain risk models for credit, operational, and market risk analysis. Conduct data profiling, cleansing, and validation to ensure accuracy and consistency. Collaborate with stakeholders to gather requirements and deliver data-driven solutions. Perform exploratory data analysis to uncover trends and support risk mitigation. Automate reporting … and dashboard creation using Python and BI tools. Optimize Teradata queries and ETL performance for efficient data processing. Document data flows, model logic, and technical specifications for transparency. Ensure compliance with data governance and contribute to continuous improvement initiatives. Your Profile: 3+ years of experience in data analysis and ETL development. Strong proficiency in Teradata SQL and Informatica PowerCenter. More ❯