london (city of london), south east england, united kingdom
Investigo
the following experience & knowledge: Ideally you will have experience leading/mentoring a team, but ideally you will still be a hands-on data scientist, particularly with experience in modelvalidation & setting up risk frameworks. Preferably you will have a background working in the financial services industry , either institutional or regulatory side, or a consultant supporting financial institutions More ❯
key contributor to our forward looking research roadmap, including the build out of new systematic franchises in nature, biodiversity, adjacent sustainability thematics and beyond. YOUR ROLE 1) Quantitative Research & Model Development Design innovative new quantitative models and metrics, extend, and maintain quantitative models, metrics and investment frameworks, with rigorous back‐testing, scenario design, and attribution . Integrate new indicators … and alternative datasets; formalise feature engineering and signal decay/robustness analysis; implement model risk controls , documentation, and reproducibility. Scale and commercialise proprietary metrics for investment use‐cases and new revenue lines (e.g., indices, APIs/subscriptions, data products). 2) Data Operations & Quality Control (Automation‐First) Work with systematic team and IT to automate ingestion, validation, and … lineage/audit trails ; support data procurement and budgeting. 3) Platform Development & Process Management Co-develop with systematic team and IT a scalable data/quant architecture (data pipelines, model services, APIs) and embed SRE practices (observability, resilience, cost efficiency). Lead automation of portfolio alignment and sustainability reporting ; maintain production health, troubleshoot incidents, and drive continuous improvement. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Lorien
aligning priorities to measurable business outcomes and clear benefits tracking Understand and communicate core AI technologies keeping abreast of latest changes and innovations Work with data teams to validate model outputs, data quality and performance metrics Ability to identify opportunities for automation and AI driven innovation within existing systems and processes Strong understanding of AI technologies and data lifecycle More ❯
role centers on evaluating analytical workflows, modeling standards, experimentation culture, and applied business impact . This position is ideal for someone with a strong background in applied data science, model lifecycle design, and organizational data maturity — capable of analyzing current practices and defining what “best-in-class” looks like for scalable, responsible, and high-impact data science operations. Key … Responsibilities • Practice Maturity Assessment: Evaluate current data science processes, tools, and team structures to determine capability strengths, weaknesses, and improvement areas. • Framework Design: Develop and apply a structured maturity model to assess how data science work is conceived, executed, validated, and scaled. • Model Lifecycle Review: Assess practices across data preparation, feature engineering, model development, validation, monitoring … . • Collaboration & Alignment: Work with AI and Data & AI Architects to connect findings from people, platform, and practice assessments into a unified capability map. • Gap Identification: Identify gaps in model governance, documentation, and model-to-business translation and recommend actionable improvement pathways. • Reporting & Advisory: Produce detailed reports summarizing data science maturity, practice gaps, and recommendations for scaling responsibly More ❯
role centers on evaluating analytical workflows, modeling standards, experimentation culture, and applied business impact . This position is ideal for someone with a strong background in applied data science, model lifecycle design, and organizational data maturity — capable of analyzing current practices and defining what “best-in-class” looks like for scalable, responsible, and high-impact data science operations. Key … Responsibilities • Practice Maturity Assessment: Evaluate current data science processes, tools, and team structures to determine capability strengths, weaknesses, and improvement areas. • Framework Design: Develop and apply a structured maturity model to assess how data science work is conceived, executed, validated, and scaled. • Model Lifecycle Review: Assess practices across data preparation, feature engineering, model development, validation, monitoring … . • Collaboration & Alignment: Work with AI and Data & AI Architects to connect findings from people, platform, and practice assessments into a unified capability map. • Gap Identification: Identify gaps in model governance, documentation, and model-to-business translation and recommend actionable improvement pathways. • Reporting & Advisory: Produce detailed reports summarizing data science maturity, practice gaps, and recommendations for scaling responsibly More ❯
City of London, London, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
and preprocess structured and unstructured data from multiple internal and external sources. Perform exploratory data analysis (EDA) to identify trends, patterns, and anomalies. Design and implement data pipelines for model-ready datasets in collaboration with data engineering teams. Apply feature engineering and selection techniques to improve model accuracy and interpretability. Develop and validate machine learning and statistical models … models using appropriate metrics and perform hyperparameter tuning for optimal performance. Convert proof-of-concept models into production-grade pipelines in collaboration with MLOps and engineering teams. Required: Translate model outcomes into actionable insights through clear storytelling and visualizations. Build dashboards and reports using Power BI, Tableau, or Python-based visualization tools. Communicate findings to both technical and non … Partner with business analysts, architects, and domain experts to define use cases and success metrics. Contribute to the enterprise AI roadmap, bringing thought leadership on analytical methodologies. Document methodologies, model logic, and validation results for audit and reproducibility. Participate in Agile ceremonies, sprint planning, and client showcases. If you'd like to discuss this data scientist role in More ❯
Perform exploratory data analysis (EDA) to uncover trends and anomalies Design and implement data pipelines in collaboration with data engineering teams Apply feature engineering and selection techniques to enhance model performance Build and validate ML models for prediction, classification, clustering, and optimization Use libraries such as Scikit-learn, TensorFlow, and PyTorch for supervised and unsupervised learning Implement NLP, time … and machine learning Strong Python skills and familiarity with ML libraries (Scikit-learn, TensorFlow, PyTorch) Experience with data visualization tools (Power BI, Tableau, Matplotlib, Seaborn) Ability to translate complex model outputs into actionable business insights Excellent communication skills with both technical and non-technical audiences Familiarity with Agile methodologies and cross-functional collaboration Background in banking or financial services More ❯
Perform exploratory data analysis (EDA) to uncover trends and anomalies Design and implement data pipelines in collaboration with data engineering teams Apply feature engineering and selection techniques to enhance model performance Build and validate ML models for prediction, classification, clustering, and optimization Use libraries such as Scikit-learn, TensorFlow, and PyTorch for supervised and unsupervised learning Implement NLP, time … and machine learning Strong Python skills and familiarity with ML libraries (Scikit-learn, TensorFlow, PyTorch) Experience with data visualization tools (Power BI, Tableau, Matplotlib, Seaborn) Ability to translate complex model outputs into actionable business insights Excellent communication skills with both technical and non-technical audiences Familiarity with Agile methodologies and cross-functional collaboration Background in banking or financial services More ❯
Perform exploratory data analysis (EDA) to uncover trends and anomalies Design and implement data pipelines in collaboration with data engineering teams Apply feature engineering and selection techniques to enhance model performance Build and validate ML models for prediction, classification, clustering, and optimization Use libraries such as Scikit-learn, TensorFlow, and PyTorch for supervised and unsupervised learning Implement NLP, time … and machine learning Strong Python skills and familiarity with ML libraries (Scikit-learn, TensorFlow, PyTorch) Experience with data visualization tools (Power BI, Tableau, Matplotlib, Seaborn) Ability to translate complex model outputs into actionable business insights Excellent communication skills with both technical and non-technical audiences Familiarity with Agile methodologies and cross-functional collaboration Background in banking or financial services More ❯
City of London, London, United Kingdom Hybrid / WFH Options
AVENSYS CONSULTING (UK) LTD
data stores using Python, Java, or Node.js backends. Collaborate with architects to define scalable and secure AI service architectures. Experience in the following Implementing AI/ML pipelines for model training, validation, and deployment (using tools such as MLflow, Vertex AI, or Azure ML). Manage model evaluation, drift monitoring, and continuous improvement processes. Optimize inference performance … and cost (e.g., model compression, quantization, API optimization). Ensure compliance with AI ethics, security, and governance standards. Prepare and curate training datasets (structured/unstructured text, images, code). Apply data preprocessing, tokenization, and embedding generation techniques. Work with vector databases (Pinecone, Weaviate, FAISS, Chroma) for semantic retrieval use cases. Partner with business stakeholders to identify and shape More ❯
ethics, and integration, shaping our innovation roadmap. Additionally, you will optimize algorithms with tools like Python, TensorFlow, PyTorch, scikit-learn, and cloud services (e.g., AWS SageMaker), including data analysis, model training, and validation. You will address issues in model accuracy, bias, and integration, complying with data privacy regulations while supporting growth. Finally, you will track AI trends and More ❯
ethics, and integration, shaping our innovation roadmap. Additionally, you will optimize algorithms with tools like Python, TensorFlow, PyTorch, scikit-learn, and cloud services (e.g., AWS SageMaker), including data analysis, model training, and validation. You will address issues in model accuracy, bias, and integration, complying with data privacy regulations while supporting growth. Finally, you will track AI trends and More ❯
East London, London, United Kingdom Hybrid / WFH Options
MIDDLE8
ethics, and integration, shaping our innovation roadmap. Additionally, you will optimize algorithms with tools like Python, TensorFlow, PyTorch, scikit-learn, and cloud services (e.g., AWS SageMaker), including data analysis, model training, and validation. You will address issues in model accuracy, bias, and integration, complying with data privacy regulations while supporting growth. Finally, you will track AI trends and More ❯
City of London, London, United Kingdom Hybrid / WFH Options
MIDDLE8
ethics, and integration, shaping our innovation roadmap. Additionally, you will optimize algorithms with tools like Python, TensorFlow, PyTorch, scikit-learn, and cloud services (e.g., AWS SageMaker), including data analysis, model training, and validation. You will address issues in model accuracy, bias, and integration, complying with data privacy regulations while supporting growth. Finally, you will track AI trends and More ❯
ethics, and integration, shaping our innovation roadmap. Additionally, you will optimize algorithms with tools like Python, TensorFlow, PyTorch, scikit-learn, and cloud services (e.g., AWS SageMaker), including data analysis, model training, and validation. You will address issues in model accuracy, bias, and integration, complying with data privacy regulations while supporting growth. Finally, you will track AI trends and More ❯
Central London / West End, London, United Kingdom Hybrid / WFH Options
MIDDLE8
ethics, and integration, shaping our innovation roadmap. Additionally, you will optimize algorithms with tools like Python, TensorFlow, PyTorch, scikit-learn, and cloud services (e.g., AWS SageMaker), including data analysis, model training, and validation. You will address issues in model accuracy, bias, and integration, complying with data privacy regulations while supporting growth. Finally, you will track AI trends and More ❯
ETL pipelines, manage cloud-based and on-prem data environments, and integrate diverse datasets from external vendors, trading systems and internal sources. The role also involves implementing automated data validation processes, optimising data quality and performance, and ensuring high availability across all systems. Working closely with the portfolio manager, researchers and engineers, you will enable data-driven decision-making … tech firm Strong hands-on expertise in Python and modern ETL frameworks Experience designing and maintaining cloud-based data pipelines (e.g. AWS, Airflow, Snowflake) Deep understanding of data modelling, validation, and pipeline resilience Familiarity with financial or alternative datasets preferred More ❯
ETL pipelines, manage cloud-based and on-prem data environments, and integrate diverse datasets from external vendors, trading systems and internal sources. The role also involves implementing automated data validation processes, optimising data quality and performance, and ensuring high availability across all systems. Working closely with the portfolio manager, researchers and engineers, you will enable data-driven decision-making … tech firm Strong hands-on expertise in Python and modern ETL frameworks Experience designing and maintaining cloud-based data pipelines (e.g. AWS, Airflow, Snowflake) Deep understanding of data modelling, validation, and pipeline resilience Familiarity with financial or alternative datasets preferred More ❯
ETL pipelines, manage cloud-based and on-prem data environments, and integrate diverse datasets from external vendors, trading systems and internal sources. The role also involves implementing automated data validation processes, optimising data quality and performance, and ensuring high availability across all systems. Working closely with the portfolio manager, researchers and engineers, you will enable data-driven decision-making … tech firm Strong hands-on expertise in Python and modern ETL frameworks Experience designing and maintaining cloud-based data pipelines (e.g. AWS, Airflow, Snowflake) Deep understanding of data modelling, validation, and pipeline resilience Familiarity with financial or alternative datasets preferred More ❯
london (city of london), south east england, united kingdom
Radley James
ETL pipelines, manage cloud-based and on-prem data environments, and integrate diverse datasets from external vendors, trading systems and internal sources. The role also involves implementing automated data validation processes, optimising data quality and performance, and ensuring high availability across all systems. Working closely with the portfolio manager, researchers and engineers, you will enable data-driven decision-making … tech firm Strong hands-on expertise in Python and modern ETL frameworks Experience designing and maintaining cloud-based data pipelines (e.g. AWS, Airflow, Snowflake) Deep understanding of data modelling, validation, and pipeline resilience Familiarity with financial or alternative datasets preferred More ❯
implement ETL workflows to manage data extraction, transformation, and loading into Teradata. Build and maintain risk models for credit, operational, and market risk analysis. Conduct data profiling, cleansing, and validation to ensure accuracy and consistency. Collaborate with stakeholders to gather requirements and deliver data-driven solutions. Perform exploratory data analysis to uncover trends and support risk mitigation. Automate reporting … and dashboard creation using Python and BI tools. Optimize Teradata queries and ETL performance for efficient data processing. Document data flows, model logic, and technical specifications for transparency. Ensure compliance with data governance and contribute to continuous improvement initiatives. Your Profile: 3+ years of experience in data analysis and ETL development. Strong proficiency in Teradata SQL and Informatica PowerCenter. More ❯
implement ETL workflows to manage data extraction, transformation, and loading into Teradata. Build and maintain risk models for credit, operational, and market risk analysis. Conduct data profiling, cleansing, and validation to ensure accuracy and consistency. Collaborate with stakeholders to gather requirements and deliver data-driven solutions. Perform exploratory data analysis to uncover trends and support risk mitigation. Automate reporting … and dashboard creation using Python and BI tools. Optimize Teradata queries and ETL performance for efficient data processing. Document data flows, model logic, and technical specifications for transparency. Ensure compliance with data governance and contribute to continuous improvement initiatives. Your Profile: 3+ years of experience in data analysis and ETL development. Strong proficiency in Teradata SQL and Informatica PowerCenter. More ❯
implement ETL workflows to manage data extraction, transformation, and loading into Teradata. Build and maintain risk models for credit, operational, and market risk analysis. Conduct data profiling, cleansing, and validation to ensure accuracy and consistency. Collaborate with stakeholders to gather requirements and deliver data-driven solutions. Perform exploratory data analysis to uncover trends and support risk mitigation. Automate reporting … and dashboard creation using Python and BI tools. Optimize Teradata queries and ETL performance for efficient data processing. Document data flows, model logic, and technical specifications for transparency. Ensure compliance with data governance and contribute to continuous improvement initiatives. Your Profile: 3+ years of experience in data analysis and ETL development. Strong proficiency in Teradata SQL and Informatica PowerCenter. More ❯
london (city of london), south east england, united kingdom
Capgemini
implement ETL workflows to manage data extraction, transformation, and loading into Teradata. Build and maintain risk models for credit, operational, and market risk analysis. Conduct data profiling, cleansing, and validation to ensure accuracy and consistency. Collaborate with stakeholders to gather requirements and deliver data-driven solutions. Perform exploratory data analysis to uncover trends and support risk mitigation. Automate reporting … and dashboard creation using Python and BI tools. Optimize Teradata queries and ETL performance for efficient data processing. Document data flows, model logic, and technical specifications for transparency. Ensure compliance with data governance and contribute to continuous improvement initiatives. Your Profile: 3+ years of experience in data analysis and ETL development. Strong proficiency in Teradata SQL and Informatica PowerCenter. More ❯
with a primary focus on deriving new markets and enhancing existing offerings. Possess a deep understanding of complex statistical distributions and leverage techniques such as Monte Carlo simulations in model development. Rigorously back test and validate models to ensure their robustness, accuracy, and profitability in real-world betting scenarios. Drive and lead quantitative modelling initiatives, with a particular focus … findings and project outcomes clearly and persuasively to both technical and non-technical stakeholders, including senior leadership. Create basic reports and visualisations using tools such as Tableau to communicate model performance and insights. Required Skills and Experience: Proven experience as a Quantitative Analyst/Modeller with a track record of successfully leading and delivering impactful quantitative models in a … including a strong understanding of complex statistical distributions and Monte Carlo simulations. Highly proficient in Python for all modelling, analysis, and data manipulation work. Strong experience in back testing, validation, and performance evaluation of quantitative models. Solid understanding of the end-to-end model development and deployment lifecycle in a production environment. Experience in deriving markets for various More ❯