are met. Continuous Improvement: Foster a culture of learning and optimization across teams and processes. Mentorship & Quality: Mentor team members and uphold quality standards through best practices in testing, validation, and delivery governance. Key Requirements Proven leadership experience with cross-functional teams Strong problem-solving abilities, especially with technical challenges Solid grasp of technology trends and ability to learn quickly More ❯
concepts Explore Retrieval Augmented Generation (RAG) methods Solve industry relevant problems Design and implement artificial intelligence systems and applications that can simulate human intelligence processes through the creation and validation of algorithms, neural networks, and other machine learning techniques. Collaborate with cross-functional teams to ensure seamless integration of software components. Stay updated with the latest trends and advancements in More ❯
concepts Explore Retrieval Augmented Generation (RAG) methods Solve industry relevant problems Design and implement artificial intelligence systems and applications that can simulate human intelligence processes through the creation and validation of algorithms, neural networks, and other machine learning techniques. Collaborate with cross-functional teams to ensure seamless integration of software components. Stay updated with the latest trends and advancements in More ❯
them into technical solutions. Ensure the scalability, reliability, and performance of Guidewire systems. Provide technical guidance and support to development teams throughout the project lifecycle. Conduct system testing and validation to ensure high-quality deliverables. Stay updated with the latest Guidewire technologies and industry trends. Requirements: Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience More ❯
like Power BI or Tableau Design and run efficient SQL queries to extract and manipulate data Develop and maintain BI tools and solutions to support decision-making Perform data validation and quality control to ensure integrity and accuracy Deliver ad hoc analysis and actionable insights to stakeholders 💼 Required Skills & Experience: 3+ years in a Data Analyst role within complex business More ❯
re looking for: A strong background in Statistics, Mathematics, Economics, Data Science, or a related field A number of years of experience working within Risk Modelling, Risk Management, Risk Validation Proven experience with statistical software, ideally in Python or R Experience with advanced analytical techniques, including machine learning and predictive modelling Industry knowledge of forecasting in Automotive/Finance/ More ❯
like Power BI or Tableau Design and run efficient SQL queries to extract and manipulate data Develop and maintain BI tools and solutions to support decision-making Perform data validation and quality control to ensure integrity and accuracy Deliver ad hoc analysis and actionable insights to stakeholders 💼 Required Skills & Experience: 3+ years in a Data Analyst role within complex business More ❯
re looking for: A strong background in Statistics, Mathematics, Economics, Data Science, or a related field A number of years of experience working within Risk Modelling, Risk Management, Risk Validation Proven experience with statistical software, ideally in Python or R Experience with advanced analytical techniques, including machine learning and predictive modelling Industry knowledge of forecasting in Automotive/Finance/ More ❯
performance and scalability. Key responsibilities Design and build robust ETL pipelines using Python and AWS services Own and maintain Airflow workflows Ensure high data quality through rigorous testing and validation Analyse and understand complex data sets before pipeline design Collaborate with stakeholders to translate business requirements into data solutions Monitor and improve pipeline performance and reliability Maintain documentation of systems More ❯
performance and scalability. Key responsibilities Design and build robust ETL pipelines using Python and AWS services Own and maintain Airflow workflows Ensure high data quality through rigorous testing and validation Analyse and understand complex data sets before pipeline design Collaborate with stakeholders to translate business requirements into data solutions Monitor and improve pipeline performance and reliability Maintain documentation of systems More ❯
support experimentation and model development Develop and deploy machine learning models in production environments Present insights and results to technical and non-technical stakeholders Apply statistical methods and model validation techniques to reach accurate conclusions Work with data engineers to ensure clean, usable datasets Contribute to ethical, privacy-aware, and user-centric data products Stay curious, research new methods, and More ❯
support experimentation and model development Develop and deploy machine learning models in production environments Present insights and results to technical and non-technical stakeholders Apply statistical methods and model validation techniques to reach accurate conclusions Work with data engineers to ensure clean, usable datasets Contribute to ethical, privacy-aware, and user-centric data products Stay curious, research new methods, and More ❯
3rd hire, making it a rare opportunity to join a consultancy with significant growth potential. THE ROLE You will be doing the following daily: Lead the design, development, and validation of credit risk models, with a primary focus on IFRS 9. Have an in-depth understanding of Basel and IFRS 9 regulatory frameworks, ensuring full compliance throughout the model development More ❯
This role is ideal for someone experienced in natural catastrophe modelling and confident working with large geospatial and financial loss datasets. You’ll contribute to the development, calibration, and validation of loss models that project climate-related physical impacts such as floods, subsidence, storms, droughts, and wildfires — helping clients assess and manage climate risks effectively. Key Responsibilities Develop loss models More ❯
This role is ideal for someone experienced in natural catastrophe modelling and confident working with large geospatial and financial loss datasets. You’ll contribute to the development, calibration, and validation of loss models that project climate-related physical impacts such as floods, subsidence, storms, droughts, and wildfires — helping clients assess and manage climate risks effectively. Key Responsibilities Develop loss models More ❯
process. Development of quantitative models for the evaluation of complex structured deals, support originators/traders in the development and implementation of trading strategies incorporating such models. Development and validation of quantitative models for use in transaction valuation and risk measurement within the Commercial Risk. Risk data management - Ensure data in Risk systems are accurate, complete and accessible. Risk Platform More ❯
process. Development of quantitative models for the evaluation of complex structured deals, support originators/traders in the development and implementation of trading strategies incorporating such models. Development and validation of quantitative models for use in transaction valuation and risk measurement within the Commercial Risk. Risk data management - Ensure data in Risk systems are accurate, complete and accessible. Risk Platform More ❯
Build strong relationships across the business, working with finance, IT, and vendors to resolve issues and align on priorities. UAT & Testing: Coordinate and support User Acceptance Testing and system validation activities. Change & Training: Create training materials, support user on-boarding, and contribute to change management planning to drive adoption. Reporting & Governance: Provide reporting and status updates to senior stakeholders, including More ❯
ML Platform - Build and evaluate models using SageMaker, Bedrock, Glue, Athena, and Redshift Knowledge Transfer - Create documentation and mentor teams on MLOps best practices Full ML Lifecycle - Manage training, validation, versioning, deployment, monitoring, and governance API Development - Develop secure APIs using Apigee for enterprise AI functionality access Automation - Build CI/CD pipelines using Jenkins and Maven for ML project More ❯
new feature creations - Work closely with business owners and operations staff to optimize various business operations - Establish scalable, efficient, automated processes for large scale data analyses, model development, model validation and model implementation - Mentor other scientists and engineers in the use of ML techniques BASIC QUALIFICATIONS - Experience programming in Java, C++, Python or related language - Experience with SQL and an More ❯
Experience with Avaloq: Familiarity with Avaloq's physical data layer is essential Knowledge of data quality frameworks: Understanding of profiling dimensions and industry-standard approaches (e.g., DQ checks, data validation rules) Nice to have Experience with Avaloq Reporting & Data Extracts (ARD, ADF, SmartView): Familiarity with Avaloq-specific data export or reporting mechanisms would help accelerate development. Knowledge of Oracle Database More ❯
new feature creations - Work closely with business owners and operations staff to optimize various business operations - Establish scalable, efficient, automated processes for large scale data analyses, model development, model validation and model implementation - Mentor other scientists and engineers in the use of ML techniques About the team The India Machine Learning team works closely with the business and engineering teams More ❯
Vulnerability Management, managing partners and vendor deliverables, and developing a strategy for a world-class Vulnerability Management program. Your team will perform discovery scanning, risk assessments, mitigation activities, continuous validation, and lessons learned workshops to improve processes across Group Security and Verticals. We seek a diligent, dedicated, creative, and motivated individual with excellent communication skills, capable of building relationships with More ❯
team. Primary duties will include: Analyze and interpret complex transaction reporting requirements across multiple regulatory frameworks (EMIR, MiFID, SFTR, CFTC, etc.), with focus on field population, data transformation, and validation rules Develop and maintain comprehensive data mapping documentation and data lineage diagrams, tracking data flows and transformations across various systems and products Drive end-to-end investigation efforts to identify More ❯
decision-making. Design, build, and deliver group-wide reporting programmes , primarily using Power BI , aligned with leadership priorities and business strategy. Ensure data quality and governance , maintaining the accuracy, validation, and trustworthiness of insights used across the business. Provide executive-level analytics, including: Forecasting support to the CFO Strategic planning and hypothesis testing with the CEO Bespoke data support for More ❯