a dynamic self-starter to join their innovative team in the heart of London. If you thrive in a fast-paced environment and want to contribute to cutting-edge data science applications, this is the perfect opportunity for you! About Client: Our client is a leading global payments technology company that enables businesses of all sizes to accept payments … geographies. They handle billions of transactions annually, offering secure, scalable solutions that support innovation in the financial ecosystem. With deep industry expertise, they empower merchants to grow through advanced data insights and reliable infrastructure. What You'll Do : Design, develop, and deploy machine learning models that solve real-world problems. Collaborate with cross-functional teams to improve production systems … and ensure seamless integration of ML models. Drive the ML life cycle from dataingestion to model deployment and monitoring. Implement CI/CD pipelines and leverage DevOps practices to enhance operational efficiency. What You Need : A relevant BSc/MSc degree in a related field with strong industry experience. Proven expertise in ML engineering and MLOps, with More ❯
investment adviser. Overview: We are looking for a Risk Analytics Manager to join our Risk team. This role is critical in ensuring the integrity of both input and output data across our in-house and third-party risk systems. The ideal candidate will have a strong background in risk management, combined with hands-on experience in software development and … data engineering. Responsibilities: Develop, validate, and maintain risk reporting tools and dashboards. Design and enhance risk metrics and reports to support trading decisions and risk oversight. Collaborate with stakeholders across trading, technology, and risk to understand evolving requirements and continuously improve the risk analytics and reporting framework. Oversee the quality and consistency of data used in internal and … external risk systems and processes. Troubleshoot and improve dataingestion, transformation, and analysis pipelines. Keep track of and communicate the key assumptions and methods used in the analytics. Requirements: 5+ years of experience in a risk analytics or similar role within a financial institution. Strong programming and scripting skills in Python and SQL. Experience in Django and JavaScript More ❯
Social network you want to login/join with: We’re seeking a proactive Trading Developer to architect and enhance the data and process backbone of our trading environment. You’ll design and implement backend APIs, real-time data pipelines, and monitoring frameworks—ensuring reliable, low-latency access to market and operational data. While your core focus is … tasks and responsibilities API & Service Development Design, develop, and maintain robust internal RESTful and event-driven APIs. Establish best practices for versioning, documentation, security, and monitoring. Build and manage data streaming pipelines using frameworks such as Spark, Kafka, or equivalents. Ensure high-availability and low-latency delivery of market and industry data. Deploy and operate services on Azure using … Functions, App Services, AKS (Kubernetes), and related services. DataIngestion & Management Ingest diverse data sources (exchange feeds, third-party APIs, FIX/WebSocket interfaces). Implement ETL processes and data models to drive downstream analytics and trading strategies. Uptime & Monitoring Implement health checks, alerting, and observability (metrics, logs, tracing) to maintain 24/7 uptime. Rapidly More ❯
Bloomberg runs on data. Our products are fuelled by powerful information. We combine data and context to paint the whole picture for our clients, around the clock - from around the world. In Data, we are responsible for delivering this data, news and analytics through innovative technology - quickly and accurately. We apply problem-solving skills to identify innovative … solutions to enhance our systems, products and processes - all while providing platinum customer support to our clients. The Role: The Content Acquisition team is an integral part of the Data department. We lead the governance, cost, and quality of the data we acquire and we collaborate with the wider company to develop our data acquisition strategies ensuring … that they trust us with their valuable content Establish, maintain and improve relationships with Content Partners to develop and implement technologies/workflows and facilitate efficient onboarding of new data sets Collaborate with our team of motivated account managers, technical specialists, relationship managers and product managers to better understand our clients' data needs and balance the best of More ❯
well as specialist practices Publicis Media Exchange (PMX), Performics, Publicis Sport & Entertainment, Publicis Media Content and NextTECHnow. Together they combine deep expertise in media investment, strategy, insights and analytics, data and technology, commerce, performance marketing and content. Publicis Media is part of Publicis Groupe and is present in more than 100 countries with over 23,500 employees worldwide. Our … nurture engineering talent, and contribute to the growth of our business Responsibilities Guide a team of engineers in developing applications that empower our clients to optimize marketing campaigns through data-driven insights and automated actions, with a specific focus on leveraging LLMs and AI Own the technical roadmap for your team, aligning it with the overall product strategy and … LLMs and AI for various marketing applications, including: Automated content generation, optimization, and personalization Predictive analytics for campaign performance and optimization Customer journey orchestration and personalized messaging Real-time data analysis and actionable insights generation Drive the technical direction of the team, ensuring solutions are built with a focus on modularity, scalability, and maintainability, specifically in the context of More ❯
length of service Sick Pay – Increasing with length of service The Role: We are looking for an experienced SQL Database Administrator. Key Responsibilities: Support and maintain a large-scale data warehouse environment, handling high volumes of structured data. Manage and process incoming data streams from external sources (e.g. cameras and monitoring systems). Develop and support server-side … data processing workflows to ensure timely and accurate data ingestion. Collaborate with 2nd and 3rd line support teams, providing expertise on complex data-related issues. Maintain and optimise the central database infrastructure, ensuring performance, reliability, and scalability. Work closely with stakeholders to deliver custom reports and analytics. Ensure data integrity, transformation, and storage aligns with business … and regulatory requirements. Participate in long-standing, enterprise-level projects requiring consistent support and system knowledge. Troubleshoot data warehouse and integration issues and implement enhancements for improved performance and reliability. Contribute to the design, development, and maintenance of ETL pipelines and reporting systems. Requirements: Microsoft SQL Server Database deployed on-premise and in the cloud. Always On replication to More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Capgemini
tomorrow. Informed and validated by science and data. Superpowered by creativity and design. All underpinned by technology created with purpose. Your Role You will be joining our rapidly growing Data & AI Strategy team within Capgemini Invent, where we are at the forefront of designing and building winning strategies and operating models to transform the way in which organisations across … all industries and sectors leverage data & AI. We're building a successful team of experts across the data & AI lifecycle, and we're looking for talented people to strengthen our position as consulting leaders in data & AI strategy & operating model. Our Managers and Senior Managers marry best-in-class delivery with thought leadership to become trusted advisors … with senior clients. You will be leading teams of data & AI consultants and technical specialists from across Capgemini on the design and implementation of large and complex data & AI operating model work to transform of their data & AI capabilities. We are looking for candidates with consulting experience in one or more of the following sectors: Transport, Energy More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Lorien
Data Engineer (Snowflake) Hybrid working pattern 2 days per week in the London office My client, a large global Insurance company, are seeking a talented AWS Data Engineer with Snowflake experience to join their growing data practice within the UK. As a key member of this team, you will play a pivotal role in designing and implementing … data warehousing solutions using Snowflake and AWS. The ideal candidate must have: Strong experience as an AWS Data Engineer Lead the design and implementation of Snowflake [and Redshift] based data warehousing solutions within an AWS environment Extensive experience in AWS services, e.g. EC2, S3, RDS, DynamoDB, Redshift, Lambda, API Gateway Develop and optimize ETL processes using AWS … services (e.g. AWS Glue, Lambda) to ensure efficient dataingestion, transformation, storage, and cost optimization. Strong SQL skills for complex data queries and transformations ETL Pipe-lining experience Experience with Python AWS and Snowflake Certifications are a plus Insurance experience or Financial services experience advantageous Must have experience of working closely with business teams and communicating with More ❯
a small, high-impact team responsible for the firm's most critical datasets - including corporate actions, fundamentals, and index membership data. Rather than chasing breadth, this team focuses on data depth: building a clean, unified, and intelligent data layer that underpins all major decisions across the business. We build distributed pipelines to ingest, clean, and expose daily datasets … to internal teams. This is an engineering-first team with strong autonomy - ideal for someone who enjoys taking ownership of data quality, system design, and intelligent automation. What You'll Do Own and improve automated dataingestion and processing pipelines Design and maintain distributed systems for reliable, scalable data access Identify and fix data anomalies … using statistical and algorithmic approaches Collaborate directly with internal users across research, trading, and operations Anticipate future data needs and help evolve a minimal, robust data schema Hard Requirements Strong coding skills in a statically typed language (e.g. Go, Java, C++) Solid background in distributed systems and relational databases Comfortable with data analysis; knowledge of statistics or More ❯
Our talented people empower us, and we believe in being part of a team that is open, collaborative, entrepreneurial, passionate, and above all, fun. About the team The Enterprise Data & AI (EDAI) Team, led by the FIS Chief Data & AI Officer (CDAO), is focused on delivering cutting-edge Data, AI, and Machine Learning platforms and models. The … group is organized into four engineering Towers: DataIngestion, Data Engineering, Data Warehousing, and Artificial Intelligence/Machine Learning. These are supported by three enabling Towers: Strategy & Operations, Customer Success, and Data Ops/Compliance. Based between the US, UK, and India, the team is 250 strong and committed to delivering value for FIS through … data and AI. What you will be doing You will play a pivotal role in the success of the new enterprise Data and AI platform developed by the EDAI team. This platform, built with industry-leading technology and the highest security standards, will become the enterprise's new benchmark – allowing FIS to anonymize and commercialize data to More ❯
modern Silicon Valley innovation and time-tested Japanese quality craftsmanship. We leverage these complementary strengths to amplify the capabilities of drivers, foster happiness, and elevate well-being. Team Our data platform team is working on accelerating autonomous driving by providing access to petabytes of data collected by our fleet of autonomous and non-autonomous vehicles. Efficient, fast and … cost-effective access to data at large scale is key to tackle the hardest problems in AD/ADAS, from developing the Machine Learning (ML) models for perception and prediction of human driving patterns, to increasing the sophistication of our validation and simulation by identifying rare and interesting real-world driving situations. The data ecosystem developed by the … London team is a key building block for developing and testing modern AD/ADAS products that will impact millions of customers. Our ML and Data pipelines are built on-top of the open-source Flyte orchestration framework and are deployed to AWS. Pipeline code is written in Python. We use SQS and Kafka to automate data connections More ❯
Hybrid role based in London Role Overview The Data Scientist role will be responsible for designing, developing, and implementing advanced AI and machine learning models, utilizing both traditional and emerging approaches to address complex business challenges. This position is well-suited for with a PhD/Masters or equivalent in Statistics, Machine Learning, or Artificial Intelligence with a passion … including credit, commercial, product, and operations, with a significant emphasis on credit risk. ML Ops Leadership: Define and execute an ML Ops framework to streamline model lifecycle management, including dataingestion, data transformation, model training, deployment, and monitoring. Collaborative Problem Solving: Work with commercial and product teams to align ML solutions with business goals, ensuring risk considerations … compliance with industry standards. Who you are Educational Background: PhD/Master's or equivalent in Statistics, Machine Learning or Artificial Intelligence. Deep Experience: 5+ years of experience in data science, including hands-on ML model development and production deployment. ML & Statistical Expertise: Expert in traditional machine learning and statistical methods (e.g., classification, regression, time series models), with deep More ❯
Quantitative Developer - Java (Risk Technology) Millennium is a top-tier global hedge fund committed to leveraging innovations in technology and data science to solve complex business problems. The Risk Technology team is seeking a Quantitative Developer who will utilize Java, AWS, and data manipulation libraries to deliver data-driven risk management solutions for stakeholders such as Portfolio … London, and Singapore to develop risk analytics solutions for various asset classes (fixed-income, commodities, equities, etc.). Build, enhance, and maintain Java REST services and related systems. Develop dataingestion pipelines and core data systems to enable risk management analytics access via programmatic interfaces and web applications. Create and manage cloud applications on AWS. Work with … solutions to Portfolio and Risk Managers. Required skills/experience: Strong analytical and mathematical skills with interest or experience in quantitative finance. Good understanding of design patterns, algorithms, and data structures. Significant experience with modern Java. Experience with REST APIs and cloud services. Relational SQL database development experience. Unix/Linux command-line proficiency. Ability to work independently in More ❯
you will play a key role in ensuring that critical business services have the capacity and resilience to meet regulatory and operational demands. You will focus on developing robust data pipelines and solutions that support capacity monitoring, reporting, and forecasting across complex IT environments. Role Overview: Develop ETL processes to extract and normalise capacity data from infrastructure sources. … Automate dataingestion and processing into SQL Server databases. Normalising raw infrastructure and monitoring data into structured formats suitable for graphing and trending Collaborate with DBAs and technical teams to ensure transparency and supportability. Build data flows that support capacity reporting and trending, using modern technologies. Skills & Experience Required: 3+ years' experience in ETL/data processing roles. Proficiency with ETL tools, SQL Server, RESTful APIs. Experience working with infrastructure data (e.g., Nutanix, VMware, Dell EMC desirable). Scripting experience in Python, SQL (T-SQL), Excel VBA. Knowledge of data optimisation, large datasets, JSON/XML formats. Experience of setting up SQL DTSS and Control/M batch jobs useful Excellent communication, analytical More ❯
This is an exciting opportunity for a Power BI Specialist. Based in London, you will use your expertise to create insightful dashboards and reports that support data-driven decision-making. Client Details A £100million turnover company. Description Design, build, and refine dashboards focusing on profit and loss, margin analysis, cost control, and operational KPIs across the Group. Enhance existing … dashboards to integrate with the new Microsoft Data Links repository, replacing scheduled refreshes with live data feeds. Use Power Query (M language) for dataingestion and transformation. Develop advanced DAX measures, including time intelligence calculations, dynamic reporting layers, and calculated tables. Manage the full dashboard development lifecycle, from stakeholder engagement and data modelling to prototyping … business units to gather requirements and shape outputs. Ensure dashboards are user-friendly, commercially relevant, and actionable for both finance and non-finance users. Work alongside internal ERP and data teams (who manage SQL and connectors) to maintain data consistency with Unit4 and Tagetik systems. Provide ad-hoc support for wider reporting and transformation projects, including benchmarking and More ❯
Chat API using Python, LangChain, and OpenAI services on Azure Function Apps Monitor and resolve API bugs and performance issues to ensure high availability and reliability Oversee and troubleshoot dataingestion processes supporting the chatbot's functionality Collaborate with solution architects, product teams, and the cloud team to deliver a reliable and scalable solution Champion best practices in … strong knowledge of LangChain and OpenAI APIs Hands-on experience with Azure Function Apps and cloud-native development Strong background in API design, monitoring, and performance optimisation Experience managing dataingestion pipelines and integrating data sources Excellent problem-solving skills and a proactive, solution-oriented mindset Strong communication and leadership skills, with the ability to work cross More ❯
financial sector. The ideal candidate will have a deep experience and understanding of test automation frameworks, and an understanding of COTS (Commercial Off-The-Shelf) products and experience in Data Modernization/Transformation programmes. As an Automation Test Lead, you will be responsible for designing and executing test plans, coordinating with cross-functional teams, and ensuring the quality of … support Coordinate with cross-functional teams, including developers, business analysts, and project managers, to understand project requirements and develop test strategies Potentially also support, advise, assist to prepare test data, migrations testing CI/CD, NFT and UAT Potentially create automated solutions around data inputs, outputs and data source-target comparisons Qualifications and Skills: Bachelor's degree … experience as an Automation Test Lead, with at least 5 years of experience and automation testing in the banking and financial sector Experience with cloud technologies, such as Azure Data Factory In depth experience, knowledge and understanding of test automation frameworks with Karate, JSON, NUnit, Pytest, Eggplant; along with Github Actions pipeline and IntelliJ IDE. In-depth understanding of More ❯
financial sector. The ideal candidate will have a deep experience and understanding of test automation frameworks, and an understanding of COTS (Commercial Off-The-Shelf) products and experience in Data Modernization/Transformation programmes. As an Automation Test Lead, you will be responsible for designing and executing test plans, coordinating with cross-functional teams, and ensuring the quality of … support Coordinate with cross-functional teams, including developers, business analysts, and project managers, to understand project requirements and develop test strategies Potentially also support, advise, assist to prepare test data, migrations testing CI/CD, NFT and UAT Potentially create automated solutions around data inputs, outputs and data source-target comparisons Qualifications and Skills: Bachelor's degree … experience as an Automation Test Lead, with at least 5 years of experience and automation testing in the banking and financial sector Experience with cloud technologies, such as Azure Data Factory In depth experience, knowledge and understanding of test automation frameworks with Karate, JSON, NUnit, Pytest, Eggplant; along with Github Actions pipeline and IntelliJ IDE. In-depth understanding of More ❯
Software Engineer, Systematic Equity Millennium is a top tier global hedge fund with a strong commitment to leveraging market innovations in technology and data to deliver high-quality returns. About Us We are a well established systematic equity trading group within a famous and prestigious global investment firm. Our team develops and maintains a plethora of sophisticated trading strategies … collaborate closely with quantitative researchers, portfolio manager, and supporting teams to build scalable, high-performance trading systems. Your primary responsibility will be to develop, optimize, and maintain complex algorithms, data pipelines, and software infrastructure that underpin our trading strategies while ensuring robust risk management and compliance standards. Location London Principal Responsibilities Lead the development and evolution of the quantitative … translate trading strategies into efficient, production-grade code Evaluate and integrate emerging technologies, libraries, and tools to drive continuous improvement Design, code, test, and deploy robust trading algorithms and data processing pipelines Develop real-time dataingestion systems and analytics frameworks for market data, risk metrics, and performance Optimize existing codebases for speed, reliability, and maintainability More ❯
What is Human Native? At Human Native, we're building an AI data marketplace that ensures creators and rights holders are fairly compensated for their work while providing AI developers with high-quality, responsibly licensed training data. We believe in building AI the right way - ensuring transparency, fairness, and accessibility. This is a hard problem, and we need brilliant … minds to help us solve it. The Opportunity As an ML Engineer, you'll help us index, benchmark, and evaluate training datasets at scale. Your expertise with data, AI and ML training methodologies and evaluation techniques will advance the state of the art for developing AI. You will work across: Designing and developing benchmarks that allow our customers to … understand their value of data for training ML (quantifying dataset quality and biases). Deploy these benchmarks by implementing end-to-end data evaluation pipelines to be run on different datasets and ML models. Tools to visualise, analyze, and understand the attributes of datasets based on the evaluations. Develop ML models to transform, clean and understand data. Collaborating More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Reed.co.uk
management Skills and Experience AI/ML Expertise: Strong understanding of LLMs, NLP, and AI-driven recommendations. Technical Leadership: Experience in AI product scalability, DevOps, and cloud architecture. Backend & Data Skills: Proficiency in Python, Node.js, PostgreSQL, MongoDB, API security. AI Model Deployment: MLOps, scaling, data engineering, and AI ethics awareness. Strategic Mindset: Ability to align technology with business … objectives and cost efficiency. Security & Compliance Knowledge: GDPR, API authentication, and observability best practices. Big data processing: Understanding data lakes, warehouses, and tools like Spark, Kafka, and Airflow. ETL Pipelines: Ability to evaluate dataingestion, transformation, and cleaning processes. DevOps & CI/CD: Hands-on knowledge of Jenkins, GitHub Actions, Terraform, and CloudFormation. What's in More ❯
own Terraform to manage and provision our cloud infrastructure for machine learning operations. Oversee the transition to a real-time streaming architecture for our machine learning applications, ensuring efficient dataingestion, feature engineering, and model serving in a streaming context. Develop and implement a comprehensive monitoring framework to track model performance, identify potential issues, and ensure optimal model … health in production. Monitor model performance and update them as needed to adapt to new data and changing conditions. Collaborate closely with data scientists and engineers to ensure seamless integration of models into our existing systems and workflows. Stay abreast of the latest MLOps trends and technologies to continuously improve our processes and tools. Your Story You have … Kubernetes) and using orchestration tools such as Kubeflow (our preferred tool) or similar frameworks like Apache Airflow to manage and automate ML workflows. You have experience with real-time data streaming technologies such as Kafka and Confluent and feature stores in such settings. You are skilled in building and maintaining monitoring systems for machine learning models. You have excellent More ❯
London, England, United Kingdom Hybrid / WFH Options
Reed.co.uk
management Skills and Experience AI/ML Expertise: Strong understanding of LLMs, NLP, and AI-driven recommendations. Technical Leadership: Experience in AI product scalability, DevOps, and cloud architecture. Backend & Data Skills: Proficiency in Python, Node.js, PostgreSQL, MongoDB, API security. AI Model Deployment: MLOps, scaling, data engineering, and AI ethics awareness. Strategic Mindset: Ability to align technology with business … objectives and cost efficiency. Security & Compliance Knowledge: GDPR, API authentication, and observability best practices. Big data processing: Understanding data lakes, warehouses, and tools like Spark, Kafka, and Airflow. ETL Pipelines: Ability to evaluate dataingestion, transformation, and cleaning processes. DevOps & CI/CD: Hands-on knowledge of Jenkins, GitHub Actions, Terraform, and CloudFormation. What's in More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Tata Consultancy Services
the end-to-end architecture of RMN platforms, covering sponsored ads, programmatic DSP, self-service portals, and dynamic ad placements. • Design scalable and modular RMN solutions, integrating Ad-Serving, Data Management Platforms (DMP), Customer Data Platforms (CDP), and Demand-Side Platforms (DSP). • Ensure real-time bidding (RTB) and programmatic advertising capabilities within the RMN ecosystem. • Develop a … analytics, and automated campaign optimization. • Implement Generative AI for automated ad creation, content personalization, and A/B testing. • Enhance Dynamic Creative Optimization (DCO) capabilities for personalized ad experiences. Data & Integration Framework • Define dataingestion, processing, and activation strategies across first-party, second-party, and third-party data sources. • Ensure seamless API integrations with RMN partners … attribution models to measure ad performance effectively. Ad-Tech Stack & Cloud Architecture • Design RMN solutions leveraging Google Cloud, AWS, or Azure, ensuring high availability and low-latency performance. • Optimize data pipelines, real-time analytics, and identity resolution mechanisms. • Oversee privacy-compliant data handling (GDPR, CCPA) and secure identity frameworks. Stakeholder Collaboration & Roadmap Execution • Work closely with engineering, product More ❯
patches, and enhancements with minimal disruption to business operations. Collaborate with stakeholders to define requirements for system integrations, user needs, and architectural designs. Identify and implement solutions to improve data quality, ingestion/extraction processes, and overall data accessibility across SQL and other databases. Execute system failovers and disaster recovery procedures to minimize risk and downtime. Manage … with strong experience customizing Symitar for reporting and system enhancements. Expertise in Jack Henry system releases, including planning, testing, and deployment. In-depth experience with SQL Server management including dataingestion, query optimization, and backups. Strong project leadership skills and experience managing complex, cross-functional technical initiatives. Advanced troubleshooting, analytical, and strategic problem-solving abilities in high-pressure More ❯