Data Engineer - BigQuery Salary: £75,000-£80,000 Role type: Permanent Location: London/Hybrid Our client is seeking a skilled Data Engineer to take the lead in developing and optimising their data systems. This role will focus on SQL database application development, utilising Google BigQuery as the new platform for data management, and supporting the … development of data analytics and visualisation tools such as PowerBI. The Data Engineer will be responsible for ensuring seamless dataingestion, transformation, and reporting, as well as working with key stakeholders to provide timely and actionable insights. Main Responsibilities Engineering : Developing and maintaining business-critical dataingestion, analysis, and management systems. Utilizing API and … ETL methods to integrate with internal systems. Building tools to minimize errors and improve customer experience. Automating data development and governance to improve data health. Designing procedures for system troubleshooting, capacity management, and scaling infrastructure. Analytics : Developing scripts and data outputs to automate reporting and visualizations. Ensuring timely and easily translatable data delivery for business stakeholders. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
SystemsAccountants
Data Engineer - BigQuery Salary: £75,000-£80,000 Role type: Permanent Location: London/Hybrid Our client is seeking a skilled Data Engineer to take the lead in developing and optimising their data systems. This role will focus on SQL database application development, utilising Google BigQuery as the new platform for data management, and supporting the … development of data analytics and visualisation tools such as PowerBI. The Data Engineer will be responsible for ensuring seamless dataingestion, transformation, and reporting, as well as working with key stakeholders to provide timely and actionable insights. Main Responsibilities Engineering : Developing and maintaining business-critical dataingestion, analysis, and management systems. Utilizing API and … ETL methods to integrate with internal systems. Building tools to minimize errors and improve customer experience. Automating data development and governance to improve data health. Designing procedures for system troubleshooting, capacity management, and scaling infrastructure. Analytics : Developing scripts and data outputs to automate reporting and visualizations. Ensuring timely and easily translatable data delivery for business stakeholders. More ❯
Job Title: Test Data Engineer Location: London, UK (3 days in office) SC Cleared: Required Job Type: Full-Time Experience: 5+ years Job Summary: We are seeking a highly skilled and detail-oriented Test Data Engineer to join our growing team. As a Test Data Engineer, you will be responsible for designing, implementing, and executing test cases … applications in the banking domain. You will work closely with the development team to identify and resolve defects, ensure product quality, and deliver reliable and efficient software solutions. Key Data-Testing Responsibilities: Test Internal dataingestion and reference dataingestionData Ingress – Source to Landing Zone Landing Zone to Bronze Bronze to Silver – Data … tests Output Dataset labelling algorithm tests Vintaging labelling algorithm tests Key Core-Testing Responsibilities: Plan, Schedule, Coordinate, prep, run and report manual tests when required, defect lifecycle, prep test data, mainly for System Test, SIT and migrations testing. Analyse and prioritise test cases to ensure effective test coverage, identify defects, and track resolution to closure; participate in Defect Triage More ❯
Job Title: Test Data Engineer Location: London, UK (3 days in office) SC Cleared: Required Job Type: Full-Time Experience: 5+ years Job Summary: We are seeking a highly skilled and detail-oriented Test Data Engineer to join our growing team. As a Test Data Engineer, you will be responsible for designing, implementing, and executing test cases … applications in the banking domain. You will work closely with the development team to identify and resolve defects, ensure product quality, and deliver reliable and efficient software solutions. Key Data-Testing Responsibilities: Test Internal dataingestion and reference dataingestionData Ingress – Source to Landing Zone Landing Zone to Bronze Bronze to Silver – Data … tests Output Dataset labelling algorithm tests Vintaging labelling algorithm tests Key Core-Testing Responsibilities: Plan, Schedule, Coordinate, prep, run and report manual tests when required, defect lifecycle, prep test data, mainly for System Test, SIT and migrations testing. Analyse and prioritise test cases to ensure effective test coverage, identify defects, and track resolution to closure; participate in Defect Triage More ❯
Lead Data Engineer – Azure Databricks Remote – 2 Days a Month in Bristol Contract Opportunity - £500 - £650 a day DOE Applicants Must be Eligible for SC Clearance About the Role TRIA is proud to be partnering with a purpose-driven, mission-led organisation that is using data to make a meaningful impact. As they scale their data capabilities … and embrace new technologies, they are seeking an experienced Principal Data Engineer to lead the advancement of their Azure-based platform. The Opportunity You will work alongside a collaborative data team to enhance and maintain robust dataingestion pipelines, facilitate a transition to Azure and Databricks and help productionise AI models with monitoring and alerting frameworks … in place. This is a key role in supporting the organisation’s continued data maturity. Key Responsibilities Build and optimise scalable, reusable data pipelines Improve team efficiency through smart automation and streamlined processes Support the deployment and monitoring of AI models in production Contribute to the evolution of ETL processes across the data platform Your Experience Proven More ❯
Senior Data Insights Analyst 📍 Oxford 💰 £50,000 - £55,000 🧠 Data-Driven Strategy | Revenue Optimisation | Business Impact Key skills: Large data sets (Google Analytics, Google Ads, Meta Ads etc) PowerBI SQL VBA E-commerce environment Are you an intellectually curious, highly motivated analyst with a passion for turning data into real business value? We’re on the … hunt for a Senior Data Insights Analyst to join our fast-growing team in Oxford — someone ready to roll up their sleeves and make a meaningful impact from day one. This role sits at the core of our business , where data isn’t just a resource — it’s the engine powering our product innovation, customer experience, and revenue … growth. 🔍 About the Role As part of our high-performing Data Insight Team , you’ll be working with large datasets to extract critical trends, uncover growth opportunities, and drive fast-paced, data-informed decision-making. You'll influence key business strategies, using insights to directly shape product performance and customer satisfaction. If you’re someone who loves the More ❯
At Insurwave, we are looking for remarkable people who thrive on making an exceptional contribution. We now have an exciting opportunity for a Data Scientist to play a key role in our Data and AI team. If making a difference gets you out of bed in the morning, then this could be the perfect opportunity and the start … of something incredible! What will you be doing? As a Mid-Level Data Scientist, you will play a critical role within a multidisciplinary team of data scientists, analysts, and domain experts, developing advanced AI and analytics solutions for the Insurwave platform. This self-sufficient team is responsible for the entire delivery lifecycle—from design and development to testing … exposure analytics, and model asset behaviour for the commercial insurance market. Responsibilities Design, build, and deploy machine learning models that meet defined performance and business requirements Develop production-ready data science solutions and maintain high-quality, testable code using modern development practices Build and maintain APIs, data pipelines, and automated workflows to support model deployment, monitoring, and lifecycle More ❯
At Insurwave, we are looking for remarkable people who thrive on making an exceptional contribution. We now have an exciting opportunity for a Data Scientist to play a key role in our Data and AI team. If making a difference gets you out of bed in the morning, then this could be the perfect opportunity and the start … of something incredible! What will you be doing? As a Mid-Level Data Scientist, you will play a critical role within a multidisciplinary team of data scientists, analysts, and domain experts, developing advanced AI and analytics solutions for the Insurwave platform. This self-sufficient team is responsible for the entire delivery lifecycle—from design and development to testing … exposure analytics, and model asset behaviour for the commercial insurance market. Responsibilities Design, build, and deploy machine learning models that meet defined performance and business requirements Develop production-ready data science solutions and maintain high-quality, testable code using modern development practices Build and maintain APIs, data pipelines, and automated workflows to support model deployment, monitoring, and lifecycle More ❯
Services AI Data Solution Principal (Services Technical PreSales), based London Job Summary: The Services AI Data Solutions Principal is a customer-facing, technical presales leader responsible for driving Dell Technologies' AI and Data Services revenue across a wide portfolio of enterprise customers and industries. This role requires strong technical expertise in AI and associated Data engineering … and Data management disciplines, strong consultative selling skills, executive-level communications, and business development acumen. The candidate will jointly lead and influence customers from initial opportunity discovery through proposal development and deal closure while collaborating closely with Dell sales, delivery, product, and partner teams. This role carries sales pipeline joint-ownership within an aligned pod of professionals, quota-bearing … services portfolio, delivery capabilities, and financial objectives. Entrepreneurial focus to drive innovation, efficiency, process/IP improvement opportunities, coach and enable Dell sales and presales teams on AI/Data solution positioning, capabilities, and value articulation. Build and maintain a strong personal network across Dell's global sales, product, engineering, and partner ecosystem to effectively orchestrate deal success. Customer More ❯
wasted, 1 incredible purpose. Together we will beat cancer. Skills, Experience, Qualifications, If you have the right match for this opportunity, then make sure to apply today. Head of Data Technology & Operations Salary: £83,000 - £93,000 per annum Department: Chief Operating Office Contract: Permanent Hours: Full time 35 hours per week Location: Stratford, London Office-based with high … need to go much further and much faster. That's why we're looking for someone talented, someone determined, someone like you. We're looking for a Head of Data Technology & Operations to join our Technology leadership team. In this crucial role, you will be accountable for the design, implementation, roll out and continuous improvement of fit-for-purpose … data and data technology services and solutions. You'll provide leadership, direction, coaching and functional support to Technology our data teams, and you'll work closely with senior data directors across CRUK to ensure that our data initiatives can support and enable the direction and execution of CRUK's data strategies and ultimately maximise More ❯
waves in the industry. The role is paying up to £95,000 + benefits We are looking for: Deep understanding of Financial Services (FS) Deep expertise in the full data engineering lifecycle—from dataingestion through to end-user consumption Practical experience with modern data tools and platforms, including Redshift, Airflow, Python, DBT, MongoDB, AWS, Looker … and Docker Strong grasp of best practices in data modelling, transformation, and orchestration Proven ability to build and support both internal analytics solutions and external-facing data products Solid understanding of Agile methodologies (Scrum, Kanban) and modern software development workflows Skilled in using CI/CD pipelines, infrastructure-as-code, and scalable cloud-based data infrastructure Knowledgeable … in data governance, privacy regulations, etc Experienced in leveraging metrics and observability tools to ensure platform reliability and team performance In this role, your focus on technology innovation and excellence will be paramount to our success. Are you ready to join us on this exciting journey and make your mark on the future of our industry More ❯
waves in the industry. The role is paying up to £95,000 + benefits We are looking for: Deep understanding of Financial Services (FS) Deep expertise in the full data engineering lifecycle—from dataingestion through to end-user consumption Practical experience with modern data tools and platforms, including Redshift, Airflow, Python, DBT, MongoDB, AWS, Looker … and Docker Strong grasp of best practices in data modelling, transformation, and orchestration Proven ability to build and support both internal analytics solutions and external-facing data products Solid understanding of Agile methodologies (Scrum, Kanban) and modern software development workflows Skilled in using CI/CD pipelines, infrastructure-as-code, and scalable cloud-based data infrastructure Knowledgeable … in data governance, privacy regulations, etc Experienced in leveraging metrics and observability tools to ensure platform reliability and team performance In this role, your focus on technology innovation and excellence will be paramount to our success. Are you ready to join us on this exciting journey and make your mark on the future of our industry More ❯
Job Title: KDB Developer Location: London - Hybrid Type: Permanent Position We are actively recruiting multiple KDB Developers at various levels for a fast-paced, data-driven organisation at the forefront of real-time analytics and high-performance computing. This organisation are a consultancy working with some of the leading names in the financial services sector. The Role: As a … KDB Developer, you will be responsible for designing, developing, and maintaining high-performance applications and data analytics solutions using kdb+/q. You’ll work closely with quants, traders, and data scientists to deliver scalable systems and actionable insights from large volumes of time-series data. Key Responsibilities: Design, implement, and optimise kdb+/q-based applications and … data pipelines Work on real-time dataingestion, transformation, and analysis Collaborate with stakeholders to gather requirements and translate them into technical solutions Maintain and enhance existing codebases, ensuring high availability and performance Contribute to architectural decisions and best practices for kdb+ systems Troubleshoot and resolve production issues quickly and effectively Required Skills & Experience: Strong hands-on More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Synchro
Job Title: KDB Developer Location: London - Hybrid Type: Permanent Position We are actively recruiting multiple KDB Developers at various levels for a fast-paced, data-driven organisation at the forefront of real-time analytics and high-performance computing. This organisation are a consultancy working with some of the leading names in the financial services sector. The Role: As a … KDB Developer, you will be responsible for designing, developing, and maintaining high-performance applications and data analytics solutions using kdb+/q. You’ll work closely with quants, traders, and data scientists to deliver scalable systems and actionable insights from large volumes of time-series data. Key Responsibilities: Design, implement, and optimise kdb+/q-based applications and … data pipelines Work on real-time dataingestion, transformation, and analysis Collaborate with stakeholders to gather requirements and translate them into technical solutions Maintain and enhance existing codebases, ensuring high availability and performance Contribute to architectural decisions and best practices for kdb+ systems Troubleshoot and resolve production issues quickly and effectively Required Skills & Experience: Strong hands-on More ❯
including and relevant soft skills and software package proficiencies, are required. Millennium is a top tier global hedge fund with a strong commitment to leveraging innovations in technology and data science to solve complex problems for the business. Risk Technology team is looking for a Quantitative Developer who would leverage Python, Cloud infrastructure (AWS), and scientific frameworks to provide … data-driven, risk management solutions to Risk Managers, Portfolio Managers, Business Management. Responsibilities: · Work in the intersection of Portfolio Management, Risk Management and Quantitative Research to develop risk analytics solutions for Equity Derivatives businesses. · Collaborate with risk management for rapid prototyping and delivery of solutions to enhancement risk metrics. · Develop dataingestion pipelines and analytics the generated … information. · Design and implement cloud-native, data-intensive applications that effectively leverage AWS solutions · Mentor junior team members, fostering growth and collaboration within the team. · Fit into the active culture of Millennium, judged by the ability to deliver timely solutions to Portfolio and Risk Managers Required Skills/Experience: · Minimum 5 years of experience using Python and scientific python More ❯
and ultimately, be more fulfilled Detailed JD : Architect and deploy scalable AI systems leveraging LLMs (GPT, LLaMA, Claude, etc.) • Build and fine-tune SLMs/LLMs using domain-specific data (e.g., ITSM, security, operations) • Design and optimize Retrieval-Augmented Generation (RAG) pipelines with vector DBs (e.g., FAISS, Chroma, Weaviate, Pinecone) • Develop agent-based architectures using LangGraph, AutoGen, CrewAI, or … Azure, AWS, or GCP (Azure preferred) • Experience with Kubernetes, Docker, and scalable microservice deployments • Experience integrating with REST APIs, webhooks, and enterprise systems (ServiceNow, SAP, etc.) • Solid understanding of data pipelines, ETL, and structured/unstructured dataingestionMore ❯
Wandsworth, England, United Kingdom Hybrid / WFH Options
Tower Research Capital
on the Central Execution Desk, directly contributing to scale up Tower's Mid-Frequency Trading capabilities. Design, implement, and maintain high-performance services in Rust and Python for market-dataingestion, ML pipelines, and post-trade analytics Translate research prototypes into production-ready code, adding testing, monitoring, and CI/CD automation Optimise existing code for throughput, memory … footprint, and reliability on distributed systems Collaborate closely with quantitative researchers to iterate on data pipelines, simulation frameworks, and performance diagnostics Qualifications Bachelor’s or Master’s degree in Computer Science, Mathematics, or a related STEM field 2-5 years of professional software-engineering experience, including production systems written in Python Proficiency in a systems language - Rust preferred (C … Go also acceptable) - and the desire to deepen that expertise Strong computer-science fundamentals: algorithms, data structures, concurrency, networking, and performance profiling Experience working with real-time and historical market data or other high-volume time-series data Proficiency with Linux development, Git, containers, and CI/CD workflows Familiarity with SQL and at least one columnar More ❯
Director — Enterprise Data Architecture & Governance Base c. £140-175k + bonus and LTIP (flexible for the right hire) 35 days leave │ 14 % pension │ full family healthcare Purpose Establish and scale a single enterprise data backbone for a global retail-investing and pensions platform (~£400 bn AUA), enabling regulatory-grade lineage, real-time analytics, and API-driven product … innovation. You will be tasked with the front, middle and back-office solution, defining the interactions, integration and data model transformation. Candidate with no Investment or Asset Management experience are not being considered for this role at this time. 12-Month Success Metrics Publish an enterprise-wide canonical data model covering custody, wrappers, trading, and digital touchpoints. Stand … as-code controls delivering automated Consumer Duty & CASS MI. Migrate 60 % of legacy ETL into an event-stream/Snowflake lakehouse pattern, cutting report latency to < 60 sec. Embed data-product operating model across UK & India hubs; uplift data-literacy score by 25 %. Core Accountabilities Architecture & Design Authority: Define reference architectures, patterns, and guardrails for dataMore ❯
Director — Enterprise Data Architecture & Governance Base c. £140-175k + bonus and LTIP (flexible for the right hire) 35 days leave │ 14 % pension │ full family healthcare Purpose Establish and scale a single enterprise data backbone for a global retail-investing and pensions platform (~£400 bn AUA), enabling regulatory-grade lineage, real-time analytics, and API-driven product … innovation. You will be tasked with the front, middle and back-office solution, defining the interactions, integration and data model transformation. Candidate with no Investment or Asset Management experience are not being considered for this role at this time. 12-Month Success Metrics Publish an enterprise-wide canonical data model covering custody, wrappers, trading, and digital touchpoints. Stand … as-code controls delivering automated Consumer Duty & CASS MI. Migrate 60 % of legacy ETL into an event-stream/Snowflake lakehouse pattern, cutting report latency to < 60 sec. Embed data-product operating model across UK & India hubs; uplift data-literacy score by 25 %. Core Accountabilities Architecture & Design Authority: Define reference architectures, patterns, and guardrails for dataMore ❯
Data Pipeline Development: Design and implement end-to-end data pipelines in Azure Databricks, handling ingestion from various data sources, performing complex transformations, and publishing data to Azure Data Lake or other storage services. Write efficient and standardized Spark SQL and PySpark code for data transformations, ensuring data integrity and accuracy across … the pipeline. Automate pipeline orchestration using Databricks Workflows or integration with external tools (e.g., Apache Airflow, Azure Data Factory). DataIngestion & Transformation: Build scalable dataingestion processes to handle structured, semi-structured, and unstructured data from various sources (APIs, databases, file systems). Implement data transformation logic using Spark, ensuring data is cleaned, transformed, and enriched according to business requirements. Leverage Databricks features such as Delta Lake to manage and track changes to data, enabling better versioning and performance for incremental data loads. Data Publishing & Integration: Publish clean, transformed data to Azure Data Lake or other cloud storage solutions for consumption by analytics and reporting More ❯
Data Pipeline Development: Design and implement end-to-end data pipelines in Azure Databricks, handling ingestion from various data sources, performing complex transformations, and publishing data to Azure Data Lake or other storage services. Write efficient and standardized Spark SQL and PySpark code for data transformations, ensuring data integrity and accuracy across … the pipeline. Automate pipeline orchestration using Databricks Workflows or integration with external tools (e.g., Apache Airflow, Azure Data Factory). DataIngestion & Transformation: Build scalable dataingestion processes to handle structured, semi-structured, and unstructured data from various sources (APIs, databases, file systems). Implement data transformation logic using Spark, ensuring data is cleaned, transformed, and enriched according to business requirements. Leverage Databricks features such as Delta Lake to manage and track changes to data, enabling better versioning and performance for incremental data loads. Data Publishing & Integration: Publish clean, transformed data to Azure Data Lake or other cloud storage solutions for consumption by analytics and reporting More ❯
are known for being a pioneer in Cyber insurance are looking for a Python Developer to join their growing team. This is a unique role with exposure to the data space in a high-functioning team. This role sits at the heart of a cyber data innovation team, focused on building and evolving a proprietary platform that powers … risk intelligence and customer protection. You'll work across large-scale dataingestion, performance APIs, Real Time event handling, and quality-first design - all aimed at delivering actionable insights that matter. Responsibilities: Designing and developing robust dataingestion pipelines and high-performance APIs Working with large, fast-moving datasets to improve product capability and data quality Supporting infrastructure (IaC) and CI/CD processes in Azure and AWS Collaborating across teams to model data for both operational and analytical use cases Leading on data quality metrics, testing automation, and clean documentation Mentoring and contributing to a modern, pairing-friendly engineering culture Technology Stack: Python, SQL (Postgres, SQL Server), Linux/WSL Azure More ❯
Quantitative Developer - Risk Technology Millennium is a top tier global hedge fund with a strong commitment to technology and data science. We are looking for a Quantitative Developer to join our Risk Technology team in London. The developer will use Python, AWS and data manipulation libraries to provide risk managers with data-driven insights on our fund … Work closely with quants, risk managers and other technologists in New York, Tel Aviv and Singapore to develop risk analytics solutions for our Fixed-Income business. Develop micro-services, dataingestion pipelines and APIs to feed tools and dashboards that monitor Millennium’s market risks. Analyse data with pandas/polars to present data in the … understanding of financial instruments and services. Strong analytical skills & problem-solving capabilities. Ability to work independently in a fast-paced environment. Minimum 2 years’ experience working with Python, and data analysis libraries (Pandas/Polars/Numpy). Experience with REST APIs and cloud services. Relational SQL database development experience. Unix/Linux command-line experience. Detail oriented, organised More ❯
Machine Learning Engineer Location: York or Manchester This position is for an experienced Machine Learning Engineer to join a newly established data science team. The primary focus is on building and maintaining the infrastructure to support the full data science lifecycle from dataingestion to model deployment, monitoring, and upgrades within Azure and Databricks environments. The … engineer will work closely with data scientists in a collaborative, cross-functional setting, helping transition models from research into production. Key Responsibilities: Own and develop deployment frameworks for data science services. Ownership of the deployment framework for all data science services. You will have oversight of how data will flow into the data science life … cycle from the wider business data warehouse. Oversight of the automation of the data science life cycle (dataset build, training, evaluation, deployment, monitoring) when we move to production. Automate the data science pipeline (data prep to deployment). Collaborate with cross-functional teams to ensure smooth productionisation of models. Write clean, production-ready Python code. Apply More ❯
and experience for this role Read on to find out, and make your application. Millennium is a top tier global hedge fund with a strong commitment to technology and data science. We are looking for a Quantitative Developer to join our Risk Technology team in London. The developer will use Python, AWS and data manipulation libraries to provide … risk managers with data-driven insights on our fund’s risk profile. Responsibilities Work closely with quants, risk managers and other technologists in New York, Tel Aviv and Singapore to develop risk analytics solutions for our Fixed-Income business. Develop micro-services, dataingestion pipelines and APIs to feed tools and dashboards that monitor Millennium’s market … risks. Analyse data with pandas/polars to present data in the most impactful way possible. Create and manage cloud applications using AWS. Required skills/experience Bachelor’s degree in Computer Science, Mathematics, Engineering or similar, from a leading university. Passion for working in finance, and broad understanding of financial instruments and services. Strong analytical skills & problem More ❯