london, south east england, United Kingdom Hybrid / WFH Options
Xcede Group
of new Trading desks across both the Systematic HFT and Discretionary businesses they're now looking to hire an experienced Software Engineer/Data Engineer to work closely with Trading, Quant Research, and parallel Technology teams. This is a fast-paced, collaborative and innovative environment, suited to individuals … both autonomously and as part of a vibrant rapidly expanding group. Your focus will be the design, building and implementation of advanced Big Data solutions, to enable the capture, ingestion, storage, and provision of Terabyte sized Alternate Datasets. Tools you'll create include ETL/ELT DataPipelines and Frameworks APIs Data Store Candidate's suitable for this role, will ideally have 3-8 yrs' of blended Data Engineering and Software Engineering skills, include ETL/ELT pipelines for High throughput (and Low Latency Data) Expert Python skills including More ❯
combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title and Summary Data Engineer II Role description: Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we're … partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Data Engineer II Responsibilities: Independently execute projects through design, implementation, automation, and maintenance of large-scale enterprise ETL processes for a global client base. Develop … repeatable and scalable code that processes client data in an automated and efficient manner to ensure data availability in the platform is as real-time as possible. Act as an expert data resource within the team. Manage the process of data delivery More ❯
We're looking for experienced Data Engineers to help design and deliver a Remediation Data Hub for a major financial services client. This is an exciting opportunity to work on a high-impact program focused on regulatory redress and data integrity. Location: UK-based … Remote - with ad-hoc and expensed client visits) Tech Stack: Oracle Snowflake Informatica PL/SQL PySpark Power BI Build scalable, auditable datapipelines for ingestion, transformation, and downstream use. Work across multiple platforms (Oracle, Snowflake, Informatica). Collaborate with architects, analysts, and stakeholders to align data solutions with business and regulatory requirements. Develop data models and logic for segmentation, audit trails, and reporting. Support code reviews, testing, and documentation. Proficient in SQL, PL/SQL, or PySpark. Experienced with Oracle, Snowflake, and/or Informatica. Background in financial services, ideally with remediation More ❯
global technical platform that supports these different processes and drives forward long-term solutions to enhance Group capabilities. We are looking for a Data Engineer to help achieve this goal - ideally someone who is comfortable diving into different tasks to support each team using a variety of coding … relationships across the company while explaining technical processes in the most appropriate way, and who keeps an eye on scalable solutions to support data growth. This is therefore an exciting opportunity to take on a role that combines complex data engineering, visual analytics and business critical … and Airflow; Experience in Kubernetes, Docker, Django, Spark and related monitoring tools for DevOps a big plus (e.g. Grafana, Prometheus); Experience with dbt for pipeline modeling also beneficial; Skilled at shaping needs into a solid set of requirements and designing scalable solutions to meet them; Able to quickly understand More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Pavegen
meets meaning. We’ve evolved from a kinetic energy pioneer into a next-generation AI-powered engagement platform that transforms footfall into energy, data, and dynamic experiences. Our mission? To revolutionise how people connect with the spaces around them - through intelligent, sustainable, and interactive technology. That’s where … management role. You'll already understand and be in a position to architect complex integrated environments that connect hardware (kinetic and solar floors), data collection modules, cloud infrastructure, and customer-facing digital experiences while adding new technologies such as AI, Digital Twin and more working with our in … lead the charge on building our smart infrastructure ecosystem - from IoT devices on the ground, through to digital twins, edge computing, and cloud datapipelines that drive actionable insight for our clients. Responsibilities Technology Vision & Strategy Own and evolve the company’s technology roadmap to support the next More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Pavegen
meets meaning. We’ve evolved from a kinetic energy pioneer into a next-generation AI-powered engagement platform that transforms footfall into energy, data, and dynamic experiences. Our mission? To revolutionise how people connect with the spaces around them - through intelligent, sustainable, and interactive technology. That’s where … management role. You'll already understand and be in a position to architect complex integrated environments that connect hardware (kinetic and solar floors), data collection modules, cloud infrastructure, and customer-facing digital experiences while adding new technologies such as AI, Digital Twin and more working with our in … lead the charge on building our smart infrastructure ecosystem - from IoT devices on the ground, through to digital twins, edge computing, and cloud datapipelines that drive actionable insight for our clients. Responsibilities Technology Vision & Strategy Own and evolve the company’s technology roadmap to support the next More ❯
closely with internal teams to influence product features, particularly those focused on web standards (e.g., geolocation permissions). Partner Management: Identify and build a pipeline of new and existing partners suited to support varied technical feature rollouts. Stakeholder Collaboration: Work with cross-functional teams (sales, product, engineering) across regions … or solution engineering roles. Technical Knowledge: Strong grasp of JavaScript, HTML, CSS for web consulting. Familiarity with SQL for partner and datapipeline management. Ability to read, understand, and troubleshoot code (full-time coding not required). Consulting Background: Experience in client-facing roles, ideally with a More ❯
Osterley/Hybrid Duration: 12 Months We are seeking a highly skilled and experienced Back-End Developer with advanced expertise in Go (Golang), data analytics, and cloud-based backend services. This individual will play a key role in building scalable APIs, processing datapipelines, and supporting … services using Go (Golang). Design and implement RESTful APIs and GraphQL endpoints to support front-end applications. Process and optimize large-scale datapipelines for real-time analytics and reporting. Write clean, maintainable, and efficient code that adheres to best practices and coding standards. Utilize AWS cloud … services for scalable and secure backend application deployment. Ensure robust data storage and management using PostgreSQL and other database solutions. Implement authentication, authorization, and security best practices in backend services. Improve CI/CD pipelines for automated deployments and infrastructure management. Architecture and Performance Lead performance optimization initiatives More ❯
london, south east england, United Kingdom Hybrid / WFH Options
eTeam
Osterley/Hybrid Duration: 12 Months We are seeking a highly skilled and experienced Back-End Developer with advanced expertise in Go (Golang), data analytics, and cloud-based backend services. This individual will play a key role in building scalable APIs, processing datapipelines, and supporting … services using Go (Golang). Design and implement RESTful APIs and GraphQL endpoints to support front-end applications. Process and optimize large-scale datapipelines for real-time analytics and reporting. Write clean, maintainable, and efficient code that adheres to best practices and coding standards. Utilize AWS cloud … services for scalable and secure backend application deployment. Ensure robust data storage and management using PostgreSQL and other database solutions. Implement authentication, authorization, and security best practices in backend services. Improve CI/CD pipelines for automated deployments and infrastructure management. Architecture and Performance Lead performance optimization initiatives More ❯
data to personalize interactions in a compliant manner, at scale (20m consumers and growing 10% MoM) Built out a datapipeline capable of handling millions of events daily Enabled the use of multiple LLMs for a single interaction, optimized for the best performance while reducing More ❯
data to personalize interactions in a compliant manner, at scale (20m consumers and growing 10% MoM) Built out a datapipeline capable of handling millions of events daily Enabled the use of multiple LLMs for a single interaction, optimized for the best performance while reducing More ❯
We are hiring a Staff Software Engineer to join our Data Team. This is a crucial … position that will enable us to achieve project deliverables this year. You'll support our Genetic Data Squad, looking at withdrawals, supporting pipeline development and maintenance to support our imputed releases. If you're looking for a new challenge, have experience working with genetic data … about human health and diseases. So future generations can live in good health for longer. Essential Duties and Responsibilities: Responsible for several interacting datapipelines/flows, ensuring these meet the user, business and technical requirements that have been prioritised. Leading hands-on development of new features, including More ❯
We are hiring a Staff Software Engineer to join our Data Team. This is a crucial … position that will enable us to achieve project deliverables this year. You'll support our Genetic Data Squad, looking at withdrawals, supporting pipeline development and maintenance to support our imputed releases. If you're looking for a new challenge, have experience working with genetic data … about human health and diseases. So future generations can live in good health for longer. Essential Duties and Responsibilities: Responsible for several interacting datapipelines/flows, ensuring these meet the user, business and technical requirements that have been prioritised. Leading hands-on development of new features, including More ❯
in at least one key area of the software development stack, including but not limited to: frontend engineering, backend architecture design, datapipeline management, or distributed systems. Experience working in cross-functional teams and building integrations across diverse product stacks. Additional or Preferred Qualifications Demonstrated full-stack … collaborate and contribute to a positive, inclusive work environment, fostering knowledge sharing and growth within the team. Responsibilities Collaborate cross-functionally with product, data, and clinical teams to align on technical priorities and clarify scopes. Work independently across a wide range of the stack, integrating backend dataMore ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
create modern systems by using Microsoft Fabric Act as a consultant, understanding the client requirements and deliver the best solutions. Building and optimising data pipelines. Debugging issues Keep costs under control Skills Microsoft Fabric - Lakehouse and warehouse models Azure Tools (Data Factory, Synapse, SQL) Python experience … for programming Databases, SQL and NoSQL (CosmosDB, KQL) Data Modelling: Kimball frameworks and 3NF Nice To Have Databricks Power BI AI/ML Azure Infrastructure DevOps Certifications Microsfot certified: fabric Analytics Engineer Associate DP-203 Azure data Engineering AZ-305 Azure Solutions Architect. If this sounds More ❯
to innovation provide ample opportunities for professional growth and progression. Our Python Developer will enjoy working on a backend code base including algorithms, datapipelines, APIs and ML models. Apply now for immediate consideration for this exceptional Python Developer opportunity! Understanding Recruitment is acting as an employment agency More ❯
and production support when needed. Key Requirements Solid experience working with Openlink Endur, including configuration and customisation. Familiarity with downstream system integration and datapipelines around Endur. Strong knowledge of OpenJVS and SQL. Exposure to or experience with Java development. Experience with Linux, Git, RabbitMQ, FIX messaging, and More ❯
and production support when needed. Key Requirements Solid experience working with Openlink Endur, including configuration and customisation. Familiarity with downstream system integration and datapipelines around Endur. Strong knowledge of OpenJVS and SQL. Exposure to or experience with Java development. Experience with Linux, Git, RabbitMQ, FIX messaging, and More ❯
existing tools. Enhancing automation for scaling infrastructure. Playing a key role in diversifying and scaling platform. Evaluating options to replace existing real-time data pipeline. Providing platform support to engineering. AppTek.ai provides AI-powered speech and language solutions including ASR, NMT, NLP/U, LLMs, and TTS, serving … diverse industries globally. Education Requirements: BS in a field related to Computational Linguistics, Computer/Data Science. Experience Requirements: 2+ years of industry experience (desirable for Site Reliability Engineer role). Other Requirements: Strong knowledge of Linux. Strong knowledge of AWS. Docker. Scripting languages (Bash, Python). Familiarity … to monitor the platform. Respond to incidents, troubleshoot, investigate root causes. Conduct post-incident investigation and report. QED.ai provides AI-driven solutions for data scarcity in health and agriculture, offering tools for data digitization, geospatial mapping, and spectroscopy. Travel to exotic places around the world. Ask More ❯
configurations. Manage Kubernetes clusters and containerized workloads. Administer GitLab infrastructure for CI/CD processes. Operate and maintain Kafka clusters for real-time data pipelines. Diagnose and resolve issues across systems, networks, containers, and applications. Use observability tools (Grafana, Prometheus, Kibana, Elasticsearch) to monitor system health. Automate system More ❯
configurations. Manage Kubernetes clusters and containerized workloads. Administer GitLab infrastructure for CI/CD processes. Operate and maintain Kafka clusters for real-time data pipelines. Diagnose and resolve issues across systems, networks, containers, and applications. Use observability tools (Grafana, Prometheus, Kibana, Elasticsearch) to monitor system health. Automate system More ❯
class AI-enabled platform in a high-ownership environment, where you’ll be leading development across the backend, ML service, ML infrastructure, and data pipelines. Fine-tune and deploy large language models for patent search, analysis, drafting, and other intellectual property workflows. Evaluate and improve large language model More ❯
class AI-enabled platform in a high-ownership environment, where you’ll be leading development across the backend, ML service, ML infrastructure, and data pipelines. Fine-tune and deploy large language models for patent search, analysis, drafting, and other intellectual property workflows. Evaluate and improve large language model More ❯
risk intelligence who are seeking a Senior Machine Learning Engineer. You will work along the end-to-end machine learning lifecycle, focusing on data engineering and maintaining real-time data pipelines. Your responsibilities will include developing and maintaining ETL processes, designing and implementing machine learning algorithms … and deploying and monitoring models in production environments The Ideal Candidate Education & Experience: Bachelor's or Master's degree in Computer Science, Data Science, or a related field. 7+ years of professional experience in areas such as fraud prevention or credit scoring. Machine Learning Expertise: Strong understanding of … machine learning algorithms and their practical applications. Experience with frameworks like TensorFlow, PyTorch, and scikit-learn. Data Engineering: Proficiency in developing and maintaining real-time data pipelines. Experience with ETL processes, Python, and SQL. Familiarity with big data technologies like Apache Hadoop and Apache More ❯
risk intelligence who are seeking a Senior Machine Learning Engineer. You will work along the end-to-end machine learning lifecycle, focusing on data engineering and maintaining real-time data pipelines. Your responsibilities will include developing and maintaining ETL processes, designing and implementing machine learning algorithms … and deploying and monitoring models in production environments The Ideal Candidate Education & Experience: Bachelor's or Master's degree in Computer Science, Data Science, or a related field. 7+ years of professional experience in areas such as fraud prevention or credit scoring. Machine Learning Expertise: Strong understanding of … machine learning algorithms and their practical applications. Experience with frameworks like TensorFlow, PyTorch, and scikit-learn. Data Engineering: Proficiency in developing and maintaining real-time data pipelines. Experience with ETL processes, Python, and SQL. Familiarity with big data technologies like Apache Hadoop and Apache More ❯