As a Senior Data Engineer, you will be responsible for the development of complex data sources and pipelines into our data platform (i.e. Snowflake) along with other data applications (i.e. Azure, Airflow, etc.) and automation. The Senior Data Engineer will work closely with the data, Architecture, Business Analyst, Data Stewards to integrate and … align requirements, specifications and constraints of each element of the requirement. They will also need to help identify gaps in resources, technology, or capabilities required, and work with the data engineering team to identify and implement solutions where appropriate. Work type: Contract Length: initial 6 months Work structure: hybrid 2 days a week in London. Primary Responsibilities: Integrate data from multiple on prem and cloud sources and systems. Handle data ingestion, transformation, and consolidation to create a unified and reliable data foundation for analysis and reporting. Develop data transformation routines to clean, normalize, and aggregate data. Apply data processing techniques to handle complex datastructures, handle missing or inconsistent data, and More ❯
Our platform simplifies all administrative tasks, allowing entrepreneurs and small business owners to manage their business admin effortlessly and without the usual complexities. By harnessing cutting-edge technology and data analytics, coupled with outstanding customer service, we provide a seamless experience that emphasises simplicity, efficiency, and rapid execution. Our goal is to remove obstacles, enabling you to achieve success … fostering a collaborative learning environment. API Development: Design and build RESTful API endpoints using NodeJS and Express, including detailed API documentation. Database Management: Design, optimize, and maintain complex SQL data models, ensuring efficient datastructures, query performance, and reliability. Full-Stack Development: Develop responsive and performant front-end applications using ReactJS, focusing on UX/UI best … a proactive and solution-oriented mindset. Soft Skills: Excellent communication, organisational skills, and the ability to work independently or within a team. Preferred Qualifications Advanced Degrees: Master’s in Data Science or Software Engineering. Cloud & DevOps: Experience with AWS cloud services, Terraform, Docker, Kubernetes Agile/DevOps: Experience with Agile methodologies, CI/CD pipelines, and project management tools More ❯
from the US and the UK, including institutional VC funds and C-level executives of global technology companies. In this role, you will: Design and develop a highly-scalable data platform capable of ingesting and processing hundreds of terabytes of training data. Design and develop methodologies and metrics to better understand the underlying quality, structure and distribution of training … data. Architect training data validation, integrity and safety mechanisms for state-of-the-art ML models. Co-hire your future colleagues. Work closely with the founding team and contribute towards best practices, standards, and culture of the company. What we are looking for: Back-end development: Experience in back-end engineering developing data platforms or large-scale extract … transform-load (ETL) pipelines. Programming languages: Proficiency in Python for data pipelines, distributed systems and micro-services. Cloud-native technologies: Experience in developing and deploying in cloud platforms (e.g., AWS, GCP or Azure), an understanding of containerisation (e.g., Docker) and infrastructure-as-code software (e.g., Terraform). Algorithms and datastructures: Excellent understanding of core CS fundamentals More ❯
from the US and the UK, including institutional VC funds and C-level executives of global technology companies. In this role, you will: Design and develop a highly-scalable data platform capable of ingesting and processing hundreds of terabytes of training data. Design and develop methodologies and metrics to better understand the underlying quality, structure and distribution of training … data. Architect training data validation, integrity and safety mechanisms for state-of-the-art ML models. Be given a high degree of autonomy and ownership over your work. Co-hire your future colleagues. Work closely with the founding team and contribute towards best practices, standards, and culture of the company. What we are looking for: Back-end development: 5+ … years of industry experience in back-end engineering developing data platforms or large-scale extract-transform-load (ETL) pipelines. Programming languages: Proficiency in Python for data pipelines, distributed systems and micro-services. Cloud-native technologies: Experience in developing and deploying in cloud platforms (e.g., AWS, GCP or Azure), an understanding of containerisation (e.g., Docker) and infrastructure-as-code More ❯
a different format of this document, please get in touch with at [emailprotected] or call TCS London Office number 02031552100 with the subject line: “Application Support Request”. Role : Data Architect Job Type: Permanent (Hybrid) Location: London, United Kingdom Ready to utilize your skills in designing, creating, and managing data architecture? Join us as a Data Architect. … endless learning opportunities. • Be part of an exciting team where you will be challenged every day. • Build strong relationships with a diverse range of stakeholders. The Role As a Data Architect, you will be responsible for designing, creating, and managing data architecture. You will also ensure that data is efficiently and securely stored, organized, and accessible across … the enterprise. Build the foundation for databases, data warehouses, data lakes, and other data storage solutions, ensuring they meet both business and technical requirements. Key responsibilities: - Develop and design the data architecture framework for the organization. - Create models for databases, data warehouses, data lakes, and other storage solutions to store and manage dataMore ❯
tackle services that infringe our member companies’ rights. The ideal candidate will have well-rounded technical knowledge. They will have a solid understanding of both backend and frontend technologies, datastructures, OOP, development best practices and have experience in relational databases and RESTful APIs. Strong communication skills and an analytical, solution-driven mindset are vital to this role. … across applications. Stay ahead of emerging technologies in full-stack and cloud development. Conduct testing, debugging, and continuously improve user experiences. Provide support in implementing solutions to improve the data mining strategy. May also provide support to other technical-related business activities. The selected candidate will receive training in specific processes and skills, as required. Requirements Proven AWS qualifications … preferably in solutions architect, DevOps or data engineer. Well-versed in cloud computing, automated tests, microservices architecture, continuous delivery/integration and DevOps tools. Experience building and maintaining full-stack applications, including backend APIs and frontend user interfaces. Experience developing world-scale/multi-tenancy applications Proficiency and commercial experience developing solutions in Python using Fast API and/ More ❯
production issues/outages Work in an agile environment, based on research, collaboration and learning from failure Key Requirements: Experience across RFQ is ideal Strong design (patterns), algorithms and datastructures knowledge Experience with the entire software development life cycle Experience with building services from scratch and developing on top of existing components Understanding of Continuous Delivery concepts. More ❯
Job Summary: NineTech's client are looking for a highly skilled and experienced Data Architect to lead the design, development, and implementation of data architecture solutions across the organization. This role is an INSIDE IR35 position and paying up to £650 P/D DOE Below is a breakdown of the role. The ideal candidate will possess deep … expertise in ServiceNow , Oracle , Data Lakes , and Data Migration strategies. Knowledge of Elastic Search and modern data engineering practices is highly desirable. Key Responsibilities: Design and maintain scalable, secure, and high-performing data architectures. Develop and implement data integration and migration strategies between legacy and modern systems. Lead architecture initiatives related to ServiceNow data modeling , integration, and reporting. Architect and support enterprise Data Lake solutions, ensuring data availability, quality, and consistency. Drive the design and execution of data migration projects , especially those involving Oracle-based systems. Collaborate with stakeholders across departments to gather data requirements and translate them into technical specifications. Implement and maintain Elastic Search clusters and solutions More ❯
from the US and the UK, including institutional VC funds and C-level executives of global technology companies. In this role, you will: Design and develop a highly-scalable data platform capable of ingesting and processing hundreds of terabytes of training data. Design and develop methodologies and metrics to better understand the underlying quality, structure and distribution of training … data. Architect training data validation, integrity and safety mechanisms for state-of-the-art ML models. Be given a high degree of autonomy and ownership over your work. Co-hire your future colleagues. Work closely with the founding team and contribute towards best practices, standards, and culture of the company. What we are looking for: Back-end development: At … least 2–3 years of industry experience in back-end engineering developing data platforms or large-scale extract-transform-load (ETL) pipelines. Programming languages: Proficiency in Python for data pipelines, distributed systems and micro-services. Cloud-native technologies: Experience in developing and deploying in cloud platforms (e.g., AWS, GCP or Azure), an understanding of containerisation (e.g., Docker) and More ❯
SQL Server Database Developer – London Markets Insurance A specialist London Markets insurer is building out a brand-new, modern data platform and is looking for an experienced SQL Server Database Developer to join their growing data team. With strong executive support and significant investment, this is a rare greenfield opportunity to help define and build core data infrastructure from the ground up. The business is deeply committed to becoming a more data-driven organisation, and this role offers both stability and progression in a collaborative, high-impact environment. Why apply: Greenfield platform build with real architectural influence Strong backing and long-term investment in data initiatives Excellent career development opportunities within a growing team … and optimise SQL Server databases to support analytics and operational workloads Write high-quality, efficient T-SQL procedures, views, and functions Develop ETL processes and support the integration of data from core insurance systems Work closely with BI, data architecture, and business teams to shape datastructures and pipelines Support performance tuning, indexing, and optimisation across More ❯
City of London, London, United Kingdom Hybrid / WFH Options
JobHeron
or masters degree in computer science (or related field). 5+ years of professional software development experience. Proficient in C++ (Concurrent programming techniques: Shared memory, Atomics and lock-free datastructures) Strong understanding of datastructures, algorithms, and software design principles. Excellent problem-solving and analytical skills. Working knowledge of Linux C++ development environment: vim, gdb … make, valgrind etc. Experience working in Linux Environments with good command of shell, python, awk, sed Motivation to understand/develop an understanding of various financial data elements and how they are used for trading Ability to work collaboratively with a team and demonstrate passion for developing high-quality software. Proven experience in hands-on development and deployment of More ❯
master’s degree in computer science (or related field). 5+ years of professional software development experience. Proficient in C++ (Concurrent programming techniques: Shared memory, Atomics and lock-free datastructures) Strong understanding of datastructures, algorithms, and software design principles. Excellent problem-solving and analytical skills. Working knowledge of Linux C++ development environment: vim, gdb … make, valgrind etc. Experience working in Linux Environments with good command of shell, python, awk, sed Motivation to understand/develop an understanding of various financial data elements and how they are used for trading Ability to work collaboratively with a team and demonstrate passion for developing high-quality software. Proven experience in hands-on development and deployment of More ❯
financial sector. The ideal candidate will have a deep experience and understanding of test automation frameworks, and an understanding of COTS (Commercial Off-The-Shelf) products and experience in Data Modernization/Transformation programmes. As an Automation Test Lead, you will be responsible for designing and executing test plans, coordinating with cross-functional teams, and ensuring the quality of … support Coordinate with cross-functional teams, including developers, business analysts, and project managers, to understand project requirements and develop test strategies Potentially also support, advise, assist to prepare test data, migrations testing CI/CD, NFT and UAT Potentially create automated solutions around data inputs, outputs and data source-target comparisons Qualifications and Skills: Bachelor's degree … experience as an Automation Test Lead, with at least 5 years of experience and automation testing in the banking and financial sector Experience with cloud technologies, such as Azure Data Factory In depth experience, knowledge and understanding of test automation frameworks with Karate, JSON, NUnit, Pytest, Eggplant; along with Github Actions pipeline and IntelliJ IDE. In-depth understanding of More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Talent Hero Ltd
service is 100% free for all UK applicants. Applying through Talent Hero gives you access to global opportunities without navigating the typical hiring grind. Youll work alongside US product, data, and engineering teams to build scalable ML pipelines and turn advanced algorithms into business value. We are looking for Machine Learning Engineers who are technically sharp, production-minded, and … excited to push the boundaries of intelligent systems. Responsibilities Design, build, and deploy machine learning models to solve complex business problems Develop scalable data pipelines for training, testing, and deploying models Collaborate with data scientists, product teams, and software engineers to integrate ML into production Optimize models for speed, accuracy, and efficiency in real-time environments Monitor model … in a similar role (minimum 1 year ) Strong proficiency in Python and popular ML frameworks (e.g., TensorFlow, PyTorch) Experience deploying machine learning models into production environments Solid understanding of datastructures , algorithms , and statistical learning Familiarity with cloud platforms (AWS, Azure, or GCP) and ML pipeline orchestration Bonus: Experience with deep learning , NLP , recommendation systems , or computer vision More ❯
for 3 days on-site each week, and 2 days remote. Skills & Requirements: Specific Software Skills: Strong expertise in C++ development, with a deep understanding of object-oriented programming, datastructures, and algorithms Experience with version control systems (e.g., Git), build systems, and CI/CD pipelines Knowledge of other programming languages (e.g., Python, Java) and development tools More ❯
a client of ours looking to hire for one of their teams that is developing an AI-first product to support commercial real estate investment decisions. Our application pulls data from a variety of sources, applies market-leading machine learning, and presents insights through innovative visualizations. After proving our product’s value as an internal tool, we were acquired … attention to detail. Good communication skills and a team player. Proficient in Python: demonstrated experience working in teams of Python developers on large projects. Solid understanding of algorithms and data structures. Experience with building features using test-driven development. Solid experience using Git for version control. Solid grasp of data concepts (relational databases, data cleansing, validation). … AWS, Azure). Broad understanding of Financial Services/Capital Markets/Asset Management. Experience working with geospatial data. Experience in feature engineering for Machine Learning applications. Experience with data engineering frameworks. Portfolio of past experience (e.g., demos of past work, contributions to open source, blogs, talks). Technical Stack: Data Pipeline Stack: Python 3, pandas, GeoPandas, boto3 More ❯
City of London, London, United Kingdom Hybrid / WFH Options
idpp
Junior Data Analyst – Fintech & Commercial Analytics Team (6–12 months) Rate: £350.00 - £400.00 per day inside IR35 Location: Hybrid, Central London About the Role Our global financial services client is seeking a Data Analyst to join their Fintech & Commercial Analytics Team . This role is ideal for someone with strong technical skills and a passion for uncovering insights … from large datasets to support strategic decision-making. You’ll be responsible for implementing and maintaining PySpark data tables aligned with established data models and best practices. This includes translating business requirements into production-level code, particularly focused on payment cost analysis , and ensuring data quality through systematic validation. You’ll work closely with senior analysts and … stakeholders to refine data pipelines, improve infrastructure, and document technical processes to ensure team-wide knowledge sharing and operational consistency. About the Team You’ll be a key contributor to the Commercial Analytics unit within our Fintech department —a team committed to innovation, collaboration, and impact. Responsibilities Working independently to collect, prepare, and write production-ready PySpark code Translating More ❯
Voluntary Benefits Health Cash Plan, Dental, Will Writing etc Annual Leave 23 days rising to 27 with length of service Sick Pay Increasing with length of service The Role: Data Architect As the Data Architect you will join a small but powerful data team, with a mission to transform the business into a data driven organisation. … You will be responsible for designing, developing, and maintaining the enterprise data architecture, ensuring alignment with business objectives and long-term scalability. Reporting to the Head of Data, this role will be responsible to define and maintain data standards, data models, and data governance frameworks to enable effective data management and analytics across the … organisation. The ideal candidate will have strong expertise in data architecture, data modelling, and data governance, with hands-on experience in Azure Databricks and Unity Catalog for metadata management and scalable data & analytics. Key responsibilities Define and enforce enterprise data architecture principles, policies, and standards in collaboration with the Head of Data. Develop and maintain More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Synchro
Job Title: KDB Developer Location: London - Hybrid Type: Permanent Position We are actively recruiting multiple KDB Developers at various levels for a fast-paced, data-driven organisation at the forefront of real-time analytics and high-performance computing. This organisation are a consultancy working with some of the leading names in the financial services sector. The Role: As a … KDB Developer, you will be responsible for designing, developing, and maintaining high-performance applications and data analytics solutions using kdb+/q. You’ll work closely with quants, traders, and data scientists to deliver scalable systems and actionable insights from large volumes of time-series data. Key Responsibilities: Design, implement, and optimise kdb+/q-based applications and … data pipelines Work on real-time data ingestion, transformation, and analysis Collaborate with stakeholders to gather requirements and translate them into technical solutions Maintain and enhance existing codebases, ensuring high availability and performance Contribute to architectural decisions and best practices for kdb+ systems Troubleshoot and resolve production issues quickly and effectively Required Skills & Experience: Strong hands-on experience More ❯
years of hands-on work experience as a Java Developer. Strong technical background in Java and Spring boot. Good experience in developing microservices Knowledge of design patterns, datastructures, and algorithms. Familiarity with microservices, SQL, Kafka, and relational databases. Exposure to Amazon Web Services (AWS) or cloud technologies. Good understanding of Docker, containers, and images. Knowledge of Java More ❯
that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Principal Consultant (4C), UI Developer! We are seeking a dynamic and results-oriented individual to join our team as Principal … front-end development best practices Experience with state management libraries (e.g., Redux, Context API). Familiarity with RESTful APIs and integrating front-end with back-end services. Knowledge of datastructures, design patterns, and algorithms. Awareness of the full development cycle, from analysis to first level production support. Preferred Qualifications/Skills AMPS messaging highly desirable. Financial knowledge More ❯
Science, Mathematics, Physics, Engineering, or a similar field 5+ years of hands-on experience developing Java applications in a financial services environment (Java 21) Excellent knowledge in algorithms and datastructures, object-oriented design, and microservices architecture (This will be tested) Proven record of complex software delivery in the Interest Rates domain Excellent Java 21, Spring Boot, strong More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
skills You have a good knowledge of AWS You have experience of working on highly scalable systems You have a strong knowledge of Computer Science fundamentals such as OOP, DataStructures, Design Patterns You have experience with, or a strong interest in Artificial Intelligence and are keen to explore the possibilities further, ChatGPT knowledge You have strong analysis More ❯
City of London, Greater London, UK Hybrid / WFH Options
Trust In SODA
deployment, ensuring high-quality delivery and continuous improvement. Database Mastery: You have a deep understanding of relational databases (ideally PostgreSQL) and extensive experience in building, managing, and optimizing complex datastructures and queries. Problem Solver: You possess exceptional analytical and problem-solving skills, able to tackle complex technical challenges and deliver innovative solutions. Bonus Points: Experience within the More ❯
The role Position: Data Architect Contract type: Full Time/Permanent Reporting to: Head of Data Location: London Overview of role As the Data Architect you will join a small but powerful data team, with a mission to transform Zodiac Maritime into a data driven organisation. You will be responsible for designing, developing, and maintaining … Zodiac Maritime’s enterprise data architecture, ensuring alignment with business objectives and long-term scalability. Reporting to the Head of Data, this role will be responsible to define and maintain data standards, data models, and data governance frameworks to enable effective data management and analytics across the organization. The ideal candidate will have strong … expertise in data architecture, data modelling, and data governance, with hands-on experience in Azure Databricks and Unity Catalog for metadata management and scalable data & analytics. Key responsibilities and primary deliverables Define and enforce enterprise data architecture principles, policies, and standards in collaboration with the Head of Data. Develop and maintain current (“as-is”) and More ❯