As a Senior Data Engineer, you will be responsible for the development of complex data sources and pipelines into our data platform (i.e. Snowflake) along with other data applications (i.e. Azure, Airflow, etc.) and automation. The Senior Data Engineer will work closely with the data, Architecture, Business Analyst, Data Stewards to integrate and … align requirements, specifications and constraints of each element of the requirement. They will also need to help identify gaps in resources, technology, or capabilities required, and work with the data engineering team to identify and implement solutions where appropriate. Work type: Contract Length: initial 6 months Work structure: hybrid 2 days a week in London. Primary Responsibilities: Integrate data from multiple on prem and cloud sources and systems. Handle data ingestion, transformation, and consolidation to create a unified and reliable data foundation for analysis and reporting. Develop data transformation routines to clean, normalize, and aggregate data. Apply data processing techniques to handle complex datastructures, handle missing or inconsistent data, and More ❯
Our platform simplifies all administrative tasks, allowing entrepreneurs and small business owners to manage their business admin effortlessly and without the usual complexities. By harnessing cutting-edge technology and data analytics, coupled with outstanding customer service, we provide a seamless experience that emphasises simplicity, efficiency, and rapid execution. Our goal is to remove obstacles, enabling you to achieve success … fostering a collaborative learning environment. API Development: Design and build RESTful API endpoints using NodeJS and Express, including detailed API documentation. Database Management: Design, optimize, and maintain complex SQL data models, ensuring efficient datastructures, query performance, and reliability. Full-Stack Development: Develop responsive and performant front-end applications using ReactJS, focusing on UX/UI best … a proactive and solution-oriented mindset. Soft Skills: Excellent communication, organisational skills, and the ability to work independently or within a team. Preferred Qualifications Advanced Degrees: Master’s in Data Science or Software Engineering. Cloud & DevOps: Experience with AWS cloud services, Terraform, Docker, Kubernetes Agile/DevOps: Experience with Agile methodologies, CI/CD pipelines, and project management tools More ❯
Candidates should take the time to read all the elements of this job advert carefully Please make your application promptly. About Bauer Media Audio Data & Decision Sciences Team The Data and Decision Sciences The Data and Decision Sciences (DDS) team is at the core of Bauer Media Audio with a mission to leverage data as a … strategic enabler across our nine European markets. Our goal is to provide trusted, actionable insights and robust data solutions that empower business growth, enhance audience engagement, and drive operational efficiency. The DDS team operates as a collaborative, cross-functional unit that bridges the gap between data and business strategy, delivering solutions that align with our organizational goals and … market needs. We are structured to support our stakeholders with a combination of centralized capabilities and localized expertise, ensuring that data drives value across the entire organization. We see data not as a support function but as an integral business partner that collaborates across all domains and markets expertise to deliver impactful business outcomes. As a Senior Analytics More ❯
production support and L3 cover Required qualifications, capabilities, and skills Very strong experience in Python Strong knowledge of security and authentication (e.g. OIDC, OAUTH) Strong knowledge of algorithms and datastructures Knowledge of Unix Shell scripting & SQL Expertise in software design using hexagonal architecture and Domain Driven Design Experience of REST API design/development Experience with build More ❯
from the US and the UK, including institutional VC funds and C-level executives of global technology companies. In this role, you will: Design and develop a highly-scalable data platform capable of ingesting and processing hundreds of terabytes of training data. Design and develop methodologies and metrics to better understand the underlying quality, structure and distribution of training … data. Architect training data validation, integrity and safety mechanisms for state-of-the-art ML models. Co-hire your future colleagues. Work closely with the founding team and contribute towards best practices, standards, and culture of the company. What we are looking for: Back-end development: Experience in back-end engineering developing data platforms or large-scale extract … transform-load (ETL) pipelines. Programming languages: Proficiency in Python for data pipelines, distributed systems and micro-services. Cloud-native technologies: Experience in developing and deploying in cloud platforms (e.g., AWS, GCP or Azure), an understanding of containerisation (e.g., Docker) and infrastructure-as-code software (e.g., Terraform). Algorithms and datastructures: Excellent understanding of core CS fundamentals More ❯
from the US and the UK, including institutional VC funds and C-level executives of global technology companies. In this role, you will: Design and develop a highly-scalable data platform capable of ingesting and processing hundreds of terabytes of training data. Design and develop methodologies and metrics to better understand the underlying quality, structure and distribution of training … data. Architect training data validation, integrity and safety mechanisms for state-of-the-art ML models. Be given a high degree of autonomy and ownership over your work. Co-hire your future colleagues. Work closely with the founding team and contribute towards best practices, standards, and culture of the company. What we are looking for: Back-end development: 5+ … years of industry experience in back-end engineering developing data platforms or large-scale extract-transform-load (ETL) pipelines. Programming languages: Proficiency in Python for data pipelines, distributed systems and micro-services. Cloud-native technologies: Experience in developing and deploying in cloud platforms (e.g., AWS, GCP or Azure), an understanding of containerisation (e.g., Docker) and infrastructure-as-code More ❯
Job Description Be one of the first applicants, read the complete overview of the role below, then send your application for consideration. Join the Chief Data & Analytics Office (CDAO) at JPMorgan Chase and be part of a team that accelerates the firm's data and analytics journey. We focus on ensuring data quality and security while leveraging … insights to promote decision-making and support commercial goals through AI and machine learning. As an AI ML Lead Software Engineer within the Chief Data & Analytics Office, you will become part of a mission to modernize compliance through scalable and explainable AI. We are building a system that answers the question: “Can I use this data?”, not with … guesswork, but with prediction/classification, logic, proof, and intelligent automation. Our work sits at the intersection of applied machine learning, AI reasoning systems, and data governance. We are designing the triage layer of an intelligent decision engine that combines ML-driven classification, LLM-assisted parsing, and formal logic-based verification. This is an opportunity to tackle complex, ambiguous More ❯
a different format of this document, please get in touch with at [emailprotected] or call TCS London Office number 02031552100 with the subject line: “Application Support Request”. Role : Data Architect Job Type: Permanent (Hybrid) Location: London, United Kingdom Ready to utilize your skills in designing, creating, and managing data architecture? Join us as a Data Architect. … endless learning opportunities. • Be part of an exciting team where you will be challenged every day. • Build strong relationships with a diverse range of stakeholders. The Role As a Data Architect, you will be responsible for designing, creating, and managing data architecture. You will also ensure that data is efficiently and securely stored, organized, and accessible across … the enterprise. Build the foundation for databases, data warehouses, data lakes, and other data storage solutions, ensuring they meet both business and technical requirements. Key responsibilities: - Develop and design the data architecture framework for the organization. - Create models for databases, data warehouses, data lakes, and other storage solutions to store and manage dataMore ❯
tackle services that infringe our member companies’ rights. The ideal candidate will have well-rounded technical knowledge. They will have a solid understanding of both backend and frontend technologies, datastructures, OOP, development best practices and have experience in relational databases and RESTful APIs. Strong communication skills and an analytical, solution-driven mindset are vital to this role. … across applications. Stay ahead of emerging technologies in full-stack and cloud development. Conduct testing, debugging, and continuously improve user experiences. Provide support in implementing solutions to improve the data mining strategy. May also provide support to other technical-related business activities. The selected candidate will receive training in specific processes and skills, as required. Requirements Proven AWS qualifications … preferably in solutions architect, DevOps or data engineer. Well-versed in cloud computing, automated tests, microservices architecture, continuous delivery/integration and DevOps tools. Experience building and maintaining full-stack applications, including backend APIs and frontend user interfaces. Experience developing world-scale/multi-tenancy applications Proficiency and commercial experience developing solutions in Python using Fast API and/ More ❯
production issues/outages Work in an agile environment, based on research, collaboration and learning from failure Key Requirements: Experience across RFQ is ideal Strong design (patterns), algorithms and datastructures knowledge Experience with the entire software development life cycle Experience with building services from scratch and developing on top of existing components Understanding of Continuous Delivery concepts. More ❯
with business users and prioritize requirements. Preferred Qualifications, Capabilities, and Skills: - Software development experience in commodities, finance, or investment banking preferred, or willingness to learn rapidly. - Strong knowledge of datastructures, algorithms, and enterprise architecture. - Ability to collaborate with and influence other technology teams in a constantly changing environment. About Us J.P. Morgan is a global leader in More ❯
Job Summary: NineTech's client are looking for a highly skilled and experienced Data Architect to lead the design, development, and implementation of data architecture solutions across the organization. This role is an INSIDE IR35 position and paying up to £650 P/D DOE Below is a breakdown of the role. The ideal candidate will possess deep … expertise in ServiceNow , Oracle , Data Lakes , and Data Migration strategies. Knowledge of Elastic Search and modern data engineering practices is highly desirable. Key Responsibilities: Design and maintain scalable, secure, and high-performing data architectures. Develop and implement data integration and migration strategies between legacy and modern systems. Lead architecture initiatives related to ServiceNow data modeling , integration, and reporting. Architect and support enterprise Data Lake solutions, ensuring data availability, quality, and consistency. Drive the design and execution of data migration projects , especially those involving Oracle-based systems. Collaborate with stakeholders across departments to gather data requirements and translate them into technical specifications. Implement and maintain Elastic Search clusters and solutions More ❯
from the US and the UK, including institutional VC funds and C-level executives of global technology companies. In this role, you will: Design and develop a highly-scalable data platform capable of ingesting and processing hundreds of terabytes of training data. Design and develop methodologies and metrics to better understand the underlying quality, structure and distribution of training … data. Architect training data validation, integrity and safety mechanisms for state-of-the-art ML models. Be given a high degree of autonomy and ownership over your work. Co-hire your future colleagues. Work closely with the founding team and contribute towards best practices, standards, and culture of the company. What we are looking for: Back-end development: At … least 2–3 years of industry experience in back-end engineering developing data platforms or large-scale extract-transform-load (ETL) pipelines. Programming languages: Proficiency in Python for data pipelines, distributed systems and micro-services. Cloud-native technologies: Experience in developing and deploying in cloud platforms (e.g., AWS, GCP or Azure), an understanding of containerisation (e.g., Docker) and More ❯
SQL Server Database Developer – London Markets Insurance A specialist London Markets insurer is building out a brand-new, modern data platform and is looking for an experienced SQL Server Database Developer to join their growing data team. With strong executive support and significant investment, this is a rare greenfield opportunity to help define and build core data infrastructure from the ground up. The business is deeply committed to becoming a more data-driven organisation, and this role offers both stability and progression in a collaborative, high-impact environment. Why apply: Greenfield platform build with real architectural influence Strong backing and long-term investment in data initiatives Excellent career development opportunities within a growing team … and optimise SQL Server databases to support analytics and operational workloads Write high-quality, efficient T-SQL procedures, views, and functions Develop ETL processes and support the integration of data from core insurance systems Work closely with BI, data architecture, and business teams to shape datastructures and pipelines Support performance tuning, indexing, and optimisation across More ❯
Posted: 26.06.2025 Expiry Date: 10.08.2025 col-wide Job Description: SQL Server Database Developer – London Markets Insurance A specialist London Markets insurer is building out a brand-new, modern data platform and is looking for an experienced SQL Server Database Developer to join their growing data team. With strong executive support and significant investment, this is a rare greenfield … opportunity to help define and build core data infrastructure from the ground up. The business is deeply committed to becoming a more data-driven organisation, and this role offers both stability and progression in a collaborative, high-impact environment. Why apply: Greenfield platform build with real architectural influence Strong backing and long-term investment in data initiatives … and optimise SQL Server databases to support analytics and operational workloads Write high-quality, efficient T-SQL procedures, views, and functions Develop ETL processes and support the integration of data from core insurance systems Work closely with BI, data architecture, and business teams to shape datastructures and pipelines Support performance tuning, indexing, and optimisation across More ❯
City of London, London, United Kingdom Hybrid / WFH Options
JobHeron
or masters degree in computer science (or related field). 5+ years of professional software development experience. Proficient in C++ (Concurrent programming techniques: Shared memory, Atomics and lock-free datastructures) Strong understanding of datastructures, algorithms, and software design principles. Excellent problem-solving and analytical skills. Working knowledge of Linux C++ development environment: vim, gdb … make, valgrind etc. Experience working in Linux Environments with good command of shell, python, awk, sed Motivation to understand/develop an understanding of various financial data elements and how they are used for trading Ability to work collaboratively with a team and demonstrate passion for developing high-quality software. Proven experience in hands-on development and deployment of More ❯
master’s degree in computer science (or related field). 5+ years of professional software development experience. Proficient in C++ (Concurrent programming techniques: Shared memory, Atomics and lock-free datastructures) Strong understanding of datastructures, algorithms, and software design principles. Excellent problem-solving and analytical skills. Working knowledge of Linux C++ development environment: vim, gdb … make, valgrind etc. Experience working in Linux Environments with good command of shell, python, awk, sed Motivation to understand/develop an understanding of various financial data elements and how they are used for trading Ability to work collaboratively with a team and demonstrate passion for developing high-quality software. Proven experience in hands-on development and deployment of More ❯
financial sector. The ideal candidate will have a deep experience and understanding of test automation frameworks, and an understanding of COTS (Commercial Off-The-Shelf) products and experience in Data Modernization/Transformation programmes. As an Automation Test Lead, you will be responsible for designing and executing test plans, coordinating with cross-functional teams, and ensuring the quality of … support Coordinate with cross-functional teams, including developers, business analysts, and project managers, to understand project requirements and develop test strategies Potentially also support, advise, assist to prepare test data, migrations testing CI/CD, NFT and UAT Potentially create automated solutions around data inputs, outputs and data source-target comparisons Qualifications and Skills: Bachelor's degree … experience as an Automation Test Lead, with at least 5 years of experience and automation testing in the banking and financial sector Experience with cloud technologies, such as Azure Data Factory In depth experience, knowledge and understanding of test automation frameworks with Karate, JSON, NUnit, Pytest, Eggplant; along with Github Actions pipeline and IntelliJ IDE. In-depth understanding of More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Talent Hero Ltd
service is 100% free for all UK applicants. Applying through Talent Hero gives you access to global opportunities without navigating the typical hiring grind. Youll work alongside US product, data, and engineering teams to build scalable ML pipelines and turn advanced algorithms into business value. We are looking for Machine Learning Engineers who are technically sharp, production-minded, and … excited to push the boundaries of intelligent systems. Responsibilities Design, build, and deploy machine learning models to solve complex business problems Develop scalable data pipelines for training, testing, and deploying models Collaborate with data scientists, product teams, and software engineers to integrate ML into production Optimize models for speed, accuracy, and efficiency in real-time environments Monitor model … in a similar role (minimum 1 year ) Strong proficiency in Python and popular ML frameworks (e.g., TensorFlow, PyTorch) Experience deploying machine learning models into production environments Solid understanding of datastructures , algorithms , and statistical learning Familiarity with cloud platforms (AWS, Azure, or GCP) and ML pipeline orchestration Bonus: Experience with deep learning , NLP , recommendation systems , or computer vision More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Korn Ferry
time each week, and 2 days working remotely. Rate negotiable Skills & Requirements: Specific Software Skills are : Strong expertise in C++ development, with a deep understanding of object-oriented programming, datastructures, and algorithms Experience with version control systems (e.g., Git), build systems, and continuous integration/continuous deployment (CI/CD) pipelines Knowledge of other programming languages (e.g. More ❯
for 3 days on-site each week, and 2 days remote. Skills & Requirements: Specific Software Skills: Strong expertise in C++ development, with a deep understanding of object-oriented programming, datastructures, and algorithms Experience with version control systems (e.g., Git), build systems, and CI/CD pipelines Knowledge of other programming languages (e.g., Python, Java) and development tools More ❯
or C++ Experience with gRPC and Google Protocol Buffers Experience with caching technologies, e.g. Redis Experience with infrastructure as code software, e.g. Terraform Experience using and designing schemas/datastructures in resilient SQL and NoSQL databases (e.g. CockroachDB ) Familiarity with front-end technologies, like ReactJs Certified Kubernetes and public cloud knowledge (e.g. CKAD and AWS certifications) About More ❯
to login/join with: Duration: contract to run until 31/12/2025 Rate: up to £644 p/d Umbrellainside IR35 We are seeking an experienced Data Modeller with proven expertise in the London Market insurance sector. The successful candidate will play a key role in designing and validating data models that support enterprise data initiatives. This includes working closely with data engineers, architects, and business stakeholders to ensure datastructures are scalable, accurate, and aligned with business needs. Key Skills/requirements Design and maintain conceptual, logical, and physical data models to support reporting, analytics, and operational systems. Collaborate with data engineers and analysts to ensure models are … implemented correctly and efficiently. Translate complex business requirements into scalable and maintainable data structures. Ensure data models comply with data governance, compliance, and London Market regulatory standards. Document data definitions, relationships, and lineage using industry-standard modeling tools. Support data quality initiatives by identifying gaps and inconsistencies in source systems and downstream usage. Qualifications: London More ❯
City of London, London, United Kingdom Hybrid / WFH Options
idpp
Junior Data Analyst – Fintech & Commercial Analytics Team (6–12 months) Rate: £350.00 - £400.00 per day inside IR35 Location: Hybrid, Central London About the Role Our global financial services client is seeking a Data Analyst to join their Fintech & Commercial Analytics Team . This role is ideal for someone with strong technical skills and a passion for uncovering insights … from large datasets to support strategic decision-making. You’ll be responsible for implementing and maintaining PySpark data tables aligned with established data models and best practices. This includes translating business requirements into production-level code, particularly focused on payment cost analysis , and ensuring data quality through systematic validation. You’ll work closely with senior analysts and … stakeholders to refine data pipelines, improve infrastructure, and document technical processes to ensure team-wide knowledge sharing and operational consistency. About the Team You’ll be a key contributor to the Commercial Analytics unit within our Fintech department —a team committed to innovation, collaboration, and impact. Responsibilities Working independently to collect, prepare, and write production-ready PySpark code Translating More ❯
Voluntary Benefits Health Cash Plan, Dental, Will Writing etc Annual Leave 23 days rising to 27 with length of service Sick Pay Increasing with length of service The Role: Data Architect As the Data Architect you will join a small but powerful data team, with a mission to transform the business into a data driven organisation. … You will be responsible for designing, developing, and maintaining the enterprise data architecture, ensuring alignment with business objectives and long-term scalability. Reporting to the Head of Data, this role will be responsible to define and maintain data standards, data models, and data governance frameworks to enable effective data management and analytics across the … organisation. The ideal candidate will have strong expertise in data architecture, data modelling, and data governance, with hands-on experience in Azure Databricks and Unity Catalog for metadata management and scalable data & analytics. Key responsibilities Define and enforce enterprise data architecture principles, policies, and standards in collaboration with the Head of Data. Develop and maintain More ❯