Mente's goal is to deeply understand the brain, to protect the brain from neurological disease and enhance the brain in health. We do this by generating our own data, building brain foundation models, and translating discovery to real clinical and research impact. Role focus - Biological Data Infrastructure at Petabyte Scale Key Tasks: Owning and scaling our data infrastructure by several orders of magnitude to handle > 100 petabyte-scale multi-omic datasets, including data pipelines, distributed data processing, and storage systems Building a unified feature store for all our ML models and biological data analysis workflows Efficiently storing and loading petabytes of data for ML bio data Processing and storing predictions and … evaluation metrics for large-scale biological forecasting and analysis models Implementing data versioning and point-in-time correctness systems for evolving biological datasets Building observable, debuggable data pipelines that handle the complexity of multi-omic data sources Expected Growth In 1 month you will be responsible for: Analyzing current data infrastructure bottlenecks. Implementing initial optimizations to More ❯
SQL, ETL, Azure SeniorDataEngineer is required to join a forward-thinking data team within a thriving city-based insurance group. This role will see you playing a critical role in delivering reliable, scalable and business-focused data solutions. With a strong focus on Microsoft technologies and cloud-based tools, you’ll work directly … making through data. The ideal candidate here will have a strong background in insurance MI or reporting—experience within an MGA or insurance carrier is essential. Key Responsibilities Deliver data solutions and changes that support evolving business requirements. Build and maintain robust, scalable data pipelines using SQL and ETL best practices. Collaborate with stakeholders to analyse, define and … implement solutions to complex data challenges. Proactively assess the impact of changes on the broader data model and ensure integrity is maintained. Work alongside the MI/reporting team to ensure data is accurately reflected in dashboards and reporting tools. Consult with business analysts, system owners and architects to align technical delivery with strategic objectives. Build deep More ❯
Ready for your next Data Engineering contract and want to be part of a data-led company thats helped deliver over £100 million of social impact into communities. Our partner is looking for a DataEngineer on a contract basis. As the Dataengineer you will build, maintain and improve thier data platform. … You'll get stuck into every part of the data engineering ecosystem: building up the infrastructure and deployment pipelines, writing new ELT pipelines, performing transformations and preparing the data for analysis. Tech stack includes: Databricks, Azure Cloud, Python, SQL, C#, Azure DevOps pipelines, Terraform, Azure Data Factory (ADF) Location: Fully Remote - Need to be based in the More ❯
by Intrum, Europe's biggest credit management service and have begun the next phase in our growth - expansion into 17 European markets over the next two years. The Role - DataEngineer We are looking for our first dedicated Data Engineering hire to join our small but high performing data team. You will be responsible for ensuring … the scalability and reliability of our data infrastructure to power our machine learning and analytics workloads to support the companies growth. Our data stack: We work with a modern data stack built on Databricks and AWS with python and pyspark as our primary tools. In this role, you'll get to: Own business critical components and perform … meaningful work with an impact on our company and our customers Design , implement and deliver data pipelines and infrastructure to support our machine learning and analytics products Collaborate as part of a fast paced data team and help achieve the team's goals of delivering high impact data products at scale Work with a modern dataMore ❯
by Intrum, Europe's biggest credit management service and have begun the next phase in our growth - expansion into 17 European markets over the next two years. The Role - DataEngineer We are looking for our first dedicated Data Engineering hire to join our small but high performing data team. You will be responsible for ensuring … the scalability and reliability of our data infrastructure to power our machine learning and analytics workloads to support the companies growth. Our data stack: We work with a modern data stack built on Databricks and AWS with python and pyspark as our primary tools. In this role, you'll get to: Own business critical components and perform … meaningful work with an impact on our company and our customers Design , implement and deliver data pipelines and infrastructure to support our machine learning and analytics products Collaborate as part of a fast paced data team and help achieve the team's goals of delivering high impact data products at scale Work with a modern dataMore ❯
unique, however our collective purpose is to create an environment where everyone can, 'Thrive in a rapidly evolving world'. We now have an exciting opportunity for an experienced SeniorDataEngineer to join our fantastic team. This is a hybrid position, working at least 2 days a week in our London Office. About the role This … is a new role in a growing team at the heart of Cognita's data transformation. You'll lead the design and architecture of data pipelines and integrations while remaining hands-on in delivery. Working with in-house data experts, regional stakeholders, and strategic partners, you'll build and optimise data pipelines on our Azure platform … and help shape our future integration strategy. To view the full Job Description, please Click Here Who we are looking for: You have strong SQL and Python skills, solid data engineering fundamentals, and experience delivering pipelines in a cloud environment (Azure, AWS, or GCP). Experience with Azure Data Factory, Synapse, or Data Lake Gen2 is beneficial More ❯
My client is based in the London area, is currently looking to recruit an experienced DataEngineer to join their AI Analytics team. They are one of the leaders within the AI space. They are currently going through a period of growth and are looking for an experienced DataEngineer to join their team. They are … demand and cutting-edge technology on the market right now. Main Responsibilities Develop scalable systems and reusable components that serve as reference points for new team members. Create efficient data workflows using distributed computing tools, particularly for large-scale processing and aggregation. Build responsive, asynchronous APIs and backend processes capable of handling high data volumes with minimal delay. … Utilise AI-powered development tools to enhance productivity and code quality. Collaborate within Agile teams to refine and monitor data collection systems using Scala and Java. Apply sound engineering principles such as test-driven development and modular design. Preferred Background Hands-on experience with Spark and Scala in commercial environments. Familiarity with Java and Python. Exposure to distributed dataMore ❯
My client is based in the London area, is currently looking to recruit an experienced DataEngineer to join their AI Analytics team. They are one of the leaders within the AI space. They are currently going through a period of growth and are looking for an experienced DataEngineer to join their team. They are … demand and cutting-edge technology on the market right now. Main Responsibilities Develop scalable systems and reusable components that serve as reference points for new team members. Create efficient data workflows using distributed computing tools, particularly for large-scale processing and aggregation. Build responsive, asynchronous APIs and backend processes capable of handling high data volumes with minimal delay. … Utilise AI-powered development tools to enhance productivity and code quality. Collaborate within Agile teams to refine and monitor data collection systems using Scala and Java. Apply sound engineering principles such as test-driven development and modular design. Preferred Background Hands-on experience with Spark and Scala in commercial environments. Familiarity with Java and Python. Exposure to distributed dataMore ❯
and Octopus Money. Check out the Seccl website for the latest on our products and our mission to shape the future of investments. The role We're looking for Seniordata analytics engineer to join our data team. You will play a key role in expanding and improving our data warehouse and BI capabilities. You … improvements in the team by suggesting improvements to existing processes, new technologies and ways of working. On a typical day you will be Working in an agile t-shaped data team and with colleagues across all other teams Building and implementing a range of developments across areas including our pipelines, data warehouse and BI tool Hands-on in … terms of development, testing, debugging, deployment and support Involved with specification and documentation Develop and promote appropriate processes and best practice Provide expertise on promoting best use of data and our tools throughout the company. Help develop and promote appropriate processes and best practice, always looking for better ways of working to drive scalability. You'll be successful if More ❯
Technology is at the heart of Disney's past, present, and future. Disney Entertainment and ESPN Product & Technology is a global organization of engineers, product developers, designers, technologists, data scientists, and more - all working to build and advance the technological backbone for Disney's media business globally. The team marries technology with creativity to build world-class products, enhance … of people globally. Innovation: We develop and implement groundbreaking products and techniques that shape industry norms, and solve complex and distinctive technical problems. The team is looking for a Senior Software Engineer to join our expanding quality engineering efforts across the Ads, Data, and eCommerce domains. As part of our mission to scale automation and enhance quality … s unique needs. In Ads, your work will help ensure that ad delivery systems serve relevant, timely, and high-quality advertisements, maximizing both viewer satisfaction and business revenue. For Data, you'll ensure the integrity and performance of large-scale pipelines that power analytics and insights used across the organization. In eCommerce, you'll help maintain seamless shopping experiences More ❯
approach to technology, delivering bespoke software solutions and expert advice. Our clients are increasingly looking to us to help them make the best use of their data. In building data platforms and pipelines, our data engineers create the foundation for diverse data & analytics solutions, including data science and AI. They build data lakes and warehouses … create the processes to extract or access operational data, and transform siloed datasets into integrated data models that allow insight into business performance and problems or training of ML models. These are hands-on, client-facing roles, with openings at senior or lead level to suit your experience. You may be leading teams, setting technical direction, advising … clients or solving tough engineering challenges. You'd also be expected to spend some time on-site with clients in the London area on an ad-hoc basis. Our data engineers combine a strong software engineering approach with solid data fundamentals and experience with modern tools. We're technology agnostic, and we're open minded when it comes More ❯
Job title: DataEngineer Client: Award Winning FinTech Firm Rate: Up to £675 p/d Duration: 6-Month Rolling Contract | Long-Term Engagement Location: London Skills: Python/Azure/Databricks/Snowflake The Role: Design and build powerful data pipelines that fuel real-time decisions, machine learning, and next-gen analytics at scale. Transform raw … data into trusted, usable assets - the foundation for everything from executive dashboards to product innovation. Own the data flow from ingestion to insight - using modern tools like Python, Azure, and Databricks. What you need: 7+ years of hands-on experience in data engineering roles. Advanced skills in Python for data pipelines, transformation, and orchestration. Deep understanding … of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly at bbirch@hunterbond.com More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hunter Bond
Job title: DataEngineer Client: Award Winning FinTech Firm Rate: Up to £675 p/d Duration: 6-Month Rolling Contract | Long-Term Engagement Location: London Skills: Python/Azure/Databricks/Snowflake The Role: Design and build powerful data pipelines that fuel real-time decisions, machine learning, and next-gen analytics at scale. Transform raw … data into trusted, usable assets - the foundation for everything from executive dashboards to product innovation. Own the data flow from ingestion to insight - using modern tools like Python, Azure, and Databricks. What you need: 7+ years of hands-on experience in data engineering roles. Advanced skills in Python for data pipelines, transformation, and orchestration. Deep understanding … of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly at bbirch@hunterbond.com More ❯
london, south east england, united kingdom Hybrid / WFH Options
Hunter Bond
Job title: DataEngineer Client: Award Winning FinTech Firm Rate: Up to £675 p/d Duration: 6-Month Rolling Contract | Long-Term Engagement Location: London Skills: Python/Azure/Databricks/Snowflake The Role: Design and build powerful data pipelines that fuel real-time decisions, machine learning, and next-gen analytics at scale. Transform raw … data into trusted, usable assets - the foundation for everything from executive dashboards to product innovation. Own the data flow from ingestion to insight - using modern tools like Python, Azure, and Databricks. What you need: 7+ years of hands-on experience in data engineering roles. Advanced skills in Python for data pipelines, transformation, and orchestration. Deep understanding … of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly at bbirch@hunterbond.com More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Hunter Bond
Job title: DataEngineer Client: Award Winning FinTech Firm Rate: Up to £675 p/d Duration: 6-Month Rolling Contract | Long-Term Engagement Location: London Skills: Python/Azure/Databricks/Snowflake The Role: Design and build powerful data pipelines that fuel real-time decisions, machine learning, and next-gen analytics at scale. Transform raw … data into trusted, usable assets - the foundation for everything from executive dashboards to product innovation. Own the data flow from ingestion to insight - using modern tools like Python, Azure, and Databricks. What you need: 7+ years of hands-on experience in data engineering roles. Advanced skills in Python for data pipelines, transformation, and orchestration. Deep understanding … of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly at bbirch@hunterbond.com More ❯
Position: Mid-Level Analytics Engineer Location: London (Hybrid, 2 days per week in office) Salary: £50,000 - £65,000 per annum Employment Type: Permanent, Full-Time Work Authorisation: Candidates must reside in the UK and have existing work eligibility. Sponsorship is not available. About Us We're a fast-growing, product-focused e-commerce business investing heavily in data to power smarter decisions across the company. Data plays a vital role in everything from product development to marketing performance. We're looking for a Mid-Level Analytics Engineer to support our data platform and contribute to our evolving analytics capabilities. Role Overview This role is ideal for someone with solid technical skills who enjoys working … collaboratively across teams to deliver accurate, actionable, and accessible data solutions. You'll work closely with senior engineers and stakeholders, supporting data pipeline development and helping improve data accessibility. Key Responsibilities Assist in building and maintaining data pipelines using dbt and Snowflake, ensuring they are scalable and efficient. Collaborate with analytics engineers and stakeholders across More ❯
renowned financial organisation you will be decommissioning a legacy commodities platform and then become part of new platform team! What you'll need to succeed Great experience as a DataEngineer and strong recent Financial Services or Insurance experience - working in regulated environments. You must be an expert with Databricks - Databricks certification (desirable). Experience with architecting and … engineering Databricks solutions from scratch. Strong experience with Python or Spark Expertise with modern data pipelines and running into production. Fantastic communication skills - Consultative/stakeholder management experience - you will help stakeholders use the platform. What you'll get in return Flexible working options available. What you need to do now If you're interested in this role, click More ❯
Employment Type: Contract
Rate: £675.0 - £700.0 per day + 700 Per Day (Inside IR35)
renowned financial organisation you will be decommissioning a legacy commodities platform and then become part of new platform team! What you'll need to succeed Great experience as a DataEngineer and strong recent Financial Services or Insurance experience - working in regulated environments. You must be an expert with Databricks - Databricks certification (desirable). Experience with architecting and … engineering Databricks solutions from scratch. Strong experience with Python or Spark Expertise with modern data pipelines and running into production. Fantastic communication skills - Consultative/stakeholder management experience - you will help stakeholders use the platform. What you'll get in return Flexible working options available. What you need to do now If you're interested in this role, click More ❯
in small teams of exceptional people, who are relentlessly resourceful to solve problems and find smarter solutions than the status quo Build the best technology in-house , using new data sources, machine learning and AI to make machines do the heavy lifting About the Role As a Data Platform Engineer, you'll help shape and scale our … data infrastructure, making analytics faster, more reliable, and cost-efficient. You'll work with AWS, Snowflake, Python, and Terraform, building tooling, and onboarding new data sources. You'll collaborate closely with teams across the business, ensuring our platform is secure, scalable, and easy to use. Our team mission We want to maximise business value by improving efficiency in … our Analytics, enhancing Lendable's competitive advantage through data utilisation. This means our Data Platform is designed to streamline the efficiency of analysis in extracting insights from data, while also ensuring cost control, compliance, and security. What will you be doing? This is a non-exhaustive list of the activities you would engage with in the role More ❯
Your new company Working for a globally renowned bank. Your new role You will be working as a Senior Bi Engineer in a newly formed data product team. In this role you will be supporting Front Office operations possessing strong expertise in Power Bi and Data Engineering, with good stakeholder management experience. You will be: Designing … and maintaining Power BI dashboards for trading, risk, and regulatory reporting as well as building data pipelines for real-time financial data. Partnering with the front office to deliver analytics solutions and optimising data models for front office reporting. What you'll need to succeed Strong business intelligence/data engineering experience with strong Power BI expertise. … Data engineering experience building production pipelines - with Databricks Business experience working around the Front Office/Risk is a must! Understanding of regulatory reporting processes. Financial product knowledge e.g. - equities, fixed income, derivatives. Experience on Azure cloud platforms. What you'll get in return Flexible working options available. What you need to do now If you're interested in More ❯
Employment Type: Contract
Rate: £750 - £800/day £700-800 Per Day (Inside IR35)
At Hastings Direct, storing your data securely is very important to us. Please see our Data Protection Statement and Job Application Terms & Conditions here for details on how your information will be stored. time left to apply End Date: September 1, 2025 (24 days left to apply) job requisition id Welcome to Hastings Direct. We're a digital … insurance provider with ambitious plans to become The Best and Biggest in the UK market. We've made huge investments in our data and tech capabilities over the past few years, along with nurturing our 4Cs culture. We're proud of the journey we're on as a company and know that our continued success will rely on the … we would love to hear from you. Role overview; Job details - you'll need to have demonstrable experience in: Develop, document and maintain robust processes that transform and clean data for input into a variety of live systems and analytical workflows Take on larger projects where you will need to co-ordinate the division of tasks with colleagues and More ❯
including examples such as Open Banking UK and PSD2. Programming Proficiency Proficiency in programming languages such as Python, Scala, or Java is necessary for this role. Experience with Cloud Data Platforms The candidate should have experience working with cloud data platforms such as GCP (BigQuery, Dataflow) or Azure (Data Factory, Synapse). SQL Expertise Being an expert More ❯