Data Engineer - Subject Matter Expert (SME) Northern, VA area Full-Time On-site Position Contingent Upon Award Join Synertex LLC and bring your expertise to a mission that matters. We're looking for a Data Engineer SME to construct and maintain scalable data pipelines for AI training and analytics in support of critical government and intelligence initiatives. … If you're passionate about dataengineering and thrive in an independent, mission-driven environment-this opportunity is for you. RESPONSIBILITIES: Construct scalable data pipelines. Maintain data pipelines to support AI training and analytics. Operate independently across multiple work areas (Full Performance level). Lead major technical assignments and supervise teams (Senior level). Provide strategic … with 6 years of relevant experience, OR PhD with 4 years of relevant experience Provides strategic technical/management leadership Join a mission-driven team advancing government and intelligence data initiatives. Apply today and become part of Synertex LLC's legacy of leadership and excellence. More ❯
of Products and Applications - Competition & Markets Authority - G6 Belfast, Cardiff, Edinburgh, London, Manchester About the job Job summary The Competition and Markets Authority has established an Executive Directorate for Data, Technology, and Insight (DTI), which encompasses our expertise in data science, dataengineering, artificial intelligence, behavioural science, technology insight, and digital forensics, as well as technology More ❯
of Products and Applications - Competition & Markets Authority - G6 Belfast, Cardiff, Edinburgh, London, Manchester About the job Job summary The Competition and Markets Authority has established an Executive Directorate for Data, Technology, and Insight (DTI), which encompasses our expertise in data science, dataengineering, artificial intelligence, behavioural science, technology insight, and digital forensics, as well as technology More ❯
of Products and Applications - Competition & Markets Authority - G6 Belfast, Cardiff, Edinburgh, London, Manchester About the job Job summary The Competition and Markets Authority has established an Executive Directorate for Data, Technology, and Insight (DTI), which encompasses our expertise in data science, dataengineering, artificial intelligence, behavioural science, technology insight, and digital forensics, as well as technology More ❯
of Products and Applications - Competition & Markets Authority - G6 Belfast, Cardiff, Edinburgh, London, Manchester About the job Job summary The Competition and Markets Authority has established an Executive Directorate for Data, Technology, and Insight (DTI), which encompasses our expertise in data science, dataengineering, artificial intelligence, behavioural science, technology insight, and digital forensics, as well as technology More ❯
of Products and Applications - Competition & Markets Authority - G6 Belfast, Cardiff, Edinburgh, London, Manchester About the job Job summary The Competition and Markets Authority has established an Executive Directorate for Data, Technology, and Insight (DTI), which encompasses our expertise in data science, dataengineering, artificial intelligence, behavioural science, technology insight, and digital forensics, as well as technology More ❯
Technical Delivery Lead - GCP Location: London 3 days a week/Leeds 2 days a month Employment Type: Fixed Term Contract OR Umbrella Contract Business Unit: Banking/Data Management A global data and AI-driven transformation company is seeking an experienced Technical Delivery Lead to oversee the delivery of a scalable, enterprise-grade data platform on … coordinating cross-functional teams and aligning technology initiatives with business outcomes. Role Overview The Technical Delivery Lead will manage the end-to-end delivery life cycle of GCP-based data initiatives. This includes orchestrating teams, managing dependencies, and ensuring alignment with modern data principles and cloud best practices. The role emphasises leadership, stakeholder engagement, and proactive risk management … rather than hands-on engineering. Key Responsibilities Lead delivery of GCP-based data platform projects, ensuring strategic alignment and timely execution Coordinate cross-functional teams including data engineers, architects, QA, and DevOps Manage project phases from design through deployment, maintaining delivery velocity and quality Facilitate retrospectives and feedback sessions to drive continuous improvement Collaborate with architects to evolve More ❯
Technical Delivery Lead - GCP Location: London 3 days a week/Leeds 2 days a month Employment Type: Fixed Term Contract OR Umbrella Contract Business Unit: Banking/Data Management A global data and AI-driven transformation company is seeking an experienced Technical Delivery Lead to oversee the delivery of a scalable, enterprise-grade data platform on … coordinating cross-functional teams and aligning technology initiatives with business outcomes. Role Overview The Technical Delivery Lead will manage the end-to-end delivery life cycle of GCP-based data initiatives. This includes orchestrating teams, managing dependencies, and ensuring alignment with modern data principles and cloud best practices. The role emphasises leadership, stakeholder engagement, and proactive risk management … rather than hands-on engineering. Key Responsibilities Lead delivery of GCP-based data platform projects, ensuring strategic alignment and timely execution Coordinate cross-functional teams including data engineers, architects, QA, and DevOps Manage project phases from design through deployment, maintaining delivery velocity and quality Facilitate retrospectives and feedback sessions to drive continuous improvement Collaborate with architects to evolve More ❯
Contribute to clear and concise documentation for software, processes, and systems to ensure team alignment and knowledge sharing. Your Qualifications Experience: Professional experience in Python development or related software engineering roles. Python Proficiency: Strong knowledge of Python, including experience with web frameworks like Django, Flask, or FastAPI. Database Management: Solid experience with relational databases like PostgreSQL or MySQL and … functional teams. Cloud Services: Experience with cloud platforms like AWS, GCP, or Azure. DevOps Tools: Familiarity with containerization (Docker) and infrastructure automation tools like Terraform or Ansible. Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. DataEngineering: Experience with Apache Spark, Databricks, or similar … big data platforms for processing large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like Apache Airflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with Apache Flink or other stream processing frameworks is a plus. Desired Skills Asynchronous Programming: Familiarity with asynchronous programming tools like Celery More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hlx Technology
Senior Data Engineer – AI & Neuroscience Location: London (Hybrid, Kings Cross) or San Francisco (Onsite/Hybrid) Employment: Full-time About the Company We are partnered with a pioneering biotech building the world’s first brain foundation models — large-scale AI systems designed to deeply understand, protect, and enhance the human brain. By generating our own data, developing novel … with direct influence on the technical systems that will underpin scientific and clinical breakthroughs. The work is ambitious, interdisciplinary, and sits at the cutting edge of AI, biology, and data engineering. The Role We are seeking a Senior Data Engineer to lead the design and scaling of the company’s core data infrastructure. You will be responsible … Your work will provide the backbone for training next-generation AI models, enabling researchers to extract real-world insights from raw biological data. Key Responsibilities Design and implement distributed data pipelines for multi-omic, neuroscience, and clinical datasets Build a unified feature store to serve ML training and downstream biological analysis Develop scalable storage, ingestion, and validation systems with More ❯
Senior Data Engineer – AI & Neuroscience Location: London (Hybrid, Kings Cross) or San Francisco (Onsite/Hybrid) Employment: Full-time About the Company We are partnered with a pioneering biotech building the world’s first brain foundation models — large-scale AI systems designed to deeply understand, protect, and enhance the human brain. By generating our own data, developing novel … with direct influence on the technical systems that will underpin scientific and clinical breakthroughs. The work is ambitious, interdisciplinary, and sits at the cutting edge of AI, biology, and data engineering. The Role We are seeking a Senior Data Engineer to lead the design and scaling of the company’s core data infrastructure. You will be responsible … Your work will provide the backbone for training next-generation AI models, enabling researchers to extract real-world insights from raw biological data. Key Responsibilities Design and implement distributed data pipelines for multi-omic, neuroscience, and clinical datasets Build a unified feature store to serve ML training and downstream biological analysis Develop scalable storage, ingestion, and validation systems with More ❯
leading consultancy working with an end client in the banking and financial services sector is seeking an experienced Technical Delivery Lead to oversee the delivery of a large-scale data platform built on Google Cloud Platform (GCP) . This role requires proven experience delivering cloud-based data solutions within banking environments , with a strong understanding of regulatory frameworks … data governance, and stakeholder engagement in financial settings. Key Responsibilities Lead end-to-end delivery of GCP-based data platform initiatives. Coordinate cross-functional teams across dataengineering, architecture, QA, and DevOps. Ensure alignment with Agile, DevOps, and CI/CD best practices. Manage stakeholder expectations and proactively mitigate delivery risks. Support architecture evolution and ensure … compliance with banking governance standards. Ideal Candidate 10+ years in technology delivery, with 3+ years leading cloud data projects (GCP preferred). Must have experience delivering data platforms in banking or financial services. Strong knowledge of GCP services (BigQuery, Dataflow, Pub/Sub). Familiarity with ETL/ELT, data lakehouse architectures, and cloud integration. Excellent leadership More ❯
Senior Software Engineer - Backend & Data | Fully Remote (UK) Founding Engineer | Python | GCP | Terraform | Event-Based Systems | Data Pipelines An innovative, mission-driven LegalTech SaaS start-up is looking for a Senior Product Engineer (Backend & Data) with deep experience in Python, Product development, and Data engineering. Your mission: own, build, and scale version 1 - the first commercial … UK only) Stack : Python , TypeScript, GCP, Pub/Sub, SQL & NoSQL, IaC (Terraform), CI/CD (GitHub Actions), Observability tools, AI tooling Youll join a remote first, high-trust engineering team working with a modern, cloud-native stack - with real influence over technical decisions from day one. Youll take technical ownership of the backend to ensure its robust, scalable … and ready for real customers, while adding new Data-driven features, optimising performance, and shaping the long-term Product roadmap. While your focus will be backend systems in Python, youll also work across the stack, collaborate directly with users, and bring a strong Product mindset to every decision. This role is ideal for someone who thrives on turning early More ❯
Software Developer - Subject Matter Expert - MOON 413-01 Country Intelligence Group is seeking a Full-Time Software Developer to support our client on the All Source Data Modeling and Analysis team. This team is responsible for executing complex DataEngineering, Science, Modeling, and Analysis initiatives in support of All Source Analysis Reports and Products. The selected candidate … will serve as a key contributor in developing full-stack solutions that integrate large-scale data processing, AI/ML modeling, and modern cloud-based architectures. The ideal candidate will bring a strong background in both front-end and back-end development, experience working with large language models (LLMs), and proficiency in modern software engineering tools and frameworks. … problem-solving mindset, effective collaboration with multidisciplinary teams, and a commitment to delivering high-impact technical solutions in support of mission-driven analysis. Tasks Performed: • Design and develop responsive, data-driven front-end applications using JavaScript and Angular. • Build scalable and secure backend systems; develop and integrate RESTful APIs. • Support AI/ML modeling, including the integration of large More ❯
Key Responsibilities Design and build high-performance tools and services to validate the reliability, performance, and correctness of ML data pipelines and AI infrastructure. Develop platform-level test solutions and automation frameworks using Python, Terraform, and modern cloud-native practices. Contribute to the platform’s CI/CD pipeline by integrating automated testing, resilience checks, and observability hooks at … every stage. Lead initiatives that drive testability, platform resilience, and validation as code across all layers of the ML platform stack. Collaborate with engineering, MLOps, and infrastructure teams to embed quality engineering deeply into platform components. Build reusable components that support scalability, modularity, and self-service quality tooling. Mentor junior engineers and influence technical standards across the Test … Engineering Program. Required Qualifications Bachelor’s or master’s degree in computer science, Engineering, or a related technical field. 8+ years of hands-on software development experience, including large-scale backend systems or platform engineering. Expert in Python with a strong understanding of object-oriented programming, testing frameworks, and automation libraries. Experience building or validating platform infrastructure, with More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hunter Bond
Job title: Data Engineer Client: Award Winning FinTech Firm Rate: Up to £725 p/d Duration: 6-Month Rolling Contract | Long-Term Engagement Location: London Skills: Python/Azure/Databricks/Snowflake The Role: Design and build powerful data pipelines that fuel real-time decisions, machine learning, and next-gen analytics at scale. Transform raw data into trusted, usable assets - the foundation for everything from executive dashboards to product innovation. Own the data flow from ingestion to insight - using modern tools like Python, Azure, and Databricks. What you need: 7+ years of hands-on experience in dataengineering roles. Advanced skills in Python for data pipelines, transformation, and orchestration. Deep understanding … of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly at bbirch@hunterbond.com More ❯
Job title: Data Engineer Client: Award Winning FinTech Firm Rate: Up to £725 p/d Duration: 6-Month Rolling Contract | Long-Term Engagement Location: London Skills: Python/Azure/Databricks/Snowflake The Role: Design and build powerful data pipelines that fuel real-time decisions, machine learning, and next-gen analytics at scale. Transform raw data into trusted, usable assets - the foundation for everything from executive dashboards to product innovation. Own the data flow from ingestion to insight - using modern tools like Python, Azure, and Databricks. What you need: 7+ years of hands-on experience in dataengineering roles. Advanced skills in Python for data pipelines, transformation, and orchestration. Deep understanding … of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly at bbirch@hunterbond.com More ❯
london, south east england, united kingdom Hybrid / WFH Options
Hunter Bond
Job title: Data Engineer Client: Award Winning FinTech Firm Rate: Up to £725 p/d Duration: 6-Month Rolling Contract | Long-Term Engagement Location: London Skills: Python/Azure/Databricks/Snowflake The Role: Design and build powerful data pipelines that fuel real-time decisions, machine learning, and next-gen analytics at scale. Transform raw data into trusted, usable assets - the foundation for everything from executive dashboards to product innovation. Own the data flow from ingestion to insight - using modern tools like Python, Azure, and Databricks. What you need: 7+ years of hands-on experience in dataengineering roles. Advanced skills in Python for data pipelines, transformation, and orchestration. Deep understanding … of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly at bbirch@hunterbond.com More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Hunter Bond
Job title: Data Engineer Client: Award Winning FinTech Firm Rate: Up to £725 p/d Duration: 6-Month Rolling Contract | Long-Term Engagement Location: London Skills: Python/Azure/Databricks/Snowflake The Role: Design and build powerful data pipelines that fuel real-time decisions, machine learning, and next-gen analytics at scale. Transform raw data into trusted, usable assets - the foundation for everything from executive dashboards to product innovation. Own the data flow from ingestion to insight - using modern tools like Python, Azure, and Databricks. What you need: 7+ years of hands-on experience in dataengineering roles. Advanced skills in Python for data pipelines, transformation, and orchestration. Deep understanding … of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly at bbirch@hunterbond.com More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Hunter Bond
Job title: Data Engineer Client: Award Winning FinTech Firm Rate: Up to £725 p/d Duration: 6-Month Rolling Contract | Long-Term Engagement Location: London Skills: Python/Azure/Databricks/Snowflake The Role: Design and build powerful data pipelines that fuel real-time decisions, machine learning, and next-gen analytics at scale. Transform raw data into trusted, usable assets - the foundation for everything from executive dashboards to product innovation. Own the data flow from ingestion to insight - using modern tools like Python, Azure, and Databricks. What you need: 7+ years of hands-on experience in dataengineering roles. Advanced skills in Python for data pipelines, transformation, and orchestration. Deep understanding … of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly at bbirch@hunterbond.com More ❯
We're seeking a hands-on Lead Data Engineer to guide a small team and oversee our data platform. You'll lead engineering efforts, manage DevOps workflows, and ensure robust configuration and documentation of Databricks environments. Responsibilities: Lead dataengineering across the platform Supervise DevOps processes (branching, merging, CI/CD) Act as Databricks DBA … and manage environment promotion Maintain configuration, documentation, and monitoring Mentor and support Data Engineers Required Skills: Strong experience with Databricks , Azure Synapse , and Azure DevOps Proficient in SQL and PySpark Proven leadership in small engineering teams Skilled in configuration management and documentation Location: Warwickshire Type: Contract - 6 Months initially (high chance of extension) IR35 Status: Inside IR35 Rate More ❯
We're seeking a hands-on Lead Data Engineer to guide a small team and oversee our data platform. You'll lead engineering efforts, manage DevOps workflows, and ensure robust configuration and documentation of Databricks environments. Responsibilities: Lead dataengineering across the platform Supervise DevOps processes (branching, merging, CI/CD) Act as Databricks DBA … and manage environment promotion Maintain configuration, documentation, and monitoring Mentor and support Data Engineers Required Skills: Strong experience with Databricks , Azure Synapse , and Azure DevOps Proficient in SQL and PySpark Proven leadership in small engineering teams Skilled in configuration management and documentation Location: Warwickshire Type: Contract - 6 Months initially (high chance of extension) IR35 Status: Inside IR35 Rate More ❯
Newton-Le-Willows, Merseyside, North West, United Kingdom Hybrid / WFH Options
Linaker Limited
the smooth and efficient functioning and development of our databases and pipelines. You will be responsible for developing, optimising and managing these databases and ensuring a high-level of data quality and integrity. You will be defining the principles and processes that underpin data management while ensuring we maintain the highest standards for data management, security, privacy … are repeatable and scalable. Create and manage database schemas, tables, stored procedures, indexes, and triggers. Ensuring high availability of databases through clustering, replication, and other techniques. Implement and maintain data pipelines and ETL processes for real-time and batch data ingestion. Monitor database performance and proactively address issues related to tuning and optimization. Ensure data security, privacy … integrity, and compliance with internal and external regulations. Collaborate with third-party suppliers, software developers, data analysts, DevOps, and other stakeholders to meet project requirements. Plan and implement database backups, restores, and disaster recovery strategies. Taking an active role in supporting and in finding the root cause of operational issues and preventing recurrences. Support the migration of databases and More ❯
for a Senior Backend Developer to join their award-winning Insurance division. As part of a mixed technical team including back-end developers, testers, front-end developers and actuarial data scientists, you d be working on their market-leading insurance application, used by some of the largest London Market and International insurers to assess over £200bn in reserves. Your … to design Driving better integration between their C# and Python codebases, rewriting and restructuring code where needed to improve cohesion Address any performance bottlenecks and optimise performance of complex data workflows Championing best practice and guiding the junior team members What they re specifically looking for: C# development experience (.Net 8) with production-level proficiency Dataengineeringdata-intensive application experience Python Azure PaaS/SaaS tools (eg. Azure Functions, App Services, Batch, Blob/Table/Queue Storage) Excellent problem-solving skills Why join them: It s an entrepreneurial environment pushing the boundaries, harnessing technology to enable more efficient analytics. A true people-first employer where you ll be supported to learn and develop More ❯
London, Manchester Square, United Kingdom Hybrid / WFH Options
Liberty CL Recruitment
for a Senior Backend Developer to join their award-winning Insurance division. As part of a mixed technical team including back-end developers, testers, front-end developers and actuarial data scientists, you’d be working on their market-leading insurance application, used by some of the largest London Market and International insurers to assess over £200bn in reserves. Your … to design Driving better integration between their C# and Python codebases, rewriting and restructuring code where needed to improve cohesion Address any performance bottlenecks and optimise performance of complex data workflows Championing best practice and guiding the junior team members What they’re specifically looking for: C# development experience (.Net 8) with production-level proficiency Dataengineering – data-intensive application experience Python Azure PaaS/SaaS tools (eg. Azure Functions, App Services, Batch, Blob/Table/Queue Storage) Excellent problem-solving skills Why join them: It’s an entrepreneurial environment pushing the boundaries, harnessing technology to enable more efficient analytics. A true people-first employer where you’ll be supported to learn and develop More ❯