About Us: ATREIDES is a leader in advanced geospatial technologies, big data solutions, and data analytics. We specialize in providing cutting-edge software platforms that enable our clients to process, analyze, and visualize large-scale geospatial and sensor data for critical decision-making. Our mission is to deliver innovative, scalable, and high-performance platforms … that help organizations unlock the full potential of their data for various industries, including defense, national security, environmental monitoring, and urban planning. We are seeking a highly skilled Senior Data Engineer with a strong focus on big data, data analytics, and geospatial data intelligence to join our engineering team. In … and technologies (AWS, Azure, GCP) for big data processing and platform deployment. Strong knowledge of data warehousing, data lakes, and datapipeline design for large-scale data integration and storage. Familiarity with machine learning and AI techniques for data analytics (e.g., classification, regression, clustering, anomaly detection). More ❯
that solves real-world problems for large blue-chip companies and governments worldwide. We're pioneers of meaningful AI : our solutions go far beyond chatbots. We are using data and AI to solve the world's biggest issues in telecommunications, sustainable water management, energy, healthcare, climate change, smart cities, and other areas that have a real impact on … re underpinned by over 300 engineers and scientists working across Africa, Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data architectures such as Data … data engineering projects. You will work closely with cross-functional teams and contribute to the strategic direction of our data initiatives. RESPONSIBILITIES DataPipeline Development: Lead the design, implement, and maintain scalable datapipelines for ingesting, processing, and transforming large volumes of data from various sources using tools such More ❯
that solves real-world problems for large blue-chip companies and governments worldwide. We're pioneers of meaningful AI : our solutions go far beyond chatbots. We are using data and AI to solve the world's biggest issues in telecommunications, sustainable water management, energy, healthcare, climate change, smart cities, and other areas that have a real impact on … re underpinned by over 300 engineers and scientists working across Africa, Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data architectures such as Data … modelling, ETL (Extract, Transform, Load) processes, and big data technologies it becomes possible to develop robust and reliable data solutions. RESPONSIBILITIES DataPipeline Development: Design, implement, and maintain scalable datapipelines for ingesting, processing, and transforming large volumes of data from various sources using tools such as databricks More ❯
Overview We are looking for a data engineer experienced in DevOps-based pipeline delivery, who can not only develop the pipeline but also establish the foundational framework for reusable data ingestion processes. The ideal candidate is proactive, a self-starter, and demonstrates a strong can-do attitude. While not essential, experience with Health Data … Framework Delivery: Responsible for building reusable metadata driven datapipelines within a framework to handle batch and near-real-time data feeds. DataPipeline Development: Develop end-to-end datapipelines, including data load patterns, error handling, automation, and hardware optimisation. Requirements Formulation: Collaborate with Business Analysts, Architects, SMEs … updated on new data engineering technologies and best practices, especially in healthcare, and recommend adoption as needed. Lead proof of concepts, pilots, and develop datapipeline using agile and iterative methods. Qualifications Certifications such as DP 203 and AZ 900 or similar certification/experience. Essential skills Experience in working with healthcare dataMore ❯
consistency, and proactively collaborating with upstream and downstream teams to enable seamless data flow across the organization. In this role, you will not only troubleshoot and resolve pipeline issues but also contribute to enhancing data architecture, implementing best practices in data governance and security, and ensuring the scalability and performance of data … stakeholders This position requires client presence between 25%-50% of the time per month at the client's office, which is located in London. Key Responsibilities: DataPipeline Development & Maintenance Build, maintain, and optimize scalable ETL/ELT pipelines using tools such as Dagster, or similar. Ensure high data availability, reliability, and consistency through rigorous … data validation and monitoring practices. Collaborate with cross-functional teams to align datapipeline requirements with business objectives and technical feasibility. Automate data workflows to improve operational efficiency and reduce manual intervention. Data Integrity & Monitoring Perform regular data consistency checks, identifying and resolvinganomalies or discrepancies. Implement robust monitoring frameworks More ❯
Report to: Director of Data Science Location : Hybrid (2 days/week in office, London Holborn) Salary : 60-65k DOE About Us The data team at Times Higher Education works with real-world data inputs from a number of external sources, and our work is reliant on this data being easily … also need to be shared across the company and to external parties, for a growing number of applications. Your role will be focused on supporting the delivery of data to our various internal stakeholders, within and outside the data team. You will work closely with our Dir. Of Data Science, Data Scientists … analyst/scientist with strong data engineering skills), who can work on ad-hoc data delivery as well as help strengthen our whole pipeline and governance, providing expert input and support to the team. Responsibilities Support Data Delivery team in data collection and provision processes Support Consultancy team in More ❯
Senior Data Engineer - (Azure/Databricks) page is loaded Senior Data Engineer - (Azure/Databricks) Apply locations London - Scalpel time type Full time posted on Posted 7 Days Ago job requisition id REQ05851 This is your opportunity to join AXIS Capital - a trusted global provider of specialty lines insurance and reinsurance. We stand apart for our … civil union status, family or parental status, or any other characteristic protected by law. Accommodation is available upon request for candidates taking part in the selection process. Senior Data Engineer (Azure/Databricks Job Family Grouping: Chief Underwriting Officer Job Family: Data & Analytics Location: London How does this role contribute to our collective success? The Data … services. Implement end-to-end datapipelines, ensuring data quality, data integrity and data security. Troubleshoot and resolve datapipeline issues while ensuring data integrity and quality. Implement and enforce data security best practices, including role-based access control (RBAC), encryption, and compliance with industry More ❯
Seasoned and HuffPost UK. The additional brands echo the existing business ethos and allow for increased audiences and a further strategic diversification of revenue streams. About You A data engineer with advanced knowledge of SQL and hands-on experience with both relational and non-relational databases, supporting data needs in fast-paced, content-driven environments. Experience … to best practice methods with the current framework. Key Responsibilities and Accountabilities Design and Maintain DataPipelines: Develop and maintain robust, scalable, and efficient datapipeline architecture to support current and future business needs. Engineering and Integration: Assemble large, complex datasets from a variety of structured and unstructured sources, ensuring they meet functional requirements. Process … using Python, SQL, and Google Cloud Platform (GCP) big data technologies, such as BigQuery, Dataflow, Dataproc and Cloud Storage. Business Intelligence Enablement: Prepare and transform pipelinedata to support downstream analytics and feed BI tools (DOMO), enabling data-driven decision-making across the organization. Cross-Functional Collaboration: Partner with internal stakeholders More ❯
Seasoned and HuffPost UK. The additional brands echo the existing business ethos and allow for increased audiences and a further strategic diversification of revenue streams. About You A data engineer with advanced knowledge of SQL and hands-on experience with both relational and non-relational databases, supporting data needs in fast-paced, content-driven environments. Experience … to best practice methods with the current framework. Key Responsibilities and Accountabilities Design and Maintain DataPipelines: Develop and maintain robust, scalable, and efficient datapipeline architecture to support current and future business needs. Engineering and Integration: Assemble large, complex datasets from a variety of structured and unstructured sources, ensuring they meet functional requirements. Process … using Python, SQL, and Google Cloud Platform (GCP) big data technologies, such as BigQuery, Dataflow, Dataproc and Cloud Storage. Business Intelligence Enablement: Prepare and transform pipelinedata to support downstream analytics and feed BI tools (DOMO), enabling data-driven decision-making across the organization. Cross-Functional Collaboration: Partner with internal stakeholders More ❯
Position : Senior Data Engineer Salary: Negotiable, depending on Experience Location : Hybrid - Bristol, Manchester, or London Security Requirement : SC Clearance to start and be willing/able to obtain DV Clearance We are seeking Data Engineers, with a keen interest/experience in AI/ML and strong proficiency in Python Scripting. To join our Consultancy client … working on meaningful projects in the Defence and Security sector. Required Experience: End-to-End Data Development : Strong experience with datapipelines, ETL processes, and workflow orchestration, demonstrating best practices across tech stacks. Strong Python Skills Diverse Data Handling : Familiarity with batch, streaming, real-time, and unstructured data sources. Architectural Design & Systems … AI Integration : Apply data engineering tools, integration frameworks, and query engines to create high-quality, standardised data for AI applications and reporting. DataPipeline Development : Design and implement robust datapipelines and stores in collaboration with other engineers and developers. Innovative Problem Solving : Bring fresh approaches to challenging dataMore ❯
throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Bring Join a dynamic team transforming our organization into a data-driven enterprise! The Data Layer Team is where we build essential data connectors, datapipelines, and capabilities to empower our clients and colleagues … with high-quality and reusable data. Our Data Layer Team focuses on creating scalable data solutions and advancing our data infrastructure that are the foundation of AI products that will drive informed decision-making across the company. As the Data Layer Connectors & Ingestion Delivery Owner, you will oversee data engineering … practices, data connectors, integrations, while driving continuous improvement, and ensuring alignment with data governance, security, and compliance standards. You will lead the Platform Build & Connectors squad compose of data engineers, architects, and you will closely collaborate with cross-functional teams to ensure timely delivery and operational excellence. Your responsibilities include defining the dataMore ❯
are a brand embarking on a very exciting plan and we are seeking new members of the team who want to contribute to this. THE ROLE As a Data Architect (contract) for Project Nova, you will be instrumental in shaping AllSaints' journey to become a truly data-driven, AI-powered organisation. You will lead the design … and implementation of our enterprise data architecture. This role is central to the Data & Integration workstream of Project Nova, building on our existing data warehouse and BI capabilities to accelerate our data transformation and enable the use of greater AI-driven insight and actions. You will ensure our data assets … techniques and tools Strong hands-on experience with cloud data platforms, specifically Google Cloud Platform - BigQuery Experience with leading ETL/ELT tools and datapipeline orchestration (e.g., Dataflow, Apache Airflow, Talend, Informatica) Advanced SQL skills and deep knowledge of various database technologies (relational, columnar, NoSQL) Practical experience in establishing data governance frameworks More ❯
Croydon, London, United Kingdom Hybrid / WFH Options
Manchester Digital
allowance based on a skills assessment Published on 10 July 2025 Deadline 30 July 2025 Location Croydon, Manchester, Sheffield About the job Job summary As a Lead Data Architect, you will create and uphold data architecture across critical Home Office systems, enabling secure, data-driven services that keep the country safe. Working under the … vision set by the Principal Data Architect, you'll design robust data models, metadata systems, and governance frameworks to serve diverse stakeholders. Your leadership will span strategic planning, mentoring junior data architects, and aligning technical decisions with business objectives. If you thrive on architecting solutions that integrate emerging technologies such as cloud-native platforms … data lakehouses, streaming datapipelines and AI-powered analytics while collaborating with multi disciplinary teams, this role offers you the chance to elevate how the Home Office exploits data for maximum public impact. Due to business requirements this post is available on a full time/flexible working basis. Where business needs allow, some More ❯
Radius is seeking a highly talented Data Architect and Governance lead for my client going through a digital transformation. Data Strategy Define and lead enterprise wide data strategy Design and maintain a scalable, secure data architecture Deep understanding of Data Governance, Modern data platforms: Cloud Data quality Management, metadata, lineage and data modelling Experience with Data integration, master data management (MDM) and designing scalable datapipelinesData Privacy, regulatory compliance, and security best practises BI Tools Power BI, Tableau, Looker proficient in data modelling, SQL, and cloud platforms - Azure, AWS, GCP and … their Data services Key Responsibilities Define and implement data architecture strategies in line with the Customer Journey programme Guide the data-driven decision-making process, ensuring that data is effectively utilised to inform business strategies and initiatives. Develop data plan to address data quality issues in line with More ❯
Radius is seeking a highly talented Data Architect and Governance lead for my client going through a digital transformation. Data Strategy Define and lead enterprise wide data strategy Design and maintain a scalable, secure data architecture Deep understanding of Data Governance, Modern data platforms: Cloud Data quality Management, metadata, lineage and data modelling Experience with Data integration, master data management (MDM) and designing scalable datapipelinesData Privacy, regulatory compliance, and security best practises BI Tools Power BI, Tableau, Looker proficient in data modelling, SQL, and cloud platforms - Azure, AWS, GCP and … their Data services Key Responsibilities Define and implement data architecture strategies in line with the Customer Journey programme Guide the data-driven decision-making process, ensuring that data is effectively utilised to inform business strategies and initiatives. Develop data plan to address data quality issues in line with More ❯
Data Engineer/Scientist Department: Consultancy Employment Type: Full Time Location: United Kingdom/Hybrid Description As a Data Engineer/Scientist at Actica, you will have the opportunity to design, build, and maintain datapipelines while developing advanced analytics solutions to unlock business problems for high-profile UK public sector organisations. Your expertise … will enable organisations to maximise the value of their data assets through robust data infrastructure and sophisticated analysis, playing a key role in nationally critical projects that make a real difference to people's everyday lives. Locations: London, Guildford, Bristol, M4 corridor Hybrid working Roles and Responsibilities Actica recognises that data engineering, analytics, and … data science are distinct but interconnected disciplines. Our data engineers focus on building and maintaining the data infrastructure that enables analytics and data science work, while our data scientists and analysts focus on deriving insights and developing models. At Actica, we're at the forefront of the UK government's More ❯
We believe in better. And we make it happen. Better content. Better products. And better careers. Working in Tech, Product or Data at Sky is about building the next and the new. From broadband to broadcast, streaming to mobile, SkyQ to Sky Glass, we never stand still. We optimise and innovate. We turn big ideas into the products … content and services millions of people love. And we do it all right here at Sky. The team is looking for a Data Architect to join the IoT platform team, supporting technical initiatives aimed at enhancing data architecture and improving the platform's data handling capabilities. This role will focus on designing scalable data models, ensuring data quality, and streamlining the integration of data sources for real-time analytics and platform performance. The Data Architect will also play a key role in implementing data governance policies to ensure compliance, security, and proper management of IoT data. Additionally, the role will involve aligning IoT dataMore ❯
DEPT/AI has a single mission: to make the best work in the industry using Data & AI to enhance everything we do. This role sits within our Data & AI practice, which has deep expertise in leveraging AI. The team includes data strategists, consultants, data scientists and analysts that work alongside DEPT … and most challenging problems facing some of the best loved brands in the world - and doing this alongside an experienced team. This role is part of our EMEA Data craft team, an integrated craft across Europe driving data engineering, data science, and AI initiatives. We combine our expertise in data architecture, analytics … and AI to deliver exceptional data-driven solutions that build lasting relationships with our clients like Footlocker, Nikon, Rituals, and Philips ROLE OVERVIEW The Lead Data Engineer is responsible for building and leading our data engineering capabilities across EMEA and setting technical standards. This senior technical leadership role will drive the strategic vision for More ❯
or Manchester Permanent Full time position Up to £120k DOE + fantastic benefits Closing date for applications is Tuesday 5th August, 2025. We make health happen! The Lead Data Engineer will help shape the future of data driven innovation in healthcare by building robust data systems that power intelligent decision-making. Reporting into the … senior level. Accountability for the implantation of full data lifecycle-from design and development to deployment and optimisation is highly desirable. Strong knowledge and experience ofdata pipelines, data modelling, database design, and data warehousing. Proficiency in Snowflake, Power BI and medallion architecture. Ideal candidates will have been accountable for building out a data … senior level. Accountability for the implantation of full data lifecycle-from design and development to deployment and optimisation is highly desirable. Strong knowledge and experience ofdata pipelines, data modelling, database design, and data warehousing. Proficiency in Snowflake, Power BI and medallion architecture. Ideal candidates will have been accountable for building out a dataMore ❯
Intelligence in making their decisions. This leads to better business decisions with improved accuracy, reduced errors, and better outcomes across various aspects of the business. Team Description: The Data Ingestion and Pipelines (DIP) team is responsible for designing, building, and optimizing scalable data systems that integrate and process data from a variety of sources. … We develop efficient datapipelines that ensure seamless data flow into our data lake and transform it to support business intelligence and machine learning applications. Our work spans the entire data lifecycle; from ingesting data from clients, to developing complex ETL processes, to building systems for scraping external datasets (e.g. … you will play a key role in designing, developing, and maintaining scalable data infrastructure that supports our business intelligence and analytics efforts. Key Responsibilities: DataPipeline Development: Design, develop, and maintain robust datapipelines and ETL processes to ingest, transform, and load data from diverse sources into our dataMore ❯
DATA ARCHITECT - DV CLEARED NEW PERMANENT JOB OPPORTUNITY AVAILABLE WITHIN A LEADING NATIONAL SECURITY SME FOR A DATA ARCHITECT WITH ENHANCED DV CLEARANCE Permanent job opportunity for a Data Architect Leading National Security SME Salary up to £100,000 + Bonus London based organisation in an easily accessible location To apply please call or … email WHO WE ARE? We are recruiting multiple a Data Architect to support urgent National Security & Defence projects in London. Due to the nature of these projects you must hold enhanced DV Security Clearance. WHAT WILL THE DATA ARCHITECT BE DOING? You will be joining a leading SME who is working hard to support National Security … projects within UK Govt. Departments in London. As part of a team, you will be responsible for designing and implementing Data Solutions in Mission-Critical areas. WE NEED THE DATA ARCHITECT TO HAVE.... Current DV clearance - Enhanced Good at understanding complexity and abstracting that into a form that is consumable for a non-technical audience. Experience More ❯
Senior Data and BI Solutions Analyst page is loaded Senior Data and BI Solutions Analyst Apply locations Farringdon, London, United Kingdom time type Full time posted on Posted Yesterday job requisition id JR-82411 Job Summary: Company: Live Nation Department: International Data Location: Farringdon, London Reports to: Vice President, Data and Media … International Working Hours: Full time Contract Type: Initially fixed term until the end of 2025. Role Description The Senior Data & BI Analyst will play a critical role in enabling data access, platform operations, and analytical data modeling across Live Nation's international data functions. You will manage data workflows, ensuring … seamless data access and usability in Databricks, BigQuery, Tableau, and Looker. You will lead the development of data models and analytical structures to support reporting teams and enable self-service analytics capabilities. The role requires strong collaboration with data engineering teams and business stakeholders to manage data migration, integration, and transformation efforts More ❯
global advisory, broking, and solutions company. We work with clients across a wide range of industries, helping them manage risk, optimise benefits, and improve performance. As a Fabric Data Engineer, you will play a key role in leveraging Microsoft Fabric, Azure, and Python to design and build advanced data solutions in the insurance domain. Location: London … UK Role: Hybrid Workstyle (Full-time) Role Overview: As a Fabric Data Engineer at WTW, you will take ownership of developing and optimising datapipelines, workflows, and ETL processes. You will work with cutting-edge technologies to ensure that data is efficiently processed, stored, and made accessible for analysis. This role is a key … processes using Microsoft Fabric or Azure technologies. Manage and optimise notebooks, pipelines, and workflows to enhance the performance and efficiency of our data architecture. DataPipeline Development & ETL: Build and maintain high-quality ETL pipelines to clean, transform, and enrich data from various sources. Ensure that pipelines are automated, scalable, and fault-tolerant More ❯
Gett, Kaizen Gaming, and TransferGo. Our products are recognised by industry leaders like Gartner's Magic Quadrant, Forrester Wave and Frost Radar. Our tech stack: Superset and similar data visualisation tools. ETL tools: Airflow, DBT, Airbyte, Flink, etc. Data warehousing and storage solutions: ClickHouse, Trino, S3. AWS Cloud, Kubernetes, Helm. Relevant programming languages for data engineering tasks: SQL, Python, Java, etc. What you will be doing: Designing and developing scalable and efficient datapipelines, ETL processes, and data integration solutions to support data ingestion, processing, and storage needs. Ensuring data quality and reliability by implementing data validation, data cleansing, and data … quality monitoring processes. Optimising database performance by tuning queries, implementing indexing strategies, and monitoring and analysing system performance metrics. Collaborating with cross-functional teams to gather requirements, understand data needs, and develop data solutions that align with business objectives. Staying up-to-date with emerging technologies and industry trends in data engineering, and identifying More ❯
Join our rapidly expanding team as a hands-on Cloud Data Analytics Platform Engineer and play a pivotal role in shaping the future of data at Citi. We're building a cutting-edge, multi-cloud data analytics platform that empowers our users with secure, scalable, and efficient data insights. This role sits … at the intersection of infrastructure, data engineering, and architecture, offering a unique opportunity to work with the latest cloud-native technologies and influence our data strategy. This is a hands-on role requiring deep technical skills and a passion for building and optimizing data platforms. What You'll Do: Architect and Build: Design and … data zones. Familiarity with data governance tools and frameworks. IaC Proficiency: Solid experience with Terraform and preferably Harness, Tekton, or Lightspeed for CI/CD pipeline management. Kubernetes Mastery: Strong command of Kubernetes, especially in the context of data processing workloads. Security Focus: A firm grasp of cloud security principles and best practices. More ❯