engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets. Key Responsibilities : DataPipeline Development & Optimisation : Design, develop, and maintain robust and scalable datapipelines for ingesting, transforming, and loading data from various sources … effectively to both technical and non-technical audiences. Participate in code reviews and knowledge sharing sessions. Automation & DevOps: Implement automation for datapipeline deployments and other data engineering tasks. Work with DevOps teams to implement and Build CI/CD pipelines, for environmental deployments. Promote … SQL Database. Experience working with large datasets and complex data pipelines. Experience with data architecture design and datapipeline optimization. Proven expertise with Databricks, including hands-on implementation experience and certifications. Experience with SQL and NoSQL databases. Experience with data quality More ❯
engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets. Key Responsibilities : DataPipeline Development & Optimisation: Design, develop, and maintain robust and scalable datapipelines for ingesting, transforming, and loading data from various sources … effectively to both technical and non-technical audiences. Participate in code reviews and knowledge sharing sessions. Automation & DevOps: Implement automation for datapipeline deployments and other data engineering tasks. Work with DevOps teams to implement and Build CI/CD pipelines, for environmental deployments. Promote … SQL Database. Experience working with large datasets and complex data pipelines. Experience with data architecture design and datapipeline optimization. Proven expertise with Databricks, including hands-on implementation experience and certifications. Experience with SQL and NoSQL databases. Experience with data quality More ❯
Newcastle-upon-Tyne, Newcastle upon Tyne, Tyne and Wear, England
Government Digital & Data
the infected blood community to provide financial compensation to victims of infected blood on a UK-wide basis. This role will lead our data platform architecture and modelling team within the Data Operations arm of the IBCA Data Directorate. The Data Operations … team is responsible for developing and running safe and secure data solutions that provide a single source of truth for those going through their compensation journey. They are designing and building a new data platform using Amazon Web Services (AWS) and data management and … intelligence products using Databricks and Quantexa. We are taking a product-centric approach treating data as a product and are building squads around our products, with a focus on paying compensation to those impacted by the infected blood scandal seamlessly. You will focus on designing scalable and flexible More ❯
advantage, and its technology-led service provides access to all major exchanges, order-flow management via screen, voice and DMA, plus award-winning data, insights and analytics. The Technology Department delivers differentiation, scalability and security for the business. Reporting to the COO, Technology provides digital tools, software services … our enterprise-wide services to end users and actively manages the firm's infrastructure and data. Within IT, Marex Technology has established a Data team that enables the firm to leverage data assets to increase productivity and improve business decisions, as well as maintain data compliance. The Data Team encompasses Database Administration, Data Engineering, Data Analysis, Data Architecture, Data Intelligence and AI expertise. In recent years, they have developed a Data Lakehouse architecture, that is relied upon by different departments across More ❯
advantage, and its technology-led service provides access to all major exchanges, order-flow management via screen, voice and DMA, plus award-winning data, insights and analytics. The Technology Department delivers differentiation, scalability and security for the business. Reporting to the COO, Technology provides digital tools, software services … our enterprise-wide services to end users and actively manages the firm's infrastructure and data. Within IT, Marex Technology has established a Data team that enables the firm to leverage data assets to increase productivity and improve business decisions, as well as maintain data compliance. The Data Team encompasses Database Administration, Data Engineering, Data Analysis, Data Architecture, Data Intelligence and AI expertise. In recent years, they have developed a Data Lakehouse architecture, that is relied upon by different departments across More ❯
get in touch with at UKI.recruitment@tcs.com or call TCS London Office number 02031552100 with the subject line: “Application Support Request”. Role : Data Architect Job Type: Permanent (Hybrid) Location: London, United Kingdom Ready to utilize your skills in designing, creating, and managing data architecture? Join … us as a Data Architect. Careers at TCS: It means more TCS is a purpose-led transformation company, built on belief. We do not just help businesses to transform through technology. We support them in making a meaningful difference to the people and communities they serve - our clients … an exciting team where you will be challenged every day. • Build strong relationships with a diverse range of stakeholders. The Role As a Data Architect, you will be responsible for designing, creating, and managing data architecture. You will also ensure that data is efficiently More ❯
a newly created role within the NAO's Digital Services (DS) function with responsibility for supporting the development and continual improvement of NAO data & technology service composition and provision. They will support emerging tech to enable the automation or acceleration of relevant NAO processes and derive deeper insights … from corporate and client data. In this capacity, you will transform organizational data into structured formats suitable for analysis and decision-making. You will develop and test data models, explore local data sources, and construct pipelines from corporate repositories to data science … and machine learning models. Acting as a domain-specific collaborator to data engineers, you will facilitate the conversion of data into actionable intelligence, thereby contributing to the NAO's commitment to data-driven excellence. In this role, you will: Collaborate with subject matter experts More ❯
Create and maintain optimal datapipeline architecture. Assemble large, complex data sets that meet functional/non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure … variety of data sources using SQL and AWS 'big data' technologies. Build analytics tools that utilize the datapipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics. Work with stakeholders including the Executive, Product, Data … data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with datapipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. Experience with AWS cloud services: EC2, EMR, RDS, Redshift. Experience with stream-processing systems: Storm, Spark More ❯
Company: Royal London Group Contract Type: Permanent Location: Wilmslow or Edinburgh or Glasgow Working style: Hybrid 50% home/office based The Group Data Office (GDO) is responsible for enhancing the effectiveness of how data is used across Royal London (RL). The function provides Data Governance, Data Management, Data Strategy, Analytics/AI and Insight capabilities. It sets and oversees data related policies and standards, so data is collected, processed, and exploited in accordance with the Group's risk appetite. Role Overview Royal London is … models using data modelling tools (Idera ER/Studio or similar). Data engineering/datapipeline experience, with hands on experience on integration tools such as Azure Databricks Notebooks, Azure Data Factory or PySpark. Python extremely beneficial. About More ❯
Company: Royal London Group Contract Type: Permanent Location: Wilmslow or Edinburgh or Glasgow Working style: Hybrid 50% home/office based The Group Data Office (GDO) is responsible for enhancing the effectiveness of how data is used across Royal London (RL). The function provides Data Governance, Data Management, Data Strategy, Analytics/AI and Insight capabilities. It sets and oversees data related policies and standards, so data is collected, processed, and exploited in accordance with the Group's risk appetite. Role Overview Royal London is … models using data modelling tools (Idera ER/Studio or similar). Data engineering/datapipeline experience, with hands on experience on integration tools such as Azure Databricks Notebooks, Azure Data Factory or PySpark. Python extremely beneficial. About More ❯
Company: Royal London Group Contract Type: Permanent Location: Wilmslow or Edinburgh or Glasgow Working style: Hybrid 50% home/office based The Group Data Office (GDO) is responsible for enhancing the effectiveness of how data is used across Royal London (RL). The function provides Data Governance, Data Management, Data Strategy, Analytics/AI and Insight capabilities. It sets and oversees data related policies and standards, so data is collected, processed, and exploited in accordance with the Group's risk appetite. Role Overview Royal London is … models using data modelling tools (Idera ER/Studio or similar). Data engineering/datapipeline experience, with hands on experience on integration tools such as Azure Databricks Notebooks, Azure Data Factory or PySpark. Python extremely beneficial. About More ❯
Company: Royal London Group Contract Type: Permanent Location: Wilmslow or Edinburgh or Glasgow Working style: Hybrid 50% home/office based The Group Data Office (GDO) is responsible for enhancing the effectiveness of how data is used across Royal London (RL). The function provides Data Governance, Data Management, Data Strategy, Analytics/AI and Insight capabilities. It sets and oversees data related policies and standards, so data is collected, processed, and exploited in accordance with the Group's risk appetite. Role Overview Royal London is … models using data modelling tools (Idera ER/Studio or similar). Data engineering/datapipeline experience, with hands on experience on integration tools such as Azure Databricks Notebooks, Azure Data Factory or PySpark. Python extremely beneficial. About More ❯
of Data and Enterprise architecture to build data engineering process in AWS using modern tech stack. Responsibilities DataPipeline Development : Design, build, and maintain scalable datapipelines to ingest, process, and store large sets of financial data from various … pipelines and integrations. Documentation & Best Practices : Maintain clear, well-organised documentation for all data engineering projects, including datapipeline processes, cloud infrastructure, and codebase. Data Governance & Security : Implement and enforce data governance policies and security practices, ensuring compliance with … financial regulations and protecting sensitive financial data. Troubleshooting & Optimisation : Monitor and troubleshoot datapipeline performance, addressing bottlenecks or issues promptly to ensure smooth data flow and operation. Requirements Several years of experience in data engineering, preferably in the financial services or similar regulated More ❯
allows customers to make a single monthly payment, to receive brand new equipment, and have maintenance costs taken care of. Role overview The Data Engineer will work within the data team to assist in the design, implementation, and maintenance of our Azure-based data warehouses. The data engineer will be involved in all aspects of the data warehouse, covering integration to new data sources, design and creation of new datapipelines as well as the optimization and maintenance of existing ones. They will also … Azure based data warehouse technologies such as Azure Data Factory, Analysis Services, SQL server and Azure Synapse. DataPipeline Creation: Build and optimize ETL/ELT datapipelines using Azure Data Factory, Databricks, or similar services to ensure dataMore ❯
allows customers to make a single monthly payment, to receive brand new equipment, and have maintenance costs taken care of. Role overview The Data Engineer will work within the data team to assist in the design, implementation, and maintenance of our Azure-based data warehouses. The data engineer will be involved in all aspects of the data warehouse, covering integration to new data sources, design and creation of new datapipelines as well as the optimization and maintenance of existing ones. They will also … Azure based data warehouse technologies such as Azure Data Factory, Analysis Services, SQL server and Azure Synapse. DataPipeline Creation: Build and optimize ETL/ELT datapipelines using Azure Data Factory, Databricks, or similar services to ensure dataMore ❯
ripponden, yorkshire and the humber, United Kingdom
JLA Group
allows customers to make a single monthly payment, to receive brand new equipment, and have maintenance costs taken care of. Role overview The Data Engineer will work within the data team to assist in the design, implementation, and maintenance of our Azure-based data warehouses. The data engineer will be involved in all aspects of the data warehouse, covering integration to new data sources, design and creation of new datapipelines as well as the optimization and maintenance of existing ones. They will also … Azure based data warehouse technologies such as Azure Data Factory, Analysis Services, SQL server and Azure Synapse. DataPipeline Creation: Build and optimize ETL/ELT datapipelines using Azure Data Factory, Databricks, or similar services to ensure dataMore ❯
Data Engineer required by our market leading, award winning, professional services organisation based in Yeovil. The successful Data Engineer, you'll play a vital role in designing, building, and maintaining sophisticated datapipelines and ensuring the integrity of our clients extensive customer data. Your … to develop your expertise in data engineering. Key Responsibilities Design & Build DataPipelines: Create and maintain scalable datapipeline architecture that supports business needs. Data Management: Assemble large, complex data sets to meet business and technical requirements. Process Improvement … years of hands-on experience with big data tools and frameworks. Technical Skills: Proficiency in SQL, Python, and datapipeline tools such as Apache Kafka, Apache Spark, or AWS Glue. Problem-Solving: Strong analytical skills with the ability to troubleshoot and resolve dataMore ❯
on long-term career growth and development opportunities in an environment that embraces your unique skills and experience. Your role and responsibilities A Data Products Designer bridges the gap between data strategy, business outcomes, and user experience by designing data products that are valuable … usable, and scalable. They work closely with business stakeholders, data teams, and technology teams to ensure data products solve real problems, fit into existing business workflows, and deliver measurable impact. Key Responsibilities: Data Product Discovery & Framing: Facilitate discovery workshops with business, data, and technology stakeholders. Frame the problem statements, user needs, and business value of data products Define clear hypotheses, success metrics, and KPIs. User-Centered Data Product Design: Understand and map user personas, journeys, and pain points. Translate user and business needs into dataMore ❯
Crawley, Sussex, United Kingdom Hybrid / WFH Options
Rentokil Pest Control South Africa
The AI Engineer is a key role in the Data Platform Portfolio team, building the data platform, driving value from data across the business. With strong technical skills and business acumen to help turn millions of potential data points into actionable insights … our customer retention rates, and drive operating efficiencies across the business. The primary goals of the team are: To build and run a data platform which can create and deliver analytics to colleagues and deliver regulatory reporting. Ingest and transform data from multiple systems, modelling data … implementing ELT pipelines using tools like dbt, with strong knowledge of data warehousing, data lake concepts, and datapipeline optimization. Skilled in SQL for data manipulation, analysis, query optimisation, and database design. Artificial Intelligence and Machine Learning Understanding of machine learning More ❯
to transform care delivery worldwide, ensuring every patient receives the safest, highest-quality care. Through our innovative Healthcare Operations Platform, we're connecting data to unlock trusted insights that enable improved decision-making and help deliver safer healthcare for all. At RLDatix were making healthcare safer, together. Our … culture makes it all possible. As a team, we collaborate globally to reach our ultimate goalhelping people. Were searching for aLondon-based Principal Data Engineer to join our Data and Reporting Platform team, so that we can design and implement a robust data strategy … warehousing, and ETL/ELT architecture at scale; Advanced programming skills in SQL and Python; experience with orchestration and datapipeline tools; Demonstrated success delivering complex data architectures in support of cross-functional business needs; Experience with implementing data governance frameworks More ❯
Job Title: Lead Data Engineer (AWS) Business Unit/Segment: Data Management/Analytics Location: London, United Kingdom (Flexible hybrid working) Employment Type: Permanent - Salary -£80k to £100k Summary of the Role: A leading global data and AI company is looking for a Lead … with emerging data technologies and recommend improvements to data engineering practices. Develop and enforce best practices for datapipeline orchestration, testing, and deployment. Provide mentorship, feedback, and leadership across project and operational initiatives. Collaborate with cross-functional teams to design a consistent and … CD practices and code versioning systems. Key Skills and Attributes: Cloud & Data Engineering: Deep expertise in AWS services and datapipeline development. Data Modelling & Architecture: Strong background in data warehousing and modern modelling frameworks. Leadership: Ability to lead teams, provide feedback More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
Job Title: Lead Data Engineer (AWS) Business Unit/Segment: Data Management/Analytics Location: London, United Kingdom (Flexible hybrid working) Employment Type: Permanent - Salary -£80k to £100k Summary of the Role: A leading global data and AI company is looking for a Lead … with emerging data technologies and recommend improvements to data engineering practices. Develop and enforce best practices for datapipeline orchestration, testing, and deployment. Provide mentorship, feedback, and leadership across project and operational initiatives. Collaborate with cross-functional teams to design a consistent and … CD practices and code versioning systems. Key Skills and Attributes: Cloud & Data Engineering: Deep expertise in AWS services and datapipeline development. Data Modelling & Architecture: Strong background in data warehousing and modern modelling frameworks. Leadership: Ability to lead teams, provide feedback More ❯
Job Title: Senior Data Engineer Location: Abu Dhabi Job Summary: As a Senior Data Engineer , you will be responsible for designing, developing, and maintaining advanced, scalable data systems that power critical business decisions. You will lead the development of robust datapipelines … role requires a deep understanding of modern data engineering practices, real-time processing, and cloud-native solutions. Key Responsibilities: DataPipeline Development & Management: Design, implement, and maintain scalable and reliable datapipelines to ingest, transform, and load structured, unstructured, and real-time data … environments. Machine Learning & Advanced Analytics Enablement: Collaborate with data scientists to prepare and serve features for ML models. Maintain awareness of ML pipeline integration and ensure data readiness for experimentation and deployment. Documentation & Continuous Improvement: Maintain thorough documentation including technical specifications , data flow More ❯
Job Title: Senior Data Engineer Location: Abu Dhabi Job Summary: As a Senior Data Engineer , you will be responsible for designing, developing, and maintaining advanced, scalable data systems that power critical business decisions. You will lead the development of robust datapipelines … role requires a deep understanding of modern data engineering practices, real-time processing, and cloud-native solutions. Key Responsibilities: DataPipeline Development & Management: Design, implement, and maintain scalable and reliable datapipelines to ingest, transform, and load structured, unstructured, and real-time data … environments. Machine Learning & Advanced Analytics Enablement: Collaborate with data scientists to prepare and serve features for ML models. Maintain awareness of ML pipeline integration and ensure data readiness for experimentation and deployment. Documentation & Continuous Improvement: Maintain thorough documentation including technical specifications , data flow More ❯
with length of service Sick Pay Increasing with length of service The Role: DataOps Engineer Job Description We are seeking a senior Data DataOps Engineer to serve as our first DataOps specialist in a growing team of Data Engineers and DevOps professionals. In this pivotal role … you will focus on operationalising and automating our data lifecycle to ensure that data workflows perform with reliability and efficiency. You will integrate CI/CD datapipelines, streamline deployment processes, enforce robust data governance, and optimise operational costs within our Microsoft …/CD datapipelines and strong programming skills in Python, SQL, Bash, and PySpark for automation. Strong aptitude for datapipeline monitoring and an understanding of data security practices such as RBAC and encryption. Implemented data and pipeline observability dashboards More ❯