a strong understanding of data engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets. Key Responsibilities : DataPipeline Development & Optimisation : Design, develop, and maintain robust and scalable datapipelines for ingesting, transforming, and loading data from various sources (e.g., APIs, databases, financial data … technical solutions. Communicate technical concepts effectively to both technical and non-technical audiences. Participate in code reviews and knowledge sharing sessions. Automation & DevOps: Implement automation for datapipeline deployments and other data engineering tasks. Work with DevOps teams to implement and Build CI/CD pipelines, for environmental deployments. Promote and implement DevOps best practices. … Azure Blob Storage, and Azure SQL Database. Experience working with large datasets and complex data pipelines. Experience with data architecture design and datapipeline optimization. Proven expertise with Databricks, including hands-on implementation experience and certifications. Experience with SQL and NoSQL databases. Experience with data quality and data governance More ❯
a strong understanding of data engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets. Key Responsibilities : DataPipeline Development & Optimisation : Design, develop, and maintain robust and scalable datapipelines for ingesting, transforming, and loading data from various sources (e.g., APIs, databases, financial data … technical solutions. Communicate technical concepts effectively to both technical and non-technical audiences. Participate in code reviews and knowledge sharing sessions. Automation & DevOps: Implement automation for datapipeline deployments and other data engineering tasks. Work with DevOps teams to implement and Build CI/CD pipelines, for environmental deployments. Promote and implement DevOps best practices. … Azure Blob Storage, and Azure SQL Database. Experience working with large datasets and complex data pipelines. Experience with data architecture design and datapipeline optimization. Proven expertise with Databricks, including hands-on implementation experience and certifications. Experience with SQL and NoSQL databases. Experience with data quality and data governance More ❯
a strong understanding of data engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets. Key Responsibilities : DataPipeline Development & Optimisation: Design, develop, and maintain robust and scalable datapipelines for ingesting, transforming, and loading data from various sources (e.g., APIs, databases, financial data … technical solutions. Communicate technical concepts effectively to both technical and non-technical audiences. Participate in code reviews and knowledge sharing sessions. Automation & DevOps: Implement automation for datapipeline deployments and other data engineering tasks. Work with DevOps teams to implement and Build CI/CD pipelines, for environmental deployments. Promote and implement DevOps best practices. … Azure Blob Storage, and Azure SQL Database. Experience working with large datasets and complex data pipelines. Experience with data architecture design and datapipeline optimization. Proven expertise with Databricks, including hands-on implementation experience and certifications. Experience with SQL and NoSQL databases. Experience with data quality and data governance More ❯
a strong understanding of data engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets. Key Responsibilities : DataPipeline Development & Optimisation: Design, develop, and maintain robust and scalable datapipelines for ingesting, transforming, and loading data from various sources (e.g., APIs, databases, financial data … technical solutions. Communicate technical concepts effectively to both technical and non-technical audiences. Participate in code reviews and knowledge sharing sessions. Automation & DevOps: Implement automation for datapipeline deployments and other data engineering tasks. Work with DevOps teams to implement and Build CI/CD pipelines, for environmental deployments. Promote and implement DevOps best practices. … Azure Blob Storage, and Azure SQL Database. Experience working with large datasets and complex data pipelines. Experience with data architecture design and datapipeline optimization. Proven expertise with Databricks, including hands-on implementation experience and certifications. Experience with SQL and NoSQL databases. Experience with data quality and data governance More ❯
london (city of london), south east england, united kingdom
Mastek
a strong understanding of data engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets. Key Responsibilities : DataPipeline Development & Optimisation: Design, develop, and maintain robust and scalable datapipelines for ingesting, transforming, and loading data from various sources (e.g., APIs, databases, financial data … technical solutions. Communicate technical concepts effectively to both technical and non-technical audiences. Participate in code reviews and knowledge sharing sessions. Automation & DevOps: Implement automation for datapipeline deployments and other data engineering tasks. Work with DevOps teams to implement and Build CI/CD pipelines, for environmental deployments. Promote and implement DevOps best practices. … Azure Blob Storage, and Azure SQL Database. Experience working with large datasets and complex data pipelines. Experience with data architecture design and datapipeline optimization. Proven expertise with Databricks, including hands-on implementation experience and certifications. Experience with SQL and NoSQL databases. Experience with data quality and data governance More ❯
Senior Data Engineer - Azure, BI & Data Strategy Location: East Yorkshire Salary: £55,000-£65,000 depending on experience ContractType: PermanentWe’re looking for a Senior Data Engineer with strong experience in Azure Data Factory, Business Intelligence and data strategy to join a forward-thinking organisation modernising its data ecosystem. This is an excellent opportunity for a Senior Data Engineer who can bridge the gap between technology and business, ensuring that data from systems such as SAP, Salesforce and factory production feeds is connected, structured and leveraged for meaningful insight.The Senior Data Engineer will take ownership of developing and integrating datapipelines across the business, supporting enterprise reporting and enabling smart decision-making through Power BI and Azure-based solutions. Key Responsibilities Design, build and maintain robust datapipelines using Azure Data Factory and the wider Azure Data ecosystem Oversee the data lake architecture, integrating sources such as SAP, Salesforce and More ❯
Senior Data Engineer - Azure, BI & Data Strategy Location: East Yorkshire Salary: £55,000-£65,000 depending on experience ContractType: Permanent We’re looking for a Senior Data Engineer with strong experience in Azure Data Factory, Business Intelligence and data strategy to join a forward-thinking organisation modernising its data ecosystem. This is an excellent opportunity for a Senior Data Engineer who can bridge the gap between technology and business, ensuring that data from systems such as SAP, Salesforce and factory production feeds is connected, structured and leveraged for meaningful insight. The Senior Data Engineer will take ownership of developing and integrating … datapipelines across the business, supporting enterprise reporting and enabling smart decision-making through Power BI and Azure-based solutions. Key Responsibilities Design, build and maintain robust datapipelines using Azure Data Factory and the wider Azure Data ecosystem Oversee the data lake architecture, integrating sources such as SAP, Salesforce More ❯
Employment Type: Permanent
Salary: £55000 - £65000/annum £55,000 to £65,000 + Benefits Packag
we're proud to be consumer champions in the automotive space. We're constantly exploring smarter ways to connect people with the right cars - and that's where data engineering plays a key role. You'll join a focused data team of 4 professionals (2 Data Scientists, 1 BI Developer, and yourself as Data … CI/CD pipelines for our data infrastructure, ensuring automated testing and smooth deployments. Participating in code reviews and contributing to team standards for datapipeline development. Keeping on top of the latest datasets available within the automotive space and making recommendations about new data sources. Supporting and expanding our Microsoft Azure infrastructure … optimising it for datapipeline performance. Writing comprehensive data documentation for analytics-focused entities, accelerating your colleagues' understanding of available data. What You'll Need to Succeed Essential Requirements: 2-4 years of experience building robust datapipelines in a commercial environment or through complex personal projects. Strong Python skills including experience with More ❯
uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do Join the Data Layer Team, a global portfolio transforming our organization into a data-driven enterprise! The Data Layer Team is a portfolio of 30 people who build … essential data platforms, products, and capabilities to empower our clients and colleagues with high-quality, actionable insights. Our focus is on creating scalable data solutions and advancing our data infrastructure to drive informed decision-making across the company. As Use Case Enablement Product Analyst within BCG's Data Layer Team, you will … collaborate with Use Case Enablement Product Owner and cross-functional teams to gather and analyze business and data requirements. Your role is critical to bridging the gap between business stakeholders and technical teams, ensuring that new GenAI use cases are well-scoped, feasible, and aligned with user needs. You will work with various Gen AI use cases and More ❯
and health security Detect: use cutting edge environmental and biological surveillance to proactively detect and monitor infectious diseases and threats to health Analyse: use world-class science and data analytics to assess and continually monitor threats to health, identifying how best to control and mitigate the risks Respond: take rapid, collaborative and effective actions nationally and locally to … lead strong and sustainable global, national, regional and local partnerships designed to save lives, protect the nation from public health threats and reduce inequalities. Job overview The UKHSA Data Design & Architecture (DD&A) team is responsible for supporting data architecture and data solutions across the organisation. We are a small team of solution architects … data architects, and data modellers. We also work closely with other parts of the organisation who also are involved in development and maintenance of infrastructure and software, including Programme Teams, Engineering & Enablement, Surveillance Systems, Data Services & Quality, Genomics, Enterprise Architecture and the Development and Operations team. The DD&A team supervises and actively develops More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Syntax Consultancy Limited
IR35) Databricks Engineer needed with active SC Security Clearance for 6 Month Contract based in Central London (Hybrid). Developing a cutting-edge Azure Databricks platform for economic data modelling, analysis, and forecasting. Start ASAP in Nov/Dec 2025. Hybrid Working - 2 days/week remote (WFH), and 3 days/week working on-site from the … Central London office. A chance to work with a leading global IT and Digital transformation business specialising in Government projects: In-depth Data Engineering + strong hands-on Azure Databricks expertise. Azure Data Services, Azure Data Factory, Azure Blob Storage + Azure SQL Database. Designing, developing, building + optimising datapipelines, implementing … including data modelling techniques + data integration patterns. Experience of working with complex datapipelines, large data sets, datapipeline optimization + data architecture design. Implementing complex data transformations using Spark, PySpark or Scala + working with SQL/MySQL databases. Experience with dataMore ❯
Purpose of the Role We are looking for a world-class Data Architect to drive the data flows and analysis that powers F1. As a Lead Data Architect, you will be responsible for partnering with systems architects and business users to understand data management and integration requirements, which help the team optimize … workflows and feature delivery. You will also partner with software engineers to ensure that data integration, database architectures, and analytics capabilities are world championship material. You will partner with business functions and platform engineering to ensure the highest standards of operational excellence for our data platforms including monitoring, alerting, and problem resolution, and will work closely … warehouse, data lake) systems. Ability to partner with less technical business users to understand, document, and optimize data flows between systems. DataPipeline Design and Implementation: Experienced in designing and implementing datapipelines (ETL/ELT/streaming) for robust data ingestion, transformation, and distribution, ensuring access to More ❯
Job description Purpose of the Role We are looking for a world-class Data Architect to drive the data flows and analysis that powers F1. As a Lead Data Architect, you will be responsible for partnering with systems architects and business users to understand data management and integration requirements, which help the … team optimize workflows and feature delivery. You will also partner with software engineers to ensure that data integration, database architectures, and analytics capabilities are world championship material. You will partner with business functions and platform engineering to ensure the highest standards of operational excellence for our data platforms including monitoring, alerting, and problem resolution, and will … warehouse, data lake) systems. Ability to partner with less technical business users to understand, document, and optimize data flows between systems. DataPipeline Design and Implementation: Experienced in designing and implementing datapipelines (ETL/ELT/streaming) for robust data ingestion, transformation, and distribution, ensuring access to More ❯
skills and experience, we're looking for people who are innovative, commercial and value the work that they do. Our London office is currently looking for a Senior Data Engineer. The Role The Senior Data Engineer will be responsible for designing, developing, and maintaining the infrastructure and systems required for data storage, processing, and … analysis. They play a crucial role in building and managing the datapipelines that enable efficient and reliable data integration, transformation, and delivery for all data users across the enterprise. Key Responsibilities Designs and develops datapipelines that extract data from various sources, transform it into the desired format, and … load it into the appropriate data storage systems Collaborates with data scientists and analysts to optimize models and algorithms for data quality, security, and governance Integrates data from different sources, including databases, data warehouses, APIs, and external systems Ensures data consistency and integrity during the integration process, performing More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
Position: Analytics Engineer Location: London (hybrid 3 times a week)Department: ITType: 3 months rolling contractOutside IR35 The client is seeking an experienced Analytics Engineer to join its Data & Analytics team. The role focuses on building scalable datapipelines, transforming raw data into clean, reusable datasets, and enabling data-driven decision-making. … Key Responsibilities Design and build data products, with proficiency throughout the data lifecycle. Develop robust data models through close collaboration with business users and the engineering team. Partner with senior management, operational leads, and other stakeholders, coaching and supporting a data-driven culture, including KPI definition and reporting frameworks. Accountable for data extraction, transforming JSON & XML, utlising high experience within metadata management. Collaborate with data engineers to develop data, enrich product design and integrate data for predictive models or machine learning. Deliver well-defined, transformed, tested, documented, and code-reviewed datasets for analysis. Evaluate and recommend improvements in data flow, influencing and More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
Engineer Location: London (hybrid 3 times a week) Department: IT Type: 3 months rolling contract Outside IR35 The client is seeking an experienced Analytics Engineer to join its Data & Analytics team. The role focuses on building scalable datapipelines, transforming raw data into clean, reusable datasets, and enabling data-driven decision-making. … Key Responsibilities Design and build data products, with proficiency throughout the data lifecycle. Develop robust data models through close collaboration with business users and the engineering team. Partner with senior management, operational leads, and other stakeholders, coaching and supporting a data-driven culture, including KPI definition and reporting frameworks. Accountable for data extraction, transforming JSON & XML, utlising high experience within metadata management. Collaborate with data engineers to develop data, enrich product design and integrate data for predictive models or machine learning. Deliver well-defined, transformed, tested, documented, and code-reviewed datasets for analysis. Evaluate and recommend improvements in data flow, influencing and More ❯
Location - London, Bristol or Manchester (1 day a month onsite) Duration - 6 months Rate - £550 - £600pd (inside ir35) As a Data Engineer in the Cyber and Domains Protection Team you will: Work within an Agile team to support the development of dashboards and build automated reports to meet the needs of technical and non-technical users Work with … the data analyst and user researcher to update relevant data models to allow business intelligence data to meet the organisation's specific needs Develop business intelligence reports that can be automated, reused and shared with users directly Implement data flows to connect operational systems, data for analytics and business intelligence … for ETL and data cataloging Amazon Redshift or Athena for data warehousing and analytics Lambda for event-driven data processing. ETL/ELT pipeline development : experience in designing, building, and maintaining robust, automated data pipelines. You should be comfortable with both the theory and practical application of extracting, transforming, and loading More ❯
Location - London, Bristol or Manchester (1 day a month onsite) Duration - 6 months Rate - £550 - £600pd (inside ir35) As a Data Engineer in the Cyber and Domains Protection Team you will: Work within an Agile team to support the development of dashboards and build automated reports to meet the needs of technical and non-technical users Work with … the data analyst and user researcher to update relevant data models to allow business intelligence data to meet the organisation's specific needs Develop business intelligence reports that can be automated, reused and shared with users directly Implement data flows to connect operational systems, data for analytics and business intelligence … for ETL and data cataloging Amazon Redshift or Athena for data warehousing and analytics Lambda for event-driven data processing. ETL/ELT pipeline development : experience in designing, building, and maintaining robust, automated data pipelines. You should be comfortable with both the theory and practical application of extracting, transforming, and loading More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Job Title: Manager - Cloud Data Architect (AWS & Snowflake) Protiviti's UK ED&A practice offers a comprehensive range of data use cases delivered through various delivery and commercial routes. We work on the full data lifecycle with highly skilled and experienced data professionals. Our solutions range from data strategy and … governance through the development, design and implementation of advanced analytics and digitisation. About the Role We are seeking a Manager-level Cloud Data Architect with deep expertise in AWS cloud services , Snowflake , and AWS-native data platforms to lead the design and implementation of scalable, secure, and high-performance data solutions. You will play … a pivotal role in shaping our cloud data strategy, driving innovation, and mentoring teams to deliver enterprise-grade data architectures that enable advanced analytics and business intelligence for our clients. Key Responsibilities Architecture & Design Design end-to-end cloud data architectures using AWS and Snowflake, ensuring scalability, security, and performance. Define and promote best More ❯
Purpose of the Job: Design, build, and maintain robust data systems and pipelines that support data storage, processing, and analysis on the Cloud. Work with large datasets, ensuring data quality, scalability, and performance, while collaborating. closely with data scientists, analysts, and other engineering teams to understand their data needs and … provide them with high-quality, accessible data. They are responsible for ensuring that the underlying data infrastructure supports the organizations broader data and business goals, enabling more effective data-driven decision-making. Key Accountabilities: Design and implement scalable, efficient, and secure data architectures, ensuring optimal data flow across systems in … and process across the team along with BI lead. Influence the evolution of business and system requirements and contribute to the design of technical solutions to feed a delivery pipeline that increasingly employs Agile methods such as SCRUM and Kanban You will be required to develop unit tested code and then support test cycles including post implementation validation. You More ❯
Overview JOB TITLE: Senior Software Engineer (GCP) - Risk Data Lab LOCATION: London HOURS: 35 hours, full time WORKING PATTERN: Hybrid, 40% (or two days) in the above office location About this opportunity Become a part of our digital transformation journey and help us modernise our data environment! As a GCP Senior Software Engineer you will be … responsible for modernising our data stack by designing, developing and maintaining datapipelines, data solutions and data infrastructure on our Strategic Data Mesh Platform. You will collaborate with cross-functional teams to ensure the efficient flow of processing of data while serving data products to a … delivery expertise using Google Cloud Platform data technologies, particularly focussing on Cloud Storage, Big Query, using containerisation (GKE) and additionally Data Build Tool (DBT) for pipeline processing. Responsibilities Design and develop scalable, reliable, efficient and cost-effective datapipelines and ELT processes on the GCP platform. GCP Professional Data Engineer certification More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom
Norton Rose Fulbright LLP
Practice Group/Department: Integrations/Development & Data Management Job Description We're Norton Rose Fulbright - a global law firm with over 50 offices and 7,000 employees worldwide. We provide the world’s preeminent corporations and financial institutions with a full business law service. At Norton Rose Fulbright, our strategy and our culture are closely entwined. We … as the relevant skills and experience, we're looking for people who are innovative, commercial and value the work that they do. We are embarking on an exciting Data Programme of work and are looking for a talented Data Engineer to join our team. You will play a crucial role in building and managing datapipelines that enable efficient and reliable data integration, transformation and delivery for all data users across the EMEA Region. 12 month FTC, possibility of being made permanent Responsibilities Design and develop datapipelines that extract data from various sources, transform it into the desired format, and load it into the More ❯
energy and utilities, financial services, government and public services, health and life sciences, and transport. The Data Engineer will have experience in AWS cloud technologies for ETL pipeline, data warehouse and data lake design/building and data movement. You will join the business at a period of huge growth. WE … growth area for the business with a diverse and growing capability, and we are looking for a Data Engineer with experience in AWS cloud technologies for ETL pipeline, data warehouse and data lake design/building and data movement. AWS data and analytics services (or open-source equivalent) such … requirements You thrive in problem-solving and analytical thinking You enjoy collaborating with multiple stakeholders in a fast-paced environment Experience in the design and deployment of production datapipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala, Spark, SQL. Experience performing tasks such as writing scripts, extracting dataMore ❯
Bristol, Avon, England, United Kingdom Hybrid / WFH Options
Aspire Personnel Ltd
energy and utilities, financial services, government and public services, health and life sciences, and transport. The Data Engineer will have experience in AWS cloud technologies for ETL pipeline, data warehouse and data lake design/building and data movement. You will join the business at a period of huge growth. WE … growth area for the business with a diverse and growing capability, and we are looking for a Data Engineer with experience in AWS cloud technologies for ETL pipeline, data warehouse and data lake design/building and data movement. AWS data and analytics services (or open-source equivalent) such … requirements You thrive in problem-solving and analytical thinking You enjoy collaborating with multiple stakeholders in a fast-paced environment Experience in the design and deployment of production datapipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala, Spark, SQL. Experience performing tasks such as writing scripts, extracting dataMore ❯
Data Architect Would you like to shape the future of data platforms and drive impactful software innovations? Do you thrive in collaborative, customer-focused environments where your ideas help guide strategic decisions? About Our Team- LexisNexis Intellectual Property, which serves customers in more than 150 countries with 11,300 employees worldwide, is part of RELX, a … achieve superior results. By helping our customers achieve their goals, we support the development of new technologies and processes that ultimately advance humanity. We are looking for a Data Architect with proven experience designing and implementing data platforms using Databricks . In this mid-level role, you will play a critical part in architecting scalable data solutions that drive analytics, data science, and business intelligence efforts. You will work cross-functionally with engineering, analytics, and infrastructure teams to transform raw data into valuable enterprise assets. Key Responsibilities: Designing and implementing cloud-native data architectures using Databricks and technologies such as Delta Lake, Spark, and MLflow. Developing and maintaining More ❯