u00A0 Director, Data Governance Location: Global Reporting to: VP – Data Excellence WHO WE ARE Choreograph is WPP's global data products and technology company. Our mission is to transform marketing by building the fastest, most connected data platform that bridges marketing strategy … to scaled activation. We partner with agencies and clients to unlock the full value of data by combining technology, data, and analytics expertise. This is delivered through the Open Media Studio, an AI-enabled media and data platform designed for the future of advertising. … to ensure data governance principles are integrated into all aspects of the data platform development and operation. DataPipeline Governance: Implement governance controls over datapipelines, ensuring data lineage, data quality, and security are maintained throughout the More ❯
Director , Data Governance Location: Global Reporting to: VP - Data Excellence WHO WE ARE Choreograph is WPP's global data products and technology company. Our mission is to transform marketing by building the fastest, most connected data platform that bridges marketing strategy to … scaled activation. We partner with agencies and clients to unlock the full value of data by combining technology, data, and analytics expertise. This is delivered through the Open Media Studio, an AI-enabled media and data platform designed for the future of advertising. Our … to ensure data governance principles are integrated into all aspects of the data platform development and operation. DataPipeline Governance: Implement governance controls over datapipelines, ensuring data lineage, data quality, and security are maintained throughout the More ❯
London, England, United Kingdom Hybrid / WFH Options
Foreign, Commonwealth and Development Office
Join to apply for the Senior Data Engineers role at Foreign, Commonwealth and Development Office 1 week ago Be among the first 25 applicants Join to apply for the Senior Data Engineers role at Foreign, Commonwealth and Development Office London or East Kilbride - You will be … UK-based staff work in King Charles Street, London, Abercrombie House in East Kilbride and in Milton Keynes. These roles contribute to FCDOs Data Strategy to improve insights to the business by provisioning it with reliable and trustworthy data sources and improving their delivery in a … secure manner in compliance with Data regulations. The team is primarily Azure focused but other Cloud technologies such as Oracle and Amazon Web Services (AWS) are used in FCDO. The successful candidates will build complex datapipelines (both Extract Transform and Load {ETL} and Extract Load More ❯
engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets. Key Responsibilities : DataPipeline Development & Optimisation : Design, develop, and maintain robust and scalable datapipelines for ingesting, transforming, and loading data from various sources … effectively to both technical and non-technical audiences. Participate in code reviews and knowledge sharing sessions. Automation & DevOps: Implement automation for datapipeline deployments and other data engineering tasks. Work with DevOps teams to implement and Build CI/CD pipelines, for environmental deployments. Promote … SQL Database. Experience working with large datasets and complex data pipelines. Experience with data architecture design and datapipeline optimization. Proven expertise with Databricks, including hands-on implementation experience and certifications. Experience with SQL and NoSQL databases. Experience with data quality More ❯
engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets. Key Responsibilities : DataPipeline Development & Optimisation : Design, develop, and maintain robust and scalable datapipelines for ingesting, transforming, and loading data from various sources … effectively to both technical and non-technical audiences. Participate in code reviews and knowledge sharing sessions. Automation & DevOps: Implement automation for datapipeline deployments and other data engineering tasks. Work with DevOps teams to implement and Build CI/CD pipelines, for environmental deployments. Promote … SQL Database. Experience working with large datasets and complex data pipelines. Experience with data architecture design and datapipeline optimization. Proven expertise with Databricks, including hands-on implementation experience and certifications. Experience with SQL and NoSQL databases. Experience with data quality More ❯
engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets. Key Responsibilities : DataPipeline Development & Optimisation: Design, develop, and maintain robust and scalable datapipelines for ingesting, transforming, and loading data from various sources … effectively to both technical and non-technical audiences. Participate in code reviews and knowledge sharing sessions. Automation & DevOps: Implement automation for datapipeline deployments and other data engineering tasks. Work with DevOps teams to implement and Build CI/CD pipelines, for environmental deployments. Promote … SQL Database. Experience working with large datasets and complex data pipelines. Experience with data architecture design and datapipeline optimization. Proven expertise with Databricks, including hands-on implementation experience and certifications. Experience with SQL and NoSQL databases. Experience with data quality More ❯
london (city of london), south east england, united kingdom
Mastek
engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets. Key Responsibilities : DataPipeline Development & Optimisation: Design, develop, and maintain robust and scalable datapipelines for ingesting, transforming, and loading data from various sources … effectively to both technical and non-technical audiences. Participate in code reviews and knowledge sharing sessions. Automation & DevOps: Implement automation for datapipeline deployments and other data engineering tasks. Work with DevOps teams to implement and Build CI/CD pipelines, for environmental deployments. Promote … SQL Database. Experience working with large datasets and complex data pipelines. Experience with data architecture design and datapipeline optimization. Proven expertise with Databricks, including hands-on implementation experience and certifications. Experience with SQL and NoSQL databases. Experience with data quality More ❯
engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets. Key Responsibilities : DataPipeline Development & Optimisation : Design, develop, and maintain robust and scalable datapipelines for ingesting, transforming, and loading data from various sources … effectively to both technical and non-technical audiences. Participate in code reviews and knowledge sharing sessions. Automation & DevOps: Implement automation for datapipeline deployments and other data engineering tasks. Work with DevOps teams to implement and Build CI/CD pipelines, for environmental deployments. Promote … SQL Database. Experience working with large datasets and complex data pipelines. Experience with data architecture design and datapipeline optimization. Proven expertise with Databricks, including hands-on implementation experience and certifications. Experience with SQL and NoSQL databases. Experience with data quality More ❯
engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets. Key Responsibilities : DataPipeline Development & Optimisation : Design, develop, and maintain robust and scalable datapipelines for ingesting, transforming, and loading data from various sources … effectively to both technical and non-technical audiences. Participate in code reviews and knowledge sharing sessions. Automation & DevOps: Implement automation for datapipeline deployments and other data engineering tasks. Work with DevOps teams to implement and Build CI/CD pipelines, for environmental deployments. Promote … SQL Database. Experience working with large datasets and complex data pipelines. Experience with data architecture design and datapipeline optimization. Proven expertise with Databricks, including hands-on implementation experience and certifications. Experience with SQL and NoSQL databases. Experience with data quality More ❯
Job Description Job Title: Data Architect Location: London - 3 days travel to office SC Cleared: Required Job Type: Full-Time Experience: 10+ years Job Summary : We are seeking a highly experienced and visionary Data Architect to lead the design and implementation of the data … our cutting-edge Azure Databricks platform focused on economic data. This platform is crucial for our Monetary Analysis, Forecasting, and Modelling efforts. The Data Architect will be responsible for defining the overall data strategy, data models, data governance framework, and data integration patterns. This role requires a deep understanding of data warehousing principles, big data technologies, cloud computing (specifically Azure), and a strong grasp of data analysis concepts within the economic domain. Key Experience: Extensive Data Architecture Knowledge: They possess a More ❯
advantage, and its technology-led service provides access to all major exchanges, order-flow management via screen, voice and DMA, plus award-winning data, insights and analytics. The Technology Department delivers differentiation, scalability and security for the business. Reporting to the COO, Technology provides digital tools, software services … our enterprise-wide services to end users and actively manages the firm's infrastructure and data. Within IT, Marex Technology has established a Data team that enables the firm to leverage data assets to increase productivity and improve business decisions, as well as maintain data compliance. The Data Team encompasses Database Administration, Data Engineering, Data Analysis, Data Architecture, Data Intelligence and AI expertise. In recent years, they have developed a Data Lakehouse architecture, that is relied upon by different departments across More ❯
get in touch with at UKI.recruitment@tcs.com or call TCS London Office number 02031552100 with the subject line: “Application Support Request”. Role: Data Architect Job Type: Permanent (Hybrid) Location: London, United Kingdom Ready to utilize your skills in designing, creating, and managing data architecture? Join … us as a Data Architect. Careers at TCS: It means more TCS is a purpose-led transformation company, built on belief. We do not just help businesses to transform through technology. We support them in making a meaningful difference to the people and communities they serve - our clients … an exciting team where you will be challenged every day. • Build strong relationships with a diverse range of stakeholders. The Role As a Data Architect, you will be responsible for designing, creating, and managing data architecture. You will also ensure that data is efficiently More ❯
get in touch with at UKI.recruitment@tcs.com or call TCS London Office number 02031552100 with the subject line: “Application Support Request”. Role : Data Architect Job Type: Permanent (Hybrid) Location: London, United Kingdom Ready to utilize your skills in designing, creating, and managing data architecture? Join … us as a Data Architect. Careers at TCS: It means more TCS is a purpose-led transformation company, built on belief. We do not just help businesses to transform through technology. We support them in making a meaningful difference to the people and communities they serve - our clients … an exciting team where you will be challenged every day. • Build strong relationships with a diverse range of stakeholders. The Role As a Data Architect, you will be responsible for designing, creating, and managing data architecture. You will also ensure that data is efficiently More ❯
get in touch with at UKI.recruitment@tcs.com or call TCS London Office number 02031552100 with the subject line: “Application Support Request”. Role : Data Architect Job Type: Permanent (Hybrid) Location: London, United Kingdom Ready to utilize your skills in designing, creating, and managing data architecture? Join … us as a Data Architect. Careers at TCS: It means more TCS is a purpose-led transformation company, built on belief. We do not just help businesses to transform through technology. We support them in making a meaningful difference to the people and communities they serve - our clients … an exciting team where you will be challenged every day. • Build strong relationships with a diverse range of stakeholders. The Role As a Data Architect, you will be responsible for designing, creating, and managing data architecture. You will also ensure that data is efficiently More ❯
on financial and tax planning, offering home finance and annuities propositions, and providing collective fund solutions to third party customers. Job Summary The Data Engineer is a hands-on technical role responsible for designing, developing, and maintaining datapipelines within the IT department. The pipelines will … be realised in a modern lake environment and the engineer will collaborate in cross-functional teams to gather requirements and develop the conceptual data models. This role plays a crucial part in driving data-driven decision-making across the organisation, ensuring data availability, quality … guiding and upskilling more junior data engineers and setting data standards and guidelines. What you'll do DataPipeline Development Design, model, develop and maintain datapipelines to ingest, store, process, and present data. Ensure data quality, accuracy, and More ❯
on financial and tax planning, offering home finance and annuities propositions, and providing collective fund solutions to third party customers. Job Purpose The Data Engineer is a hands-on technical role responsible for designing, developing, and maintaining datapipelines within the IT department. The pipelines will … be realised in a modern lake environment and the engineer will collaborate in cross-functional teams to gather requirements and develop the conceptual data models. This role plays a crucial part in driving data-driven decision-making across the organisation, ensuring data availability, quality … guiding and upskilling more junior data engineers and setting data standards and guidelines. What you'll do DataPipeline Development Design, model, develop and maintain datapipelines to ingest, store, process, and present data. Ensure data quality, accuracy, and More ❯
Description Data System Reliability Engineer (dSRE) Role Overview: A crucial role in CME's Cloud data transformation, the data SRE will be aligned to data product pods ensuring the our data infrastructure is reliable, scalable, and efficient as the GCP … data footprint expands rapidly. Accountabilities: Automate data tasks on GCP Work with data domain owners, data scientists and other stakeholders to ensure that data is consumed effectively on GCP Design, build, secure and maintain data infrastructure, including datapipelines, databases, data warehouses, and data processing platforms on GCP Measure and monitor the quality of data on GCP data platforms Implement robust monitoring and alerting systems to proactively identify and resolve issues in data systems. Respond to More ❯
advantage, and its technology-led service provides access to all major exchanges, order-flow management via screen, voice and DMA, plus award-winning data, insights and analytics. The Technology Department delivers differentiation, scalability and security for the business. Reporting to the COO, Technology provides digital tools, software services … our enterprise-wide services to end users and actively manages the firm’s infrastructure and data. Within IT, Marex Technology has established a Data team that enables the firm to leverage data assets to increase productivity and improve business decisions, as well as maintain data compliance. The Data Team encompasses Data Analysis, Data Architecture, Data Intelligence and Machine Learning expertise. In recent years, they have developed a Data Lakehouse architecture, that is relied upon by different departments across the firm. Marex now seeks More ❯
and is responsible for driving process transformation, risk reduction, and efficiencies through best-in-class technology solutions. Including, development of shared services, comprehensive data architecture, automation, end-to-end solution design for large scale transformation initiatives, and adoption of Generative AI solutions. Job Purpose : The Central Operations Enterprise … Data Architect role is a critical hire in the group to become part of the team that drives data strategy, architecture, and design across critical initiatives -payments services, global reconciliations, reg. reporting, liquidity management, and strategic general ledger. The role requires hands on experience in designing … executing, and managing enterprise-wide data architecture, including design of frameworks to standardize data management, consumption, and data quality controls. Successful candidate will be working cross disciplines in engineering, architecture, and technology teams across Operations, Finance, CDO, and Technology functions to simplify dataMore ❯
View these similar available jobs instead: Date Added: Yesterday Data Manager Job DescriptionThis is an exciting opportunity with candidates to start and interview as soon as possible!The Data Manager must have minimum 1 year clinical research experience as they need an individual who can work … Day to day responsibilities will involve... Job DescriptionThis is an exciting opportunity with candidates to start and interview as soon as possible!The Data Manager must have minimum 1 year clinical research experience as they need an individual who can work independently as well as part of a … Day to day responsibilities will involve... Job DescriptionThis is an exciting opportunity with candidates to start and interview as soon as possible!The Data Manager must have minimum 1 year clinical research experience as they need an individual who can work independently as well as part of a More ❯
Description Data System Reliability Engineer (dSRE) Role Overview: A crucial role in CME's Cloud data transformation, the data SRE will be aligned to data product pods ensuring that our data infrastructure is reliable, scalable, and efficient as the GCP … data footprint expands rapidly. Accountabilities: Automate data tasks on GCP Work with data domain owners, data scientists, and other stakeholders to ensure effective data consumption on GCP Design, build, secure, and maintain data infrastructure, including datapipelines, databases, data warehouses, and data processing platforms on GCP Measure and monitor the quality of data on GCP data platforms Implement robust monitoring and alerting systems to proactively identify and resolve issues in data systems. Respond to More ❯
for large blue-chip companies and governments worldwide. We're pioneers of meaningful AI : our solutions go far beyond chatbots. We are using data and AI to solve the world's biggest issues in telecommunications, sustainable water management, energy, healthcare, climate change, smart cities, and other areas that … engineers and scientists working across Africa, Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data … You will work closely with cross-functional teams and contribute to the strategic direction of our data initiatives. RESPONSIBILITIES DataPipeline Development: Lead the design, implement, and maintain scalable datapipelines for ingesting, processing, and transforming large volumes of data from More ❯
DataPipeline Development: Design and implement end-to-end datapipelines in Azure Databricks, handling ingestion from various data sources, performing complex transformations, and publishing data to Azure Data Lake or other storage services. Write efficient and standardized Spark … SQL and PySpark code for data transformations, ensuring data integrity and accuracy across the pipeline. Automate pipeline orchestration using Databricks Workflows or integration with external tools (e.g., Apache Airflow, Azure Data Factory). Data Ingestion & Transformation: Build scalable data … for performance by tuning configurations, partitioning data, and caching intermediate results to minimize processing time and resource consumption. Continuously monitor and improve pipeline performance, addressing bottlenecks and optimizing for cost efficiency in Azure. Automation & Monitoring: Automate datapipeline deployment and management using tools like More ❯
London or East Kilbride, East Kilbride, South Lanarkshire, Scotland Hybrid / WFH Options
Government Digital & Data
UK-based staff work in King Charles Street, London, Abercrombie House in East Kilbride and in Milton Keynes. These roles contribute to FCDOs Data Strategy to improve insights to the business by provisioning it with reliable and trustworthy data sources and improving their delivery in a … secure manner in compliance with Data regulations. The team is primarily Azure focused but other Cloud technologies such as Oracle and Amazon Web Services (AWS) are used in FCDO. The successful candidates will build complex datapipelines (both Extract Transform and Load {ETL} and Extract Load … and Transform {ELT}) in the Azure cloud platforms. You will work with structured and unstructured data, data lakes, data warehouses to service operational and analytical business needs. Job description The successful candidates will: Work collaboratively with Enterprise Data Colleagues to develop pipelinesMore ❯
for large blue-chip companies and governments worldwide. We're pioneers of meaningful AI : our solutions go far beyond chatbots. We are using data and AI to solve the world's biggest issues in telecommunications, sustainable water management, energy, healthcare, climate change, smart cities, and other areas that … engineers and scientists working across Africa, Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data … Load) processes, and big data technologies it becomes possible to develop robust and reliable data solutions. RESPONSIBILITIES DataPipeline Development: Design, implement, and maintain scalable datapipelines for ingesting, processing, and transforming large volumes of data from various sources More ❯