Birmingham, West Midlands, United Kingdom Hybrid/Remote Options
Crimson
and continuous improvement. Provided secure, scalable, reusable integration capabilities for delivery teams and projects Strong knowledge of enterprise integration patterns and tools (APIs, ESB, messaging queues, event-driven architectures, ETL/ELT). Familiarity with cloud-native, hybrid, and middleware integration solutions (e.g., Azure Integration Services). Understanding of delivery methodologies (waterfall, agile, lean) as they impact integration planning andMore ❯
and other relevant standards. Evaluate, select, and implement data governance tools (e.g. data catalogues, data quality tools). Lead the deployment of modern data platform technologies including Data Warehouses, ETL services, and cloud-based solutions. Promote a data-driven culture through organization-wide training and awareness initiatives. Develop and deliver role-specific training for data owners, custodians, and other key … and physical data modelling, as well as expertise in data warehousing, data lake architecture, and Master Data Management (MDM). The role also demands proficiency in data integration andETL processes, including the design and orchestration of data pipelines and the use of ETL/ELT tools. Candidates should be experienced with a range of database technologies, particularly cloud-based More ❯
Strong expertise with MSSQL Server (schema design, tuning, indexing, profiling) Advanced SQL and dimensional data modelling (SCDs, fact/dim, conformed dimensions) Experience with PostgreSQL optimisation. Advanced Python skills ETL/ELT Pipelines: Hands-on experience building pipelines using SSIS, dbt, Airflow, or similar Strong understanding of enterprise ETL frameworks, lineage, and data quality Cloud & Infrastructure: Experience designing and supporting … build, and maintain OLAP, Tabular, and Multidimensional models used across the business Develop semantic models and robust data structures for Power BI and Excel cube connectivity Create and optimise ETL/ELT pipelines integrating data from S3 and diverse source systems Administer and tune MSSQL Server and PostgreSQL for high performance and reliability Ensure model scalability, accuracy, consistency, and rapid More ❯
Stockport, Cheshire, England, United Kingdom Hybrid/Remote Options
Robert Walters
role in designing, building, and scaling robust data platforms that support analytics, reporting, and advanced insight across the organisation. Key Responsibilities of the Role: Design, build, and maintain reliable ETL processes and end-to-end data pipelines. Develop and manage data warehousing solutions to support analytics and reporting needs. Work with both structured and unstructured data, ensuring high-quality data … Work both independently and collaboratively, demonstrating proactive problem-solving and critical thinking. Key Experience Needed: 5+ years of experience in Data Engineering or a similar role. Proven experience building ETL pipelines and large-scale data systems. Strong understanding of data warehousing, modelling, and transformation. Experience with cloud platforms (AWS preferred). Exposure to Snowflake, or willingness to learn. Familiarity with More ❯
deliver high-quality solutions in a fast-paced environment. Key Responsibilities Design and implement scalable data pipelines using Azure Data Factory , Databricks , and Synapse Analytics . Develop and optimise ETL processes for structured and semi-structured data. Work with SQL and Python for data transformation and modelling. Integrate data from multiple sources, ensuring accuracy, consistency, and performance. Collaborate with stakeholders … in enterprise environments. Strong hands-on expertise with Azure Data Factory , Databricks , Synapse , and Azure Data Lake . Proficiency in SQL , Python , and PySpark . Experience with data modelling , ETL optimisation , and cloud migration projects . Familiarity with Agile delivery and CI/CD pipelines. Excellent communication skills for working with technical and non-technical teams. Interested? Apply now or More ❯
deliver high-quality solutions in a fast-paced environment. Key Responsibilities Design and implement scalable data pipelines using Azure Data Factory , Databricks , and Synapse Analytics . Develop and optimise ETL processes for structured and semi-structured data. Work with SQL and Python for data transformation and modelling. Integrate data from multiple sources, ensuring accuracy, consistency, and performance. Collaborate with stakeholders … in enterprise environments. Strong hands-on expertise with Azure Data Factory , Databricks , Synapse , and Azure Data Lake . Proficiency in SQL , Python , and PySpark . Experience with data modelling , ETL optimisation , and cloud migration projects . Familiarity with Agile delivery and CI/CD pipelines. Excellent communication skills for working with technical and non-technical teams. Interested? Apply now or More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
ll Deliver End-to-end data solutions covering acquisition, engineering, modelling, analysis, and visualisation. Client workshops and communication at both technical and business levels. Design and implementation of robust ETL/ELT solutions using the Microsoft/Azure ecosystem (Fabric/Databricks). Development of data lakehouse architectures using a medallion design approach. Scalable engineering solutions that meet current and … Ability to build strong, collaborative relationships with teams and clients. Practical experience in data engineering or data warehousing with Azure/Microsoft or SQL Server technologies. Skills in developing ETL/ELT pipelines using Azure Synapse, Data Factory, Databricks, or Fabric, with SQL and Python. Strong understanding of data lake and lakehouse architectures. Experience working with large, complex datasets from More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
ll Deliver End-to-end data solutions covering acquisition, engineering, modelling, analysis, and visualisation. Client workshops and communication at both technical and business levels. Design and implementation of robust ETL/ELT solutions using the Microsoft/Azure ecosystem (Fabric/Databricks). Development of data lakehouse architectures using a medallion design approach. Scalable engineering solutions that meet current and … Ability to build strong, collaborative relationships with teams and clients. Practical experience in data engineering or data warehousing with Azure/Microsoft or SQL Server technologies. Skills in developing ETL/ELT pipelines using Azure Synapse, Data Factory, Databricks, or Fabric, with SQL and Python. Strong understanding of data lake and lakehouse architectures. Experience working with large, complex datasets from More ❯
Atherstone, Warwickshire, England, United Kingdom Hybrid/Remote Options
Aldi
using industry level best practices.Reporting to the Platform and Engineering Manager, the candidate will be required to design and manage data warehousing solutions, including the development of data models, ETL processes and data integration pipelines for efficient data consolidation, storage and retrieval, providing technical guidance and upskilling for the team, and conducting monitoring and optimisation activities. If you’re a … winning employer, apply to join #TeamAldi today! Your New Role: Project Management of demands and initiatives Lead the design and implementation of data warehousing Design and develop data models, ETL processes and data integration pipelines Complete Data Engineering end-to-end ownership of demand delivery Provide technical guidance for team members Providing 2nd or 3rd level technical support About You More ❯
analytics, or data management, with hands-on expertise in cloud-based data platforms (e.g., Azure, AWS, GCP, Snowflake). Proven expertise in designing, developing, and maintaining scalable data pipelines, ETL/ELT processes, and integrations to support advanced analytics. Experience with data governance frameworks, master data management (MDM), metadata management, and ensuring data compliance with global standards. Deep understanding of … fully leverage the data models, governance structures, and best practices established at the AOE level. Data Engineering & Infrastructure: Oversee the design, development, and maintenance of data pipelines, integrations, andETL processes to ensure efficient data flow and accessibility for analytics use cases. Collaboration & Stakeholder Management: Act as the key connection between AOE D&A and PNE, facilitating knowledge-sharing, alignment More ❯
the metrics, responsible for architecting the robust data pipelines that fuel our clients' growth and performance. We stay curious by constantly exploring new technologies like BigQuery and cutting-edge ETL tools, and we get stuck in by collaborating closely with our strategy, development, and marketing teams to turn raw data into transparent, high-impact insights. If you love the challenge … One Trust or Cookiebot. Build compelling dashboards and reports in tools like Looker Studio, Power BI, or Tableau, delivering key insights on web traffic and performance. Build and maintain ETL/ELT pipelines (e.g., acquisition BigQuery Looker Studio) to ensure reliable data flow for reporting. Manage data warehousing , including defining and optimising data schemas and staging tables for efficiency and … experience managing cookie consent platforms (e.g., One Trust, Cookiebot). Strong expertise in SQL —it's essential for working with and optimising data warehouses Proven experience building and maintaining ETL/ELT pipelines , whether custom or using tools like dbt, Fivetran, or Airbyte. Experience with BigQuery or other major cloud-based data warehouses. An adaptable, organised, and detail-oriented mindset More ❯
Contract Data Analyst (BigQuery) – Inside IR35 Location: Manchester - 2 days a week in the office Contract: 3 months (Inside IR35) Day Rate: Competitive (Inside IR35) Start Date: ASAP About the Role I'm seeking an experienced Data Analyst with deep More ❯
Sunbury-On-Thames, London, United Kingdom Hybrid/Remote Options
BP Energy
transformation, Data domain knowledge, Data Integration, Data Management, Data Manipulation, Data Sourcing, Data strategy and governance, Data Structures and Algorithms (Inactive), Data visualization and interpretation, Digital Security, Extract, transformandload, Group Problem Solving Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national More ❯
Huddersfield, West Yorkshire, Yorkshire, United Kingdom
Quest Global Engineering Limited
to help define migration scope, strategy, and approach. Collaborate with functional consultants and business stakeholders to capture data mapping requirements (including transformation rules). Develop, configure, execute, and optimise Extract-Transform-Load scripts . Support migration planning activities. Familiarity with Informatica or Pentaho would be advantageous. Execute migrations, carry out validation checks, and prepare status reports . Communicate data cleansing More ❯
data-driven insights to enable cost savings What you need to succeed at GXO: Strong Excel and WMS skills (Manhattan) and experience with relational databases Understanding of data extraction, ETL processes, and data formats (CSV, JSON, Parquet) Ability to design impactful visuals that influence business decisions Proven experience in business partnering with operational and financial teams Excellent communication skills andMore ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Opus Recruitment Solutions Ltd
as a skilled Salesforce Data Cloud specialist. Delivered two successful end-to-end Salesforce Data Cloud implementations. Strong expertise in designing scalable enterprise-level data architecture solutions. Experienced in ETL tools, data migration, and data cleansing practices. Proficient in writing and optimizing moderate to advanced SQL queries. Preferably a Salesforce Data Cloud Consultant certification holder. What to do next If More ❯
major transformation programmes. Key skills required Deep knowledge of PostgreSQL, Snowflake and Greenplum Snowflake internals, schemas, modelling, data lakes and integration patterns Data ingestion using Informatica, Talend and similar ETL tooling Strong experience handling JSON, XML, CSV and multi-source datasets Patroni expertise for HADR and streaming replication Backup, recovery, tuning and optimisation across Postgres, Snowflake and Greenplum Understanding of More ❯
reporting and data solutions that enhance efficiency, automate processes, and drive strategic decision-making. Your primary responsibilities will include gathering, cleaning, and integrating data from diverse sources, developing efficient Extract, Transform, Load (ETL) procedures, and utilizing BI tools like Tableau and Power BI to create clear, impactful dashboards and reports. A deep understanding and application of advanced Excel functionalities, Power More ❯
PostGIS, MS SQL Spatial and any other big data platform like MongoDB. Sufficient understanding of Vector Databases for the implementation of AI modules and techniques. Experience in geospatial data ETL (extraction, transformation and loading) to a spatial data warehouse environment, with a strong emphasis on CAD/BIM to GIS data processing with experience using ETL tools to automate data More ❯
or equivalent experience Experience writing SQL queries and optimising query performance. Strong skills in Python Experience with AWS or GCP Demonstrated experience and responsibility with data, processes, and building ETL pipelines. Experience with cloud data warehouses such as Amazon Redshift, and Google BigQuery. Building visualizations using tools such as Looker studio or equivalent Experience in designing ETL/ELT solutions More ❯
operations. Your expertise will be crucial as we gear up for an exciting data warehouse migration from New York to London in 2026! Key Responsibilities Analyse and optimise SSIS ETL pipelines and batch jobs. Improve SQL performance through effective indexing and execution plans. Identify and resolve locking and blocking issues to enhance efficiency. Apply best practises to boost overall warehouse … is essential. Solid experience with SSAS and SSRS tools. Deep understanding of execution plans and performance tuning techniques. Strong troubleshooting and problem-solving skills. Proven history of improving complex ETL environments. Nice to Have Experience with C# for SSIS scripts. Proficiency in Python. Exposure to Power BI for data visualisation. Why Join Us? Be part of a vibrant team that More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Adecco
operations. Your expertise will be crucial as we gear up for an exciting data warehouse migration from New York to London in 2026! Key Responsibilities Analyse and optimise SSIS ETL pipelines and batch jobs. Improve SQL performance through effective indexing and execution plans. Identify and resolve locking and blocking issues to enhance efficiency. Apply best practises to boost overall warehouse … is essential. Solid experience with SSAS and SSRS tools. Deep understanding of execution plans and performance tuning techniques. Strong troubleshooting and problem-solving skills. Proven history of improving complex ETL environments. Nice to Have Experience with C# for SSIS scripts. Proficiency in Python. Exposure to Power BI for data visualisation. Why Join Us? Be part of a vibrant team that More ❯
London, England, United Kingdom Hybrid/Remote Options
Free-Work UK
data strategy, improve how teams access and use data, and ensure our platforms remain secure, resilient, and future-ready. design, build, and maintain scalable, high-quality data pipelines andETL/ELT workflows that support analytics, reporting, and product development. assess, recommend, and implement modern data technologies and tooling to meet organisational requirements. develop and optimise data models, warehouses, and … primary development language, and the ability to write clean, scalable, production-ready code then we want to hear from you. You should also have: demonstrable experience designing and building ETL/ELT data pipelines and integrating data from multiple upstream sources. proven cloud experience (Azure (preferred), AWS, or GCP), including deploying, managing, and supporting cloud-hosted data services and applications. … support them. We are seeking an experienced Data Engineer who will play a central role in building and supporting the new platform. The role will focus on developing the ETL data pipeline framework to integrate upstream master data from parliamentary business systems, as well as engineering new data services, including APIs and web-client applications. The Data Engineer will be More ❯
support a major government programme delivering secure, scalable data solutions. Key Responsibilities Design and implement data pipelines on AWS using services such as Glue, Lambda, S3, and Redshift. Develop ETL processes and optimise data workflows for performance and security. Collaborate with analysts and architects to ensure compliance with government security standards. Troubleshoot and resolve issues in complex cloud environments. Essential … Skills Strong experience with AWS services (Glue, Lambda, S3, Redshift, IAM). Proficiency in Python and SQL for data engineering tasks. Knowledge of data modelling, ETL frameworks, and best practices. Familiarity with security and compliance in government or regulated environments. Excellent communication and problem-solving skills. Active SC clearance (mandatory). Desirable Experience with Terraform or CloudFormation. Exposure to CI More ❯
support a major government programme delivering secure, scalable data solutions. Key Responsibilities Design and implement data pipelines on AWS using services such as Glue, Lambda, S3, and Redshift. Develop ETL processes and optimise data workflows for performance and security. Collaborate with analysts and architects to ensure compliance with government security standards. Troubleshoot and resolve issues in complex cloud environments. Essential … Skills Strong experience with AWS services (Glue, Lambda, S3, Redshift, IAM). Proficiency in Python and SQL for data engineering tasks. Knowledge of data modelling, ETL frameworks, and best practices. Familiarity with security and compliance in government or regulated environments. Excellent communication and problem-solving skills. Active SC clearance (mandatory). Desirable Experience with Terraform or CloudFormation. Exposure to CI More ❯