implementing data engineering best-practice (e.g., source-to-target mappings, coding standards, data quality, etc.), working closely with the external party who setup the environment . Create and maintain ETL processes, data mappings & transformations to orchestrate data integrations. Ensure data integrity, quality, privacy, and security across systems, in line with client and regulatory requirements. Optimize data solutions for performance and … up monitoring and data quality exception handling. Strong data modelling experience. Experience managing and developing CI/CD pipelines. Experience with Microsoft Azure products and services, and proficiency in ETL processes. Experience of working with APIs to integrate data flows between disparate cloud systems. Strong analytical and problem-solving skills, with the ability to work independently and collaboratively. The aptitude More ❯
implementing data engineering best-practice (e.g., source-to-target mappings, coding standards, data quality, etc.), working closely with the external party who setup the environment . Create and maintain ETL processes, data mappings & transformations to orchestrate data integrations. Ensure data integrity, quality, privacy, and security across systems, in line with client and regulatory requirements. Optimize data solutions for performance and … up monitoring and data quality exception handling. Strong data modelling experience. Experience managing and developing CI/CD pipelines. Experience with Microsoft Azure products and services, and proficiency in ETL processes. Experience of working with APIs to integrate data flows between disparate cloud systems. Strong analytical and problem-solving skills, with the ability to work independently and collaboratively. The aptitude More ❯
implementing data engineering best-practice (e.g., source-to-target mappings, coding standards, data quality, etc.), working closely with the external party who setup the environment . Create and maintain ETL processes, data mappings & transformations to orchestrate data integrations. Ensure data integrity, quality, privacy, and security across systems, in line with client and regulatory requirements. Optimize data solutions for performance and … up monitoring and data quality exception handling. Strong data modelling experience. Experience managing and developing CI/CD pipelines. Experience with Microsoft Azure products and services, and proficiency in ETL processes. Experience of working with APIs to integrate data flows between disparate cloud systems. Strong analytical and problem-solving skills, with the ability to work independently and collaboratively. The aptitude More ❯
Newcastle Upon Tyne, United Kingdom Hybrid / WFH Options
NHS Business Services Authority
deliver measurable outcomes. Experience of working in environment where written and verbal communication are essential to delivery of outcomes. Desirable Awareness of data engineering concepts, including Data Warehousing or ETL processes. Exposure to deployment pipelines or CI/CD concepts, with a willingness to learn how code is tested and promoted into production environments. Familiarity with reporting and dashboarding tools More ❯
and enhance report performance and usability. Automation and Workflow Management Use Power Automate to streamline data workflows and automate routine reporting tasks. Support cross-system data integration where applicable. ETL Development and Modernisation Build and maintain ETL pipelines using best practice methods and modern tooling. Support data transformation and movement across systems and platforms. Participate in the adoption and maintenance … T-SQL , including writing queries, procedures, and managing database structures. Proficient in Power BI dashboard development and DAX formulas. Experience with Power Automate for data-driven workflows. Understanding of ETL concepts and processes. Exposure to modern data platforms such as Azure Data Lake , Databricks , or Microsoft Fabric is a bonus. Analytical Skills: Ability to understand complex data structures and derive More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
data and analytics needs. Design and deploy end-to-end data solutions using Microsoft Fabric, encompassing data ingestion, transformation, and visualisation workflows. Construct and refine data models, pipelines, andETL frameworks within the Fabric ecosystem. Leverage Fabric's suite of tools to build dynamic reports, dashboards, and analytical applications. Maintain high standards of data integrity, consistency, and system performance across More ❯
intelligence and reporting tools like Tableau, PowerBI or similar. Experience with version control systems (e.g. Git) Ability to work in an Agile environment Experience with Microsoft SQL. Experience with ETL Tools and Data Migration. Experience with Data Analysis, Data mapping and UML. Experience with programming languages (Python, Ruby, C++, PHP, etc). The ability to work with large datasets across More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
The Boeing Company
decision making for our customers and partners. Position Responsibilities: Integrate various data sources and systems through a data pipeline into a data warehouse. Maintain and improve a data warehouse Extract, Transform, Load (ETL) and Datamart. Quickly become familiar with Boeing Global IT-related processes to enable implementation of requests and discovery of opportunities to automate or otherwise improve processes. Develop More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
using Azure Data Factory (ADF), ensuring efficient and reliable data movement and transformation. • Data Modelling using Kimball, 3NF or Dimensional methodologies • Utilize SQL and Python languages to extract, transform, andload data from various sources into Azure Databricks and Azure SQL/SQL Server. • Design and implement metadata driven pipelines to automate data processing tasks. • Collaborate with cross-functional teams More ❯
Build and optimize datasets for performance and reliability in Azure Databricks . Collaborate with analysts and business stakeholders to translate data requirements into robust technical solutions. Implement and maintain ETL/ELT pipelines using Azure Data Factory or Synapse Pipelines . Design and develop a fit-for-purpose enterprise data warehouse to serve reporting and analytics. Ensure data quality, data … Azure Data Lake Storage Gen2 . Understanding of data warehouse concepts , dimensional modelling , and data architecture . Experience working with Delta Lake and large-scale data processing. Experience building ETL pipelines in Azure Data Factory or similar orchestration tools. Familiarity with version control systems (e.g., Git ) and CI/CD practices. Preferred Qualifications: Experience in a manufacturing, FMCG, or retail More ❯
Birmingham, West Midlands, England, United Kingdom Hybrid / WFH Options
Broster Buchanan Ltd
time data streaming, batch data processing and data transformation processes Experience with core tools such as Data Factory, Databricks, Synapse, Kafka and Python Any exposure to data migration/ETL would be highly beneficial, with SQL/T-SQL, SSIS, SSRS and SSAS, as there is a large data migration project planned Any previous experience with setting up data governance … and processes would be highly beneficial Good proficiency in Power BI and Tableau Strong knowledge of data integration techniques andETL/ELT processes Experience with data modelling, data warehousing, and data governance best practices More ❯
ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and a desire to make a significant impact, we encourage you to apply! Job Responsibilities ETL/ELT Pipeline Development: Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow. Implement batch and real-time data processing solutions using More ❯
Essential Skills & Experience): Proven Data Engineering Expertise: Demonstrable experience designing, building, and maintaining complex data pipelines in a production environment. Strong Technical Foundation: Expert-level SQL and proficiency in ETL principals. We currently use SQLSvr/SSIS, but are on a transformation journey of our data platform (AWS) Cloud Proficiency: Hands-on experience with at least one major cloud platform … AWS, Azure, or GCP) and its core data services (e.g., S3, Redshift, Lambda/Functions, Glue). Data Modelling: Deep understanding of ELT/ETL patterns, and data modelling techniques. CRM/Customer Data Focus: Experience working directly with data from CRM systems (e.g., Salesforce, Dynamics 365, Hubspot) and understanding customer data structures. Leadership Potential: Experience leading projects or mentoring More ❯
/or Data Lake. Technologies : Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, Azure Databricks and Power BI. Experience with creating low-level designs for data platform implementations. ETL pipeline development for the integration with data sources and data transformations including the creation of supplementary documentation. Proficiency in working with APIs and integrating them into data pipelines. Strong programming More ❯
standards. Lead evaluation and integration of data tools, platforms, and technologies (e.g., Snowflake, Databricks, Azure Synapse, Kafka, dbt, Power BI). Oversee data integration strategy across the enterprise-including ETL/ELT pipelines, APIs, and event-driven data streaming. Contribute to the development of a Data Center of Excellence (CoE) by sharing knowledge, reusable components, and data standards. Provide technical More ❯
Middlesbrough, Yorkshire, United Kingdom Hybrid / WFH Options
Causeway Technologies
products. Work with the Group Architect to align team standards and processes to Causeway's and influence the evolution of Causeway's standards and processes. Essential experience Experience with ETL/ELT processes and frameworks. Experience with CI/CD pipelines and Infrastructure as Code and understanding SDLC principles of data engineering workflows. Previous background in a similar software engineering More ❯
Telford, Shropshire, West Midlands, United Kingdom Hybrid / WFH Options
JLA Resourcing Ltd
and mentoring a team of data engineers, setting standards and best practices. Architecting and building end-to-end data solutions and streaming platforms. Designing and implementing data pipelines andETL workflows using Pentaho BA. Creating engaging BI reports and dashboards using Power BI. Managing cloud-based data environments (AWS, Azure), ensuring scalability and resilience. Collaborating on project planning, risk management … environments. Essential experience includes: 5+ years in data engineering with leadership responsibilities. Deep expertise in AWS and/or Azure data platforms and services. Proficiency with Pentaho BA for ETL processes and data workflows. Strong skills in Power BI, Python, and Java. Agile development experience and a pragmatic, team-first approach. Excellent problem-solving and communication skills, including the ability … SSIS, AWS or Azure Data Factory. Familiarity with Hadoop, Jenkins, or DevOps practices including CI/CD. Cloud certifications (Azure or AWS). Knowledge of additional programming languages or ETL tools. This is a fantastic opportunity to take a leadership role on meaningful, large-scale government programmes while continuing to develop your skills and experience in an inclusive, innovative environment. More ❯
important to ensure the delivery of robust, future-proof data solutions. Key Skills Experience developing modern data stacks and cloud data platforms. Capable of engineering scalable data pipelines using ETL/ELT tools e.g. Apache Spark, Airflow, dbt. Expertise with cloud data platforms e.g. AWS (Redshift, Glue), Azure (Data Factory, Synapse), Google Cloud (BigQuery, Dataflow). Proficiency in data processing More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Fruition Group
important to ensure the delivery of robust, future-proof data solutions. Key Skills: Experience developing modern data stacks and cloud data platforms. Capable of engineering scalable data pipelines using ETL/ELT tools e.g. Apache Spark, Airflow, dbt. Expertise with cloud data platforms e.g. AWS (Redshift, Glue), Azure (Data Factory, Synapse), Google Cloud (BigQuery, Dataflow). Proficiency in data processing More ❯
important to ensure the delivery of robust, future-proof data solutions. Key Skills: Experience developing modern data stacks and cloud data platforms. Capable of engineering scalable data pipelines using ETL/ELT tools e.g. Apache Spark, Airflow, dbt. Expertise with cloud data platforms e.g. AWS (Redshift, Glue), Azure (Data Factory, Synapse), Google Cloud (BigQuery, Dataflow). Proficiency in data processing More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
clients to gather requirements and deliver solutions. Be willing to engage and assist in pre-sales activities, bids, proposals etc. Use key techniques such as Governance, Architecture, Data Modelling, ETL/ELT, Data Lakes, Data Warehousing, Master Data, and BI. Consistently utilise key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines. More ❯
clients to gather requirements and deliver solutions. Be willing to engage and assist in pre-sales activities, bids, proposals etc. Use key techniques such as Governance, Architecture, Data Modelling, ETL/ELT, Data Lakes, Data Warehousing, Master Data, and BI. Consistently utilise key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines. More ❯
that impact the data warehouse. Ensure data accuracy, consistency, and integrity across warehouse and source systems. Maintain and evolve the data dictionary and associated metadata for the warehouse andETL systems. Mentor and support team members to build a high-performing, resilient data function. Keep up to date with industry developments and maintain relevant technical expertise. Complete all mandatory training More ❯
Modelling Design and implement scalable, high-performance data warehouse and data lake architectures. Develop conceptual, logical, and physical data models to support analytical requirements. Build and optimise data pipelines (ETL/ELT) using tools such as Azure Synapse, Snowflake, Redshift, or similar. Ensure robust data governance, security, and quality management practices. Support cloud data migrations and architecture modernisation initiatives. Front More ❯
platform for data analytics, including design and deployment of infrastructure. Expertise in creating CI/CD pipelines. Experience in creating FTP (SFTP/FTPS) configurations. Experience in working with ETL/ELT workflows for data analytics. Degree in Computer Science, Mathematics or related subject. Highly desirable skills & exposure: Working collaboratively as part of an Agile development squad. Experience and knowledge More ❯