the Role Designing, building and maintaining data pipelines. Building and maintaining data warehouses. Data cleansing and transformation. Developing and maintaining ETL processes (ELT = extract, transform, load) to extract, transform, andload data from various sources into data warehouses. Validating charts and reports created by systems built in-house. Creating validation tools. Developing and maintaining data models, data tools. Monitoring and … Experience in R programming language. Experience in Python programming language. Experience in designing, building and maintaining data pipelines. Experience with data warehousing and data lakes. Experience in developing and maintaining ETL processes. Experience in developing data integration tools. Experience in data manipulation, data analysis, data modelling. Experience with cloud platforms (AWS, Azure, etc.) Experience in designing scalable, secure, and cost More ❯
work as part of a collaborative team to solve problems and assist other colleagues. • Ability to learn new technologies, programs and procedures. Technical Essentials: • Expertise across data warehouse andETL/ELT development in AWS preferred with experience in the following: • Strong experience in some of the AWS services like Redshift, Lambda,S3,Step Functions, Batch, Cloud formation, Lake Formation More ❯
Knutsford, Cheshire, North West, United Kingdom Hybrid / WFH Options
The Veterinary Defence Society
improve data processes Contribute to project teams, ensuring data requirements are addressed from the outset Continuously improve BI tools and data engineering practices Required Skills & Experience Proficiency in SQL, ETL processes, data warehousing, and data modelling (MS SQL preferred) Proven experience in data engineering or analysis Strong analytical and problem-solving skills Excellent communication skills able to explain technical concepts More ❯
Spark, Databricks, or similar data processing tools. Stron g technical proficiency in data modelin g , SQL, NoSQL databases, and data warehousing . Hands-on experience with data pipeline development, ETL processes, and big data technolo g ies (e. g ., Hadoop, Spark, Kafka). Proficiency in cloud platforms such as AWS, Azure, or Goo g le Cloud and cloud-based More ❯
Middlesbrough, Yorkshire, United Kingdom Hybrid / WFH Options
Causeway Technologies
products. Work with the Group Architect to align team standards and processes to Causeway's and influence the evolution of Causeway's standards and processes. Essential experience Experience with ETL/ELT processes and frameworks. Experience with CI/CD pipelines and Infrastructure as Code and understanding SDLC principles of data engineering workflows. Previous background in a similar software engineering More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
organisation to recruit a Junior Data Engineer. Hybrid working - London based Role The Junior Data Engineer will assist in the design, development, and maintenance of scalable data pipelines andETL/ELT processes in Azure. Write efficient and reliable SQL queries for data extraction, transformation, and analysis. Support data integration from various sources (internal systems, third-party vendors) into centralised More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
clients to gather requirements and deliver solutions. Be willing to engage and assist in pre-sales activities, bids, proposals etc. Use key techniques such as Governance, Architecture, Data Modelling, ETL/ELT, Data Lakes, Data Warehousing, Master Data, and BI. Consistently utilise key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines. More ❯
clients to gather requirements and deliver solutions. Be willing to engage and assist in pre-sales activities, bids, proposals etc. Use key techniques such as Governance, Architecture, Data Modelling, ETL/ELT, Data Lakes, Data Warehousing, Master Data, and BI. Consistently utilise key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines. More ❯
that impact the data warehouse. Ensure data accuracy, consistency, and integrity across warehouse and source systems. Maintain and evolve the data dictionary and associated metadata for the warehouse andETL systems. Mentor and support team members to build a high-performing, resilient data function. Keep up to date with industry developments and maintain relevant technical expertise. Complete all mandatory training More ❯
and develops proactive solutions for data quality and system performance issues. Act as a liaison between departments, facilitating communication and ensuring alignment on data-driven goals. Support and enhance ETL (Extract, Transform, Load) processes for seamless data integration. Provide ongoing training and guidance to individuals and teams on using data, tools, dashboards, and reports. Design, refine, and implement interactive reports More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Actica Consulting Limited
discuss how your expertise could strengthen our growing data practice. As a data engineer/scientist, you will: Data Engineering Focus: Design, implement, and maintain scalable data pipelines andETL processes Develop and maintain data warehouses and data lakes Implement data quality monitoring and validation systems Create and maintain data documentation and cataloguing systems Optimize data storage and retrieval systems More ❯
East Horsley, Surrey, United Kingdom Hybrid / WFH Options
Actica Consulting Limited
discuss how your expertise could strengthen our growing data practice. As a data engineer/scientist, you will: Data Engineering Focus: Design, implement, and maintain scalable data pipelines andETL processes Develop and maintain data warehouses and data lakes Implement data quality monitoring and validation systems Create and maintain data documentation and cataloguing systems Optimize data storage and retrieval systems More ❯
visualisations and narratives. Support change management, adoption, and user training. Technical Skills (Required/Desirable): Azure Data Factory, Synapse, Databricks, Microsoft Fabric. Data warehouse & lakehouse architecture. Strong SQL, Python, ETL/ELT knowledge. Power BI and data visualisation tools. Agile delivery experience. You’ll Have: 5+ years in Microsoft data technologies. Strong consulting and stakeholder management experience. Excellent documentation andMore ❯
visualisations and narratives. Support change management, adoption, and user training. Technical Skills (Required/Desirable): Azure Data Factory, Synapse, Databricks, Microsoft Fabric. Data warehouse & lakehouse architecture. Strong SQL, Python, ETL/ELT knowledge. Power BI and data visualisation tools. Agile delivery experience. You’ll Have: 5+ years in Microsoft data technologies. Strong consulting and stakeholder management experience. Excellent documentation andMore ❯
Burton-on-Trent, Staffordshire, England, United Kingdom Hybrid / WFH Options
Crimson
in Azure Data Pipeline development is key for this position. Key Skills & Responsibilities: Build and manage pipelines using Azure Data Factory, Databricks, CI/CD, and Terraform. Optimisation of ETL processes for performance and cost-efficiency. Design scalable data models aligned with business needs. Azure data solutions for efficient data storage and retrieval. Ensure compliance with data protection laws (e.g. More ❯
Reading, Oxfordshire, United Kingdom Hybrid / WFH Options
Henderson Drake
initiatives Ensure high standards of documentation and data security compliance Technical Skills (desirable): Microsoft Azure Data Services (e.g., Azure Data Factory, Synapse, Databricks, Fabric) Data warehousing and lakehouse design ETL/ELT pipelines SQL, Python for data manipulation and machine learning Big Data frameworks (e.g., Hadoop, Spark) Data visualisation (e.g., Power BI) Understanding of statistical analysis and predictive modelling Experience More ❯
Reading, Berkshire, United Kingdom Hybrid / WFH Options
Henderson Drake
initiatives Ensure high standards of documentation and data security compliance Technical Skills (desirable): Microsoft Azure Data Services (e.g., Azure Data Factory, Synapse, Databricks, Fabric) Data warehousing and lakehouse design ETL/ELT pipelines SQL, Python for data manipulation and machine learning Big Data frameworks (e.g., Hadoop, Spark) Data visualisation (e.g., Power BI) Understanding of statistical analysis and predictive modelling Experience More ❯
platforms. Expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL and at least one programming language (e.g., Python, Scala, or Java) Demonstrated experience owning complex technical systems end-to-end, from More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Motability Operations Limited
Product Owner in Agile environments. Ideally experience managing Finance and data products or platforms, ideally data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Strong understanding of data quality frameworks, data contracts, and lineage. Proficient in using analytics and BI tools (e.g., Tableau, Power BI, Looker) to drive product More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations
Product Owner in Agile environments. Ideally experience managing Finance and data products or platforms, ideally data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Strong understanding of data quality frameworks, data contracts, and lineage. Proficient in using analytics and BI tools (e.g., Tableau, Power BI, Looker) to drive product More ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
Product Owner in Agile environments. Ideally experience managing Finance and data products or platforms, ideally data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Strong understanding of data quality frameworks, data contracts, and lineage. Proficient in using analytics and BI tools (e.g., Tableau, Power BI, Looker) to drive product More ❯
Employment Type: Permanent, Part Time, Work From Home
scalability, security, and reliability. What we're looking for - Proven experience in data modelling and data engineering with a strong grasp of modern cloud architectures. - Expertise in data pipelines, ETL/ELT, and working with structured & unstructured data. - Strong skills in SQL, Python, Spark, or other relevant technologies. - Prior exposure to Microsoft Fabric is a huge plus -- but if you … of NHS information and associated datasets. 5. Collaborate with Data Analysts, and other stakeholders to understand data requirements and translate them into technical solutions. 6. Develop and implement efficient ETL (Extract, Transform, Load) processes to integrate data from various sources into centralised data repositories. 7. Document data architecture, processes, and workflows for reference and knowledge sharing. 8. Utilise programming languages … Extensive experience in data engineering within the healthcare sector, with a focus on NHS data systems. Proven track record of designing, developing, and maintaining large-scale data pipelines andETL processes. In-depth knowledge of data modelling, database design, and data warehousing principles. Familiarity with healthcare data standards and compliance regulations. Significant experience of extracting data, manipulating, understanding, transforming, wrangling More ❯
and management Proficient SQL and/or Python capabilities - minimum 2-3 years hands-on experience Comprehensive Data Engineering background - proven track record in enterprise data solutions Experience with ETL processes and data transformation , preferably using Databricks Strong foundation in Data Warehousing architectures and dimensional modeling Familiarity with batch processing from relational database sources Communication & Collaboration Skills of the Data More ❯
data platforms in cloud using Azure D&A stack, Databricks and Azure Open AI solution. Proficiency in coding (Python,PL/SQL, Shell Script) relational and non-relational databases, ETL tooling (such as Informatica), scalable data platforms. Proficiency in Azure Data and Analytics stack; working knowledge on AWS and GCP data solutions. Good understanding of deploying AI solutions in Azure More ❯
South East London, London, United Kingdom Hybrid / WFH Options
Datatech Analytics
knowledge of Microsoft Fabric, Azure Data Factory, Power BI, and related Azure tools Strong proficiency in SQL, Spark SQL, and Python for data processing and automation Solid understanding of ETL/ELT workflows, data modelling, and structuring datasets for analytics Experience working with large, complex datasets and APIs across formats (CSV, JSON, Parquet, etc) Familiarity with workflow automation tools (eg More ❯