Essential Skills & Experience): Proven Data Engineering Expertise: Demonstrable experience designing, building, and maintaining complex data pipelines in a production environment. Strong Technical Foundation: Expert-level SQL and proficiency in ETL principals. We currently use SQLSvr/SSIS, but are on a transformation journey of our data platform (AWS) Cloud Proficiency: Hands-on experience with at least one major cloud platform … AWS, Azure, or GCP) and its core data services (e.g., S3, Redshift, Lambda/Functions, Glue). Data Modelling: Deep understanding of ELT/ETL patterns, and data modelling techniques. CRM/Customer Data Focus: Experience working directly with data from CRM systems (e.g., Salesforce, Dynamics 365, Hubspot) and understanding customer data structures. Leadership Potential: Experience leading projects or mentoring More ❯
Essential Skills & Experience): Proven Data Engineering Expertise: Demonstrable experience designing, building, and maintaining complex data pipelines in a production environment. Strong Technical Foundation: Expert-level SQL and proficiency in ETL principals. We currently use SQLSvr/SSIS, but are on a transformation journey of our data platform (AWS) Cloud Proficiency: Hands-on experience with at least one major cloud platform … AWS, Azure, or GCP) and its core data services (e.g., S3, Redshift, Lambda/Functions, Glue). Data Modelling: Deep understanding of ELT/ETL patterns, and data modelling techniques. CRM/Customer Data Focus: Experience working directly with data from CRM systems (e.g., Salesforce, Dynamics 365, Hubspot) and understanding customer data structures. Leadership Potential: Experience leading projects or mentoring More ❯
/or Data Lake. Technologies : Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, Azure Databricks and Power BI. Experience with creating low-level designs for data platform implementations. ETL pipeline development for the integration with data sources and data transformations including the creation of supplementary documentation. Proficiency in working with APIs and integrating them into data pipelines. Strong programming More ❯
Knutsford, Cheshire, North West, United Kingdom Hybrid / WFH Options
The Veterinary Defence Society
improve data processes Contribute to project teams, ensuring data requirements are addressed from the outset Continuously improve BI tools and data engineering practices Required Skills & Experience Proficiency in SQL, ETL processes, data warehousing, and data modelling (MS SQL preferred) Proven experience in data engineering or analysis Strong analytical and problem-solving skills Excellent communication skills able to explain technical concepts More ❯
Chelsea and Westminster Hospital NHS Foundation Trust
objectives and user iteration Experience managing dependencies risks and resource constraints in a fast paced environment Experience of using SQL programming and understanding of data engineering concepts, data pipelines, ETL/ELT processes, and data warehousing Desirable Understanding of data modeling, dimensional modeling, and performance tuning. Skills and knowledge Essential Proficiency in PRINCE2 for project management and Excel plus other More ❯
Spark, Databricks, or similar data processing tools. Stron g technical proficiency in data modelin g , SQL, NoSQL databases, and data warehousing . Hands-on experience with data pipeline development, ETL processes, and big data technolo g ies (e. g ., Hadoop, Spark, Kafka). Proficiency in cloud platforms such as AWS, Azure, or Goo g le Cloud and cloud-based More ❯
Spark, Databricks, or similar data processing tools. Stron g technical proficiency in data modelin g , SQL, NoSQL databases, and data warehousing . Hands-on experience with data pipeline development, ETL processes, and big data technolo g ies (e. g ., Hadoop, Spark, Kafka). Proficiency in cloud platforms such as AWS, Azure, or Goo g le Cloud and cloud-based More ❯
environments. Database Design & Optimisation : Design and optimise complex SQL queries, and relational databases (e.g., Amazon Redshift, PostgreSQL, MySQL) to enable fast, efficient data retrieval and analytics. Data Transformation : Apply ETL/ELT processes to transform raw financial data into usable insights for business intelligence, reporting, and predictive analytics. Collaboration with Teams : Work closely with platform team, data analysts, and business … flow and operation. Requirements Several years of experience in data engineering, preferably in the financial services or similar regulated industries. Strong understanding of data engineering concepts, including data modelling, ETL/ELT processes, and data warehousing. Proven experience with AWS services (e.g., S3, Redshift, Lambda, ECS, ECR, SNS, Eventbridge, CloudWatch, Athena etc.) for building and maintaining scalable data solutions in … the cloud. Technical Skills (must have): Python: Proficient in Python for developing custom ETL solutions, data processing, and integration with cloud platforms. Terraform: Experience with Terraform to manage infrastructure as code, ensuring scalable and repeatable cloud environment provisioning. SQL: Advanced proficiency in SQL for querying and optimising relational databases Version Control: Experience with GitHub for managing code, reviewing pull requests More ❯
stored, and used across the organisation to support strategic decisions, ensure compliance, and drive digital transformation. What You’ll Be Doing Designing and managing SQL-based data infrastructure andETL processes Developing and maintaining data models, stored procedures, and integrated data pipelines Building insightful dashboards and visualisations using Power BI or SSRS Supporting internal and external reporting needs, including funding … and transformation initiatives What We’re Looking For Strong experience with SQL Server databases and data warehousing Proficiency in Power BI, SSRS, or similar reporting tools Solid understanding of ETL techniques and data integration Knowledge of data governance, validation methods, and data protection standards Excellent communication skills, especially when translating technical data for non-technical users A proactive, detail-oriented More ❯
generation technology. Data Support : Assist the Head of Data and Technology as a Subject Matter Expert in Data. Act as a senior custodian of the WRBU Data Platform; administering ETL/ELT pipelines, completing peer/code reviews, ensuring the integrity of WRBU data, and holding accountability for data integrity and accuracy. Data Warehousing : Help manage aspects of the data … warehousing landscape and data movements, including assisting with solution design for new and existing requirements. Collaborate with the business to understand needs. Design and develop ETL/ELT processes, data lake storage architecture, data warehouse, data marts, cubes, reports and dashboards; in-line with the company data management framework and enterprise data strategy. MI Development and Maintenance : Support the development More ❯
generation technology. Data Support : Assist the Head of Data and Technology as a Subject Matter Expert in Data. Act as a senior custodian of the WRBU Data Platform; administering ETL/ELT pipelines, completing peer/code reviews, ensuring the integrity of WRBU data, and holding accountability for data integrity and accuracy. Data Warehousing : Help manage aspects of the data … warehousing landscape and data movements, including assisting with solution design for new and existing requirements. Collaborate with the business to understand needs. Design and develop ETL/ELT processes, data lake storage architecture, data warehouse, data marts, cubes, reports and dashboards; in-line with the company data management framework and enterprise data strategy. MI Development and Maintenance : Support the development More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
organisation to recruit a Junior Data Engineer. Hybrid working - London based Role The Junior Data Engineer will assist in the design, development, and maintenance of scalable data pipelines andETL/ELT processes in Azure. Write efficient and reliable SQL queries for data extraction, transformation, and analysis. Support data integration from various sources (internal systems, third-party vendors) into centralised More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
clients to gather requirements and deliver solutions. Be willing to engage and assist in pre-sales activities, bids, proposals etc. Use key techniques such as Governance, Architecture, Data Modelling, ETL/ELT, Data Lakes, Data Warehousing, Master Data, and BI. Consistently utilise key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines. More ❯
clients to gather requirements and deliver solutions. Be willing to engage and assist in pre-sales activities, bids, proposals etc. Use key techniques such as Governance, Architecture, Data Modelling, ETL/ELT, Data Lakes, Data Warehousing, Master Data, and BI. Consistently utilise key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines. More ❯
that impact the data warehouse. Ensure data accuracy, consistency, and integrity across warehouse and source systems. Maintain and evolve the data dictionary and associated metadata for the warehouse andETL systems. Mentor and support team members to build a high-performing, resilient data function. Keep up to date with industry developments and maintain relevant technical expertise. Complete all mandatory training More ❯
platform for data analytics, including design and deployment of infrastructure. Expertise in creating CI/CD pipelines. Experience in creating FTP (SFTP/FTPS) configurations. Experience in working with ETL/ELT workflows for data analytics. Degree in Computer Science, Mathematics or related subject. Highly desirable skills & exposure: Working collaboratively as part of an Agile development squad. Experience and knowledge More ❯
plus. Experience in data modeling and warehouse architecture design. Experience with data orchestration and workflow management tools such as Airflow. Solid understanding of data ingestion, transformation, and ELT/ETL practices. Expertise on data transformation frameworks such as DBT. Experience with Infrastructure as Code (IaC) toolsTerraform preferred. Experience with version control systems, GitHub preferred. Experience with continuous integration and delivery More ❯
and develops proactive solutions for data quality and system performance issues. Act as a liaison between departments, facilitating communication and ensuring alignment on data-driven goals. Support and enhance ETL (Extract, Transform, Load) processes for seamless data integration. Provide ongoing training and guidance to individuals and teams on using data, tools, dashboards, and reports. Design, refine, and implement interactive reports More ❯
Required Skills: Proficiency in BI tools (Power BI, Tableau, QlikView) and data visualization techniques Strong SQL skills and experience with relational databases (e.g., SQL Server, Oracle, MySQL) Familiarity with ETL processes and data warehousing concepts Experience in data modeling and designing effective reporting structures Knowledge of programming languages such as Python or R for data manipulation and analysis Strong analytical More ❯
of actuaries, data scientist and developers. Our role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. The Role As a Machine … model drifts, data-quality alerts, scheduled r-training pipelines. Data Management and Preprocessing. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Software Development. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. Work More ❯
SQL, Spark SQL, and Python for data processing and automation Knowledge of Microsoft Fabric and Azure Data Factory would be useful but not essential Power BI Solid understanding of ETL/ELT workflows, data modelling, and structuring datasets for analytics Experience working with large, complex datasets and APIs across formats Familiarity with workflow automation tools (eg, Power Automate) and/ More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Datatech Analytics
SQL, Spark SQL, and Python for data processing and automation Knowledge of Microsoft Fabric and Azure Data Factory would be useful but not essential Power BI Solid understanding of ETL/ELT workflows, data modelling, and structuring datasets for analytics Experience working with large, complex datasets and APIs across formats Familiarity with workflow automation tools (eg, Power Automate) and/ More ❯
of actuaries, data scientist and developers. Our role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. The Role As a Machine … model drifts, data-quality alerts, scheduled r-training pipelines. Data Management and Preprocessing. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Software Development. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. Work More ❯
discuss how your expertise could strengthen our growing data practice. As a data engineer/scientist, you will: Data Engineering Focus: Design, implement, and maintain scalable data pipelines andETL processes Develop and maintain data warehouses and data lakes Implement data quality monitoring and validation systems Create and maintain data documentation and cataloguing systems Optimize data storage and retrieval systems More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Actica Consulting Limited
discuss how your expertise could strengthen our growing data practice. As a data engineer/scientist, you will: Data Engineering Focus: Design, implement, and maintain scalable data pipelines andETL processes Develop and maintain data warehouses and data lakes Implement data quality monitoring and validation systems Create and maintain data documentation and cataloguing systems Optimize data storage and retrieval systems More ❯