are looking for someone who can demonstrate experience in the following areas: Commercial experience with implementing Fabric strong Azure experience - Ideally using ADF, Databricks, ADLS etc Data Engineering background - ETL development, data storage platforms such as Data Warehouse, Lake, or Lakehouse architectures You will ideally have come from a consultancy background, and henceforth understand how to balance multiple projects with More ❯
quality best practices ✅ Excellent communication and stakeholder engagement skills ✅ Active SC Clearance Bonus Points For: Experience working in government, defence, or secure environments Familiarity with data migration tools andETL processes Hands-on involvement in ERP implementations or upgrades Sound like you? Apply now or reach out directly for a confidential discussion. This is a high-profile programme that will More ❯
quality best practices ✅ Excellent communication and stakeholder engagement skills ✅ Active SC Clearance Bonus Points For: Experience working in government, defence, or secure environments Familiarity with data migration tools andETL processes Hands-on involvement in ERP implementations or upgrades Sound like you? Apply now or reach out directly for a confidential discussion. This is a high-profile programme that will More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
S3, Redshift, Athena, EMR. Snowflake experience (Not essential) Strong Python SQL & NoSQL databases CI/CD tools : GitHub Actions or CodePipeline Infrastructure as Code (Terraform is preferable) Experience building ETL/ELT pipelines Familiarity with data lake and data warehouse concepts Excellent communication and stakeholder engagement skills Benefits: Hybrid working (one day per week in Leeds office) Clear career progression More ❯
Salisbury, Wiltshire, South West, United Kingdom Hybrid / WFH Options
Anson Mccade
lakehouse architectures Strong grasp of cloud data platforms (AWS, Azure, GCP, Snowflake) Understanding of Data Mesh , Data Fabric , and data product-centric approaches Familiarity with Apache Spark , Python , andETL/ELT pipelines Strong knowledge of data governance, lifecycle management, and compliance (e.g. GDPR) Consulting experience delivering custom data solutions across sectors Excellent leadership, communication, and Agile delivery experience Bonus More ❯
Gloucester, Gloucestershire, United Kingdom Hybrid / WFH Options
Pro Insurance
processes for various bordereaux (both Risk and Claims) using data ingestion and analysis tools such as Intrali, Quantemplate, or Matillion. The project focuses on the extraction, transformation, and loading (ETL) of data from various sources into data warehouses and data lakes. This is an ideal role for an aspiring Data Architect in the Insurance Industry. Pro operates a hybrid working More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Pro Insurance
processes for various bordereaux (both Risk and Claims) using data ingestion and analysis tools such as Intrali, Quantemplate, or Matillion. The project focuses on the extraction, transformation, and loading (ETL) of data from various sources into data warehouses and data lakes. This is an ideal role for an aspiring Data Architect in the Insurance Industry. Pro operates a hybrid working More ❯
of Database and Middleware systems Keeping database systems up-to-date with the latest service packs and security updates Troubleshooting issues with both databases and third party applications Managing ETL processes Key player in the migration of database systems to Azure Building and migration of SQL servers to appropriate versions Ensure database systems (Dev/Test/UAT/Live More ❯
of Database and Middleware systems Keeping database systems up-to-date with the latest service packs and security updates Troubleshooting issues with both databases and third party applications Managing ETL processes Key player in the migration of database systems to Azure Building and migration of SQL servers to appropriate versions Ensure database systems (Dev/Test/UAT/Live More ❯
ingestion and transformation to warehousing and modelling - all while ensuring data quality, integrity, and alignment with business goals. Key responsibilities will include: Leading the design, development, and deployment of ETL pipelines across diverse business systems Driving the data platform roadmap and delivery using best-in-class engineering practices Managing a high-performing data engineering team and mentoring technical talent Collaborating More ❯
. The Data Engineer will help drive the build of effortless, digital first customer experiences, simplifying the organisation by developing innovative data driven solutions through data pipelines, modelling andETL design, inspiring to be commercially successful through insight and keep customers and the organisation's data safe and secure. What you'll do: Build advanced automation of data engineering pipelines More ❯
party vendors Key Skills & Experience: Proven background designing and delivering enterprise-scale data platforms Strong knowledge of Microsoft Azure, Databricks, Delta Lake, and data warehousing Advanced data modelling andETL/ELT optimisation experience Familiarity with regulatory frameworks such as IFRS 17 , BCBS 239 , and UK Data Protection Excellent stakeholder communication and cross-functional collaboration skills Prior experience in a More ❯
Employment Type: Permanent
Salary: £10000 - £85000/annum 33 days holiday, bonus + more
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Hunter Selection
party vendors Key Skills & Experience: Proven background designing and delivering enterprise-scale data platforms Strong knowledge of Microsoft Azure, Databricks, Delta Lake, and data warehousing Advanced data modelling andETL/ELT optimisation experience Familiarity with regulatory frameworks such as IFRS 17 , BCBS 239 , and UK Data Protection Excellent stakeholder communication and cross-functional collaboration skills Prior experience in a More ❯
of their consulting offering and analytical capabilities as the company grows. Skills & Experience Python (incl. pandas, numpy, fastapi, dash/plotly) Database development: e.g. SQL, PostgreSQL, SQLAlchemy, data warehousing, ETL pipelines Cloud computing & DevOps: e.g. AWS (EC2, Lambda, S3), Docker, CI/CD, serverless architecture Frontend development: JavaScript/TypeScript, React, Dash, or similar frameworks Backend APIs: FastAPI, Flask, RESTful More ❯
tangible action plans. About You: Proven experience delivering mid-sized change projects, ideally in data or technology-rich environments. Solid understanding of modern data platforms (e.g., data lakes, warehouses, ETL pipelines). Familiarity with data management principles including data quality, lineage, and governance. Strong analytical skills to interpret complex data and influence stakeholders. Deep understanding of data technologies with excellent More ❯
Qualifications Experience in MarTech, AdTech, or B2B marketing technology, with a background in marketing data and analytics platforms. Familiarity with prompt engineering, LLM optimisation, and designing data processing pipelines (ETL). Agile development experience (Scrum, Kanban) and a track record of integrating AI into business applications. What you need to do now If you're interested in this role, click More ❯
Qualifications Experience in MarTech, AdTech, or B2B marketing technology, with a background in marketing data and analytics platforms. Familiarity with prompt engineering, LLM optimisation, and designing data processing pipelines (ETL). Agile development experience (Scrum, Kanban) and a track record of integrating AI into business applications. What you need to do now If you're interested in this role, click More ❯
Salesforce DX, and CI/CD pipelines. Strong understanding of Salesforce platform capabilities including Flow, roles, permissions, reports, and dashboards. Experience with Salesforce integration patterns (API/Middleware/ETL). Ability to translate business needs into platform solutions with minimal supervision. Excellent communication, stakeholder engagement, and documentation skills. Comfortable working in agile delivery teams and managing competing priorities. Desirable More ❯
and improve fit-for-purpose data that powers our products Design and implement proactive data quality management supported by client-focused metrics and process observability Develop data modeling andETL skills within the team and collaborate with Engineering teams to develop a robust data product that can deliver value across multiple client-facing products Build processes and standards that enable More ❯
Stroud, England, United Kingdom Hybrid / WFH Options
Ecotricity
as a Data Analyst Managing and manipulating data experience Strong working knowledge of SQL for data analysis (MS-SQL/Databricks - T-SQL, View and table design, Experience of ETL) Expert with PowerBI Advanced MS Excel (Power Query, Power Pivots) Stakeholder management and facilitation of decisions of all sizes Strong presentation skills for engaging senior stakeholders Ability to handle high More ❯
making business recommendations and influencing stakeholders - Experience on defining requirements, creating business requirement document, understanding business process and using data and metrics to draw business insights - Proficiency in SQL, ETL management, Excel (including VBA, pivot tables, array functions, power pivots, etc.) and data visualisation tools such as Tableau experience PREFERRED QUALIFICATIONS - MBA or Master's degree in Computer Science, Engineering More ❯
Relevant work experience in data science, machine learning, and business analytics Practical experience in coding languages eg. Python, R, Scala, etc.; (Python preferred) Proficiency in database technologies eg. SQL, ETL, No-SQL, DW, and Big Data technologies e.g. pySpark, Hive, etc. Experienced working with structured and also unstructured data eg. Text, PDFs, jpgs, call recordings, video, etc. Knowledge of machine More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Behavioural Insights Team
products or applications, at scale. Strong knowledge of the Python Data Science stack (e.g., pandas/polars, scikit-learn). Ability to independently develop and maintain robust Python-based ETL/ELT data pipelines. Ability to independently develop LLM-based tools/products (e.g., RAG workflows). Familiarity with version control tools such as Git/GitHub. In addition, we More ❯
you are a beacon of collaboration. BONUS POINTS Experience project managing the delivery of technical solutions, including exposure to agile product development methodologies Experience building underlying data pipelines andETL, particularly useful if done using Amazon Web Services, Adverity , DBT etc. Knowledge and experience using other programming and/or statistical languages (e.g. Python, R, etc.) Life at WPP Media More ❯
closely with the Data Architect to collaborate on Design of our data architecture and interpret into a build plan Lead the build and maintenance of scalable data pipelines andETL processes to support data integration and analytics from a diverse range of data sources, Cloud storage, databases and APIs Deliver large-scale data processing workflows (ingestion, cleansing, transformation, validation, storage … closely with the Data Architect to collaborate on Design of our data architecture and interpret into a build plan Lead the build and maintenance of scalable data pipelines andETL processes to support data integration and analytics from a diverse range of data sources, Cloud storage, databases and APIs Deliver large-scale data processing workflows (ingestion, cleansing, transformation, validation, storage More ❯