the Role Designing, building and maintaining data pipelines. Building and maintaining data warehouses. Data cleansing and transformation. Developing and maintaining ETL processes (ELT = extract, transform, load) to extract, transform, andload data from various sources into data warehouses. Validating charts and reports created by systems built in-house. Creating validation tools. Developing and maintaining data models, data tools. Monitoring and … Experience in R programming language. Experience in Python programming language. Experience in designing, building and maintaining data pipelines. Experience with data warehousing and data lakes. Experience in developing and maintaining ETL processes. Experience in developing data integration tools. Experience in data manipulation, data analysis, data modelling. Experience with cloud platforms (AWS, Azure, etc.) Experience in designing scalable, secure, and cost More ❯
Knutsford, Cheshire, North West, United Kingdom Hybrid / WFH Options
The Veterinary Defence Society
improve data processes Contribute to project teams, ensuring data requirements are addressed from the outset Continuously improve BI tools and data engineering practices Required Skills & Experience Proficiency in SQL, ETL processes, data warehousing, and data modelling (MS SQL preferred) Proven experience in data engineering or analysis Strong analytical and problem-solving skills Excellent communication skills able to explain technical concepts More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
clients to gather requirements and deliver solutions. Be willing to engage and assist in pre-sales activities, bids, proposals etc. Use key techniques such as Governance, Architecture, Data Modelling, ETL/ELT, Data Lakes, Data Warehousing, Master Data, and BI. Consistently utilise key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines. More ❯
clients to gather requirements and deliver solutions. Be willing to engage and assist in pre-sales activities, bids, proposals etc. Use key techniques such as Governance, Architecture, Data Modelling, ETL/ELT, Data Lakes, Data Warehousing, Master Data, and BI. Consistently utilise key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines. More ❯
platform for data analytics, including design and deployment of infrastructure. Expertise in creating CI/CD pipelines. Experience in creating FTP (SFTP/FTPS) configurations. Experience in working with ETL/ELT workflows for data analytics. Degree in Computer Science, Mathematics or related subject. Highly desirable skills & exposure: Working collaboratively as part of an Agile development squad. Experience and knowledge More ❯
ideally within internal consultancy or transformation environments. Strong experience with PostgreSQL , Power BI , and low-code platforms (Budibase or similar). Solid programming skills, preferably in Python , especially for ETL development and support. Proficiency with version control systems (e.g., GitHub ), with an understanding of best practices for collaboration, review, and deployment. Familiarity with REST APIs , including how to design, consume More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Dept Holding B.V
of leading complex data initiatives Strong technical leadership experience, with the ability to guide and develop engineering teams Deep expertise in designing and implementing data architectures, data pipelines, andETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data More ❯
in a consultancy and programme-delivery context. Experience with Zoho Analytics and Salesforce is a bonus. Key Responsibilities 1) Data Management & Reporting Write and maintain efficient SQL queries to extract, transform, andload (ETL) data from various sources Develop scalable and insightful dashboards and reports in Power BI to support programme performance tracking, operations, and compliance Monitor and troubleshoot data More ❯
reporting in a consultancy and programme-delivery context. Experience with Zoho Analytics and Salesforce is a bonus. Key Responsibilities Data Management & Reporting Write and maintain efficient SQL queries to extract, transform, andload (ETL) data from various sources Develop scalable and insightful dashboards and reports in Power BI to support programme performance tracking, operations, and compliance Monitor and troubleshoot data More ❯
our technologies, workplaces, and colleagues to make our Group a great place for everyone. Including you. What you'll need Demonstrable experience designing and building scalable data pipelines andETL/ELT workflows using modern data engineering best practices. Strong understanding of data modelling principles and experience designing data architectures for analytics and operational workloads. Deep familiarity with Google Cloud More ❯
implementing data engineering best-practice (e.g., source-to-target mappings, coding standards, data quality, etc.), working closely with the external party who setup the environment . Create and maintain ETL processes, data mappings & transformations to orchestrate data integrations. Ensure data integrity, quality, privacy, and security across systems, in line with client and regulatory requirements. Optimize data solutions for performance and … up monitoring and data quality exception handling. Strong data modelling experience. Experience managing and developing CI/CD pipelines. Experience with Microsoft Azure products and services, and proficiency in ETL processes. Experience of working with APIs to integrate data flows between disparate cloud systems. Strong analytical and problem-solving skills, with the ability to work independently and collaboratively. The aptitude More ❯
and stakeholder engagement And any experience of these would be really useful Experience with telephony platforms (e.g. Genesys, Twilio, Avaya, Nuance) Infrastructure as Code (Terraform, Ansible, Puppet) Data engineering (ETL, SQL, Power BI, Tableau) Secure-by-design principles and architectural patterns Interest in AI, IoT, or service mesh (e.g. Istio) NodeJS A passion for delivering business value through sound engineering More ❯
field. Strong analytical and problem-solving skills. Good Database and SQL skills Experience with data visualization tools (e.g., Power BI, Tableau, Looker). Basic understanding of data modelling andETL concepts. Good communication and documentation skills. If you're interested, please apply below! INDMANJ 49567NB More ❯
field. Strong analytical and problem-solving skills. Good Database and SQL skills Experience with data visualization tools (e.g., Power BI, Tableau, Looker). Basic understanding of data modelling andETL concepts. Good communication and documentation skills. If you're interested, please apply below! INDMANJ 49567NB More ❯
Terraform Familiar with Cloud security, performance optimisation and monitoring tools (Azure Monitor, Log Analytics). Experience in IT service management environments (ITIL). Cross-functional project delivery. Experience with ETL/ELT processes and data modelling. Understanding of data governance, security, and compliance frameworks. Stakeholder Management. What you'll get in return In return, you will be rewarded with ongoing More ❯
North West London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
ll Be Doing: Leading cross-functional teams of architects, engineers, and analysts to design and deliver scalable, modern data solutions Architecting robust cloud-native data platforms using modern RDBMS, ETL/ELT, and streaming tech Engaging stakeholders and owning the technical vision across full project lifecycles Shaping product-led data strategies using Data Mesh, Data Fabric, and event-driven patterns More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
design, content, and new learning features through robust A/B testing and statistical significance testing.- Develop and maintain dashboards using Power BI, supported by SQL and Python-based ETL pipelines.- Help shape a centralised Learning Analytics Framework that tracks key KPIs on learner success, engagement, and satisfaction.- Translate complex analysis into accessible insight for stakeholders across product, learning, curriculum More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
and trends. Requirements: -3+ years of experience as a data engineer. -Strong proficiency in AWS data services such as S3, Glue, Lambda, and Redshift. -Experience with data modelling, ETL processes, and data warehousing concepts. -Proficiency in SQL and Python. Benefits: -Competitive salary, benefits package and discretionary bonus. -Opportunity to work on cutting-edge technology. -Career growth and development opportunities. More ❯
and trends. Requirements: -3+ years of experience as a data engineer. -Strong proficiency in AWS data services such as S3, Glue, Lambda, and Redshift. -Experience with data modelling, ETL processes, and data warehousing concepts. -Proficiency in SQL and Python. Benefits: -Competitive salary, benefits package and discretionary bonus. -Opportunity to work on cutting-edge technology. -Career growth and development opportunities. More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
InterQuest Group (UK) Limited
Troubleshoot integration issues and optimize pipeline performance Document workflows and maintain best practices for SnapLogic development Requirements: Proven hands-on experience with SnapLogic (Enterprise Integration Cloud) Strong understanding of ETL/ELT concepts and integration patterns Experience working with APIs, cloud platforms (e.g., AWS, Azure, GCP), and databases (SQL/NoSQL) Familiarity with REST, JSON, XML, and data mapping/ More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
using Azure Data Factory (ADF), ensuring efficient and reliable data movement and transformation. • Data Modelling using Kimball, 3NF or Dimensional methodologies • Utilize SQL and Python languages to extract, transform, andload data from various sources into Azure Databricks and Azure SQL/SQL Server. • Design and implement metadata driven pipelines to automate data processing tasks. • Collaborate with cross-functional teams More ❯
Dashboard creation, Data Modelling, DAX, Power Query) Understanding of analysis methodologies essential. Previous experience with report building tools is essential, (experience with SAP Business Objects advantageous). Awareness of ETL processes and database automation. Experience working with AI tools to generate automated insights advantageous Confident in Microsoft Excel and full Microsoft Suite. (Or cloud based equivalents) From humble beginnings to More ❯
Dashboard creation, Data Modelling, DAX, Power Query) Understanding of analysis methodologies essential. Previous experience with report building tools is essential, (experience with SAP Business Objects advantageous). Awareness of ETL processes and database automation. Experience working with AI tools to generate automated insights advantageous Confident in Microsoft Excel and full Microsoft Suite. (Or cloud based equivalents) Benefits In addition to More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Pro Insurance
processes for various bordereaux (both Risk and Claims) using data ingestion and analysis tools such as Intrali, Quantemplate, or Matillion. The project focuses on the extraction, transformation, and loading (ETL) of data from various sources into data warehouses and data lakes. This is an ideal role for an aspiring Data Architect in the Insurance Industry. Pro operates a hybrid working More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Behavioural Insights Team
products or applications, at scale. Strong knowledge of the Python Data Science stack (e.g., pandas/polars, scikit-learn). Ability to independently develop and maintain robust Python-based ETL/ELT data pipelines. Ability to independently develop LLM-based tools/products (e.g., RAG workflows). Familiarity with version control tools such as Git/GitHub. In addition, we More ❯