the Role Designing, building and maintaining data pipelines. Building and maintaining data warehouses. Data cleansing and transformation. Developing and maintaining ETL processes (ELT = extract, transform, load) to extract, transform, andload data from various sources into data warehouses. Validating charts and reports created by systems built in-house. Creating validation tools. Developing and maintaining data models, data tools. Monitoring and … Experience in R programming language. Experience in Python programming language. Experience in designing, building and maintaining data pipelines. Experience with data warehousing and data lakes. Experience in developing and maintaining ETL processes. Experience in developing data integration tools. Experience in data manipulation, data analysis, data modelling. Experience with cloud platforms (AWS, Azure, etc.) Experience in designing scalable, secure, and cost More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
clients to gather requirements and deliver solutions. Be willing to engage and assist in pre-sales activities, bids, proposals etc. Use key techniques such as Governance, Architecture, Data Modelling, ETL/ELT, Data Lakes, Data Warehousing, Master Data, and BI. Consistently utilise key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines. More ❯
clients to gather requirements and deliver solutions. Be willing to engage and assist in pre-sales activities, bids, proposals etc. Use key techniques such as Governance, Architecture, Data Modelling, ETL/ELT, Data Lakes, Data Warehousing, Master Data, and BI. Consistently utilise key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines. More ❯
platform for data analytics, including design and deployment of infrastructure. Expertise in creating CI/CD pipelines. Experience in creating FTP (SFTP/FTPS) configurations. Experience in working with ETL/ELT workflows for data analytics. Degree in Computer Science, Mathematics or related subject. Highly desirable skills & exposure: Working collaboratively as part of an Agile development squad. Experience and knowledge More ❯
ideally within internal consultancy or transformation environments. Strong experience with PostgreSQL , Power BI , and low-code platforms (Budibase or similar). Solid programming skills, preferably in Python , especially for ETL development and support. Proficiency with version control systems (e.g., GitHub ), with an understanding of best practices for collaboration, review, and deployment. Familiarity with REST APIs , including how to design, consume More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Dept Holding B.V
of leading complex data initiatives Strong technical leadership experience, with the ability to guide and develop engineering teams Deep expertise in designing and implementing data architectures, data pipelines, andETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data More ❯
our technologies, workplaces, and colleagues to make our Group a great place for everyone. Including you. What you'll need Demonstrable experience designing and building scalable data pipelines andETL/ELT workflows using modern data engineering best practices. Strong understanding of data modelling principles and experience designing data architectures for analytics and operational workloads. Deep familiarity with Google Cloud More ❯
implementing data engineering best-practice (e.g., source-to-target mappings, coding standards, data quality, etc.), working closely with the external party who setup the environment . Create and maintain ETL processes, data mappings & transformations to orchestrate data integrations. Ensure data integrity, quality, privacy, and security across systems, in line with client and regulatory requirements. Optimize data solutions for performance and … up monitoring and data quality exception handling. Strong data modelling experience. Experience managing and developing CI/CD pipelines. Experience with Microsoft Azure products and services, and proficiency in ETL processes. Experience of working with APIs to integrate data flows between disparate cloud systems. Strong analytical and problem-solving skills, with the ability to work independently and collaboratively. The aptitude More ❯
and stakeholder engagement And any experience of these would be really useful Experience with telephony platforms (e.g. Genesys, Twilio, Avaya, Nuance) Infrastructure as Code (Terraform, Ansible, Puppet) Data engineering (ETL, SQL, Power BI, Tableau) Secure-by-design principles and architectural patterns Interest in AI, IoT, or service mesh (e.g. Istio) NodeJS A passion for delivering business value through sound engineering More ❯
field. Strong analytical and problem-solving skills. Good Database and SQL skills Experience with data visualization tools (e.g., Power BI, Tableau, Looker). Basic understanding of data modelling andETL concepts. Good communication and documentation skills. If you're interested, please apply below! INDMANJ 49567NB More ❯
field. Strong analytical and problem-solving skills. Good Database and SQL skills Experience with data visualization tools (e.g., Power BI, Tableau, Looker). Basic understanding of data modelling andETL concepts. Good communication and documentation skills. If you're interested, please apply below! INDMANJ 49567NB More ❯
Terraform Familiar with Cloud security, performance optimisation and monitoring tools (Azure Monitor, Log Analytics). Experience in IT service management environments (ITIL). Cross-functional project delivery. Experience with ETL/ELT processes and data modelling. Understanding of data governance, security, and compliance frameworks. Stakeholder Management. What you'll get in return In return, you will be rewarded with ongoing More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
and trends. Requirements: -3+ years of experience as a data engineer. -Strong proficiency in AWS data services such as S3, Glue, Lambda, and Redshift. -Experience with data modelling, ETL processes, and data warehousing concepts. -Proficiency in SQL and Python. Benefits: -Competitive salary, benefits package and discretionary bonus. -Opportunity to work on cutting-edge technology. -Career growth and development opportunities. More ❯
and trends. Requirements: -3+ years of experience as a data engineer. -Strong proficiency in AWS data services such as S3, Glue, Lambda, and Redshift. -Experience with data modelling, ETL processes, and data warehousing concepts. -Proficiency in SQL and Python. Benefits: -Competitive salary, benefits package and discretionary bonus. -Opportunity to work on cutting-edge technology. -Career growth and development opportunities. More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Behavioural Insights Team
products or applications, at scale. Strong knowledge of the Python Data Science stack (e.g., pandas/polars, scikit-learn). Ability to independently develop and maintain robust Python-based ETL/ELT data pipelines. Ability to independently develop LLM-based tools/products (e.g., RAG workflows). Familiarity with version control tools such as Git/GitHub. In addition, we More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
The Vanguard Group
business. Product specific knowledge: familiarity and experience within retail platforms, third party recordkeepers, and client services model would be beneficial. AWS: Experience with AWS (particularly Athena, S3 and glue ETL) again preferable but an understanding of cloud computing infrastructure and capabilities a big plus. Agileways of working: understanding of Agile tools (specifically JIRA) and ways of working (specifically Kanban). More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
First Central Services
Finance, Actuarial, and Data Engineering to understand requirements and deliver high-impact data solutions. Expertise in Data Engineering and Development - you'll be comfortable designing, building, and optimising scalable ETL/ELT pipelines using cloud-based technologies and SQL, ensuring data is accessible, accurate, and timely. Financial and Actuarial Knowledge - you'll bring a working understanding of finance and actuarial … you'll use Python to develop scripts for automation, data transformation, and analytics - enabling everything from reserves analysis to forecast simulations. What's involved: Design, build, and optimise scalable ETL/ELT workflows for financial and actuarial datasets. Integrate data from financial systems (e.g., Workday) and actuarial sources into centralised data platforms. Collaborate with finance and actuarial teams to understand More ❯
cloud-native data pipelines Comfortable collaborating with cross-functional teams (engineering, product, stakeholders) Nice to have: Familiarity with tools like dbt, Airflow, or Python scripting Knowledge of data warehousing, ETL frameworks, and modern data stack best practices Why this role: Our client has a clear vision for what success looks like in this role. You'll have direct ownership over More ❯
cloud-native data pipelines Comfortable collaborating with cross-functional teams (engineering, product, stakeholders) Nice to have: Familiarity with tools like dbt, Airflow, or Python scripting Knowledge of data warehousing, ETL frameworks, and modern data stack best practices Why this role: Our client has a clear vision for what success looks like in this role. You'll have direct ownership over More ❯
Build and optimize datasets for performance and reliability in Azure Databricks . Collaborate with analysts and business stakeholders to translate data requirements into robust technical solutions. Implement and maintain ETL/ELT pipelines using Azure Data Factory or Synapse Pipelines . Design and develop a fit-for-purpose enterprise data warehouse to serve reporting and analytics. Ensure data quality, data … Azure Data Lake Storage Gen2 . Understanding of data warehouse concepts , dimensional modelling , and data architecture . Experience working with Delta Lake and large-scale data processing. Experience building ETL pipelines in Azure Data Factory or similar orchestration tools. Familiarity with version control systems (e.g., Git ) and CI/CD practices. Preferred Qualifications: Experience in a manufacturing, FMCG, or retail More ❯
to work on Defence and National Security projects. WE NEED THE DATA SCIENCE ENGINEER TO HAVE.... * Enhanced DV Clearance.* Experience in Data Engineering/Data Science.* Ability to develop Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) workflows to move data from source systems to date stores.* Experience with one or more of the following supporting technologies: Apache, Kafka … NiFi, Spark, Flink or Airflow etc.* Past experience working with SQL and NoSQL databases (e.g. PostgreSQL, Mongo, Elasticsearch, Accumulo or Neo4j).* Hands-on experience with distributed computing, ETL, data pipelines, and automated workflows.* Ability to use modern software languages (e.g. Python, Java or Go). TO BE CONSIDERED....Please either apply by clicking online or emailing me directly to For … forward to hearing from you. DATA SCIENCE ENGINEER- EDV CLEAREDKEY SKILLS:DATA ENGINEER/SENIOR DATA ENGINEER/LEAD DATA ENGINEER/DATA SCIENTIST/SENIOR DATA SCIENTIST/ETL/DATABASE/PYTHON/JAVA/PLATFORM ETL/DATA GOVERNANCE/DV CLEARED/DV CLEARANCE/DEVELOPPED VETTING/DEVELOPED VETTED/DEEP VETTING/DEEP VETTED More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Deloitte LLP
and professional experience Proven experience in leading and delivering complex data migration projects. Strong technical knowledge of data migration tools and techniques. Experience with various data migration methodologies (e.g., ETL, data warehousing). Excellent communication, stakeholder management, and problem-solving skills. Relevant certifications (e.g., Oracle certifications, data management certifications) or equivalent. Experience in a consulting environment. Connect to your business More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Leading Pharmaceutical Company - Manchester(Tech Stack: Data Engineer, Databricks, Python, Power BI, Azure, TSQL, ETL, Agile Methodologies)About the Role: We are seeking a talented and experienced Data Engineer on behalf of our client, a leading Software House. This is a fully remote position, offering the opportunity to work with cutting-edge technologies and contribute to exciting projects … an experienced Data Engineer to join their team in Manchester. This hybrid position involves working within the pharmaceutical industry, focusing on the design, development, and maintenance of data pipelines, ETL processes, and databases. The role is ideal for someone passionate about improving processes, ensuring data quality, and maintaining compliance with regulatory standards. focusing on designing, developing, and maintaining data pipelines … ETL processes, and databases. If you are passionate about driving continuous improvement and ensuring data quality and compliance, we want to hear from you.Key Responsibilities:Design, develop, maintain, and optimise data pipelines, ETL processes, and databases.Drive continuous improvement by refining processes, products, and identifying new tools, standards, and practices.Collaborate with teams across the business to define solutions, requirements, and testing More ❯
and implementing automated reporting solutions in Power BI Support the operation of the business's applications which support the distribution of data to external customers Data Modelling: Extract, transform, andload data from a variety of sources into Power BI and develop and maintain data models which support data analysis and reporting requirements Collaboration: Work closely with stakeholders to understand More ❯
a week - this is usually on Tuesdays and Thursdays but can be subject to change. Strong benefits package This role focuses on the development, performance, management, and troubleshooting of ETL processes, data pipelines, and data infrastructure. The successful candidate will ensure the effective and reliable operation of these systems, adopting new tools and technologies to stay ahead of industry best … practices. Collaboration across teams to define solutions, requirements, and testing approaches is essential, as is ensuring compliance with regulatory standards.Key Responsibilities:- Design, develop, maintain, and optimise data pipelines, ETL processes, and databases.- Drive continuous improvement by refining processes and identifying new tools and standards.- Collaborate with cross-functional teams to define solutions and testing approaches.- Ensure compliance with regulatory requirements More ❯