ideally within internal consultancy or transformation environments. Strong experience with PostgreSQL , Power BI , and low-code platforms (Budibase or similar). Solid programming skills, preferably in Python , especially for ETL development and support. Proficiency with version control systems (e.g., GitHub ), with an understanding of best practices for collaboration, review, and deployment. Familiarity with REST APIs , including how to design, consume More ❯
of CI/CD tools such as Jenkins or an understanding of their role. Experience with Apache Spark or Hadoop. Experience in building data pipelines. Experience of designing warehouses, ETL pipelines and data modelling. Good knowledge in designing, building, using, and maintaining REST APIs. Good SQL skills with any mainstream database - Teradata, Oracle, MySQL, Postgres. Proficient Linux skills. Agile exposure More ❯
implementing data engineering best-practice (e.g., source-to-target mappings, coding standards, data quality, etc.), working closely with the external party who setup the environment . Create and maintain ETL processes, data mappings & transformations to orchestrate data integrations. Ensure data integrity, quality, privacy, and security across systems, in line with client and regulatory requirements. Optimize data solutions for performance and … up monitoring and data quality exception handling. Strong data modelling experience. Experience managing and developing CI/CD pipelines. Experience with Microsoft Azure products and services, and proficiency in ETL processes. Experience of working with APIs to integrate data flows between disparate cloud systems. Strong analytical and problem-solving skills, with the ability to work independently and collaboratively. The aptitude More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Parking Network BV
what will help you thrive in this role: 2-5 years in data engineering or a related field Strong PySpark and advanced SQL skills Practical experience building and maintaining ETL/ELT pipelines in Databricks Familiarity with CI/CD pipelines and version control practices Nice to have: Experience using Databricks Asset Bundles (DAB) Working knowledge of GCP and/ More ❯
and stakeholder engagement And any experience of these would be really useful Experience with telephony platforms (e.g. Genesys, Twilio, Avaya, Nuance) Infrastructure as Code (Terraform, Ansible, Puppet) Data engineering (ETL, SQL, Power BI, Tableau) Secure-by-design principles and architectural patterns Interest in AI, IoT, or service mesh (e.g. Istio) NodeJS A passion for delivering business value through sound engineering More ❯
field. Strong analytical and problem-solving skills. Good Database and SQL skills Experience with data visualization tools (e.g., Power BI, Tableau, Looker). Basic understanding of data modelling andETL concepts. Good communication and documentation skills. If you're interested, please apply below! INDMANJ 49567NB More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
design, content, and new learning features through robust A/B testing and statistical significance testing.- Develop and maintain dashboards using Power BI, supported by SQL and Python-based ETL pipelines.- Help shape a centralised Learning Analytics Framework that tracks key KPIs on learner success, engagement, and satisfaction.- Translate complex analysis into accessible insight for stakeholders across product, learning, curriculum More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Interquest
Troubleshoot integration issues and optimize pipeline performance Document workflows and maintain best practices for SnapLogic development Requirements: Proven hands-on experience with SnapLogic (Enterprise Integration Cloud) Strong understanding of ETL/ELT concepts and integration patterns Experience working with APIs, cloud platforms (e.g., AWS, Azure, GCP), and databases (SQL/NoSQL) Familiarity with REST, JSON, XML, and data mapping/ More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
a real impact What You’ll Bring: 2–5 years’ experience in data engineering or similar Strong PySpark and advanced SQL skills Hands-on experience with Databricks and building ETL/ELT pipelines Familiarity with CI/CD and version control Bonus Points For: Experience with Databricks Asset Bundles (DAB) Exposure to GCP , Azure , BigQuery , or Harness What’s In More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Pro Insurance
processes for various bordereaux (both Risk and Claims) using data ingestion and analysis tools such as Intrali, Quantemplate, or Matillion. The project focuses on the extraction, transformation, and loading (ETL) of data from various sources into data warehouses and data lakes. This is an ideal role for an aspiring Data Architect in the Insurance Industry. Pro operates a hybrid working More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
First Central Services
Knowledge of data platforms, processing, and data structures Experience with machine learning libraries and tools Familiarity with data query languages like SQL and HiveQL Experience with data integration andETL processes Excellent problem-solving and analytical skills Degree in Data Science or a related field Behaviours: Self-motivated and enthusiastic Determined and passionate about data and technology Decisive, with the More ❯
You'll join a fast-paced, delivery-focused data team responsible for building and optimising scalable, production-grade data pipelines and infrastructure. Key Responsibilities: Design and implement robust, scalable ETL/ELT pipelines using Databricks and Apache Spark Ingest, transform, and manage large volumes of data from diverse sources Collaborate with analysts, data scientists, and business stakeholders to deliver clean More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
First Central Services
Finance, Actuarial, and Data Engineering to understand requirements and deliver high-impact data solutions. Expertise in Data Engineering and Development - you'll be comfortable designing, building, and optimising scalable ETL/ELT pipelines using cloud-based technologies and SQL, ensuring data is accessible, accurate, and timely. Financial and Actuarial Knowledge - you'll bring a working understanding of finance and actuarial … you'll use Python to develop scripts for automation, data transformation, and analytics - enabling everything from reserves analysis to forecast simulations. What's involved: Design, build, and optimise scalable ETL/ELT workflows for financial and actuarial datasets. Integrate data from financial systems (e.g., Workday) and actuarial sources into centralised data platforms. Collaborate with finance and actuarial teams to understand More ❯
cloud-native data pipelines Comfortable collaborating with cross-functional teams (engineering, product, stakeholders) Nice to have: Familiarity with tools like dbt, Airflow, or Python scripting Knowledge of data warehousing, ETL frameworks, and modern data stack best practices Why this role: Our client has a clear vision for what success looks like in this role. You'll have direct ownership over More ❯
with complex architecture, preferably Oracle HR transformations. Technical Skills: Programming Languages: Proficiency in Node.js, Java, Python and TypeScript. Database Management: Experience with Oracle DB. Data Integration: Knowledge of Informatica ETL processes. Enterprise Systems: Familiarity with Oracle EBS or Fusion required API Development: Expertise in RESTful services. Cloud Platforms: Experience with AWS and serverless architectures, Athena Preferable Testing: Familiarity with Jest More ❯
be well versed in SQL scripting for querying, aggregating and transforming data is required. Having good .NET C# and or java programming skills would be an advantage. Must haves: Extract, TransformandLoad (ETL) experience Querying, aggregation and transformation of data using SQL. An understanding of relational database principles. Perks to this role: Monthly socials Informal dress code Great offices More ❯
defence sector. The successful candidate must have: SQL technologies skills (e.g. MS SQL, Oracle). noSQL technologies skills (e.g. MongoDB, InfluxDB, Neo4J). Data exchange and processing skills (e.g. ETL, ESB, API). Development skills (e.g. Python). Big data technologies knowledge (e.g. Hadoop stack). This position offers a lucrative benefits package, which includes but is not inclusive of More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Maxwell Bond
with hybrid flexibility Collaborate with a welcoming team that values teamwork and innovation. As a SQL Developer, you'll: Design and develop efficient SQL procedures, views, and functions. Strong ETL Experience Optimise query performance and troubleshoot complex issues. Use tools like SSIS ADF, with exposure to Synapse or SSAS data warehouse design. Integrate data across remote servers and leverage Microsoft More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Deloitte LLP
and professional experience Proven experience in leading and delivering complex data migration projects. Strong technical knowledge of data migration tools and techniques. Experience with various data migration methodologies (e.g., ETL, data warehousing). Excellent communication, stakeholder management, and problem-solving skills. Relevant certifications (e.g., Oracle certifications, data management certifications) or equivalent. Experience in a consulting environment. Connect to your business More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Leading Pharmaceutical Company - Manchester(Tech Stack: Data Engineer, Databricks, Python, Power BI, Azure, TSQL, ETL, Agile Methodologies)About the Role: We are seeking a talented and experienced Data Engineer on behalf of our client, a leading Software House. This is a fully remote position, offering the opportunity to work with cutting-edge technologies and contribute to exciting projects … an experienced Data Engineer to join their team in Manchester. This hybrid position involves working within the pharmaceutical industry, focusing on the design, development, and maintenance of data pipelines, ETL processes, and databases. The role is ideal for someone passionate about improving processes, ensuring data quality, and maintaining compliance with regulatory standards. focusing on designing, developing, and maintaining data pipelines … ETL processes, and databases. If you are passionate about driving continuous improvement and ensuring data quality and compliance, we want to hear from you.Key Responsibilities:Design, develop, maintain, and optimise data pipelines, ETL processes, and databases.Drive continuous improvement by refining processes, products, and identifying new tools, standards, and practices.Collaborate with teams across the business to define solutions, requirements, and testing More ❯
and implementing automated reporting solutions in Power BI Support the operation of the business's applications which support the distribution of data to external customers Data Modelling: Extract, transform, andload data from a variety of sources into Power BI and develop and maintain data models which support data analysis and reporting requirements Collaboration: Work closely with stakeholders to understand More ❯
environment - they'd love to meet you. 5 days on site | £30,000 per year What You'll Be Doing Assist our Data Engineer in developing Python scripts to extract, transform, and validate data against our MySQL database. Upload and structure data for our platform to support efficient reporting and analysis. Collaborate with stakeholders and our BI Analyst to understand … functionality, workflows, and changes clearly and consistently. Follow agile development principles and contribute to sprint planning and tracking. Participate in planning, designing and developing our future migration of automated ETL processing of our data on a chosen cloud platform. What We're Looking For Bachelor's degree in Computer Science, Information Systems, Engineering, or a related field. Strong grasp of … testing using pytest Proficiency in SQL (we use MySQL, but experience with other SQL platforms is welcome). Experience with data manipulation and transformation using Pandas . Familiarity with ETL/ELT processes and data warehousing concepts. Understanding of cloud platforms (AWS or Azure). Basic knowledge of Git and GitHub for version control and collaboration. Awareness of best practices More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Searchability
a plus bonus & excellent benefits Work on complex, large-scale data pipelines with cutting-edge tech We're looking for strong Python, SQL, and AWS skills, plus experience in ETL processes Hybrid working - 2-3 days in the office ABOUT THE CLIENT: Our client is a trailblazer in the global data insights space, delivering innovative and actionable solutions to some … working with flexible hours £100 annual home office allowance DATA ENGINEER ROLE: In this hands-on position, you'll develop and maintain a mix of real-time and batch ETL processes, ensuring accuracy, integrity, and scalability across vast datasets. You'll work with Python, SQL, Apache Spark, and AWS services such as EMR, Athena, and Lambda to deliver robust, high … solutions. KEY SKILLS/EXPERIENCE: Proven experience as a Data Engineer, with Python & SQL expertise Familiarity with AWS services (or equivalent cloud platforms) Experience with large-scale datasets andETL pipeline development Knowledge of Apache Spark (Scala or Python) beneficial Understanding of agile development practices, CI/CD, and automated testing Strong problem-solving and analytical skills Positive team player More ❯
a week - this is usually on Tuesdays and Thursdays but can be subject to change. Strong benefits package This role focuses on the development, performance, management, and troubleshooting of ETL processes, data pipelines, and data infrastructure. The successful candidate will ensure the effective and reliable operation of these systems, adopting new tools and technologies to stay ahead of industry best … practices. Collaboration across teams to define solutions, requirements, and testing approaches is essential, as is ensuring compliance with regulatory standards.Key Responsibilities:- Design, develop, maintain, and optimise data pipelines, ETL processes, and databases.- Drive continuous improvement by refining processes and identifying new tools and standards.- Collaborate with cross-functional teams to define solutions and testing approaches.- Ensure compliance with regulatory requirements More ❯
maintain basic BI reports, dashboards, and visualizations to monitor key performance indicators (KPIs). Collaborate with business stakeholders to gather data requirements and translate them into reporting solutions. Support ETL processes by preparing and validating data from various sources for reporting accuracy. Perform exploratory data analysis to identify trends and support business initiatives. Monitor and maintain data quality, accuracy, and … in a BI Analyst, Data Analyst, or similar role. Familiarity with BI tools such as Power BI, Tableau, or Looker. Basic knowledge of SQL and relational databases. Understanding of ETL concepts and data preparation techniques. Strong analytical thinking and attention to detail. Good communication and teamwork skills. Preferred Qualifications Exposure to data warehousing concepts or platforms. Awareness of data governance More ❯