Data Engineer
Turn messy, fragmented data into something the business can actually trust.
About the client:
You will be working with a major UK-based organisation that operates across a portfolio of well-known brands, delivering at scale within a complex distribution environment. With deeply integrated ERP and finance systems, the business is now undergoing a significant shift in how it uses data. Following recent system changes, there is a clear focus on modernising reporting and building a more transparent, reliable data landscape that can support future growth.
Project overview:
This programme is focused on rebuilding critical financial reporting that has been disrupted, including complex margin and rebate calculations that sit at the heart of commercial performance. Alongside this, you will help design and deliver scalable data pipelines, transformation logic, and reporting outputs within a modern Azure environment. The work will establish a repeatable blueprint that can be rolled out across multiple business units, playing a key role in the transition away from legacy systems to a clean, future-ready data platform.
What you will be doing:
You will take ownership of building and maintaining robust data pipelines within Azure, working hands-on to deliver reliable, high-performance data solutions. This includes developing workflows in Databricks and or Synapse, and transforming data from a range of sources such as CSV and Excel into clean, usable datasets. A key part of the role will involve translating complex pricing and margin logic into scalable data models, ensuring accuracy and consistency across reporting. You will also focus on improving data quality, validation processes, and overall performance, while supporting the move away from legacy reporting tools. Alongside this, you will contribute to the design of a scalable data architecture that can be leveraged across multiple business units.
Tech environment:
You will be working across a modern Azure stack including Data Lake, Synapse, Databricks, Data Factory, and SQL, with exposure to complex enterprise data environments.
What we are looking for:
We are looking for someone with strong experience across Azure data engineering tools and a proven track record of building ETL or ELT pipelines in production environments. You should have solid SQL and data transformation skills, along with experience handling large and complex datasets.
Equally important is your ability to work closely with analysts and business stakeholders, translating requirements into effective data solutions.
Nice to have:
Experience working with ERP or finance data would be highly beneficial, as would any background in transformation or migration programmes where legacy systems are being modernised.
Contract details: 6 month fixed term contract with potential to extend or move into a permanent role
Salary: Up to £60,000
Location: West Yorkshire\ Hybrid
About the client:
You will be working with a major UK-based organisation that operates across a portfolio of well-known brands, delivering at scale within a complex distribution environment. With deeply integrated ERP and finance systems, the business is now undergoing a significant shift in how it uses data. Following recent system changes, there is a clear focus on modernising reporting and building a more transparent, reliable data landscape that can support future growth.
Project overview:
This programme is focused on rebuilding critical financial reporting that has been disrupted, including complex margin and rebate calculations that sit at the heart of commercial performance. Alongside this, you will help design and deliver scalable data pipelines, transformation logic, and reporting outputs within a modern Azure environment. The work will establish a repeatable blueprint that can be rolled out across multiple business units, playing a key role in the transition away from legacy systems to a clean, future-ready data platform.
What you will be doing:
You will take ownership of building and maintaining robust data pipelines within Azure, working hands-on to deliver reliable, high-performance data solutions. This includes developing workflows in Databricks and or Synapse, and transforming data from a range of sources such as CSV and Excel into clean, usable datasets. A key part of the role will involve translating complex pricing and margin logic into scalable data models, ensuring accuracy and consistency across reporting. You will also focus on improving data quality, validation processes, and overall performance, while supporting the move away from legacy reporting tools. Alongside this, you will contribute to the design of a scalable data architecture that can be leveraged across multiple business units.
Tech environment:
You will be working across a modern Azure stack including Data Lake, Synapse, Databricks, Data Factory, and SQL, with exposure to complex enterprise data environments.
What we are looking for:
We are looking for someone with strong experience across Azure data engineering tools and a proven track record of building ETL or ELT pipelines in production environments. You should have solid SQL and data transformation skills, along with experience handling large and complex datasets.
Equally important is your ability to work closely with analysts and business stakeholders, translating requirements into effective data solutions.
Nice to have:
Experience working with ERP or finance data would be highly beneficial, as would any background in transformation or migration programmes where legacy systems are being modernised.
Contract details: 6 month fixed term contract with potential to extend or move into a permanent role
Salary: Up to £60,000
Location: West Yorkshire\ Hybrid