We are currently recruiting for a Data Manager on a permanent basis to be responsible for leading the Performance team in managing the performance cycle at both strategic and departmental levels. This will include: Development and implementation of appropriate performance monitoring and reporting systems. Provide advice to senior leaders using intelligent analysis and research to provide assurance Manage the … performance team (data analysts) Develop and implement outcome focused performance management framework The ideal candidate will have the following skills and experience: Strong performance management and data analysis/reporting skills Designing performance management frameworks Datatransformation Strong in the use of PowerBI Working with complex data Ideally a date orientated qualification and past experience More ❯
Middlesborough, Middlesbrough, North Yorkshire, United Kingdom
4M Recruitment
We are currently recruiting for a Data Manager on a permanent basis to be responsible for leading the Performance team in managing the performance cycle at both strategic and departmental levels. This will include: Development and implementation of appropriate performance monitoring and reporting systems. Provide advice to senior leaders using intelligent analysis and research to provide assurance Manage the … performance team (data analysts) Develop and implement outcome focused performance management framework The ideal candidate will have the following skills and experience: Strong performance management and data analysis/reporting skills Designing performance management frameworks Datatransformation Strong in the use of PowerBI Working with complex data Ideally a date orientated qualification and past experience More ❯
savvy Senior Insight Analyst to join their team! With more than 250 employees, the company prides itself on its customer-first culture, next-day delivery, and ongoing investment in datatransformation; the business is modernising its reporting and analytics capability using Microsoft Fabric, SQL, and increasingly Python and no-code BI tools to deliver smarter insights that shape … sales and marketing to finance, supply chain, and warehousing. You'll work with complex, high-volume datasets to understand performance drivers, identify opportunities, and influence decision-making through clear data storytelling. This role is ideal for someone who enjoys blending technical analytics with commercial acumen, helping the business make smarter, faster, and more profitable decisions. Key areas of focus … include: Leading pricing and rebate analysis to improve margin and profitability Delivering insights into wholesale and distribution performance across thousands of SKUs Acting as a trusted data partner to commercial, finance, and operational teams Supporting marketing and eCommerce analytics; looking at things such as customer segmentation and campaign evaluation Collaborating with FP&A on forecasting and scenario analysis Your More ❯
Milton Keynes, Buckinghamshire, United Kingdom Hybrid/Remote Options
National House Building Council
Working location: Milton Keynes, Hybrid Employment type: Full time, Permanent Job summary We are seeking a Data Architect to shape how we design, govern, and use data across the organisation. You'll ensure that data is treated as a strategic asset, supporting business goals through well-structured, high-quality, and scalable solutions. Reporting to the Chief Data … engineers, analytics leads, and governance teams to set standards, guide architectural decisions, and optimise our Snowflake-based cloud platform. You'll also support initiatives involving AI, automation, real-time data, and reporting tools like Power BI and Tableau. What you'll be doing Lead the delivery and optimisation of NHBC's enterprise data architecture and cloud data platform (Snowflake). Set and maintain data standards, models, and design patterns across domains. Ensure high-quality, secure, and well-governed data through collaboration with engineering and governance teams. Oversee integration pipelines, data transformations (dbt), and reporting alignment (Power BI, Tableau). Guide architectural decisions supporting AI, analytics, automation, and real-time data. Promote reusability and More ❯
A leading global bank in London seeks a Head of Application Development for a major datatransformation programme. You will manage a scrum team focused on data ingestion for Finance reporting. The ideal candidate has over 10 years of Agile/Scrum experience in financial services, strong knowledge of data platforms like Databricks, and excellent stakeholder More ❯
Bournemouth, Dorset, South West, United Kingdom Hybrid/Remote Options
Sanderson Recruitment
strategic initiative within Financial Services aimed at delivering a unified, trusted view of customer data. We're seeking a highly skilled Lead Databricks Engineer to design and implement scalable data pipelines that form the backbone of our Lakehouse platform, enabling accurate analytics, reporting, and regulatory compliance. You'll work with cutting-edge technologies including Databricks , PySpark , and Azure Data Factory , applying best practices in data engineering and governance to support this critical programme. Lead Databricks Engineer: Key Responsibilities Build and maintain Databricks pipelines (batch and incremental) using PySpark and SQL. Orchestrate end-to-end workflows with Azure Data Factory . Develop and optimise Delta Lake tables (partitioning, schema evolution, vacuuming). Implement Medallion Architecture (Bronze, Silver … Gold) for transforming raw data into business-ready datasets. Apply robust monitoring, logging, and error-handling frameworks. Integrate pipelines with downstream systems such as Power BI . Collaborate with analysts, business teams, and engineers to deliver consistent, well-documented datasets. Support deployments and automation via Azure DevOps CI/CD . Gather and refine requirements from business stakeholders. Lead More ❯
Design and implement Costpoint Icertis integrations for contract values, funding releases, and work authorizations. Define mapping between Icertis contract metadata and Costpoint project/accounting dimensions. Develop and test datatransformation logic to ensure accurate posting and revenue recognition. Collaborate with enterprise architects, CLM solution architects, and finance leads to align obligations, modifications, and funding updates with accounting … treatment. Support data migration, reconciliation, and UAT across CLM and ERP environments. Provide documentation, test scripts, and defect remediation during SIT/UAT. Provide subject-matter leadership on no-CAS and CAS-compliant cost allocation and reporting. Mentor client and team members on Costpoint best practices and controls. Qualifications 8+ years implementing Deltek Costpoint with strong expertise in GL … with Icertis Contract Intelligence (ICI) Proven experience configuring Costpoint project structures, CLINs, and funding workflows. Experience with middleware or integration platforms (MuleSoft, Azure Logic Apps) Experience with API integration, data mapping, and Costpoint posting logic. Excellent collaboration and communication skills across contracts, finance, and IT stakeholders. Preferred Qualifications Experience in Federal contracting or regulated environments. Hands-on SQL/ More ❯
You'll be equally comfortable architecting new features, writing clean code, integrating with AWS services, and supporting production systems. AWS Cloud and SQL experience are essential; Unix scripting and datatransformation skills are a bonus. Core Requirements General Enthusiastic, positive mindset with strong intellectual curiosity Genuine passion for programming and problem-solving Able to prioritise and manage workload … patterns Ability to assess, select, and integrate NPM packages with a focus on security and maintainability Cloud (AWS) S3: correct use of metadata/headers (e.g., Content-Disposition) DynamoDB: data modelling and access-pattern design (appropriate pk/sk selection) AWS SDK v3: List/Get/Put/Copy/Delete operations across S3 and DynamoDB DataMore ❯
serverless PaaS solutions in Azure and GCP platforms. Deep knowledge of Container security and orchestration. Experience with compliance and application security tools. Understand Cloud (Azure/GCP) security features (data protection, IAM, network security, compliance frameworks) Ability to troubleshoot Network, DNS, Firewall, and routing. Understand and develop concepts related to deploying services via CI/CD pipeline. Experience with … AquaSec, Wiz.io, Defender, Sentinel, Google Chronicle, Splunk, etc.) Develop APIs and Webhook for multi-directional integration of cloud orchestration platform with system management systems, DevOps Tools, and cloud platforms. Datatransformation and reporting for security compliance KPI/KRIs Integration of security tooling in enterprise deployment pipelines for developer feedback and runtime governance Integration of security reporting with More ❯
Chatham, Kent, England, United Kingdom Hybrid/Remote Options
INTEC SELECT LIMITED
Head of Data Engineering Our long-standing financial services client is seeking a Head of Data Engineering to lead a high-performing team responsible for delivering a complex data transformation.Looking for a true greenfield challenge? This is your chance to shape a transformation from the ground up, build cutting-edge data solutions, and lead a … high-performing team that’s driving change across the business.To succeed in this role, you’ll bring hands-on experience owning and delivering Azure-based data platforms (Databricks, Synapse), alongside strong technical capability in designing and implementing data ingestion pipelines (ETL/ELT) and introducing event-driven architectures (Kafka) to support scalable, real-time solutions.Our client offers a … LTIP , £7.5k car allowance , and a comprehensive benefits package. The role is hybrid , based in Wolverhampton or Chatham . Key Responsibilities Lead, develop, and inspire a high-performing Data Engineering team, setting the vision, priorities, and best practices. Oversee the design, build, and optimisation of data ingestion, processing, and storage pipelines. Embed strong governance, documentation, and lifecycle management More ❯
Wolverhampton, West Midlands, England, United Kingdom Hybrid/Remote Options
INTEC SELECT LIMITED
Head of Data Engineering Our long-standing financial services client is seeking a Head of Data Engineering to lead a high-performing team responsible for delivering a complex data transformation.Looking for a true greenfield challenge? This is your chance to shape a transformation from the ground up, build cutting-edge data solutions, and lead a … high-performing team that’s driving change across the business.To succeed in this role, you’ll bring hands-on experience owning and delivering Azure-based data platforms (Databricks, Synapse), alongside strong technical capability in designing and implementing data ingestion pipelines (ETL/ELT) and introducing event-driven architectures (Kafka) to support scalable, real-time solutions.Our client offers a … LTIP , £7.5k car allowance , and a comprehensive benefits package. The role is hybrid , based in Wolverhampton or Chatham . Key Responsibilities Lead, develop, and inspire a high-performing Data Engineering team, setting the vision, priorities, and best practices. Oversee the design, build, and optimisation of data ingestion, processing, and storage pipelines. Embed strong governance, documentation, and lifecycle management More ❯
up, use your voice to drive change and help transform organisations and problem domains. Role: We are seeking a SDET (Software Development Engineer in Test) experienced in API automation, data validation, and AI-enabled systems. The ideal candidate has hands-on experience building automation frameworks in Python and exposure to validating AI or data-driven systems deployed at … scale. This role involves working closely with engineering and data teams to ensure the reliability and performance of applications that integrate AI/ML components, Agentic AI workflows, and large-scale data processing pipelines. Key Responsibilities: Design and implement API automation frameworks using Python (PyTest, Requests, or Robot Framework). Automate validation for AI-driven systems, including Agentic … AI, LLMs, and data pipelines. Test and validate RESTful APIs, data services, and backend microservices for reliability and performance. Build and maintain automated regression suites for large-scale, data-intensive systems. Integrate automated tests within CI/CD pipelines using Azure DevOps, GitHub Actions, or Jenkins. Work with data teams to validate data transformations, ETL More ❯
using modern IDE (such as IBM Rationale Developer or VS Code) Experience integrating RPG applications with web services (REST/SOAP) Familiarity with APIs, JSON/XML parsing, and datatransformation Strong experience with RPGLE (free and fixed format), including modern free-format syntax. Proficiency in IBM i (AS/400, iSeries) environment and tools (e.g., SEU, RDi … ACS, PDM, CL, VS Code). Understanding of DB2 for i (SQL, DDS, physical/logical files). Experience with service programs, modules, procedures, and sub-procedures. Knowledge of data structures, arrays, and error handling in RPGLE. Ability to read, maintain, and refactor legacy RPG (III/IV) code. Familiarity with job control, batch processing, and interactive programs. Version More ❯
Arts London, helping to build the digital foundations that connect our systems and services. You'll be responsible for developing and maintaining integrations using the MuleSoft Anypoint Platform, ensuring data flows securely and efficiently across our systems. Working closely with architects, analysts, and product teams, you'll help deliver high-quality, scalable APIs that support our strategic goals and … a creative, forward-thinking environment. Experience Strong knowledge and experience in integration development, with a specific focus on MuleSoft Anypoint platform. Strong knowledge of API development, integration patterns, and datatransformation using DataWeave. Proficiency in Java and JavaScript; experience with GitHub and CI/CD tools (eg Jenkins, Azure DevOps). Familiarity with enterprise systems integration (eg Salesforce More ❯
Arts London, helping to build the digital foundations that connect our systems and services. You’ll be responsible for developing and maintaining integrations using the MuleSoft Anypoint Platform, ensuring data flows securely and efficiently across our systems. Working closely with architects, analysts, and product teams, you’ll help deliver high-quality, scalable APIs that support our strategic goals and … a creative, forward-thinking environment. Experience Strong knowledge and experience in integration development, with a specific focus on MuleSoft Anypoint platform. Strong knowledge of API development, integration patterns, and datatransformation using DataWeave. Proficiency in Java and JavaScript; experience with GitHub and CI/CD tools (e.g. Jenkins, Azure DevOps). Familiarity with enterprise systems integration (e.g. Salesforce More ❯
Experience working low-code application development environments a plus. 5+ years total work experience Experience developing APIs. Experience in MuleSoft platform or any related API development environment. Experience with datatransformation using Dataweave, proficiency with Anypoint Studio, and knowledge of Mule connectors and API design Experience with Jenkins CI/CD. More ❯
Manchester, Lancashire, United Kingdom Hybrid/Remote Options
Smart DCC
e.g., GDPR, ISO 27001). Strong leadership, communication, and stakeholder management skills across both technical and non technical audiences. Experience integrating systems across diverse platforms using APIs, middleware, and datatransformation techniques. Demonstrated ability to balance technical, commercial, and operational priorities to drive sound architectural decisions. Understanding of the UK Smart Meter Implementation Programme and the energy sector More ❯
Central London, London, United Kingdom Hybrid/Remote Options
Ashdown Group
Director of Data Science & Engineering Global Brand A confidential global leader in consumer intelligence and forecasting is hiring a Director of Data Science & Engineering to lead its AI and data transformation. You'll shape and scale the organisations data science and engineering strategy, embedding advanced AI, LLMs, and agentic models into real-world forecasting and product … platforms. What youll do: Lead a high-performing team of data scientists and engineers Drive AI innovation using LLMs, CV, NLP & generative models Oversee architecture of scalable, cloud-native data platforms Partner with product, tech, and commercial leaders to deliver impact Ensure strong data governance, performance, and ethical AI practices What were looking for: Strong strategic mindset … and stakeholder collaboration skills Excellent management capability Strong commercial ability 7+ years in data/AI roles, with 5+ years in leadership Ability with Python, SQL, ML frameworks (TensorFlow, PyTorch) Deep knowledge of LLMs, forecasting, and modern data stacks Background in data insights, media or trend forecasting is a bonus Why apply? You'll join a high More ❯
software products. Strong understanding of Salesforce CRM systems and their applications in business environments. Exceptional leadership, communication, and interpersonal skills. Ability to think strategically and execute tactically. Experience leveraging data and insights to drive decision-making. Passion for customer success and a deep commitment to delivering exceptional customer experiences. 30% Travel Experience will be evaluated based on alignment to More ❯
ETL/database testing and a solid understanding of API testing. The candidate will be responsible for building and maintaining test suites, executing end-to-end functional testing, performing data validation, and managing codebases and test scripts. Key Responsibilities: Automation Testing: Design, develop, and maintain robust test suites using Playwright/Selenium with Typescript/Java. Implement automation frameworks … platforms. Track and report test coverage, ensuring comprehensive validation from ETL processes to UI. Identify, document, and follow up on defects, ensuring timely resolution. ETL/Database Testing: Conduct data validation and backend testing on SQL Server, MySQL, and NoSQL databases such as MongoDB and Cassandra. Perform automation on data pipeline testing to ensure the integrity of data moving from SQL to NoSQL databases. Collaborate with data engineers to validate ETL processes and data transformations. Version Control: Manage the codebase and test scripts using Git, ensuring proper version control practices are followed. Collaborate with team members to maintain a clean and organized repository. Test Management: Organize and manage test cases in QTest, ensuring alignment with More ❯
Peterborough, Cambridgeshire, England, United Kingdom Hybrid/Remote Options
eTech Partners
Power BI Developer Location: Peterborough (Hybrid – 2 days per week onsite) Overview This role is ideal for someone who enjoys transforming complex data into clear insights, building dynamic dashboards, and supporting data-driven decision-making across the business. You’ll be responsible for developing and maintaining Power BI reports, optimising data models, and working closely with stakeholders … to deliver high-quality BI solutions. Key Responsibilities Design, develop, and maintain Power BI dashboards, reports, and data models Develop DAX measures and use Power Query (M Language) for data transformation. Gather and analyse business requirements to translate them into scalable BI solutions. Ensure data accuracy, consistency, and security within Power BI Service. Manage publishing, permissions, and … refresh schedules in the Power BI environment. Collaborate with departments to provide actionable insights and improve data visibility. Skills & Experience Required Proven experience as a Power BI Developer Proficiency in DAX, Power Query, and data modelling. Strong understanding of SQL and relational databases. Excellent analytical and problem-solving skills. Confident in communicating with both technical and non-technical More ❯
West London, London, United Kingdom Hybrid/Remote Options
McGregor Boyall Associates Limited
virtual assistance. Develop and implement intelligent automation and AI-driven solutions to optimize operational efficiency and reduce manual processes across the organization. Lead the design and orchestration of modern data integration workflows using cloud-native platforms such as Azure Data Factory and Matillion for efficient data movement and transformation. Promote a data-driven culture by empowering … Technical Proficiency: Deep understanding of low-code development and process automation, particularly within the Microsoft Power Platform ecosystem. Skilled in leveraging cloud technologies such as Azure for infrastructure and data solutions, with hands-on experience in Snowflake, Matillion, and QlikSense for modern analytics and data transformation. Leadership Capabilities: Demonstrated success in leading multidisciplinary IT teams, fostering collaboration across More ❯
Seeking a hands-on data platform architect/engineer to reverse-engineer a legacy solution (currently on a VM) and migrate it to Microsoft Fabric. The goal is to stabilize critical data processes and lay the groundwork for a modular, scalable enterprise data platform. A working Microsoft Fabric-based replication of the legacy solution, with supporting documentation … and recommendations for future improvements. Analyze and document the existing legacy system. Rebuild and optimize data pipelines in Microsoft Fabric using PySpark. Conduct forensic analysis of data transformations and dependencies. Collaborate with data architects, engineers, and analysts. Troubleshoot data quality and integration issues. Provide recommendations for future modularization and scalability. More ❯
Seeking a hands-on data platform architect/engineer to reverse-engineer a legacy solution (currently on a VM) and migrate it to Microsoft Fabric. The goal is to stabilize critical data processes and lay the groundwork for a modular, scalable enterprise data platform. A working Microsoft Fabric-based replication of the legacy solution, with supporting documentation … and recommendations for future improvements. Analyze and document the existing legacy system. Rebuild and optimize data pipelines in Microsoft Fabric using PySpark. Conduct forensic analysis of data transformations and dependencies. Collaborate with data architects, engineers, and analysts. Troubleshoot data quality and integration issues. Provide recommendations for future modularization and scalability. More ❯
virtual assistance. Develop and implement intelligent automation and AI-driven solutions to optimize operational efficiency and reduce manual processes across the organization. Lead the design and orchestration of modern data integration workflows using cloud-native platforms such as Azure Data Factory and Matillion for efficient data movement and transformation. Requirements: Technical Proficiency: Deep understanding of Microsoft Power … Platform ecosystem. Skilled in leveraging cloud technologies such as Azure for infrastructure and data solutions, with hands-on experience in Snowflake, and QlikSense for modern analytics and data transformation. Leadership Capabilities: Demonstrated success in leading multidisciplinary IT teams, fostering collaboration across technical and business functions. Adept at managing relationships with stakeholders at all levels, ensuring alignment between technology More ❯