has built a reputation for delivering high-quality technology services across industries including banking, healthcare, telecommunications, and retail. The leading consultancy firm is seeking for a skilled Azure Data Engineer to join on a 6-month contract with strong potential for extension. Want to be part of a team that thrives in excellence? Feel free to reach out … and apply today! Responsibilities: Design and build robust, scalable datapipelines that serve business-critical applications and analytics use cases. Modernise and migrate Legacy ETL processes into cloud-native solutions using tools such as Azure Data Factory, Snowflake, and Databricks. Collaborate with cross-functional teams including data architects, analysts, and platform engineers to deliver … production-ready solutions. Proactively identify and resolve data quality issues, building in monitoring, testing, and automation where possible. Uphold engineering best practices through documentation, peer reviews, and continuous improvement. Skills/Must have: Extensive hands-on experience as a Data Engineer working across both on-prem and cloud data platforms. Strong expertise in Azure More ❯
Front Office Data & Analytics Engineer - Crude & Products Apply locations: London, United Kingdom Full time Posted Yesterday Job requisition id R-016371 Main Purpose: We are recruiting a software engineer to work directly with traders and research analysts in our trading teams. This role offers the opportunity to operate in a fast-paced, data-driven trading environment … contributing to real-time data and software solutions. The engineer will collaborate with global trading teams based in Geneva, Houston, and Singapore, and will be part of the global data science and engineering team responsible for market and fundamental data ingestion, management, modeling, and analytics applications. The role involves leading the development of applications … innovative business setting. Knowledge, Skills, Abilities, and Responsibilities: Develop software components, frameworks, and micro-services. Create core infrastructure and shared services for DnA applications. Build cloud-native big data platforms and analytics solutions. Maintain time-critical datapipelines (ETL/ELT). Apply SDLC and agile principles in software delivery. Implement domain-driven design. Problem solve More ❯
Creative Assembly is looking for a PowerBI and SQL ninja with deep knowledge of games and our products to come and support our Game Data Analytics & Insights. As part of the Analytics & Insights team, you'll report to the Analytics & Insights Manager and work across all our projects in the studio. It's a critical role that will … tooling and automation Build and maintain dashboards using PowerBI and Databricks SQL Produce analysis and insights for our Launch Report and Key Player Metrics processes Work with our data engineers to design, build and maintain reporting datapipelines Produce ad-hoc analysis on everything from player preferences, balancing, sentiment, title ownership and beyond Be able to … independently, managing multiple responsibilities and demands at once. From time to time, you may: Be asked to produce analysis, insights, or viz/dashboarding for our Production Intelligence data The stack you'll work with is: Power BI Databricks Knowledge, Skill and Experience Love video games and are highly motivated by the chance to play a key role More ❯
Eastleigh, Hampshire, United Kingdom Hybrid / WFH Options
Spectrum IT Recruitment
Data Engineer - Energy Consultancy £45,000 DOE | Full-Time, Permanent | Hybrid (Southampton) Join a company driving the UK's journey to net zero. We're hiring a Data Engineer for a purpose-led energy consultancy helping organisations reduce energy consumption, lower carbon emissions, and cut costs through innovative solutions and expert guidance. This is a fantastic … opportunity to join a tech-forward team committed to sustainability, professional development, and a collaborative working culture. The Role As a Data Engineer , you'll work as part of the dedicated Data Team to support internal reporting and external customer platforms. You'll design and maintain scalable pipelines using modern cloud technologies and help build a … robust new data platform for real-time analytics. Key Responsibilities: Own and deliver assigned data projects and BAU tasks Maintain datapipelines using Azure ADF and Synapse Analytics Transform raw CRM data into structured, curated datasets Develop event-driven data processes for internal and external use Contribute to best-practice More ❯
Experience in Cloud DataPipelines Building cloud datapipelines involves using Azure native programming techniques such as PySpark or Scala and Databricks. These pipelines are essential for tasks like sourcing, enriching, and maintaining structured and unstructured data sets for analysis and reporting. They are also crucial for secondary tasks such as flow pipelines, streamlining … systems. Key Skills and Experiences Strong programming experience in Python Proven experience working on the MS Azure cloud platform and its native tech stack in designing and building data & AI solutions Experience with data modeling, ETL processes, and data warehousing Knowledge of big data tools and frameworks such as Spark, Hadoop, or More ❯
A leading Financial Services client in the City of London is now seeking an experienced Data Engineering Manager to join on a permanent basis. This role is offering a base of £85,000 + a strong benefits package and flexible working. The ideal Data Engineering Manager will come from a data engineering background and … have strong knowledge in SQL, Snowflake, Microsoft Azure, Azure Data Factory and Azure DevOps. The Engineering Manager will design, improve and maintain robust datapipelines within data architecture. To be considered for this role you will need the following: Experience designing, improving and maintaining robust datapipelines Strong SQL programming skills. Knowledge … of other programming languages such as Python, C++ and Java beneficial Possesses a strong understanding of Snowflake - beneficial Experience managing small teams of Data Engineers Strong experience working in a cloud environment and knowledge in the following very beneficial: Microsoft Azure, Azure Data Factory and Azure DevOps Experience working in fast-paced Agile environments Creativity and More ❯
with an average profit share of £3,532-£10,861. Softwire is one of the UK's leading digital technology consultancies, offering services in data, AI, cloud, CX, innovation, design, and software engineering. We deliver high-profile, mission-critical, and transformational projects for household names across the public and private sectors. Founded on the idea to be the … Our culture and values emphasize doing jobs properly, making work fulfilling and fun, extending kindness, and trusting each other to reduce bureaucracy. In brief We seek well-rounded data engineers capable of tackling complex data challenges, impressing clients, and impacting team output. We offer a supportive environment with career development, mentorship, and diverse project opportunities. You … will work with talented developers and data engineers, supported by a Technical Lead, on projects such as: Data modeling for Moorfields Eye Hospital's INSIGHT programme Maintaining datapipelines and dashboards on Elexon's Azure platform Implementing AI transcription services for the BBC using GenAI Migrating legacy Excel workflows for a financial services firm More ❯
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
Softwire
with an average profit share of £3,532-£10,861. Softwire is one of the UK's leading digital technology consultancies, offering services in data, AI, cloud, CX, innovation, design, and software engineering. We deliver high-profile, mission-critical, and transformational projects for household names across the public and private sectors. Founded on the idea to be the … Our culture and values emphasize doing jobs properly, making work fulfilling and fun, extending kindness, and trusting each other to reduce bureaucracy. In brief We seek well-rounded data engineers capable of tackling complex data challenges, impressing clients, and impacting team output. We offer a supportive environment with career development, mentorship, and diverse project opportunities. You … will work with talented developers and data engineers, supported by a Technical Lead, on projects such as: Data modeling for Moorfields Eye Hospital's INSIGHT programme Maintaining datapipelines and dashboards on Elexon's Azure platform Implementing AI transcription services for the BBC using GenAI Migrating legacy Excel workflows for a financial services firm More ❯
City of London, London, United Kingdom Hybrid / WFH Options
McCabe & Barton
A leading Financial Services client in London is undergoing a major data transformation and is hiring a Data Engineering Manager on a permanent basis. The role offers a base of £85,000, strong benefits, and flexible hybrid/remote working. Were looking for a hands-on leader from a forward-thinking, modern data environment … with a strong foundation in Data Engineering and experience managing a squad of 47 Data Engineers. Youll bring solid SQL skills, cloud expertise (preferably Azure), and familiarity with tools such as Snowflake, Databricks, ADF, Python, and GenAI. Youll manage a squad of engineers and play a key role in delivering scalable, modern data solutions … as part of a group-wide transformation. Requirements: Proven experience in fast-paced environments working with modern data architectures and tooling; Financial Services experience is a plus. Experience managing Data Engineering teams Strong SQL and cloud skills (Azure preferred) Familiar with modern data tools (e.g. Snowflake, Databricks, ADF, Python) Track record of building/ More ❯
workers. Champion platform security : secrets management, zero-trust networking, least-privilege IAM, image provenance, and compliance-ready audit trails. Collaborate closely with AI & product engineers , shaping runtime environments, datapipelines, and workflow orchestration for large-scale model serving. Continuously improve reliability and cost efficiency through chaos testing, capacity planning, performance tuning, and proactive incident response. Requirements You May More ❯
Comfortable providing technical guidance and helping resolve ambiguity on large-scale work. Preferred Qualifications Experience working with Gemini models and GCP-native AI tooling. Familiarity with real-time datapipelines or streaming architectures. Prior work on context-aware or intelligent product experiences. Experience with design systems or component libraries in a production setting. Passion for continuously learning and More ❯
Comfortable providing technical guidance and helping resolve ambiguity on large-scale work. Preferred Qualifications Experience working with Gemini models and GCP-native AI tooling. Familiarity with real-time datapipelines or streaming architectures. Prior work on context-aware or intelligent product experiences. Experience with design systems or component libraries in a production setting. Passion for continuously learning and More ❯
A popular British brand is seeking a an experienced Data Engineer to play a pivotal role in a major data modernization initiative. With offices just outside of Bristol, this role will require being on-site 3 days per week to collaborate with your team and business stakeholders. As part of a transformative project leveraging Microsoft Fabric … you'll lead the design and implementation of a scalable, metadata-driven Medallion architecture. You'll work closely with cross-functional teams to deliver robust, trusted, and timely data solutions that power advanced analytics and business intelligence. What You'll Do: Architect and build scalable datapipelines using Microsoft Fabric, PySpark, and T-SQL Lead the … development of Star Schema Lakehouse tables to support BI and self-service analytics Collaborate with stakeholders to translate business needs into data models and solutions Mentor engineers and act as a technical leader within the team Ensure data integrity, compliance, and performance across the platform What You'll Bring: Expertise in Microsoft Fabric, Azure, PySpark, SparkSQL More ❯
A popular British brand is seeking a an experienced Data Engineer to play a pivotal role in a major data modernization initiative. With offices just outside of Bristol, this role will require being on-site 3 days per week to collaborate with your team and business stakeholders. As part of a transformative project leveraging Microsoft Fabric … you'll lead the design and implementation of a scalable, metadata-driven Medallion architecture. You'll work closely with cross-functional teams to deliver robust, trusted, and timely data solutions that power advanced analytics and business intelligence. What You'll Do: Architect and build scalable datapipelines using Microsoft Fabric, PySpark, and T-SQL Lead the … development of Star Schema Lakehouse tables to support BI and self-service analytics Collaborate with stakeholders to translate business needs into data models and solutions Mentor engineers and act as a technical leader within the team Ensure data integrity, compliance, and performance across the platform What You'll Bring: Expertise in Microsoft Fabric, Azure, PySpark, SparkSQL More ❯
A popular British brand is seeking a an experienced Data Engineer to play a pivotal role in a major data modernization initiative. With offices just outside of Bristol, this role will require being on-site 3 days per week to collaborate with your team and business stakeholders. As part of a transformative project leveraging Microsoft Fabric … you'll lead the design and implementation of a scalable, metadata-driven Medallion architecture. You'll work closely with cross-functional teams to deliver robust, trusted, and timely data solutions that power advanced analytics and business intelligence. What You'll Do: Architect and build scalable datapipelines using Microsoft Fabric, PySpark, and T-SQL Lead the … development of Star Schema Lakehouse tables to support BI and self-service analytics Collaborate with stakeholders to translate business needs into data models and solutions Mentor engineers and act as a technical leader within the team Ensure data integrity, compliance, and performance across the platform What You'll Bring: Expertise in Microsoft Fabric, Azure, PySpark, SparkSQL More ❯
A popular British brand is seeking a an experienced Data Engineer to play a pivotal role in a major data modernization initiative. With offices just outside of Bristol, this role will require being on-site 3 days per week to collaborate with your team and business stakeholders. As part of a transformative project leveraging Microsoft Fabric … you'll lead the design and implementation of a scalable, metadata-driven Medallion architecture. You'll work closely with cross-functional teams to deliver robust, trusted, and timely data solutions that power advanced analytics and business intelligence. What You'll Do: Architect and build scalable datapipelines using Microsoft Fabric, PySpark, and T-SQL Lead the … development of Star Schema Lakehouse tables to support BI and self-service analytics Collaborate with stakeholders to translate business needs into data models and solutions Mentor engineers and act as a technical leader within the team Ensure data integrity, compliance, and performance across the platform What You'll Bring: Expertise in Microsoft Fabric, Azure, PySpark, SparkSQL More ❯
Cheltenham, Gloucestershire, England, United Kingdom
Searchability NS&D
CONTRACT PALANTIR DATA ENGINEER - DV CLEARED NEW OUTSIDE IR35 CONTRACT OPPORTUNITY AVAILABLE WITHIN A HIGHLY SECURE DELIVERY PROGRAMME FOR A DV CLEARED PALANTIR FOUNDRY DATA ENGINEER. Contract opportunity for a Palantir Data Engineer to support cutting-edge National Security projects £600 - £750 per day (Outside IR35) DV clearance is essential Based full-time onsite … using Foundry tools like Workshop Strong communication skills to collaborate with technical and military/non-technical stakeholders Experience integrating data from multiple sources into Foundry via Pipeline Builder and custom code Comfortable working on-site within a secure environment and at pace with end-users TO BE CONSIDERED... . Please either apply by clicking online or … consent for us to process and submit your application to our client in conjunction with this vacancy only. KEY SKILLS: PALANTIR/FOUNDARY/DATA ENGINEER/PIPELINE BUILDER/ONTOLOGY/PYTHON/TYPESCRIPT/FULL STACK/DV CLEARED/LONDON/CONTRACT More ❯
grow. We are seeking a highly analytical and technically skilled Strategic Finance Associate to join our dynamic team. This role is ideal for someone who enjoys working with data, building robust financial models, automating workflows, and developing dashboards that empower operational and strategic decision-making. What you'll be doing Lead and support the development, enhancement, and maintenance … of financial models integrating live data sources for scenario planning, budgeting, and forecasting. Conduct in-depth data analysis to uncover business trends, identify performance drivers, and generate actionable insights that influence strategic priorities. Build, automate, and maintain dashboards in BI tools to monitor key financial and operational metrics. Partner cross-functionally with Data Analytics … and Revenue Operations teams to ensure high-quality datapipelines, enhance data accuracy, and develop self-serve reporting capabilities. Use SQL and scripting tools to extract and manipulate large datasets in support of ad hoc and recurring analyses. Collaborate on headcount planning, incentive structure evaluation, and GTM strategy assessments using quantitative frameworks. Define and track key More ❯
apply End Date: September 5, 2025 (30+ days left to apply) job requisition id R Role Summary: This role demands skills and perseverance to modernise tech stacks and datapipelines of well established StarMine Quant models in Python to provided specifications, and to build brand-new quantitative/predictive models with help of machine learning algorithms working in … collaboration with our data science and Research teams. Build datapipelines to break down complicated calculations into stages. Main Responsibilities/Accountabilities: We engage with product owners, architects and other specialists to craft and build sophisticated, world-class analytics models to address our customer needs. Building and maintaining efficient, reusable, reliable and secure enterprise software, adhering … like Terraform. Proficient with Git. Proficient with CI/CD automation using GitLab or similar. Azure Fabric experience Vision to use machine learning and AI algorithms to the data Education/Certifications: A relevant degree or equivalent experience is desirable, but the right approach to work is just as important. AWS/Azure certification a nice to have. More ❯
Didcot, Oxfordshire, South East, United Kingdom Hybrid / WFH Options
CV Screen Ltd
Job Title : Commercial Data Analyst - SQL Location : Didcot, Oxfordshire (Hybrid - 3 days in office, 2 days remote) Salary : £55,000 + Excellent Benefits Introduction Are you passionate about turning complex data into actionable insights? We have a fantastic opportunity for a skilled Commercial Data Analyst to join an exciting company in the retail sector … in the office, with 2 days from home. The company, a leader in sustainable retail, has been in business for over a decade and is known for leveraging data to drive customer experiences and business growth. Duties & Responsibilities Collaborate with data and tech teams to manage datapipelines across eCommerce platforms using GTM/… supply/demand across regions. Analyze customer acquisition channels, calculating ROI and attribution for various marketing efforts. What Experience is Required? At least 5 years of experience in data analysis with a strong focus on data extraction, analysis, and manipulation. Excellent experience in margins, revenue and pricing. Expertise in GTM/GA4, BigQuery, SQL, and Excel More ❯
The financial services industry is being revolutionised by the power of AI, creating new opportunities and transforming the way businesses operate. Deloitte's AI & Data Financial Services team is at the forefront of this change, helping our clients harness the potential of AI to drive innovation and achieve their strategic objectives. As an AI Solution Designer, you'll … your opportunity Key Responsibilities: Lead client engagements to identify and define high-impact AI use cases across various business functions. Conduct feasibility assessments, develop solution architectures, and define data requirements for AI initiatives. Design end-to-end AI solutions, outlining datapipelines, model architectures, and deployment strategies. Collaborate closely with data scientists, engineers, and … Strong understanding of AI concepts, machine learning algorithms, and deep learning architectures. Familiarity with various AI solution design patterns and best practices for different business applications. Experience with data visualization tools and techniques to communicate insights and solution designs effectively. Knowledge of cloud computing platforms (e.g., AWS, Azure, GCP) and their AI/ML services. Basic understanding of More ❯
Telford, Shropshire, United Kingdom Hybrid / WFH Options
Experis - ManpowerGroup
a new scrum team within Minerva Platform to develop and deliver the Ingestion and Risking within the SAS Platform including IDP. Key Responsibilities: Design, development, and deployment of data integration and transformation solutions using Pentaho, Denodo, Talend, and SAS. Architect and implement scalable datapipelines and services that support business intelligence and analytics platforms. Collaborate with … cross-functional teams to gather requirements, define technical specifications, and deliver robust data solutions. Champion Agile and Scrum methodologies, ensuring timely delivery of sprints and continuous improvement. Drive DevOps practices for CI/CD, automated testing, and deployment of data services. Mentor and guide junior engineers, fostering a culture of technical excellence and innovation. Ensure data quality, governance, and security standards are upheld across all solutions. Troubleshoot and resolve complex data issues and performance bottlenecks. Key Skills: SAS 9.4 (DI), SAS Viya 3.x (SAS Studio, VA, VI). Platform LSF, Jira, Platform Support. GIT. Strong expertise in ETL tools: Pentaho, Talend. Experience with data virtualisation using Denodo. Proficiency in SAS More ❯
Telford, Shropshire, United Kingdom Hybrid / WFH Options
Experis - ManpowerGroup
a new scrum team within Minerva Platform to develop and deliver the Ingestion and Risking within the SAS Platform including IDP. Key Responsibilities: Design, development, and deployment of data integration and transformation solutions using Pentaho, Denodo, Talend, and SAS. Architect and implement scalable datapipelines and services that support business intelligence and analytics platforms. Collaborate with … cross-functional teams to gather requirements, define technical specifications, and deliver robust data solutions. Champion Agile and Scrum methodologies, ensuring timely delivery of sprints and continuous improvement. Drive DevOps practices for CI/CD, automated testing, and deployment of data services. Mentor and guide junior engineers, fostering a culture of technical excellence and innovation. Ensure data quality, governance, and security standards are upheld across all solutions. Troubleshoot and resolve complex data issues and performance bottlenecks. Key Skills: SAS 9.4 (DI), SAS Viya 3.x (SAS Studio, VA, VI). Platform LSF, Jira, Platform Support. GIT. Strong expertise in ETL tools: Pentaho, Talend. Experience with data virtualisation using Denodo. Proficiency in SAS More ❯
Telford, Shropshire, West Midlands, United Kingdom Hybrid / WFH Options
Experis
a new scrum team within Minerva Platform to develop and deliver the Ingestion and Risking within the SAS Platform including IDP. Key Responsibilities: Design, development, and deployment of data integration and transformation solutions using Pentaho, Denodo, Talend, and SAS. Architect and implement scalable datapipelines and services that support business intelligence and analytics platforms. Collaborate with … cross-functional teams to gather requirements, define technical specifications, and deliver robust data solutions. Champion Agile and Scrum methodologies, ensuring timely delivery of sprints and continuous improvement. Drive DevOps practices for CI/CD, automated testing, and deployment of data services. Mentor and guide junior engineers, fostering a culture of technical excellence and innovation. Ensure data quality, governance, and security standards are upheld across all solutions. Troubleshoot and resolve complex data issues and performance bottlenecks. Key Skills: SAS 9.4 (DI), SAS Viya 3.x (SAS Studio, VA, VI). Platform LSF, Jira, Platform Support. GIT. Strong expertise in ETL tools: Pentaho, Talend. Experience with data virtualisation using Denodo. Proficiency in SAS More ❯
Telford, Shropshire, West Midlands, United Kingdom Hybrid / WFH Options
LA International Computer Consultants Ltd
a new scrum team within Minerva Platform to develop and deliver the Ingestion and Risking within the SAS Platform including IDP. Key Responsibilities: Design, development, and deployment of data integration and transformation solutions using Pentaho, Denodo, Talend, and SAS. Architect and implement scalable datapipelines and services that support business intelligence and analytics platforms. Collaborate with … cross-functional teams to gather requirements, define technical specifications, and deliver robust data solutions. Champion Agile and Scrum methodologies, ensuring timely delivery of sprints and continuous improvement. Drive DevOps practices for CI/CD, automated testing, and deployment of data services. Mentor and guide junior engineers, fostering a culture of technical excellence and innovation. Ensure data quality, governance, and security standards are upheld across all solutions. Troubleshoot and resolve complex data issues and performance bottlenecks. Key Skills & Experience: SAS 9.4 (DI), SAS Viya 3.x (SAS Studio, VA, VI). Platform LSF, Jira, Platform Support. GIT. Strong expertise in ETL tools: Pentaho, Talend. Experience with data virtualization using Denodo. Proficiency in More ❯