My client, a FTSE250 organisation require a Snowflake Data Engineer to provide data engineering support for the Data Science and Consumer Insight team. The contractor will be responsible for maintaining and optimising the existing Snowflake Lab environment, which ingests and transforms data from SAP BICC and other sources to support reporting and analytics. Key deliverables: - Managing … and improving the existing dataingestion and transformation pipelines within Snowflake. - Enhancing automation, reliability, and monitoring of daily update processes. - Supporting and optimising Power BI data models that depend on the Snowflake Lab. - Collaborating with the central data team to integrate newly available enterprise datasets into the Lab, ensuring efficient and controlled adoption. - Supporting ad hoc … dataingestion and preparation to meet the needs of analysts and data scientists, including custom datasets and one-off data requests. - Contributing to the development of data handling procedures and best practices, helping to ensure long-term maintainability and scalability of the lab environment. - Ownership and day-to-day management of the Commercial Insight's More ❯
The Opportunity:We are seeking an experienced Analytics Engineer to join a leading marketing and brand analytics team. This role focuses on managing dataingestion and transformation projects, optimising dbt models, and supporting Looker dashboards. You will work closely with marketing and brand stakeholders to ensure data processes are accurate, documented, and maintainable. Key Responsibilities: Own and … manage brand and media dataingestion and transformation processes. Review and optimise dbt models (15-20 models) for completeness, accuracy, and automation. Maintain and document Snowflake assets, ingested tables, and analytics workflows. Support governance, process cleanup, and monthly reporting. Collaborate with marketing stakeholders, legal, and technical teams to ensure accurate data flow and reporting. Assist in delivering … Looker dashboards and front-end experiences. Technical Skills & Experience: Strong experience with dbt, Looker, and Snowflake (or other cloud data platforms). Knowledge of dataingestion and transformation processes (stakeholder-facing, not necessarily coding). Experience with marketing analytics and brand/media data preferred. Strong documentation, process governance, and problem-solving skills. Soft Skills: Ability More ❯
The Opportunity:We are seeking an experienced Analytics Engineer to join a leading marketing and brand analytics team. This role focuses on managing dataingestion and transformation projects, optimising dbt models, and supporting Looker dashboards. You will work closely with marketing and brand stakeholders to ensure data processes are accurate, documented, and maintainable. Key Responsibilities: Own and … manage brand and media dataingestion and transformation processes. Review and optimise dbt models (15-20 models) for completeness, accuracy, and automation. Maintain and document Snowflake assets, ingested tables, and analytics workflows. Support governance, process cleanup, and monthly reporting. Collaborate with marketing stakeholders, legal, and technical teams to ensure accurate data flow and reporting. Assist in delivering … Looker dashboards and front-end experiences. Technical Skills & Experience: Strong experience with dbt, Looker, and Snowflake (or other cloud data platforms). Knowledge of dataingestion and transformation processes (stakeholder-facing, not necessarily coding). Experience with marketing analytics and brand/media data preferred. Strong documentation, process governance, and problem-solving skills. Soft Skills: Ability More ❯
Contract Opportunity: Lead Azure Data Engineer (Remote - 500/day Outside IR35) We're hiring for a Lead Azure Data Engineer to join our team on a hybrid contract based in London, supporting key finance stakeholders and transforming our data platform. This role offers the chance to shape the future of financial reporting through cutting-edge cloud … engineering and data architecture. Location : Central London (UK-based candidates only) Rate : £500/day IR35 Status : Outside IR35 Start Date : ASAP - Interviews next week Duration : 6 months with potential for extension The Role You'll lead the design and delivery of scalable data products that support financial analysis and decision-making. Working closely with BI and Analytics … teams, you'll help evolve our data warehouse and implement best-in-class engineering practices across Azure. Key Responsibilities Build and enhance ETL/ELT pipelines in Azure Databricks Develop facts and dimensions for financial reporting Collaborate with cross-functional teams to deliver robust data solutions Optimize data workflows for performance and cost-efficiency Implement governance and More ❯
Contract Opportunity: Lead Azure Data Engineer (Remote - 500/day Outside IR35) We're hiring for a Lead Azure Data Engineer to join our team on a hybrid contract based in London, supporting key finance stakeholders and transforming our data platform. This role offers the chance to shape the future of financial reporting through cutting-edge cloud … engineering and data architecture. Location : Central London (UK-based candidates only) Rate : £500/day IR35 Status : Outside IR35 Start Date : ASAP - Interviews next week Duration : 6 months with potential for extension The Role You'll lead the design and delivery of scalable data products that support financial analysis and decision-making. Working closely with BI and Analytics … teams, you'll help evolve our data warehouse and implement best-in-class engineering practices across Azure. Key Responsibilities Build and enhance ETL/ELT pipelines in Azure Databricks Develop facts and dimensions for financial reporting Collaborate with cross-functional teams to deliver robust data solutions Optimize data workflows for performance and cost-efficiency Implement governance and More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Searchability NS&D
Palantir Foundry Data Engineer - DV Cleared NEW CONTRACT OPPORTUNITY FOR A PALANTIR FOUNDRY DATA ENGINEER TO WORK ON A NATIONAL SECURITY PROJECT IN LONDON WITH DV CLEARANCE - Contract role in London for a Palantir Foundry Data Engineer - Must hold DV Security Clearance - Central London based - Daily rate up to £800 - Hybrid position - To apply, email: or call … Who we are... We are seeking an experienced Palantir Foundry Data Engineer with current DV clearance to join a high-profile programme. This is a contract position offering hybrid working and a daily rate of up to £800 . In this role, you will be responsible for designing, developing, and optimising data pipelines and integrations within Palantir Foundry … ensuring data is efficiently processed, transformed, and made available for analysis and operational use. You will collaborate closely with analysts, data scientists, and business stakeholders to deliver robust, secure, and scalable data solutions. What we're looking for... Key Responsibilities: Develop and maintain data pipelines and workflows in Palantir Foundry Integrate diverse data sources, ensuring More ❯
Irlam, Greater Manchester, United Kingdom Hybrid / WFH Options
First Recruitment Group
Our top telecoms client is looking for a Data Engineer to join their team on a staff or contract basis Our Client has a requirement for a Data Engineer, who will be required to work on a staff or contract basis in Irlam (Manchester). Role Purpose: The BI Data Engineer will be responsible for leading and … developing and reporting views to ensure data-driven insights are underpinned by a validated dataset, aligning to the data definitions set out by the business. In addition, this role will focus on data engineering, building and managing the infrastructure that allows Freedom Fibre to collect, store, and analyse data. The role will involve designing, building, and maintaining … databases to ensure data is readily available, reliable, and efficient for various purposes, including data science and business intelligence. This will also include developing and maintaining API integrations with systems to extract and load data into the database. The role will also be responsible for designing, developing, and maintaining interactive Power BI dashboards using advanced SQL and More ❯
Elastic Kibana Logstash(ELK) Details: Candidate should have 8 years of relevant experience in ELK S/He should be able to develop/build the required pipeline for dataingestion using Logstash component S/He should have implantation experience in ELK If candidate has just done monitoring of existing ELK setup , that will not be useful. … Candidate should have exposure to cloud and Kafka Also should have exposure to security domain Strong understanding of elastic - elastic search , Kibana , logstash , Fleet and other integrations Data Engineering skill set to design and develop pipelines to ingest data into Elastic Cloud & Domain aware Candidate will be responsible Building all the required pipeline for dataingestionMore ❯
Migration Specialist to lead and support the migration of observability workloads from Splunk to Elasticsearch (ELK Stack) . The ideal candidate will bring hands-on expertise in Splunk architecture, dataingestion, alerting, and dashboarding, along with experience migrating workloads to Elasticsearch. In addition to migration duties, the candidate will maintain and enhance existing Splunk infrastructure, provide incident support … and problem-solving skills. Key Responsibilities: Migration: Develop and implement a comprehensive migration strategy from Splunk to Elasticsearch (ELK Stack). Assess existing Splunk configurations (dashboards, alerts, saved searches, data models) and recreate them in Kibana. Collaborate with Elastic teams to configure alerting and monitoring using Kibana, Elasticsearch Watcher, or third-party tools. Ensure migration plans include validation, rollback … and Support: Identify and resolve issues in Splunk and ELK environments. Assist teams with Splunk-related queries and optimization efforts. Skills and Qualifications: Essential: Proven expertise with Splunk architecture , dataingestion, dashboarding, alerting, and administration. Experience migrating Splunk workloads to Elasticsearch (ELK Stack) . Solid understanding of Kibana , Elasticsearch Watcher , and observability tooling. Proficiency in Linux/Unix More ❯
We are seeking a highly skilled Snowflake Developer to support the data infrastructure of a leading Asset Manager. The role will involve designing, developing, and optimising Snowflake-based solutions, integrating data from various sources, and supporting key business functions such as portfolio management, risk, operations, and compliance. INSIDE IR35 HYBRID WORKING Key Responsibilities: Design and implement ETL/… ELT pipelines and data models using Snowflake and Azure Data Factory (ADF). Integrate and transform data from multiple sources, including structured and semi-structured formats, into Azure SQL and Snowflake. Build and maintain Azure Functions (optional but preferred) to support automation and data transformation processes. Collaborate with data consumers to ensure seamless integration of … data for reporting and analytics via Power BI or QlikView. Optimise query performance and storage efficiency using Snowflake features such as clustering, partitioning, and resource monitoring. Support integration with investment systems, including Aladdin, where applicable, and help facilitate consistent and reliable data across platforms. Experience: Proven experience developing Snowflake data platforms in enterprise environments. Strong proficiency with More ❯
seeking an experienced AI Solution Architect to design and deliver an innovative AI-driven architecture that leverages Azure OpenAI to capture, analyse, and report on stakeholder sentiment from multiple data sources. This is an opportunity to lead a high-impact initiative in a collaborative, fast-paced environment. Key Responsibilities Solution Design & Architecture: Create a scalable AI-driven solution that … integrates diverse data sources including emails, meeting recordings, consultation responses, Salesforce data, BI scores, CSAT scores, and media coverage. Data Capture & Storage: Implement automated and manual dataingestion processes, ensuring compliance with legal and regulatory requirements, and enabling full audit traceability. Data Analysis & Reporting: Deploy AI-powered sentiment and thematic analysis tools, and build … driven solutions using Azure OpenAI (or similar platforms). Strong track record integrating Salesforce and BI tools. Familiarity with media monitoring and sentiment analysis technologies. Expertise in developing dashboards, data visualisations, and insight reporting tools. Strong communication and stakeholder engagement skills. This role offers the chance to shape and lead an AI solution from concept to delivery , working on More ❯
travel to London or Corsham Duration: 6-month initial contract SC clearance required. About the Role You are a systems thinker with deep experience in backend engineering and distributed data infrastructure. You thrive in high-stakes environments, know how to ship reliable code fast, and are pragmatic about trade-o=s. You understand the unique constraints of experimental, edge … This role focuses on building and adapting backend services to support experimental AI systems deployed in sensitive Defence contexts. You will help create and manage robust, portable pipelines for dataingestion, orchestration, model interaction, and querying - all within a classified, compute-constrained environment. While our SaaS platform handles production-scale workflows, this project demands thoughtful downscaling and adaptation … not replication. You will work closely with a crossfunctional team of frontend engineers, data scientists, platform engineers, and external partners. What You'll Do * Architect and implement backend services that support secure, air-gapped AI deployments, with a focus on NLP-based tooling * Develop pipelines for transcription ingestion and real-time analytical insight generation * Support graph and RAG More ❯
experiencing persistent performance challenges including slow connection times, extended log-on durations and overall poor user experience and system responsiveness. They have deployed uberAgent, The tool is actively collecting data, But do not have the expertise to properly interpret the insights and telemetry, Diagnose root causes and formulate, implement remediation strategies. Additionally, there is a potential performance bottleneck related … to dataingestion into Splunk, which may be impacting the effectiveness of UberAgent or the overall monitoring strategy. Key Responsibilities: Deep-dive analysis into uberAgent telemetry Identification of key performance bottlenecks and root causes Evaluation of Splunk ingestion challenges and recommendations for optimisation Delivery of a detailed findings report and proposed next steps for remediation Collaborate with More ❯