Requirements: - Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources. - Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations. - Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage. - Design graph models for real-world applications such as fraud detection, network analysis, and … Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. Thanks & Regards, Pooja K | Technical Recruiter UK/ More ❯
Requirements: - Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources. - Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations. - Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage. - Design graph models for real-world applications such as fraud detection, network analysis, and … Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. Thanks & Regards, Pooja K | Technical Recruiter UK/ More ❯
Requirements: - Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources. - Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations. - Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage. - Design graph models for real-world applications such as fraud detection, network analysis, and … Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. Thanks & Regards, Pooja K | Technical Recruiter UK/ More ❯
Requirements: - Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources. - Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations. - Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage. - Design graph models for real-world applications such as fraud detection, network analysis, and … Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. Thanks & Regards, Pooja K | Technical Recruiter UK/ More ❯
Requirements: - Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources. - Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations. - Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage. - Design graph models for real-world applications such as fraud detection, network analysis, and … Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. Thanks & Regards, Pooja K | Technical Recruiter UK/ More ❯
Requirements: - Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources. - Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations. - Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage. - Design graph models for real-world applications such as fraud detection, network analysis, and … Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. Thanks & Regards, Pooja K | Technical Recruiter UK/ More ❯
Requirements: - Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources. - Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations. - Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage. - Design graph models for real-world applications such as fraud detection, network analysis, and … Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. Thanks & Regards, Pooja K | Technical Recruiter UK/ More ❯
Requirements: - Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources. - Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations. - Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage. - Design graph models for real-world applications such as fraud detection, network analysis, and … Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. Thanks & Regards, Pooja K | Technical Recruiter UK/ More ❯
Requirements: - Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources. - Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations. - Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage. - Design graph models for real-world applications such as fraud detection, network analysis, and … Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. Thanks & Regards, Pooja K | Technical Recruiter UK/ More ❯
Requirements: - Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources. - Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations. - Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage. - Design graph models for real-world applications such as fraud detection, network analysis, and … Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. Thanks & Regards, Pooja K | Technical Recruiter UK/ More ❯
Requirements: - Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources. - Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations. - Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage. - Design graph models for real-world applications such as fraud detection, network analysis, and … Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. Thanks & Regards, Pooja K | Technical Recruiter UK/ More ❯
Requirements: - Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources. - Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations. - Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage. - Design graph models for real-world applications such as fraud detection, network analysis, and … Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. Thanks & Regards, Pooja K | Technical Recruiter UK/ More ❯
Requirements: - Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources. - Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations. - Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage. - Design graph models for real-world applications such as fraud detection, network analysis, and … Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. Thanks & Regards, Pooja K | Technical Recruiter UK/ More ❯
Requirements: - Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources. - Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations. - Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage. - Design graph models for real-world applications such as fraud detection, network analysis, and … Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. Thanks & Regards, Pooja K | Technical Recruiter UK/ More ❯
Requirements: - Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources. - Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations. - Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage. - Design graph models for real-world applications such as fraud detection, network analysis, and … Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. Thanks & Regards, Pooja K | Technical Recruiter UK/ More ❯
Requirements: - Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources. - Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations. - Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage. - Design graph models for real-world applications such as fraud detection, network analysis, and … Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. Thanks & Regards, Pooja K | Technical Recruiter UK/ More ❯
Requirements: - Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources. - Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations. - Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage. - Design graph models for real-world applications such as fraud detection, network analysis, and … Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. Thanks & Regards, Pooja K | Technical Recruiter UK/ More ❯
Requirements: - Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources. - Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations. - Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage. - Design graph models for real-world applications such as fraud detection, network analysis, and … Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. Thanks & Regards, Pooja K | Technical Recruiter UK/ More ❯
DataEngineer with C# Dot Net asp.net with SQL Server SSIS SSRS Our Client is a bank based in Central London who are looking to recruit at least 7 years plus experience as a DataEngineer with the ability to work with C# Dot net and SQL Server with SSIS. You must have solid expertise of … in-house systems, i.e. CORE, SharePoint interface, Equation, Kondor, Eximbills, end of day cycle. Support Supporting budgeting and financial planning processes for Finance department, including loading and refreshing of data based on requirements. Understand and conduct the front-end functionality to amend and change hierarchical structures within the environment. Build data flows within the SQL environment in SSIS … processes thereby eliminating the need for duplicate entry. Utilise web services to integrate from cutting edge technology into legacy systems such as Equation. Integrate with all the systems using data abstraction and connectivity layers, i.e. ODBC, ADO.net. Duties Maintain knowledge of all applicable regulatory requirements including the Bank’s Risk and Compliance policies and procedures and adhere to these More ❯
Job Reference: 322235 Position: Senior DataEngineer Salary: Competitive salary plus car allowance, healthcare, annual bonus potential Type: Permanent, Full time Location: Royal Mail Farringdon (3x days a week in London Office) Benefits: Family friendly support: enhanced maternity pay, paternity leave, adoption leave, and shared parental leave Discounts and offers: more than 800 offers to help you save … more than our competitors put together. We have ambitious plans to grow market share both at home and globally while transforming our UK operation to increase efficiency and profit. Data and technology are pivotal to our success. We are seeking a Senior DataEngineer to play a pivotal role in driving Royal Mail's data strategy … collaborating closely with our Data Analytics and Data Science teams. This role will ensure the highest engineering standards and best practices for product delivery. You will help implement innovative data use cases that significantly impact the business, with high visibility, leadership responsibility, and a direct role in delivering both capital and business-as-usual (BAU) projects on More ❯
Ready to shape the future of data at one of the fastest-growing companies in iGaming? This business is using data as more than just a reporting tool — it’s the engine driving innovation and competitive edge across some of the industry’s most successful brands. They're now looking for a data professional ready to help … build and scale cutting-edge capabilities in the Azure ecosystem. In this role, you’ll work closely with the Analytics & Data team to deliver impactful insights and power smarter decisions across the business. You'll be hands-on with modern Azure data tools and play a key role in maintaining and developing .NET-based solutions — so experience with … s vibrant Camden office four days a week, surrounded by a fast-paced, collaborative team working at the forefront of marketing analytics in iGaming. If you're passionate about data, thrive on variety, and want to make an impact where data truly matters — apply below More ❯
Ready to shape the future of data at one of the fastest-growing companies in iGaming? This business is using data as more than just a reporting tool — it’s the engine driving innovation and competitive edge across some of the industry’s most successful brands. They're now looking for a data professional ready to help … build and scale cutting-edge capabilities in the Azure ecosystem. In this role, you’ll work closely with the Analytics & Data team to deliver impactful insights and power smarter decisions across the business. You'll be hands-on with modern Azure data tools and play a key role in maintaining and developing .NET-based solutions — so experience with … s vibrant Camden office four days a week, surrounded by a fast-paced, collaborative team working at the forefront of marketing analytics in iGaming. If you're passionate about data, thrive on variety, and want to make an impact where data truly matters — apply below More ❯
Our OEM Client based in Whitley, Coventry is searching for a Powertrain Charging Test DataEngineer to join their team, Inside IR35. This is a 12-month contract position initially until 31st March 2026, with the potential for further extensions. Umbrella Pay Rate: £27.03 per hour. This role sits within the Powertrain Charging Systems Team, in the Charging … Validation & Verification organisational unit. The role will focus on delivering data from tests in support of design release activity or technology advancement of electric vehicle charging infrastructure compatibility. The main activity for this role as part of a team would be to prepare test facility or test parts with instrumentation which enables measurement of a wide range of charging … system signals and variables, both on-board and external to the vehicle. In addition, this role will support the collection of data in various environments such as test facilities, public roads, and proving grounds. The key outcome is to help create a modern, seamless, and stress-free charging experience for customers. Skills Required: Awareness of existing charging standards (e.g. More ❯
3RD FLOOR, 65-68, LEADENHALL STREET, LONDON, England
SOLIRIUS LTD
fantastic opportunity for someone passionate about technology to gain practical skills, training, & experience delivering digital solutions alongside experienced consultants in a collaborative and fast-paced environment. Role As a DataEngineer degree apprentice, you'll learn how to be accountable for the undertaking and completion of the analysis of software engineering business issues for either the entire requirement … or subset thereof dependent on complexity and/or scope size You'll be instrumental in implementing Solirius’ data solutions, or part thereof, dependent on complexity, scope size and technology into the environment required (including systems test, user test environment, or live environment as appropriate) You'll be directly involved in the creation of an over-arching software solution … to resolving our clients' business issues for either the entire requirement or a subset thereof dependent on complexity and/or scope size You'll assist in building our data solutions, or part thereof, dependent on complexity, scope size, and technology As you grow in your role on our team, you may also be required to supervise the work More ❯
solve today and tomorrow's problems with the companies who need their help to do it. AND WITH AI. An opportunity to get your hands down and dirty with data to help companies find the best talent. Specifically focusing on research, innovation, science, and engineering. There is a huge and complex database, using AI to provide quantitative insights that … are unmatched. These insights highlight underrepresented talent, particularly women. Day to day, you'll design and develop data pipelines, ensuring data is clean, transformed, and integrated efficiently into the database. The environment is collaborative, working closely with machine learning engineers and data scientists to facilitate seamless data analysis. You'll work within a dynamic team to … support and enhance our AI-driven talent platform. Your background is in a heavy data environment, with a natural inclination towards statistical analysis. A STEM degree is beneficial but not necessary. You should be comfortable in the cloud, preferably AWS, with good experience building, managing, and enhancing data pipelines. Proficiency in Python and SQL is essential, as well More ❯