prominent organisation in the public health sector, is dedicated to fostering health security and responding effectively to public health emergencies. With a focus on pathogen modelling, genomic sequencing, and data analytics, they are at the forefront of critical initiatives shaping national health standards. Role Summary: Our client seeks an HPC Engineer with cluster experience to support critical public health … Slurm, Grid Engine, IBM) and tune MPI-based applications for genomic and health modelling tasks. Conduct security assessments and deploy compliant systems using SIEM tools (eg, Splunk). Oversee dataingestion/backups for petabyte-scale health datasets and perform performance tests (eg, Linpack). Respond to urgent outages during health crises and support researchers with documentation and More ❯
I am working with a client in the education sector who are looking for a data engineer with experience across architect & strategy to join on a part-time 12 month contract.1-2 days per weekFully remoteOutside IR35Immediate start12 month contract Essential Been to school in the UK DataIngestion of APIs GCP based (Google Cloud Platform) Snowflake More ❯
I am working with a client in the education sector who are looking for a data engineer with experience across architect & strategy to join on a part-time 12 month contract. 1-2 days per week Fully remote Outside IR35 Immediate start 12 month contract Essential Been to school in the UK DataIngestion of APIs GCP More ❯
ELK SME Extension Professional experience in the design, maintenance and management of Elastic stacks (Elasticsearch, Logstash, Kibana) Experience of configuring and maintaining large Elastic clusters Experience working with large data sets and elastic indexing best practices. Good understanding on Visualisation components and techniques in Elasticsearch. Proven experience in performance management and tuning of Elasticsearch environment. Strong experience in writing … dataingestion pipelines using Logstash and other big. All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply More ❯
built models for digital human, voice localization and best of breed image & video generation models in the industry. The current products include: An agentic AI platform with high-quality dataingestion and conversational interfaces. Tools for content creation, automation, and reuse via AI-powered content studio agents. Media asset management, video collaboration, and supply chain components designed for … high-performance Product team including PMs, POs, and UI/UX • Instil a performance culture with measurable success criteria (OKRs, adoption, monetisation, retention). • Foster close partnership with Engineering, Data, Commercial, and GTM leads. Cross-Functional and Executive Alignment • Act as a bridge between technical innovation and executive vision. Collaborate with the Chief Innovation Officer, CTO, Product Owners and … cross-functional collaboration skills. Preferred Qualifications Background in AI-driven, synthetic media, or generative content technologies. Familiarity with product licensing models (SaaS, enterprise sales, usage-basedpricing). Understanding of data provenance, consent management, and ethical AIpractices. Experience working in a startup or growth-stage company that has scaled tomaturity. Familiarity with localisation and global product considerations is a plus. More ❯
Crime Enhancement Project focused on Sanctions and PEP screening. What you'll do: Administer and configure LexisNexis Bridger Insight for sanctions and PEP screening workflows. Run screening jobs, manage dataingestion, and generate reports within Bridger. Set up users, permissions, and workflows tailored to project requirements. Collaborate with internal teams and external consultants to backfill and transition responsibilities. … Strong understanding of Sanctions and PEP screening processes. Background in Financial Crime, AML, or Compliance projects. Ability to manage screening engines, workflows, and user configurations. Comfortable running jobs, handling data files, and producing reports specific to Bridger functionality. Next steps We have a diverse workforce and an inclusive culture at M&G plc, underpinned by our policies and our More ❯
as of 12 months ending December 2024 totaled $13.8 billion. Experience : Minimum 10+ Years Strong Knowledge in Hadoop, Kafka, SQL/NoSQL Specialization in designing and implementing large-scale data pipelines, ETL processes, and distributed systems Should be able to work independenty with minimal help/guidance Good Understanding of Airflow,Data Fusion and Data Flow Strong … Background and experience in Data Ingestions,Transformation,Modeling and Performance tuning. One migration Experience from Cornerstone to GCP will be added advantage Suppport the design and development of BigData echosystem Experience in building complex SQL Queries Strong Communication Skills More ❯
as of 12 months ending December 2024 totaled $13.8 billion. Experience : Minimum 10+ Years Strong Knowledge in Hadoop, Kafka, SQL/NoSQL Specialization in designing and implementing large-scale data pipelines, ETL processes, and distributed systems Should be able to work independenty with minimal help/guidance Good Understanding of Airflow,Data Fusion and Data Flow Strong … Background and experience in Data Ingestions,Transformation,Modeling and Performance tuning. One migration Experience from Cornerstone to GCP will be added advantage Suppport the design and development of BigData echosystem Experience in building complex SQL Queries Strong Communication Skills More ❯
as of 12 months ending December 2024 totaled $13.8 billion. Experience : Minimum 10+ Years Strong Knowledge in Hadoop, Kafka, SQL/NoSQL Specialization in designing and implementing large-scale data pipelines, ETL processes, and distributed systems Should be able to work independenty with minimal help/guidance Good Understanding of Airflow,Data Fusion and Data Flow Strong … Background and experience in Data Ingestions,Transformation,Modeling and Performance tuning. One migration Experience from Cornerstone to GCP will be added advantage Suppport the design and development of BigData echosystem Experience in building complex SQL Queries Strong Communication Skills More ❯
as of 12 months ending December 2024 totaled $13.8 billion. Experience : Minimum 10+ Years Strong Knowledge in Hadoop, Kafka, SQL/NoSQL Specialization in designing and implementing large-scale data pipelines, ETL processes, and distributed systems Should be able to work independenty with minimal help/guidance Good Understanding of Airflow,Data Fusion and Data Flow Strong … Background and experience in Data Ingestions,Transformation,Modeling and Performance tuning. One migration Experience from Cornerstone to GCP will be added advantage Suppport the design and development of BigData echosystem Experience in building complex SQL Queries Strong Communication Skills More ❯