want to talk to you. What Are The Responsibilities? Architect and implement AI-powered features for threat detection, pattern recognition, and automated intelligence gathering Design and build scalable datapipelines that process billions of data points using AI/ML models for entity extraction and relationship mapping Lead the integration of Large Language Models (LLMs) for … natural language processing of intelligence data Develop robust APIs and microservices that handle real-time threat analysis at scale Implement computer vision systems for image and video analysis in OSINT investigations Build and optimize vector databases for semantic search across massive intelligence datasets Establish best practices for AI/ML model deployment, monitoring, and continuous improvement Mentor team … with MLOps practices and model deployment pipelines Proficient in cloud AI services (AWS SageMaker/Bedrock) Deep understanding of distributed systems and microservices architecture Expert in datapipeline platforms (Apache Kafka, Airflow, Spark) Proficient in both SQL (PostgreSQL, MySQL) and NoSQL (Elasticsearch, MongoDB) databases Strong containerization and orchestration skills (Docker, Kubernetes) Experience with infrastructure as code (Terraform More ❯
want to talk to you. What Are The Responsibilities? Architect and implement AI-powered features for threat detection, pattern recognition, and automated intelligence gathering Design and build scalable datapipelines that process billions of data points using AI/ML models for entity extraction and relationship mapping Lead the integration of Large Language Models (LLMs) for … natural language processing of intelligence data Develop robust APIs and microservices that handle real-time threat analysis at scale Implement computer vision systems for image and video analysis in OSINT investigations Build and optimize vector databases for semantic search across massive intelligence datasets Establish best practices for AI/ML model deployment, monitoring, and continuous improvement Mentor team … with MLOps practices and model deployment pipelines Proficient in cloud AI services (AWS SageMaker/Bedrock) Deep understanding of distributed systems and microservices architecture Expert in datapipeline platforms (Apache Kafka, Airflow, Spark) Proficient in both SQL (PostgreSQL, MySQL) and NoSQL (Elasticsearch, MongoDB) databases Strong containerization and orchestration skills (Docker, Kubernetes) Experience with infrastructure as code (Terraform More ❯
Markets Program Execution & Transformation – Data Acquisition Team Belfast (Hybrid) - Essential | Approx. £415pd PAYE + Holiday (PAYE) The Markets Program Execution & Transformation team partners with all Global Markets businesses and various functions—including Legal, Compliance, Finance, and Operations & Technology—to identify, mobilize, and deliver regulatory and cross-business transformation initiatives. The team's core mission is to design and … implement integrated solutions that are efficient, scalable, and client-focused. This role sits within the Data Acquisition Team , playing a critical part in supporting multiple projects and workstreams. The focus is on assessing and delivering robust data solutions and managing changes that impact diverse stakeholder groups in response to regulatory rulemaking, supervisory requirements, and discretionary transformation … programs. Key Responsibilities: Develop PySpark and SQL queries to analyze, reconcile, and interrogate data. Provide actionable recommendations to improve reporting processes—e.g., enhancing data quality, streamlining workflows, and optimizing query performance. Contribute to architecture and design discussions in a Hadoop-based environment. Translate high-level architecture and requirements into detailed design and code. Lead and guide complex, high More ❯
intelligently to user behaviour. • Cross-functional execution : Lead product development from discovery through launch. Work closely with engineering, ML, design, and go-to-market teams. You will define data requirements, shape architecture and UX, and track product performance through clear metrics. • Go-to-market and monetisation: Develop pricing and packaging strategies for AI features. Test business models and … validate product-market fit. You will own the commercial success of your product areas. • Data-driven learning loops : Create feedback systems that capture user behavior, AI performance, and commercial outcomes. Use this data to run experiments, improve models, and refine user experiences. • Customer and internal enablement : Engage directly with early users and create internal tools, documentation … user-facing tools that use machine learning, especially in scheduling, coordination, or productivity software. • Technical fluency : You are comfortable with concepts like embeddings, intent detection, prompt design, and data pipelines. You don't need to code, but you understand enough to speak with engineers and data scientists with confidence. • Workflow thinking : You're passionate about redesigning More ❯