engineers. Excellent problem-solving skills and experience in agile environments. Desirable: Experience with streaming data (Kafka/Kinesis), Docker/Kubernetes, Terraform, CI/CD pipelines, NoSQL databases, and metadata management tools. Company Benefits: Enhanced Parental Leave Generous annual leave Healthcare Plan Annual Giving Day an extra day to give back to yourself or your community Cycle-to-work Scheme More ❯
CI/CD. Primary Responsibilities Play a hands-on role as part of an Agile team. Actively contribute to the codebase and participate in peer reviews. Design and build metadata driven, event based distributed data processing platform using technologies such as Python, Rust, EC2, ECS, S3, Glue, Athena, Lambda and Step Functions. Work collaboratively in the design, development, testing and More ❯
solutions align with audit objectives. Perform comprehensive data analysis, validation, and integrity checks. Data Governance & Documentation Ensure compliance with data governance standards and regulatory requirements. Maintain accurate data dictionaries, metadata , and workflow documentation for transparency and team collaboration. Optimization & Best Practices Optimize dashboard and data pipeline performance through best practices in data visualization and coding standards . Contribute to the … financial services, banking , or regulatory environments . Familiarity with internal audit processes , risk management , or compliance frameworks . Understanding of data lineage , data governance tools , and best practices in metadata management . Why Join Us? Work on impactful projects that support critical audit and risk operations Be part of a collaborative, expert-driven environment Access to continuous learning, development , and More ❯
based architectures. Experience in using Git, Azure DevOps, or GitHub Actions for version control, CI/CD, and collaborative data delivery. Robust understanding of data governance, data quality, and metadata management. Experience of communicating technical information and data to a non-technical audience and working collaboratively with analysts, architects, and product owners to deliver data solutions that meet user and … on the introduction of foundational data management capabilities to improve trust, accessibility, and efficiency in an organisation that has limited data management capability, lacks data management practices, including governance, metadata standards, and quality controls. Design, implement, and optimise physical data models that align with pipeline architecture, by using the approach that ensures efficient query performance, scalable storage, and robust integration More ❯
which is an online learning platform that employees have access to and which features over 200,000 courses on a wide variety of business topics. Ranked #1 in AI Metadata & Search by Forrester, Aprimo sits on the cutting-edge of technology and is paving new paths forward by incorporating AI (artificial intelligence) into our product offerings. We offer generative AI More ❯
agreed ITSM tool (e.g., ServiceNow), maintaining clear documentation. Core VIM Modules and Technical Scope: OpenText VIM Core Invoice processing, approvals, and exception workflows Document Processing (DP) Indexing Indexing invoice metadata and validation Exception Handling Framework (EHF) Management and resolution of standard and custom exceptions VIM Analytics Monitoring throughput, backlog, and exception trends Business Rules Framework Plus (BRF+) Rule-based invoice More ❯
Job Role Architecture & Solution Design Define end-to-end MAM architecture, including ingest, storage, metadata, workflows, search and distribution. Design integration patterns between MAM and DAM/CMS, OTT platforms, storage, CDN, and analytics systems. Define live video workflows from contribution encoding packaging CDN playback. Define the end-to-end search architecture across the solution. Translate business requirements into technical … blueprints and implementation plans. Drive performance, scalability, and security improvements for deployments. Development & Implementation Build custom workflows, plugins, and APIs.. Develop automation solutions for ingest, transcoding, metadata enrichment, QC, and archiving. Develop and maintain live video pipelines: ingest encoding packaging delivery Build and consume APIs for live media services (AWS MediaLive/MediaConnect, Azure Media Services, Wowza, etc.) Implement and … search engines (Elasticsearch, Solr, OpenSearch, or vendor-native). Implement API-driven integrations with third-party systems (e.g., Adobe, Avid, broadcast systems, DAMs, cloud storage). Configure user access, metadata schemas, and distribution workflows. Contribute to CI/CD pipelines, containerized deployments, and monitoring setup. Leadership & Collaboration Provide technical leadership to developers and operations teams. Partner with product managers, media More ❯
risk, and technology teams to embed sustainable, scalable, and compliant data practices. Key Responsibilities Define, implement, and maintain data governance policies, processes, and reporting structures. Facilitate governance initiatives, including metadata management, stewardship, and ownership models. Design and deliver reporting frameworks that track data quality, risk, and control metrics. Collaborate with client stakeholders to promote data literacy and embed governance roles … Lead the selection, design, and rollout of governance, quality, and reporting tooling solutions. About You Strong track record in data governance and management within financial services. Deep understanding of metadata, lineage, stewardship, and data ownership frameworks. Experienced in developing and deploying governance and quality reporting frameworks. Excellent communicator capable of explaining complex governance concepts to senior stakeholders. Practical experience with More ❯
Skills and Experience: Proven experience designing geospatial data architectures in large enterprise or public sector environments. Proficiency with EA Sparx for data modelling and architecture documentation. Deep knowledge of metadata and geospatial data standards (ISO 19115, INSPIRE and GEMINI). Understanding of coordinate reference systems, topology, geometry validation and data lineage. Experience conducting data quality and completeness assessments and implementing … governance model. Role and Responsibilities: Develop conceptual, logical, and physical data models for geospatial and land data using EA Sparx for design and documentation. Define data standards, schemas and metadata frameworks to ensure consistency and interoperability across the organisation. Design solutions that distinguish between live, validated and historical datasets while supporting both batch and streaming-based processing. Conduct gap analyses More ❯
Migration Lead Veeva (Vault Platform) Location: EU (Remote or Hybrid, occasional travel) Contract: 1224 months Sector: Life Sciences/Pharma/Biotech About the Role Were looking for an experienced Migration Lead to support a large-scale Veeva Vault transformation More ❯
and optimise Service Cloud and Omni-Channel features to improve customer service efficiency. Integrate Salesforce with external systems using SOAP and REST-based APIs, the Bulk API, and the Metadata API. Collaborate with technical and functional teams to ensure seamless delivery. Participate in code reviews, deployment planning, and performance tuning. About You Were looking for a hands-on Salesforce expert … with: 58 years experience working on the Salesforce platform. Proven experience with Service Cloud and Omni-Channel. Strong understanding of Salesforce Web Services APIs (SOAP, REST, Bulk, and Metadata). Excellent problem-solving skills and attention to detail. The ability to work effectively in a fast-paced, global environment. This is initially a 6 months FTC with flexibility on salary More ❯