powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. FS Technology Consulting - AI and Data - Data Engineer - Senior Consultant/Manager - Dublin/Cork/Limerick/Galway General Information Location: Dublin, Cork, Limerick, or Galway. Available for VISA Sponsorship: No Business Area … Data & Analytics Contract Type: Full Time - Permanent EY is the only major professional services firm with an integrated Financial Services practice across Europe, the Middle East, India, and Africa (EMEIA). We connect our Asset Management, Banking and Capital Markets and Insurance clients to 6,500 talented people from 12 countries and 35,000 Financial Services colleagues around the … Azure Functions and Logic Apps for automation. Snowflake: Strong SQL skills and experience with Snowflake's architecture (virtual warehouses, storage, cloud services). Proficiency in Snowflake Streams & Tasks for CDC and automation. Experience with Snowflake Secure Data Sharing and Snowflake Marketplace. Familiarity with Snowpark for Python/Java-based transformations. Understanding of role-based access control, data masking More ❯
Luton, Bedfordshire, South East, United Kingdom Hybrid / WFH Options
Stackstudio Digital Ltd
Job Title: CDC Qlik Talend Lead Location: Luton (Hybrid 2 to 3 days per week onsite) Job Type: Contract (Inside IR35) Job Summary: Join Tata Consultancy Services (TCS) as a CDC Qlik Talend Lead and play a pivotal role in delivering business-critical migration and integration projects for leading UK clients. You will utilize your expertise in Qlik Talend Cloud … and ChangeDataCapture (CDC) to lead end-to-end migration activities, develop robust data integration pipelines, and drive seamless collaboration across distributed teams. Key Responsibilities: Lead and execute end-to-end migration from on-cloud environments to Qlik Talend SaaS. Design, develop, and maintain scalable Talend Cloud pipelines and CDC workflows. Collaborate with cross-functional … components such as tMap, tJoin, tFileInput , etc. Solid experience in SQL , ETL/ELT design, and cloud data platforms. Strong understanding of ChangeDataCapture (CDC) concepts and data modeling. Experience integrating with Git and Talend CI tools. Excellent stakeholder management and communication skills across distributed teams. Person Specification: Independent, detail-oriented professional with a More ❯
of something bigger. MAIN PURPOSE OF THE ROLE & RESPONSIBILITIES IN THE BUSINESS:As a Senior Palantir Engineer (Senior Associate) at Kainos, you will be responsible or designing and developing data processing and data persistence software components for solutions which handle data at scale. Working in agile teams, Senior Data Engineers provide strong development leadership and take … responsibility for significant technical components of data systems. You will work within a multi-skilled agile team to design and develop large scale data processing software to meet user needs in demanding production environments.YOUR RESPONSIBILITIES WILL INCLUDE:• Working to develop data processing software primarily for deployment in Big Data technologies. The role encompasses the full software … Experience of data visualisation and complex data transformations• Experience with steaming and event-processing architectures including technologies such as Kafka and change-data-capture (CDC) products• Expertise in continuous improvement and sharing input on data best practice.# Embracing our differences At Kainos, we believe in the power of diversity, equity and inclusion. We are More ❯
part of something bigger. JOB PROFILE DESCRIPTIONAs a Palantir Solution Architect (Manager) in Kainos, you'll be responsible for a multi- skilled agile teams to design and deliver contemporary data solutions. You will be a quality orientated pragmatist, where you balance trade-offs to successfully deliver complex solutions. You will be viewed as an authority figure for data … providing strong technical and thought leadership.Your responsibilities will include:• Successfully implementing functional and non-functional designs• Working closely with Ops and Infrastructure architects to productionise robust, resilient, and maintainable data solutions• Working closely with customer architects to agree functional and non-functional designs. Advising, questioning, informing, and helping the customer in making sound solution design outcomes.• Working with your … and mentor those around you• Managing and estimating timelines underpinning solution delivery• Owning technical solution design as part of a pre-sales process• Making a significant contribution to the data community and wider data capability• Managing, coaching and developing a small number of staff, with a focus on managing employee performance and assisting in their career development. You More ❯
part of something bigger. JOB PROFILE DESCRIPTIONAs a Palantir Technical Architect (Consultant) in Kainos, you'll be responsible for designing and delivering technical components as part of a larger data solution. You will work closely with Solution Architects and Customer Architects to integrate these components into quality data solutions.Your responsibilities will include:• Successfully implementing functional and non-functional … are upheld• Managing and estimating timelines underpinning technical component delivery• Contributing or owning technical solution design as part of a pre-sales process• Making a significant contribution to the data analysis community and wider data and analytics capability• Managing, coaching and developing a small number of staff, with a focus on managing employee performance and assisting in their … You'll also provide direction and leadership for your team as you solve challenging problems togetherMINIMUM (ESSENTIAL) REQUIREMENTS:• Experience of technical ownership for a component, sub-system or product (data ingestion, data streaming), including architecture, estimation, product planning and story creation• Proficient in client interaction including communication of technical decisions to non-technical audiences• Experience of applying standards More ❯
extensive experience in designing, implementing, and optimizing database solutions in a microservices-based environment. As a member of our team, you will contribute to the full lifecycle of our data persistence layer, from schema design and performance tuning to ensuring robust replication, disaster recovery, and seamless integration within our cloud-native microservice ecosystem. Responsibilities: Design & Develop Database Solutions: Architect … design, and implement highly optimized relational (e.g., MySQL, PostgreSQL, AWS Aurora, SQL Server) and NoSQL (e.g., MongoDB, DynamoDB, Redis) database schemas, ensuring data integrity, performance, and scalability for microservices. Performance Optimization & Tuning: Proactively analyze and optimize complex queries, implement efficient indexing strategies, and manage partitioning/sharding to ensure peak database performance and handle high throughput. Reliability & Disaster Recovery … in English spoken and written Nice-to-Haves: Experience with ETL/ELT pipeline design and tools (e.g., Apache Airflow). Familiarity with ChangeDataCapture (CDC) solutions. Knowledge of database services on other cloud platforms (e.g., Azure SQL Database, Google Cloud Spanner). Understanding of ORM frameworks (Entity Framework, Dapper, SQLAlchemy, Hibernate) from a database performance More ❯
london, south east england, united kingdom Hybrid / WFH Options
Intuition IT – Intuitive Technology Recruitment
Role: QA Engineer Engineer Location: London, UK (Hybrid) Employment Type: B2B Contract. Detailed Job Description: • 5-8 years of experience in ETL Testing, Snowflake, DWH Concepts and Data based testing (RDBMS, cloud-based technologies). • Strong SQL knowledge & debugging skills are a must. • Experience on Azure and Snowflake Testing is plus • Experience with Qlik Replicate and Compose tools (ChangeDataCapture) tools is considered a plus • Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool •Experience in JIRA, Xray defect management tools are good to have. • Exposure to the financial domain knowledge is considered a plus •Testing the data-readiness (data quality) address code or data … judgment and innovation to define clear and concise solutions • Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution • Prior experience with State Street and Charles River Development (CRD) considered a plus • Experience in tools such as PowerPoint, Excel, SQL More ❯
Newcastle Upon Tyne, Tyne And Wear, United Kingdom
La Fosse Associates Limited
and world-class solutions from our base in the heart of Newcastle City Centre. If you thrive in a fast-paced, intellectually stimulating environment and are passionate about technology, data, and solving complex problems, this could be the right opportunity for you. Our Culture As part of a new and growing business, you'll help shape our culture from … individuals who enjoy solving complex technical challenges and driving improvement. About the DBA Team Our Database Administration team is responsible for maintaining the reliability, performance, and robustness of our data platforms. We're looking for someone who: Has a broad interest in technology and enjoys solving a wide range of technical issues. Takes pride in maintaining high productivity, reliability … . Optimise database Servers and SQL queries, stored procedures, and schema designs. Collaborate with developers and business teams to improve SQL Server performance and usage. Work with cloud-based data technologies such as Azure SQL Server and Data Lake . Required Qualifications & Experience Minimum 2:1 degree in Computer Science or a related field, ideally from a Russell More ❯
Data Science Engineer – Azure | Databricks | AI Innovation Permanent | Birmingham (Hybrid) Amtis is proud to partner with an advanced, data-driven organisation — a business that’s not just talking about AI, but actively building intelligent systems powered by Azure, Databricks, and real-world machine learning applications. This is a hands-on engineering role where you’ll be at the … core of designing, developing, and optimising modern data platforms that enable predictive analytics, AI experimentation, and large-scale automation. You’ll work in an environment where data truly drives business decisions — not just dashboards. If you’re excited by high-volume, high-velocity data challenges and want to work on next-gen infrastructure that supports advanced analytics … Looking For Strong hands-on experience with Azure Databricks, Data Factory, Blob Storage, and Delta Lake Proficiency in Python, PySpark, and SQL Deep understanding of ETL/ELT, CDC, streaming data, and lakehouse architecture Proven ability to optimise data systems for performance, scalability, and cost-efficiency A proactive problem-solver with great communication skills and a passion More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Advanced Resource Managers
SQL Analyst, CDP & CRM Segmentation London based – hybrid working – 3 days on site 3-6 Month contract – Inside IR35 We’re standing up a Customer Data Platform (CDP) and need a hands-on SQL analyst to tighten audience segmentation, productionise segmentation logic in the warehouse, and push curated traits into Braze so CRM can operate campaigns without overloading the … data team. What you’ll do Own audience & trait definition Translate CRM/Marketing objectives into precise, reusable audience and trait specs Build and maintain warehouse-first segmentation tables/views with clear SLOs and documentation. Ship reliable data to Braze Design pipelines to push curated traits and audiences to Braze , including change-datacapture … Braze counts vs. warehouse truth to ensure high confidence. Unblock campaign operations Create a self-serve library of SQL snippets, views, so CRM can launch campaigns without ad-hoc data requests. Implement suppression logic (deliverability, compliance, frequency caps) and guardrails. Partner across teams Work closely with Data Engineering, CRM, BI and Product to align event/identity strategy More ❯
SQL Analyst, CDP & CRM Segmentation London based – hybrid working – 3 days on site 3-6 Month contract – Inside IR35 We’re standing up a Customer Data Platform (CDP) and need a hands-on SQL analyst to tighten audience segmentation, productionise segmentation logic in the warehouse, and push curated traits into Braze so CRM can operate campaigns without overloading the … data team. What you’ll do Own audience & trait definition Translate CRM/Marketing objectives into precise, reusable audience and trait specs Build and maintain warehouse-first segmentation tables/views with clear SLOs and documentation. Ship reliable data to Braze Design pipelines to push curated traits and audiences to Braze , including change-datacapture … Braze counts vs. warehouse truth to ensure high confidence. Unblock campaign operations Create a self-serve library of SQL snippets, views, so CRM can launch campaigns without ad-hoc data requests. Implement suppression logic (deliverability, compliance, frequency caps) and guardrails. Partner across teams Work closely with Data Engineering, CRM, BI and Product to align event/identity strategy More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Lorien
site. Key Responsibilities: Work with the test team to create shakedown checklists, potentially covered by automation. Assist in the implementation of the TEM Strategy. Create environment-specific landscape and data flow maps. Build an environments management information repository. Assess NFT environment fit for purpose and document recommendations. Create additional test environment capacity. Provide Environment Management representation in daily scrums … working groups, and ad-hoc meetings. Required Skillsets: Strong skills and experience with data technologies such as IBM DB2, Oracle, MongoDB, Hive, Hadoop, SQL, Informatica, and similar tech stacks. Attention to detail and strong ability to work independently and navigate complex target end state architecture (Tessa). Strong knowledge and experience with Agile testing processes and methodologies, preferably Scaled More ❯
tuning techniques Experience with backup and disaster recovery processes Review the current Debezium deployment architecture, including Oracle connector configuration, Kafka integration, and downstream consumers. Analyze Oracle database setup for CDC compatibility (e.g., redo log configuration, supplemental logging, privileges). Evaluate connector performance, lag, and error handling mechanisms. Identify bottlenecks, misconfigurations, or anti-patterns in the current implementation. Provide a detailed … expertise in MongoDB architecture, including replication, sharding, backup, and recovery Strong hands-on experience with Debezium, especially the Oracle connector (LogMiner). Deep understanding of Oracle internals relevant to CDC: redo logs, SCNs, archive log mode, supplemental logging. Proficiency with Apache Kafka and Kafka ecosystem tools. Experience with monitoring and debugging Debezium connectors in production environments. Ability to analyze logs … to comply with CHAMP Security Requirements (including but not limited to CHAMP's IT Security Policies, especially the ISMS Policy and the Acceptable Use Policy, mandatory courses, confidentiality and data protection, use of company assets, and incident reporting). CHAMP Cargosystems is an equal opportunity employer and prohibits discrimination and harassment of any kind. We are committed to the More ❯
firm operating at the intersection of knowledge graphs, generative AI, and enterprise transformation. This role requires you to embed graph intelligence into mission-critical systems enabling explainable AI, unified data views, and advanced reasoning across regulated industries and industrial domains. You will drive the design, build, and deployment of knowledge graph + GenAI systems for high-impact clients. You … ll be part of a small elite team, bridging data, AI, and business outcomes — from model scoping through to production launch. Ship full systems Operate in regulated or industrial domains (e.g. manufacturing, life sciences, public sector) Embed client strategy and technical teams Stretching the frontier … hybrid KG + AI, graph + reasoning + M 🛠 Responsibilities Lead KG schema & ontology design across domains (assets, risk, supply chain, compliance) Build ingestion pipelines (ETL/streaming/CDC) and entity resolution for graph population Author complex queries (Cypher, GSQL, AQL, SPARQL etc. depending on stack) Integrate knowledge graph retrieval & reasoning into LLM/RAG/GraphRAG systems Develop More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Intelix.AI
firm operating at the intersection of knowledge graphs, generative AI, and enterprise transformation. This role requires you to embed graph intelligence into mission-critical systems enabling explainable AI, unified data views, and advanced reasoning across regulated industries and industrial domains. You will drive the design, build, and deployment of knowledge graph + GenAI systems for high-impact clients. You … ll be part of a small elite team, bridging data, AI, and business outcomes — from model scoping through to production launch. Ship full systems Operate in regulated or industrial domains (e.g. manufacturing, life sciences, public sector) Embed client strategy and technical teams Stretching the frontier … hybrid KG + AI, graph + reasoning + M 🛠 Responsibilities Lead KG schema & ontology design across domains (assets, risk, supply chain, compliance) Build ingestion pipelines (ETL/streaming/CDC) and entity resolution for graph population Author complex queries (Cypher, GSQL, AQL, SPARQL etc. depending on stack) Integrate knowledge graph retrieval & reasoning into LLM/RAG/GraphRAG systems Develop More ❯
requirements Participate with other coordinators across the country in joint advancing public health readiness challenge Define and management the process for systems to test the submission of meaningful use data received directly from providers and help to accept the data when successful Serve as the liaison between the state and facilities for issues regarding meaningful use and immunization … data Strong working knowledge and experience of … Oracle database and PL/SQL scripting Serve as the central point of contact for information and assistance regarding meaningful use for the Centers for Disease Control and Prevention (CDC), the Joint Public Health Informatics Taskforce (JPHIT) the Centers for Medicaid and Medicate Services (CMS), and the Office of National Coordinator for Health Information technology (ONC) Participate in projects and More ❯