Luton, Bedfordshire, South East, United Kingdom Hybrid/Remote Options
Stackstudio Digital Ltd
Job Title: CDC Qlik Talend Lead Location: Luton (Hybrid 2 to 3 days per week onsite) Job Type: Contract (Inside IR35) Job Summary: Join Tata Consultancy Services (TCS) as a CDC Qlik Talend Lead and play a pivotal role in delivering business-critical migration and integration projects for leading UK clients. You will utilize your expertise in Qlik Talend Cloud … and ChangeDataCapture (CDC) to lead end-to-end migration activities, develop robust data integration pipelines, and drive seamless collaboration across distributed teams. Key Responsibilities: Lead and execute end-to-end migration from on-cloud environments to Qlik Talend SaaS. Design, develop, and maintain scalable Talend Cloud pipelines and CDC workflows. Collaborate with cross-functional … components such as tMap, tJoin, tFileInput , etc. Solid experience in SQL , ETL/ELT design, and cloud data platforms. Strong understanding of ChangeDataCapture (CDC) concepts and data modeling. Experience integrating with Git and Talend CI tools. Excellent stakeholder management and communication skills across distributed teams. Person Specification: Independent, detail-oriented professional with a More ❯
of something bigger. MAIN PURPOSE OF THE ROLE & RESPONSIBILITIES IN THE BUSINESS:As a Senior Palantir Engineer (Senior Associate) at Kainos, you will be responsible or designing and developing data processing and data persistence software components for solutions which handle data at scale. Working in agile teams, Senior Data Engineers provide strong development leadership and take … responsibility for significant technical components of data systems. You will work within a multi-skilled agile team to design and develop large scale data processing software to meet user needs in demanding production environments.YOUR RESPONSIBILITIES WILL INCLUDE:• Working to develop data processing software primarily for deployment in Big Data technologies. The role encompasses the full software … Experience of data visualisation and complex data transformations• Experience with steaming and event-processing architectures including technologies such as Kafka and change-data-capture (CDC) products• Expertise in continuous improvement and sharing input on data best practice.# Embracing our differences At Kainos, we believe in the power of diversity, equity and inclusion. We are More ❯
part of something bigger. JOB PROFILE DESCRIPTIONAs a Palantir Solution Architect (Manager) in Kainos, you'll be responsible for a multi- skilled agile teams to design and deliver contemporary data solutions. You will be a quality orientated pragmatist, where you balance trade-offs to successfully deliver complex solutions. You will be viewed as an authority figure for data … providing strong technical and thought leadership.Your responsibilities will include:• Successfully implementing functional and non-functional designs• Working closely with Ops and Infrastructure architects to productionise robust, resilient, and maintainable data solutions• Working closely with customer architects to agree functional and non-functional designs. Advising, questioning, informing, and helping the customer in making sound solution design outcomes.• Working with your … and mentor those around you• Managing and estimating timelines underpinning solution delivery• Owning technical solution design as part of a pre-sales process• Making a significant contribution to the data community and wider data capability• Managing, coaching and developing a small number of staff, with a focus on managing employee performance and assisting in their career development. You More ❯
part of something bigger. JOB PROFILE DESCRIPTIONAs a Palantir Technical Architect (Consultant) in Kainos, you'll be responsible for designing and delivering technical components as part of a larger data solution. You will work closely with Solution Architects and Customer Architects to integrate these components into quality data solutions.Your responsibilities will include:• Successfully implementing functional and non-functional … are upheld• Managing and estimating timelines underpinning technical component delivery• Contributing or owning technical solution design as part of a pre-sales process• Making a significant contribution to the data analysis community and wider data and analytics capability• Managing, coaching and developing a small number of staff, with a focus on managing employee performance and assisting in their … You'll also provide direction and leadership for your team as you solve challenging problems togetherMINIMUM (ESSENTIAL) REQUIREMENTS:• Experience of technical ownership for a component, sub-system or product (data ingestion, data streaming), including architecture, estimation, product planning and story creation• Proficient in client interaction including communication of technical decisions to non-technical audiences• Experience of applying standards More ❯
Skills & Experience: 9+ years of hands-on Ab Initio development and ETL design experience. 1+ year of experience with Amazon EKS (Elastic Kubernetes Service Strong experience with large-scale data ingestion, ETL/ELT design, and distributed systems. Knowledge of Cloudera … ecosystem (Hadoop, Hive, HDFS, Spark, Impala, etc and Hadoop DWH support (3+ years Solid understanding of AWS services and cloud migration concepts. At least one project implementing Ab Initio CDC (ChangeDataCapture) in a data integration/ETL project. Strong SQL skills and ability to write optimized queries. CI/CD Jenkins/DevOps knowledge … is big plus Good understanding of OLTP and OLAP data models and data warehouse fundamentals. Commitment to high code quality, automated testing, and engineering best practices; ability to write reusable code components. Ability to unit test code thoroughly and troubleshoot production issues. Experience with Unix/Linux shell scripting. Ability to work independently and support junior developers. Some More ❯
in integration gap analysis, performance analysis and proposing solutions based on varied product evaluations • Expertise in Integration technologies and platforms such as Mulesoft, files(using FTP, SFTP, MoveIT), MQ, CDC, Message Bus, Hub & Spoke, Pub/Sub • Ability to evaluate current system, run POC and suggest to Business & Tech stakeholders of products fit for purpose addressing the performance & other bottlenecks … in BAU system • Exposure to Security standards for Data in Transit & Rest such as TLS/SSL, Data Encryption • Knowledge on UK GDPR, PII, PCI-DSS data standards • Implementation of designs related to Batch processing patterns involving Talend, Quartz or similar. • Experience of running projects in the capacity as Integration Architect in GDPR related projects involving Data Sourcing, Validations, Integration, Data Disposition, Auditing & Reporting • Ability to treat Data as an Asset, Architect & provide Solutions aligned to it • Working knowledge in Java 8+ with Clean Architecture, Mockito, Java Spring, Spring Boot, Spring Batch, Apache Camel • Extensive experience working in Java, microservice architecture, containerisation, enterprise integration patterns, data storage in a highly available, distributed products. More ❯
tuning techniques Experience with backup and disaster recovery processes Review the current Debezium deployment architecture, including Oracle connector configuration, Kafka integration, and downstream consumers. Analyze Oracle database setup for CDC compatibility (e.g., redo log configuration, supplemental logging, privileges). Evaluate connector performance, lag, and error handling mechanisms. Identify bottlenecks, misconfigurations, or anti-patterns in the current implementation. Provide a detailed … expertise in MongoDB architecture, including replication, sharding, backup, and recovery Strong hands-on experience with Debezium, especially the Oracle connector (LogMiner). Deep understanding of Oracle internals relevant to CDC: redo logs, SCNs, archive log mode, supplemental logging. Proficiency with Apache Kafka and Kafka ecosystem tools. Experience with monitoring and debugging Debezium connectors in production environments. Ability to analyze logs … to comply with CHAMP Security Requirements (including but not limited to CHAMP's IT Security Policies, especially the ISMS Policy and the Acceptable Use Policy, mandatory courses, confidentiality and data protection, use of company assets, and incident reporting). CHAMP Cargosystems is an equal opportunity employer and prohibits discrimination and harassment of any kind. We are committed to the More ❯
requirements Participate with other coordinators across the country in joint advancing public health readiness challenge Define and management the process for systems to test the submission of meaningful use data received directly from providers and help to accept the data when successful Serve as the liaison between the state and facilities for issues regarding meaningful use and immunization … data Strong working knowledge and experience of … Oracle database and PL/SQL scripting Serve as the central point of contact for information and assistance regarding meaningful use for the Centers for Disease Control and Prevention (CDC), the Joint Public Health Informatics Taskforce (JPHIT) the Centers for Medicaid and Medicate Services (CMS), and the Office of National Coordinator for Health Information technology (ONC) Participate in projects and More ❯