solve difficult problems in the areas of Anti-Money Laundering/Counter Terrorist Financing, Identity Authentication & Verification, Fraud and Credit Risk mitigation and Customer Data Management. You can learn more about LexisNexis Risk at the link below, risk.lexisnexis.com About our Team: The RiskNarrative team operates as a established innovative … pain point or need based on your deep understanding of the customer. You will prioritize and communicate production customer issues defects based on relevant data for timely resolution. You will also lead the demo to key stakeholders at program milestones. You will be part of our multi functional product … the development and testing team to enhance our product offering. You will be analysing needs to develop a configuration strategy. As well as leading dataintegration efforts and partnering with product analysts, product, and other technical leaders to address new needs and deliver a seamless technology experience to more »
Mandatory Skills You need to have the below skills. At least 12+ Years of IT Experience with Deep understanding of component understanding around Spark DataIntegration (PySpark, Scripting, variable setting etc.), Spark SQL, Spark Explain plans. Spark SME - Be able to analyse Spark code failures through Spark Plans … was used. Spark SME - Be able to review PySpark and Spark SQL jobs and make performance improvement recommendations. Spark - SME Be able to understand Data Frames/Resilient Distributed Data Sets and understand any memory related problems and make corrective recommendations. Monitoring -Spark jobs using wider tools such … office and accept changes as per customer/Wipro policies. Your responsibilities As a Spark Architect you will be working for client - GDT (Global Data Technology) Team, you will be responsible for: Working on an Enterprise scale Cloud infrastructure and Cloud Services in one of the Clouds (GCP). more »
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Elliot Marsh
full project lifecycle. Oracle System Tester - Key Responsibilities: Develop and execute test plans, test cases, and test scripts for Oracle applications Perform business-critical data initialisation system testing, system-to-system dataintegration testing, user acceptance testing, and regression testing Identify, document, and track defects and issues more »
using Prophecy low-code no-code and available converters. Converted code is causing failures/performance issues. Skills: Spark Architecture – component understanding around Spark DataIntegration (PySpark, scripting, variable setting etc.), Spark SQL, Spark Explain plans. Spark SME – Be able to analyse Spark code failures through Spark Plans … correcting recommendations. Spark SME – Be able to review PySpark and Spark SQL jobs and make performance improvement recommendations. Spark – SME Be able to understand Data Frames/Resilient Distributed Data Sets and understand any memory related problems and make corrective recommendations. Monitoring – Be able to monitor Spark jobs more »
responsible for contributing to all phases of the development lifecycle of UKIB’s Golden Source Analytics Platform (Microsoft Fabric). Aligned to the UKIB Data Strategy, this role will ensure the seamless provision of data for consumption within reporting & analytical data products via curated layers and reusable … semantic models. Core role accountabilities: Design, Build, Support and Develop UKIB’s Golden Source data platform (Microsoft Fabric). Ingest data with shortcuts, pipelines or dataflows to support Power BI and AI analytical use cases. Establish both curated and semantic layers within UKIB’s Golden Source data platform, aligned to UKIB Data Architecture and using Microsoft Fabric capabilities. Actively contribute to all stages of the data platform lifecycle as required (including design, development, testing & delivery phases) in line with our agreed architecture and design standards. Optimise table design, data models and semantic models more »
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Damia Group Ltd
and drive overall program objectives. Responsibilities: Working on an Enterprise scale Cloud infrastructure and Cloud Services in one of the Clouds (GCP). Drive DataIntegration upgrade to PySpark Collaboration with multiple customer stakeholders Knowledge of working with Cloud Databases Excellent communication and solution presentation skills. Able to … failures through Spark Plans and make correcting recommendations Able to review PySpark and Spark SQL jobs and make performance improvement recommendations Able to understand Data Frames/Resilient Distributed Data Sets and understand any memory related problems and make corrective recommendations Able to monitor Spark jobs using wider … time libraries are used by PySpark code. Mandatory Skills At least 12+ Years of IT Experience with Deep understanding of component understanding around Spark DataIntegration (PySpark, scripting, variable setting etc.), Spark SQL, Spark Explain plans . Spark SME - Be able to analyse Spark code failures through Spark more »