well as a set of practical actions to achieve. The workstreams range from frameworks with a focus on operating model, risk and controls, methodologies, data and data controls, front office valuations and controls, various market risk related workstreams to P&L attribution analysis (PAA), IPV and Price Risk … compensation package and benefits Flexible work arrangements Responsibilities: Support the Price Risk Program Initiative lead(s) to drive execution of strategic deliverables aligned to Data, Data Controls and Architecture changes Lead or participate in working groups, workshops and stakeholders to understand data and business requirements, define project … plans and manage timelines Understand the data quality issues aligned with that data set including end to end data flows and controls and ensure these are addressed in the defined target state solution with robust controls Work with relevant leadership as well as outside experts to design More ❯
REQUIRED: DATA ENGINEER - SC CLEARED LOCATION: FULLY REMOTE WITH OCCASIONAL TRAVEL TO CENTRAL LONDON IR35 STATUS: INSIDE DURATION: 6 MONTH INITIAL CONTRACT Seeking a Data Engineer who has a strong understanding of data concepts - data types, data structures, schemas (both JSON and Spark), schema management … etc - Strong understanding of complex JSON manipulation - Experience working with Data Pipelines using a custom Python/PySpark frameworks - Strong understanding of the 4 core Data categories (Reference, Master, Transactional, Freeform) and the implications of each, particularly managing/handling Reference Data. - Strong understanding of Data Security principles - data owners, access controls - row and column level, GDPR etc including experience of handling sensitive datasets - Strong problem solving and analytical skills, particularly able to demonstrate these intuitively (able to work a problem out, not follow a work instruction to resolve) - Experience working in a support More ❯
Data Engineer 4 months - with extensions Remote Active SC clearance required £640 per day inside ir35 REQUIRED Strong understanding of data concepts - data types, data structures, schemas (both JSON and Spark), schema management etc Strong understanding of complex JSON manipulation Experience working with Data Pipelines … using a custom Python/PySpark frameworks Strong understanding of the 4 core Data categories (Reference, Master, Transactional, Freeform) and the implications of each, particularly managing/handling Reference Data. Strong understanding of Data Security principles - data owners, access controls - row and column level, GDPR … both CLI usage and scripting) Git Markdown Scala DESIRABLE Azure SQL Server as a HIVE Metastore DESIRABLE TECHNOLOGIES Azure Databricks Apache Spark Delta Tables Data processing with Python PowerBI (Integration/Data Ingestion) JIRA If you meet the above requirements, please apply for the vacancy to be contacted More ❯