london (city of london), south east england, united kingdom
HCLTech
create strategic Road-map for large enterprise initiatives Must have experience in Legacy Modernization programs Should be proficient at collaborating with cross functional teams Strong Background and experience in Data Ingestions, Transformation, Modeling and Performance tuning. Should have experience in designing and developing dashboards Strong Knowledge in Hadoop, Kafka, SQL/NoSQL Should have experience in creating roadmap to More ❯
really do and delivering rapid-fire prototypes that become mission-critical apps. What you’ll tackle Spot the pain-points across Equities, Credit, Rates and central Risk teams; translate data-flow headaches into buildable tech projects. Prototype at pace in Python—leveraging LLM co-pilots—to stand-up analytics services, REST APIs and lightweight Dash/React front-ends. … Own the full stack: dataingestion (SQL, files, feeds), business logic, web/UI, CI pipelines and Linux deployment. Codify best practice for safe, auditable LLM-assisted development; lead code reviews and knowledge-share sessions. Stay user-facing —white-board with quants, ask clarifying questions, iterate live and keep comms flowing. You in a nutshell 5 + years … building production-grade Python systems for data-heavy businesses; comfortable with tests, version control and optimisation. Confident SQL , exposure to REST and messaging (Rabbit, Kafka, etc.), and enough Linux to troubleshoot on the command line. UI chops in Dash, Flask or React to whip up proof-of-concept dashboards. Solid grounding in financial products & risk metrics —equities, bonds and More ❯
really do and delivering rapid-fire prototypes that become mission-critical apps. What you’ll tackle Spot the pain-points across Equities, Credit, Rates and central Risk teams; translate data-flow headaches into buildable tech projects. Prototype at pace in Python—leveraging LLM co-pilots—to stand-up analytics services, REST APIs and lightweight Dash/React front-ends. … Own the full stack: dataingestion (SQL, files, feeds), business logic, web/UI, CI pipelines and Linux deployment. Codify best practice for safe, auditable LLM-assisted development; lead code reviews and knowledge-share sessions. Stay user-facing —white-board with quants, ask clarifying questions, iterate live and keep comms flowing. You in a nutshell 5 + years … building production-grade Python systems for data-heavy businesses; comfortable with tests, version control and optimisation. Confident SQL , exposure to REST and messaging (Rabbit, Kafka, etc.), and enough Linux to troubleshoot on the command line. UI chops in Dash, Flask or React to whip up proof-of-concept dashboards. Solid grounding in financial products & risk metrics —equities, bonds and More ❯
design, test and deploy AI projects. Azure AI/ML Engineer, key responsibilities: Build, develop and deploy AI applications using Python Design and Develop AI services Setup and develop dataingestion pipelines and components Develop search related components using Azure AI Search Developing and deploying AI/ML models Built and maintain scalable, high-performance AI apps on More ❯
I am working with a client in the education sector who are looking for a data engineer with experience across architect & strategy to join on a part-time 12 month contract.1-2 days per weekFully remoteOutside IR35Immediate start12 month contract Essential Been to school in the UK DataIngestion of APIs GCP based (Google Cloud Platform) Snowflake More ❯
as of 12 months ending December 2024 totaled $13.8 billion. Experience : Minimum 10+ Years Strong Knowledge in Hadoop, Kafka, SQL/NoSQL Specialization in designing and implementing large-scale data pipelines, ETL processes, and distributed systems Should be able to work independenty with minimal help/guidance Good Understanding of Airflow,Data Fusion and Data Flow Strong … Background and experience in Data Ingestions,Transformation,Modeling and Performance tuning. One migration Experience from Cornerstone to GCP will be added advantage Suppport the design and development of BigData echosystem Experience in building complex SQL Queries Strong Communication Skills More ❯
as of 12 months ending December 2024 totaled $13.8 billion. Experience : Minimum 10+ Years Strong Knowledge in Hadoop, Kafka, SQL/NoSQL Specialization in designing and implementing large-scale data pipelines, ETL processes, and distributed systems Should be able to work independenty with minimal help/guidance Good Understanding of Airflow,Data Fusion and Data Flow Strong … Background and experience in Data Ingestions,Transformation,Modeling and Performance tuning. One migration Experience from Cornerstone to GCP will be added advantage Suppport the design and development of BigData echosystem Experience in building complex SQL Queries Strong Communication Skills More ❯
focusing on ultra-reliable, low-latency integrations with major crypto exchanges. Key Responsibilities Design and implement exchange connectivity modules (REST, WebSocket, FIX, proprietary APIs). Optimize order entry, market dataingestion, and account management for performance and reliability. Contribute to a fault-tolerant, scalable connectivity framework using Rust and Typescript. Collaborate with trading, infrastructure, and product teams to More ❯
focusing on ultra-reliable, low-latency integrations with major crypto exchanges. Key Responsibilities Design and implement exchange connectivity modules (REST, WebSocket, FIX, proprietary APIs). Optimize order entry, market dataingestion, and account management for performance and reliability. Contribute to a fault-tolerant, scalable connectivity framework using Rust and Typescript. Collaborate with trading, infrastructure, and product teams to More ❯