cloud platforms (e.g., AWS , Azure , GCP ). Experience deploying ML models or managing AI/ML workflows in production. Working knowledge of big data technologies like Spark , Hive , or Hadoop . Familiarity with MLOps tools (e.g., MLflow , Kubeflow , DataRobot ). Education Bachelor’s degree in Computer Science , Software Engineering , or a related technical field — or equivalent practical experience. Why More ❯
functional teams . Preferred Skills High-Performance Computing (HPC) and AI workloads for large-scale enterprise solutions. NVIDIA CUDA, cuDNN, TensorRT experience for deep learning acceleration. Big Data platforms (Hadoop, Spark) for AI-driven analytics in professional services. Pls share CV at payal.c@hcltech.com More ❯
orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala is mandated in some More ❯
Engineering, Mathematics, Finance, etc. Proficiency in Python, SQL , and one or more: R, Java, Scala Experience with relational/NoSQL databases (e.g., PostgreSQL, MongoDB) Familiarity with big data tools (Hadoop, Spark, Kafka), cloud platforms (Azure, AWS, GCP), and workflow tools (Airflow, Luigi) Bonus: experience with BI tools , API integrations , and graph databases Why Join Us? Work with large-scale More ❯
East London, London, United Kingdom Hybrid / WFH Options
McGregor Boyall Associates Limited
. Strong knowledge of LLM algorithms and training techniques . Experience deploying models in production environments. Nice to Have: Experience in GenAI/LLMs Familiarity with distributed computing tools (Hadoop, Hive, Spark). Background in banking, risk management, or capital markets . Why Join? This is a unique opportunity to work at the forefront of AI innovation in financial More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Anson McCade
Knowledge of CI/CD processes and infrastructure-as-code. • Eligible for SC clearance (active clearance highly advantageous). Desirable Skills • Exposure to large data processing frameworks (e.g., Spark, Hadoop). • Experience deploying data via APIs and microservices. • AWS certifications (Solution Architect Associate, Data Analytics Speciality, etc.). • Experience in public sector programmes or government frameworks. Package & Benefits More ❯
Azure Functions. Strong knowledge of scripting languages (e.g., Python, Bash, PowerShell) for automation and data transformation. Proficient in working with databases, data warehouses, and data lakes (e.g., SQL, NoSQL, Hadoop, Redshift). Familiarity with APIs and web services for integrating external systems and applications into orchestration workflows. Hands-on experience with data transformation and ETL (Extract, Transform, Load) processes. More ❯
security principles, system performance optimisation, and infrastructure reliability. Experience working on large-scale, production-grade systems with distributed architectures. Nice to Have: Exposure to tools like Elasticsearch/Kibana, Hadoop/HBase, OpenSearch, or VPN/proxy architectures. Ideal Candidate will: Bring technical vision, initiative, and a passion for exploring and implementing emerging technologies. Are a natural technical leader More ❯
Senior Data Scientist PhD ?" Canary Wharf We urgently require a Senior Data Scientist with at least 5 to 10 years proven track record as Data Scientist. Must have a PhD ideally in Maths or Physics and an excellent understanding of More ❯