City of London, London, United Kingdom Hybrid / WFH Options
McCabe & Barton
expertise in some of the following: Python, SQL, Scala, and Java for data engineering. Strong experience with big data tools (Apache Spark, Hadoop, Databricks, Dask) and cloud platforms (AWS, Azure, GCP). Proficient in data modelling (relational, NoSQL, dimensional) and DevOps automation (Docker, Kubernetes, Terraform, CI/CD). Skilled More ❯
years of experience in designing, developing and deploying production-level deep learning recommendation models with a proven business impact. Fluency in Python, Pandas/Dask, SQL, PyTorch or Tensorflow. Ability to write readable and maintainable code. Strong communication and storytelling skills with both technical and non-technical audiences. Ability to More ❯
Azure/GCP and AWS. Experience in automation of performance testing. Data environments exposure is a plus (Airflow, EMR, SageMaker, Ray, Tensorflow, MLflow, Kubeflow, Dask). Working conditions: Occasional out of hour's conferencing with overseas colleagues. Occasional out of hours or weekend work. A workplace that supports & role models More ❯
of Python, R, and Java. Experience scaling machine learning on data and compute grids. Proficiency with Kubernetes, Docker, Linux, and cloud computing. Experience with Dask, Airflow, and MLflow. MLOps, CI, Git, and Agile processes. Why you do not want to miss this career opportunity? We are a mission-driven firm More ❯
cases. Proficient in one of the deep learning stacks such as PyTorch or Tensorflow. Working knowledge of parallelisation and async paradigms in Python, Spark, Dask, Apache Ray. An awareness and interest in economic, financial and general business concepts and terminology. Excellent written and verbal command of English. Strong problem-solving More ❯
/CD, and modern development workflows Solid understanding of data structures, algorithms, and performance tuning Nice-to-Have: Experience with distributed computing frameworks (e.g. Dask, Spark) Exposure to cloud platforms (AWS, GCP, Azure) Familiarity with SQL and relational databases Experience working in agile environments More ❯
ML models into production environments, including both batch and real-time/streaming contexts Proficiency working with distributed computing frameworks such as Apache Spark , Dask, or similar Experience with cloud-native ML deployment , particularly on AWS , using services like ECS, EKS, Fargate, Lambda, S3, and more Familiarity with orchestration and More ❯
to interpret performance metrics (e.g., CPU, memory, threads, file handles). Knowledge and experience in distributed computing - parallel computation on a single machine like DASK, Distributed processing on Public Cloud. Knowledge of SDLC and experience in working through the entire life cycle of the project from start to end. ABOUT More ❯
C/C++ HPC experience: development and operations of highly distributed software systems and distributed, multi-core data driven processing algorithm development (such as Dask, MPI and/or OpenMP More ❯
Staines, Middlesex, United Kingdom Hybrid / WFH Options
Industrial and Financial Systems
IFS is a billion-dollar revenue company with 7000+ employees on all continents. Our leading AI technology is the backbone of our award-winning enterprise software solutions, enabling our customers to be their best when it really matters-at the More ❯