London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
learn, TensorFlow, PyTorch, or Hugging Face Solid understanding of machine learning fundamentals and performance evaluation techniques Experience working in cloud platforms (AWS, GCP, or Azure) with MLOps tools (e.g. MLflow, SageMaker, Vertex AI) Comfortable working independently and delivering high-quality work to tight timelines Experience working in fast-paced environments or scale-up settings Company Market leading financial services (fintech More ❯
backend development. Solid understanding of data processing and engineering workflows. Experience building APIs or services to support data or ML applications. Familiarity with ML model lifecycle and tooling (e.g. MLflow, Airflow, Docker). Strong problem-solving skills and the ability to work autonomously in a dynamic environment. DESIRABLE SKILLS Experience supporting LLM training or retrieval-augmented generation (RAG). Familiarity More ❯
across the Azure stack. Your responsibilities will include: Designing and deploying ETL pipelines using PySpark and Delta Lake on Databricks. Supporting the deployment and operationalisation of ML models with MLflow and Databricks Workflows. Building out reusable data products and feature stores for data science teams. Tuning performance across clusters, jobs, and workflows. Migrating legacy systems (SSIS/SQL) to Databricks … Strong background in data engineering and distributed processing . Hands-on knowledge of Azure Data Lake , Data Factory , or similar orchestration tools. Experience with ML model deployment , preferably using MLflow or similar tools. Proficient in SQL , Python , and cloud-based data pipelines. Comfortable in fast-paced, agile delivery environments. DESIRABLE SKILLS Experience building or integrating feature stores . Familiarity with More ❯