systems, deploying LLMs, and operationalizing models in production. Key Responsibilities: Design, develop, and deploy ML, Deep Learning, and LLM solutions. Implement scalable ML and data pipelines in Databricks (PySpark, DeltaLake, MLflow). Build automated MLOps pipelines with model tracking, CI/CD, and registry. Deploy and operationalize LLMs , including fine-tuning, prompt optimization, and monitoring. Architect secure … Mentor engineers, enforce best practices, and lead design/architecture reviews. Required Skills & Experience: 5+ years in ML/AI solution development. Recent hands-on experience with Databricks, PySpark, DeltaLake, MLflow . Experience with LLMs (Hugging Face, LangChain, Azure OpenAI) . Strong MLOps, CI/CD, and model monitoring experience. Proficiency in Python, PyTorch/TensorFlow, FastAPI More ❯
Dunstable, Bedfordshire, South East, United Kingdom
FBI &TMT
systems, deploying LLMs, and operationalizing models in production. Key Responsibilities: Design, develop, and deploy ML, Deep Learning, and LLM solutions. Implement scalable ML and data pipelines in Databricks (PySpark, DeltaLake, MLflow). Build automated MLOps pipelines with model tracking, CI/CD, and registry. Deploy and operationalize LLMs , including fine-tuning, prompt optimization, and monitoring. Architect secure … Mentor engineers, enforce best practices, and lead design/architecture reviews. Required Skills & Experience: 5+ years in ML/AI solution development. Recent hands-on experience with Databricks, PySpark, DeltaLake, MLflow . Experience with LLMs (Hugging Face, LangChain, Azure OpenAI) . Strong MLOps, CI/CD, and model monitoring experience. Proficiency in Python, PyTorch/TensorFlow, FastAPI More ❯
cambridge, east anglia, united kingdom Hybrid / WFH Options
KDR Talent Solutions
a scalable, company-wide Databricks Lakehouse platform on AWS. Be the hands-on technical expert, building and optimising robust ELT/ETL pipelines using Python, Spark, and Databricks (e.g., Delta Live Tables, Databricks Workflows). Work with unique, complex, and high-volume datasets from IoT-enabled robotic systems, manufacturing lines, and core business functions. Partner with data scientists and … BI teams to establish best-in-class data models, governance, and data quality standards within Delta Lake. Evangelise the benefits of the Lakehouse across the organisation, championing best practices and mentoring other engineers to build their Databricks capability. Own the data platform's roadmap, ensuring it is scalable, reliable, and secure as the company grows. What You'll Need … Proven, deep commercial experience with Databricks. You must have hands-on expertise with DeltaLake and the Lakehouse paradigm. Strong expertise in the AWS data ecosystem (e.g., S3, AWS Glue, Kinesis, IAM) and a deep understanding of how to build, secure, and optimise a Databricks platform within it. Expert-level Python and SQL skills, specifically for data engineering More ❯
Cambridgeshire, England, United Kingdom Hybrid / WFH Options
KDR Talent Solutions
a scalable, company-wide Databricks Lakehouse platform on AWS. Be the hands-on technical expert, building and optimising robust ELT/ETL pipelines using Python, Spark, and Databricks (e.g., Delta Live Tables, Databricks Workflows). Work with unique, complex, and high-volume datasets from IoT-enabled robotic systems, manufacturing lines, and core business functions. Partner with data scientists and … BI teams to establish best-in-class data models, governance, and data quality standards within Delta Lake. Evangelise the benefits of the Lakehouse across the organisation, championing best practices and mentoring other engineers to build their Databricks capability. Own the data platform's roadmap, ensuring it is scalable, reliable, and secure as the company grows. What You'll Need … Proven, deep commercial experience with Databricks. You must have hands-on expertise with DeltaLake and the Lakehouse paradigm. Strong expertise in the AWS data ecosystem (e.g., S3, AWS Glue, Kinesis, IAM) and a deep understanding of how to build, secure, and optimise a Databricks platform within it. Expert-level Python and SQL skills, specifically for data engineering More ❯
cambridgeshire, east anglia, united kingdom Hybrid / WFH Options
KDR Talent Solutions
a scalable, company-wide Databricks Lakehouse platform on AWS. Be the hands-on technical expert, building and optimising robust ELT/ETL pipelines using Python, Spark, and Databricks (e.g., Delta Live Tables, Databricks Workflows). Work with unique, complex, and high-volume datasets from IoT-enabled robotic systems, manufacturing lines, and core business functions. Partner with data scientists and … BI teams to establish best-in-class data models, governance, and data quality standards within Delta Lake. Evangelise the benefits of the Lakehouse across the organisation, championing best practices and mentoring other engineers to build their Databricks capability. Own the data platform's roadmap, ensuring it is scalable, reliable, and secure as the company grows. What You'll Need … Proven, deep commercial experience with Databricks. You must have hands-on expertise with DeltaLake and the Lakehouse paradigm. Strong expertise in the AWS data ecosystem (e.g., S3, AWS Glue, Kinesis, IAM) and a deep understanding of how to build, secure, and optimise a Databricks platform within it. Expert-level Python and SQL skills, specifically for data engineering More ❯