Engineering, or a related field 3+ years of experience in machine learning operations, data engineering, or related roles AWS Proficiency: Strong understanding of AWS services (e.g., EC2, S3, Lambda, SageMaker, ECS) and cloud infrastructure management Programming and ML Frameworks: Proficiency in Python and experience with ML frameworks such as scikit-learn, TensorFlow, or PyTorch CI/CD Experience: Experience More ❯
scikit-learn, Hugging Face, etc.). Strong software engineering background: data structures, algorithms, distributed systems, and version control (Git). Experience designing scalable ML infrastructure on cloud platforms (AWS SageMaker, GCP AI Platform, Azure ML, or equivalent). Solid understanding of data-engineering concepts: SQL/noSQL, data pipelines (Airflow, Prefect, or similar), and batch/streaming frameworks (Spark More ❯
scikit-learn, Hugging Face, etc.). • Strong software engineering background: data structures, algorithms, distributed systems, and version control (Git). • Experience designing scalable ML infrastructure on cloud platforms (AWS SageMaker, GCP AI Platform, Azure ML, or equivalent). • Solid understanding of data-engineering concepts: SQL/noSQL, data pipelines (Airflow, Prefect, or similar), and batch/streaming frameworks (Spark More ❯
scikit-learn, Hugging Face, etc.). Strong software engineering background: data structures, algorithms, distributed systems, and version control (Git). Experience designing scalable ML infrastructure on cloud platforms (AWS SageMaker, GCP AI Platform, Azure ML, or equivalent). Solid understanding of data-engineering concepts: SQL/noSQL, data pipelines (Airflow, Prefect, or similar), and batch/streaming frameworks (Spark More ❯
and maintaining tools that support data science and MLOps/LLMOps workflows. Collaborate with Data Scientists to deploy, serve, and monitor LLMs in real-time and batch environments using AmazonSageMaker, Bedrock Implement Infrastructure-as-Code with AWS CDK, CloudFormation to provision and manage cloud environments. Build and maintain CI/CD pipelines using GitHub Actions, AWS CodePipeline …/MLOps experience with a strong focus on building and delivering scalable infrastructure for ML and AI applications using Python and cloud native technologies Experience with cloud services, especially Amazon Web Services (AWS) - SageMaker, Bedrock, S3, EC2, Lambda, IAM, VPC, ECS/EKS. Proficiency in Infrastructure-as-Code using AWS CDK or CloudFormation. Experience implementing and scaling MLOps … workflows with tools such as MLflow, SageMaker Pipelines. Proven experience building, containerising, and deploying using Docker and Kubernetes. Hands-on experience with CI/CD tools (GitHub Actions, CodePipeline, Jenkins) and version control using Git/GitHub. Strong understanding of DevOps concepts including blue/green deployments, canary releases, rollback strategies, and infrastructure automation. Familiarity with security and compliance More ❯
with data privacy regulations. Technical Competencies The role is a hands-on technical leadership role with advanced experience in at least most of the following technologies Cloud Platforms: AWS (Amazon Web Services): Knowledge of services like S3, EC2, Lambda, RDS, Redshift, EMR, SageMaker, Glue, and Kinesis. Azure: Proficiency in services like Azure Blob Storage, Azure Data Lake, VMs … Kafka, and Apache Flink. ETL Tools: AWS Glue, Azure Data Factory, Talend, and Apache Airflow. AI & Machine Learning: Frameworks: TensorFlow, PyTorch, Scikit-learn, Keras, and MXNet. AI Services: AWS SageMaker, Azure Machine Learning, Google AI Platform. DevOps & Infrastructure as Code: Containerization: Docker and Kubernetes. Infrastructure Automation: Terraform, Ansible, and AWS CloudFormation. API & Microservices: API Development: RESTful API design and … Formation, Azure Purview. Data Security Tools: AWS Key Management Service (KMS), Azure Key Vault. Data Analytics & BI: Visualization Tools: Tableau, Power BI, Looker, and Grafana. Analytics Services: AWS Athena, Amazon QuickSight, Azure Stream Analytics. Development & Collaboration Tools: Version Control: Git (and platforms like GitHub, GitLab). CI/CD Tools: Jenkins, Travis CI, AWS CodePipeline, Azure DevOps. Other Key More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
ZipRecruiter
scalable Data Science, MLOps, and LLMOps workflows across the organisation. Drive strategy and execution for deploying, serving, and monitoring large models (LLMs) in real-time and batch environments using AmazonSageMaker, Bedrock, and related services. Guide the use of Infrastructure-as-Code (IaC) practices with AWS CDK and CloudFormation to provision and manage secure and maintainable cloud environments. … Extensive experience in DevOps/MLOps roles with demonstrated impact in building, scaling, and securing ML/AI infrastructure in cloud- environments. Strong experience with AWS services such as SageMaker, Bedrock, S3, EC2, Lambda, IAM, VPC, ECS/EKS, with a strong command of cloud solution architecture. Advanced proficiency in Infrastructure-as-Code practices using AWS CDK, CloudFormation, or … Terraform in production environments. Proven track record designing and operationalised end-to-end MLOps pipelines with tools such as MLflow, SageMaker Pipelines, or equivalent frameworks. Extensive experience building and operating containerised applications using Docker and Kubernetes, including production-grade orchestration and monitoring. Deep experience with CI/CD best practices with hands-on expertise in GitHub Actions, Jenkins, and More ❯
working with various data types: text, image, and video data Familiarity with AI/ML cloud implementations (AWS, Azure, GCP, NVidia) and cloud-based AI/ML services (e.g., AmazonSageMaker, Azure ML) Domain experience: Industry knowledge and experience in one of the following; BFSI, Manufacturing, Retail/Consumer Goods, Healthcare, Energy or Utilities, Tech Extensive experience working More ❯
model selection, hyperparameter tuning, model validation, model evaluation and deployment for inference Hands-on expertise in deploying ML models at scale in production environments (via platforms such as AWS SageMaker or Azure ML), and optimising models for efficient inference using formats like ONNX and TensorRT Proficiency in Python and ML/engineering frameworks such as PyTorch, TensorFlow (including Keras More ❯
About Apexon: Apexon brings together distinct core competencies – in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life sciences – to help businesses capitalize on the More ❯
Social network you want to login/join with: Apexon brings together distinct core competencies – in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life More ❯
Social network you want to login/join with: Apexon brings together distinct core competencies – in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life More ❯
Social network you want to login/join with: Apexon brings together distinct core competencies – in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life More ❯
Social network you want to login/join with: Apexon brings together distinct core competencies – in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life More ❯
Social network you want to login/join with: Apexon brings together distinct core competencies – in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life More ❯
Social network you want to login/join with: Apexon brings together distinct core competencies – in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life More ❯
for data insights Data Bricks/Data QISQL for data access and processing (PostgreSQL preferred, but general SQL knowledge is important) Latest Data Science platforms (e.g., Databricks, Dataiku, AzureML, SageMaker) and frameworks (e.g., TensorFlow, MXNet, scikit-learn) Software engineering practices (coding standards, unit testing, version control, code review) Hadoop distributions (Cloudera, Hortonworks), NoSQL databases (Neo4j, Elastic), streaming technologies (Spark More ❯
with Single Sign-On/OIDC integration and a deep understanding of OAuth, JWT/JWE/JWS. Solid understanding of backend performance optimization and debugging. Knowledge of AWS SageMaker and data analytics tools. Proficiency in frameworks TensorFlow, PyTorch, or similar. Preferred Qualifications, Capabilities, and Skills: Familiarity with LangChain, Langgraph, or any Agentic Frameworks is a strong plus. Python More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
winning work with private and public sector clients, including participating in RFI/RFP processes, preparing bids, and delivering presentations. Familiarity with data science platforms (e.g., Databricks, Dataiku, AzureML, SageMaker) and frameworks (e.g., Keras, TensorFlow, PyTorch, scikit-learn). Experience deploying solutions on Cloud platforms (AWS, Azure, Google Cloud) using provisioning tools like Terraform. Proven ability to deploy technologies More ❯
within large organisations, through e.g. the RFI/RFP process, as preferred bidder, documented bids and face to face presentations. Experience of data science platforms (e.g. Databricks, Dataiku, AzureML, SageMaker) and machine learning frameworks (e.g. Keras, Tensorflow, PyTorch, scikit-learn) Cloud platforms – demonstrable experience of building and deploying solutions to Cloud (e.g. AWS, Azure, Google Cloud) including Cloud provisioning More ❯
City of London, London, United Kingdom Hybrid / WFH Options
LHH
within large organisations, through e.g. the RFI/RFP process, as preferred bidder, documented bids and face to face presentations. Experience of data science platforms (e.g. Databricks, Dataiku, AzureML, SageMaker) and machine learning frameworks (e.g. Keras, Tensorflow, PyTorch, scikit-learn) Cloud platforms – demonstrable experience of building and deploying solutions to Cloud (e.g. AWS, Azure, Google Cloud) including Cloud provisioning More ❯
PyTorch, or similar. Experience with data science tools and platforms (e.g., Jupyter, Pandas, NumPy, MLFlow). Familiarity with cloud-based AI tools and infrastructure, especially within the AWS ecosystem (SageMaker, Lambda, etc.). Strong understanding of data structures, algorithms, and statistical analysis. Experience working with ETL pipelines and structured/unstructured data. Solid grasp of software engineering principles, version More ❯
to influence across teams Familiarity with MLOps tooling and scalable cloud platforms Nice To Have: Experience setting or influencing data science strategy in a growing organisation Familiarity with AWS SageMaker or similar cloud-based ML tools Published work, open-source contributions or community involvement MSc or PhD in a quantitative field (e.g. statistics, computer science, mathematics) What We Can More ❯
with Single Sign-On/OIDC integration and a deep understanding of OAuth, JWT/JWE/JWS. Solid understanding of backend performance optimization and debugging. Knowledge of AWS SageMaker and data analytics tools. Proficiency in frameworks TensorFlow, PyTorch, or similar. Preferred Qualifications, Capabilities, And Skills Familiarity with LangChain, Langgraph, or any Agentic Frameworks is a strong plus. Python More ❯