Engineering, or a related field 3+ years of experience in machine learning operations, data engineering, or related roles AWS Proficiency: Strong understanding of AWS services (e.g., EC2, S3, Lambda, SageMaker, ECS) and cloud infrastructure management Programming and ML Frameworks: Proficiency in Python and experience with ML frameworks such as scikit-learn, TensorFlow, or PyTorch CI/CD Experience: Experience More ❯
scikit-learn, Hugging Face, etc.). Strong software engineering background: data structures, algorithms, distributed systems, and version control (Git). Experience designing scalable ML infrastructure on cloud platforms (AWS SageMaker, GCP AI Platform, Azure ML, or equivalent). Solid understanding of data-engineering concepts: SQL/noSQL, data pipelines (Airflow, Prefect, or similar), and batch/streaming frameworks (Spark More ❯
scikit-learn, Hugging Face, etc.). Strong software engineering background: data structures, algorithms, distributed systems, and version control (Git). Experience designing scalable ML infrastructure on cloud platforms (AWS SageMaker, GCP AI Platform, Azure ML, or equivalent). Solid understanding of data-engineering concepts: SQL/noSQL, data pipelines (Airflow, Prefect, or similar), and batch/streaming frameworks (Spark More ❯
London, England, United Kingdom Hybrid / WFH Options
BBC
and maintaining tools that support data science and MLOps/LLMOps workflows. Collaborate with Data Scientists to deploy, serve, and monitor LLMs in real-time and batch environments using AmazonSageMaker, Bedrock Implement Infrastructure-as-Code with AWS CDK, CloudFormation to provision and manage cloud environments. Build and maintain CI/CD pipelines using GitHub Actions, AWS CodePipeline …/MLOps experience with a strong focus on building and delivering scalable infrastructure for ML and AI applications using Python and cloud native technologies Experience with cloud services, especially Amazon Web Services (AWS) - SageMaker, Bedrock, S3, EC2, Lambda, IAM, VPC, ECS/EKS. Proficiency in Infrastructure-as-Code using AWS CDK or CloudFormation. Experience implementing and scaling MLOps … workflows with tools such as MLflow, SageMaker Pipelines. Proven experience building, containerising, and deploying using Docker and Kubernetes. Hands-on experience with CI/CD tools (GitHub Actions, CodePipeline, Jenkins) and version control using Git/GitHub. Strong understanding of DevOps concepts including blue/green deployments, canary releases, rollback strategies, and infrastructure automation. Familiarity with security and compliance More ❯
with data privacy regulations. Technical Competencies The role is a hands-on technical leadership role with advanced experience in at least most of the following technologies Cloud Platforms: AWS (Amazon Web Services): Knowledge of services like S3, EC2, Lambda, RDS, Redshift, EMR, SageMaker, Glue, and Kinesis. Azure: Proficiency in services like Azure Blob Storage, Azure Data Lake, VMs … Kafka, and Apache Flink. ETL Tools: AWS Glue, Azure Data Factory, Talend, and Apache Airflow. AI & Machine Learning: Frameworks: TensorFlow, PyTorch, Scikit-learn, Keras, and MXNet. AI Services: AWS SageMaker, Azure Machine Learning, Google AI Platform. DevOps & Infrastructure as Code: Containerization: Docker and Kubernetes. Infrastructure Automation: Terraform, Ansible, and AWS CloudFormation. API & Microservices: API Development: RESTful API design and … Formation, Azure Purview. Data Security Tools: AWS Key Management Service (KMS), Azure Key Vault. Data Analytics & BI: Visualization Tools: Tableau, Power BI, Looker, and Grafana. Analytics Services: AWS Athena, Amazon QuickSight, Azure Stream Analytics. Development & Collaboration Tools: Version Control: Git (and platforms like GitHub, GitLab). CI/CD Tools: Jenkins, Travis CI, AWS CodePipeline, Azure DevOps. Other Key More ❯
working with various data types: text, image, and video data Familiarity with AI/ML cloud implementations (AWS, Azure, GCP, NVidia) and cloud-based AI/ML services (e.g., AmazonSageMaker, Azure ML) Domain experience: Industry knowledge and experience in one of the following; BFSI, Manufacturing, Retail/Consumer Goods, Healthcare, Energy or Utilities, Tech Extensive experience working More ❯
About Apexon: Apexon brings together distinct core competencies – in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life sciences – to help businesses capitalize on the More ❯
for data insights Data Bricks/Data QISQL for data access and processing (PostgreSQL preferred, but general SQL knowledge is important) Latest Data Science platforms (e.g., Databricks, Dataiku, AzureML, SageMaker) and frameworks (e.g., TensorFlow, MXNet, scikit-learn) Software engineering practices (coding standards, unit testing, version control, code review) Hadoop distributions (Cloudera, Hortonworks), NoSQL databases (Neo4j, Elastic), streaming technologies (Spark More ❯
with Single Sign-On/OIDC integration and a deep understanding of OAuth, JWT/JWE/JWS. Solid understanding of backend performance optimization and debugging. Knowledge of AWS SageMaker and data analytics tools. Proficiency in frameworks TensorFlow, PyTorch, or similar. Preferred Qualifications, Capabilities, and Skills: Familiarity with LangChain, Langgraph, or any Agentic Frameworks is a strong plus. Python More ❯
PyTorch, or similar. Experience with data science tools and platforms (e.g., Jupyter, Pandas, NumPy, MLFlow). Familiarity with cloud-based AI tools and infrastructure, especially within the AWS ecosystem (SageMaker, Lambda, etc.). Strong understanding of data structures, algorithms, and statistical analysis. Experience working with ETL pipelines and structured/unstructured data. Solid grasp of software engineering principles, version More ❯
City of London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
within large organisations, through e.g. the RFI/RFP process, as preferred bidder, documented bids and face to face presentations. Experience of data science platforms (e.g. Databricks, Dataiku, AzureML, SageMaker) and machine learning frameworks (e.g. Keras, Tensorflow, PyTorch, scikit-learn) Cloud platforms – demonstrable experience of building and deploying solutions to Cloud (e.g. AWS, Azure, Google Cloud) including Cloud provisioning More ❯
with Single Sign-On/OIDC integration and a deep understanding of OAuth, JWT/JWE/JWS. Solid understanding of backend performance optimization and debugging. Knowledge of AWS SageMaker and data analytics tools. Proficiency in frameworks TensorFlow, PyTorch, or similar. Preferred Qualifications, Capabilities, and Skills: Familiarity with LangChain, Langgraph, or any Agentic Frameworks is a strong plus. Python More ❯
London, England, United Kingdom Hybrid / WFH Options
Novo Nordisk
computing, both on-premises and on cloud platforms, and experience in building end-to-end scalable ML infrastructure with on-premise DGX or cloud platforms including AWS EKS/SageMaker, Azure Machine Learning/AKS, or common ML platforms (ClearML, MLflow, Weights and Biases). Cloud & Automation: Strong understanding of AWS, Azure, containerization/Kubernetes, multiple automation/DevOps More ❯
developing and deploying data-driven solutions, with a particular focus on OCR use-cases and LLM applications within AWS environments. Key Responsibilities: - AWS Data Science Tools: Hands-on with SageMaker, Lambda, Step Functions, S3, Athena. - OCR Development: Experience with Amazon Textract, Tesseract, and LLM-based OCR. - Python Expertise: Skilled in Pandas, NumPy, scikit-learn, PyTorch, Hugging Face Transformers … GDPR), PII handling. - Agile Working: Experience in Agile/Scrum teams (Jira, Azure DevOps). Essential Skills & Experience: - 5-7 years in a Data Science role - Strong experience with Amazon Bedrock and SageMaker - Python integration with APIs (e.g., ChatGPT) - Demonstrable experience with LLMs in AWS - Proven delivery of OCR and document parsing pipelines More ❯
interested in building data and science solutions to drive strategic direction? Based in Tokyo, the Science and Data Technologies team designs, builds, operates, and scales the data infrastructure powering Amazon's retail business in Japan. Working with a diverse, global team serving customers and partners worldwide, you can make a significant impact while continuously learning and experimenting with cutting … working with large-scale data, excels in highly complex technical environments, and above all, has a passion for data. You will lead the development of data solutions to optimize Amazon's retail operations in Japan, turning business needs into robust data pipelines and architecture. Leveraging your deep experience in data infrastructure and passion for enabling data-driven business impact … and business teams to identify and implement strategic data opportunities. Key job responsibilities Your key responsibilities include: - Create data solutions with AWS services such as Redshift, S3, EMR, Lambda, SageMaker, CloudWatch etc. - Implement robust data solutions and scalable data architectures. - Develop and improve the operational excellence, data quality, monitoring and data governance. BASIC QUALIFICATIONS - Bachelor's degree in computer More ❯
structures, system design, testing, and performance optimization Excellent communication and collaboration skills across technical and non-technical teams Preferred Qualifications, Capabilities, and Skills: Experience with AWS cloud stack (S3, SageMaker, Lambda, ECS, etc.) Experience working with structured data, tabular models, and metadata-driven platforms Experience with regulated data systems, enterprise controls, or secure data processing workflows Contributions to open More ❯
structures, system design, testing, and performance optimization Excellent communication and collaboration skills across technical and non-technical teams Preferred Qualifications, Capabilities, and Skills: Experience with AWS cloud stack (S3, SageMaker, Lambda, ECS, etc.) Experience working with structured data, tabular models, and metadata-driven platforms Experience with regulated data systems, enterprise controls, or secure data processing workflows Contributions to open More ❯
London, England, United Kingdom Hybrid / WFH Options
Highnic
agent solutions that integrate with enterprise systems and cloud platforms. Develop and optimize RESTful and GraphQL APIs to facilitate AI-driven interactions. Utilize AWS services (Lambda, S3, API Gateway, SageMaker, DynamoDB, ECS, etc.) to deploy scalable AI solutions. Implement Full Stack JavaScript (Node.js, React.js, Express, TypeScript, Next.js, etc.) applications to support AI-driven interfaces. Work closely with data scientists More ❯
Ranking, Recommender systems, Graph techniques, and other advanced methodologies. Deep knowledge of LLM techniques, including Agents, Planning, and Reasoning. Experience deploying ML models on cloud platforms such as AWS, Sagemaker, EKS, etc. Experience working with large-scale MLOps pipelines and deploying models to production. About Us J.P. Morgan is a global leader in financial services, providing strategic advice and More ❯
systems, Graph techniques, and other advanced methodologies. Deep knowledge of Large Model (LLM) techniques, including Agents, Planning, and Reasoning. Experience deploying ML models on cloud platforms such as AWS, Sagemaker, EKS, etc. Experience with large-scale MLOps pipelines and deploying models to production services. About Us J.P. Morgan is a global leader in financial services, providing strategic advice and More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
lead technical position, ideally within a product-focused SaaS environment. Strong command of Python and relevant libraries for machine learning, data engineering, and automation. Experience working with AWS (e.g., SageMaker, Bedrock) and infrastructure-as-code tools like Terraform. Solid understanding of large-scale data pipelines, distributed systems, and microservice architectures. Comfortable working with LLMs, model fine-tuning, and building More ❯
SQL skills and experience with relational databases. • Proficiency in data visualization tools such as Power BI, Matplotlib, Seaborn, or Plotly. • Experience with cloud platforms, especially AWS (e.g., S3, EC2, SageMaker). • Familiarity with tools like Jupyter, Snowflake, Docker, and Atlassian suite (Bitbucket, JIRA, Confluence). Soft Skills • Strong analytical and problem-solving abilities. • Excellent communication and teamwork skills. • Natural More ❯
Hounslow, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
lead technical position, ideally within a product-focused SaaS environment. Strong command of Python and relevant libraries for machine learning, data engineering, and automation. Experience working with AWS (e.g., SageMaker, Bedrock) and infrastructure-as-code tools like Terraform. Solid understanding of large-scale data pipelines, distributed systems, and microservice architectures. Comfortable working with LLMs, model fine-tuning, and building More ❯
Language Model (LLM) techniques, including Agents, Planning, Reasoning, and other related methods. Experience with building and deploying ML models on cloud platforms such as AWS and AWS tools like Sagemaker, EKS, etc. Experience working with large-scale MLOps pipelines, working with and deploying models to production services. About Us J.P. Morgan is a global leader in financial services, providing More ❯
Language Model (LLM) techniques, including Agents, Planning, Reasoning, and other related methods. Experience with building and deploying ML models on cloud platforms such as AWS and AWS tools like Sagemaker, EKS, etc. Experience working with large-scale MLOps pipelines, working with and deploying models to production services. About Us J.P. Morgan is a global leader in financial services, providing More ❯