Associate Cloud Consultant, Data AnaIytics, AWS Professional Services, Public Sector Job ID: Amazon Web Services, Inc. - A97 At Amazon Web Services (AWS), we're hiring highly technical Data Analytics consultants to collaborate with our customers and partners on key engagements. Our consultants will develop and deliver proof-of … travel in their contract. Key job responsibilities In this role, you will work with our partners, customers and focus on our AWS offerings such Amazon Kinesis, AWS Glue, Amazon Redshift, Amazon EMR, Amazon Athena, AmazonSageMaker, Amazon Bedrock, Amazon Q, and Amazon … ideal candidate will have extensive experience with design, development and operations that leverages deep knowledge in the use of modern data services like DynamoDB, Amazon Kinesis, Apache Kafka, Apache Spark, AmazonSagemaker, Amazon RDS, technologies and other industry data structures and standards. A day in the More ❯
help them achieve business outcomes with AWS. Our projects are often unique, one-of-a-kind endeavors that no one has done before. At Amazon Web Services (AWS), we are helping large enterprises build AI solutions on the AWS Cloud. We apply predictive technology to large volumes of data … our customers. You will leverage the global scale, elasticity, automation, and high-availability features of the AWS platform. You will build customer solutions with AmazonSageMaker, Amazon Bedrock, Amazon Elastic Compute (EC2), Amazon Data Pipeline, Amazon S3, Glue, Amazon DynamoDB, Amazon Relational … Database Service (RDS), Amazon Elastic Map Reduce (EMR), Amazon Kinesis, AWS Lake Formation, and other AWS services. You will collaborate across the whole AWS organization, with other consultants, customer teams, and partners on proof-of-concepts, workshops, and complex implementation projects. You will innovate and experiment to help More ❯
the latest data analytics technologies? Would you like a career path that enables you to progress with the rapid adoption of cloud computing? At Amazon Web Services, we're hiring highly technical cloud architect specialised in data analytics to collaborate with our customers and partners to derive business value … Key job responsibilities Expertise - Collaborate with pre-sales and delivery teams to help partners and customers learn and use services such as AWS Glue, Amazon S3, Amazon DynamoDB, Amazon Relational Database Service (RDS), Amazon Elastic Map Reduce (EMR), Amazon Kinesis, Amazon Redshift, Amazon Athena, AWS Lake Formation, Amazon DataZone, AmazonSageMaker and Amazon Quicksight. Solutions - Deliver technical engagements with partners and customers. This includes participating in pre-sales visits, understanding customer requirements, creating consulting proposals and creating packaged data analytics service offerings. Delivery - Engagements include projects proving the More ❯
career is just starting, hasn't followed a traditional path, or includes alternative experiences, don't let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world's most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating - that's … experience with model customization techniques such as fine-tuning, continued pre-training, and LLM-as-judge evaluation - Experience with optimization of models on GPUs, Amazon Silicon, or TPUs, also experience with open source frameworks for building applications powered by LLMs like LangChain, LlamaIndex, and/or similar tools - Experience … building generative AI applications on AWS using services such as Amazon Bedrock and AmazonSageMakerAmazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your experience and skills. We value More ❯
Azure. Experience with business intelligence tools like Tableau or PowerBI. Experience working with LLMs. Experience working with AWS Services like EC2, RDS(Postgres), SQS, Sagemaker, MLflow, S3, API gateway, ECS. Experience in UI frameworks like VueJS is a plus. About Us FactSet creates flexible, open data and software solutions More ❯
Electrical Engineering, Computer Engineering or related field. Experience in containerization - Docker/Kubernetes. Experience in AWS cloud and services (S3, Lambda, Aurora, ECS, EKS, SageMaker, Bedrock, Athena, Secrets Manager, Certificate Manager etc.) Proven DevOps/MLOps experience provisioning and maintaining infrastructure leveraging some of the following: Terraform, Ansible, AWS More ❯
accessing and processing data (PostgreSQL preferred but general SQL knowledge is more important). Familiarity with latest Data Science platforms (e.g. Databricks, Dataiku, AzureML, SageMaker) and frameworks (e.g. Tensorflow, MXNet, scikit-learn). Knowledge of software engineering practices (coding practices to DS, unit testing, version control, code review). More ❯
OIDC integration and a deep understanding of OAuth, JWT/JWE/JWS. Strong skills in backend performance optimization and debugging. Knowledge of AWS SageMaker and data analytics tools. Proficiency in frameworks like TensorFlow, PyTorch, or similar. Preferred Qualifications, Capabilities, and Skills: Familiarity with LangChain, Langgraph, or other Agentic More ❯
SR2 | Socially Responsible Recruitment | Certified B Corporation™
CI/CD pipelines, and test-driven development. Nice to Have: Experience within financial services or fintech. Familiarity with MLOps tools (e.g., MLflow, Kubeflow, SageMaker). Understanding of regulatory or compliance frameworks. Academic background in mathematics, statistics, or quantitative disciplines. Who You Are: A natural problem-solver and independent More ❯
london, south east england, United Kingdom Hybrid / WFH Options
SR2 | Socially Responsible Recruitment | Certified B Corporation™
CI/CD pipelines, and test-driven development. Nice to Have: Experience within financial services or fintech. Familiarity with MLOps tools (e.g., MLflow, Kubeflow, SageMaker). Understanding of regulatory or compliance frameworks. Academic background in mathematics, statistics, or quantitative disciplines. Who You Are: A natural problem-solver and independent More ❯
the RFI/RFP process, as preferred bidder, documented bids and face to face presentations. Experience of data science platforms (e.g. Databricks, Dataiku, AzureML, SageMaker) and machine learning frameworks (e.g. Keras, Tensorflow, PyTorch, scikit-learn) Cloud platforms - demonstrable experience of building and deploying solutions to Cloud (e.g. AWS, Azure More ❯
Planning Analytics View more categories View less categories Sector Data Science ,Technology Role Analyst Contract Type Permanent Hours Full Time DESCRIPTION The goal of Amazon Logistics (AMZL) is to build a world class last mile operation. Amazon aims to exceed the expectations of our customers by ensuring that … tools experience: Quicksight/Tableau or similar tools 3. Scripting Experience: R/Python/C++ 4. (Optional) Experience with AWS solutions (S3, Athena, Sagemaker) Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions … priority for Amazon. Please consult our Privacy Notice () to know more about how we collect, use and transfer the personal data of our candidates. Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability, or other legally protected status. Our inclusive More ❯
SR2 | Socially Responsible Recruitment | Certified B Corporation™
a product-focused SaaS environment. Strong command of Python and relevant libraries for machine learning, data engineering, and automation. Experience working with AWS (e.g., SageMaker, Bedrock) and infrastructure-as-code tools like Terraform. Solid understanding of large-scale data pipelines, distributed systems, and microservice architectures. Comfortable working with LLMs More ❯
london, south east england, United Kingdom Hybrid / WFH Options
SR2 | Socially Responsible Recruitment | Certified B Corporation™
a product-focused SaaS environment. Strong command of Python and relevant libraries for machine learning, data engineering, and automation. Experience working with AWS (e.g., SageMaker, Bedrock) and infrastructure-as-code tools like Terraform. Solid understanding of large-scale data pipelines, distributed systems, and microservice architectures. Comfortable working with LLMs More ❯
Proficiency in programming languages such as Python, experience with AI/ML frameworks (e.g., TensorFlow, PyTorch), and experience with MLOps frameworks/tools (e.g. Sagemaker pipelines, Azure ML Studio, VertexAI, Kubeflow, MLFlow, Seldon, EvidentlyAI). What we offer Culture of caring: At GlobalLogic, we prioritize a culture of caring. More ❯
Proficiency in programming languages such as Python, experience with AI/ML frameworks (e.g., TensorFlow, PyTorch), and experience with MLOps frameworks/tools (e.g. Sagemaker pipelines, Azure ML Studio, VertexAI, Kubeflow, MLFlow, Seldon, EvidentlyAI). What We Offer Culture of Caring: At GlobalLogic, we prioritize a culture of caring. More ❯
language models Proven experience with cloud platforms such as AWS, Azure, or Google Cloud Familiarity with tools and frameworks such as TensorFlow, PyTorch, MLflow, SageMaker, or Databricks Deep understanding of data architecture, APIs, and model deployment best practices Knowledge of MLOps and full model lifecycle management Excellent communication and More ❯
and unsupervised learning techniques Experience in demand prediction, optimisation, or computer vision is advantageous Comfortable working with cloud platforms (preferably AWS) and services like SageMaker or Lambda Strong mathematical and statistical foundations, with a sharp eye for patterns and insights Willingness to build basic backend development skills (Python/ More ❯
Collaborate with ML/AI Teams Package and deploy large‑language‑model (LLM) training jobs on distributed GPU clusters (Slurm, Ray, Kubeflow, or AWS SageMaker). Optimize model‑serving (Triton, vLLM, TorchServe) for low‑latency, high‑throughput inference. Cost & Performance Optimization Track cloud spend, right‑size resources, and introduce More ❯
Collaborate with ML/AI Teams Package and deploy large‑language‑model (LLM) training jobs on distributed GPU clusters (Slurm, Ray, Kubeflow, or AWS SageMaker). Optimize model‑serving (Triton, vLLM, TorchServe) for low‑latency, high‑throughput inference. Cost & Performance Optimization Track cloud spend, right‑size resources, and introduce More ❯
Experience across cloud platforms (AWS, Azure, GCP), with DevOps/ML Ops tools (eg Terraform, Docker, Kubernetes) Expertise with data science platforms like Databricks, SageMaker, or AzureML Excellent stakeholder engagement and communication skills Proven experience managing teams and delivering complex projects Security Requirements You must be eligible for Developed More ❯
data versioning, quality management, and CI/CD pipelines Experience with cloud platforms (e.g., AWS or Azure) and data tools such as Terraform or SageMaker is a plus Ideally, some hands-on experience building and maintaining data pipelines in a production environment What's on Offer: Competitive salary More ❯
especially in deploying AI models or microservices. Cloud Services: Familiarity with cloud platforms like AWS, Azure, or GCP. Experience with services such as AWS SageMaker, Google AI, or Azure Machine Learning is a plus. Serverless Architectures: Exposure to serverless architectures, ideally edge compute environments such as Cloudflare Workers. Problem More ❯
is essential, with preferred expertise in demand prediction, computer vision, or optimisation. Familiarity with the AWS cloud platform, particularly its AI/ML services (SageMaker and Lambda) and related data processing tools, is also ... More ❯
/21, FCA PS21/3, DORA, NIST AI RMF). Experience with AI/ML platforms and monitoring tools (e.g., MLflow, Azure ML, SageMaker, Python). Excellent stakeholder management and communication skills. Desirable Skills: Technical background in AI/ML, data science, or software engineering. Experience with cloud More ❯