Washington, Washington DC, United States Hybrid / WFH Options
Neuma Consulting LLC
in Python, Java, or C++ Familiarity with TensorFlow, PyTorch, Hugging Face, Scikit-learn, and related ML libraries Experience with Docker, Kubernetes, and cloud AI platforms such as AWS Bedrock, Sagemaker, Azure ML, or GCP Vertex AI Working knowledge of data tools such as Spark, Pandas, SQL/NoSQL databases Security: TS clearance required Nice to Have Experience with LangChain More ❯
binley, midlands, united kingdom Hybrid / WFH Options
Coventry Building Society
a DataOps or DevOps approach. Demonstrate how to automate and manage data systems so they run smoothly and can grow easily. Experience with tools like AWS (S3, Glue, Redshift, SageMaker) or other cloud platforms. Familiarity with Docker, Terraform, GitHub Actions, and Vault for managing secrets. Experience in coding SQL, Python, Spark, or Scala to work with data. Experience with More ❯
leicester, midlands, united kingdom Hybrid / WFH Options
Coventry Building Society
a DataOps or DevOps approach. Demonstrate how to automate and manage data systems so they run smoothly and can grow easily. Experience with tools like AWS (S3, Glue, Redshift, SageMaker) or other cloud platforms. Familiarity with Docker, Terraform, GitHub Actions, and Vault for managing secrets. Experience in coding SQL, Python, Spark, or Scala to work with data. Experience with More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Coventry Building Society
a DataOps or DevOps approach. Demonstrate how to automate and manage data systems so they run smoothly and can grow easily. Experience with tools like AWS (S3, Glue, Redshift, SageMaker) or other cloud platforms. Familiarity with Docker, Terraform, GitHub Actions, and Vault for managing secrets. Experience in coding SQL, Python, Spark, or Scala to work with data. Experience with More ❯
Coventry, West Midlands, United Kingdom Hybrid / WFH Options
Coventry Building Society
a DataOps or DevOps approach. Demonstrate how to automate and manage data systems so they run smoothly and can grow easily. Experience with tools like AWS (S3, Glue, Redshift, SageMaker) or other cloud platforms. Familiarity with Docker, Terraform, GitHub Actions, and Vault for managing secrets. Experience in coding SQL, Python, Spark, or Scala to work with data. Experience with More ❯
bolton, greater manchester, north west england, united kingdom Hybrid / WFH Options
Coventry Building Society
a DataOps or DevOps approach. Demonstrate how to automate and manage data systems so they run smoothly and can grow easily. Experience with tools like AWS (S3, Glue, Redshift, SageMaker) or other cloud platforms. Familiarity with Docker, Terraform, GitHub Actions, and Vault for managing secrets. Experience in coding SQL, Python, Spark, or Scala to work with data. Experience with More ❯
warrington, cheshire, north west england, united kingdom Hybrid / WFH Options
Coventry Building Society
a DataOps or DevOps approach. Demonstrate how to automate and manage data systems so they run smoothly and can grow easily. Experience with tools like AWS (S3, Glue, Redshift, SageMaker) or other cloud platforms. Familiarity with Docker, Terraform, GitHub Actions, and Vault for managing secrets. Experience in coding SQL, Python, Spark, or Scala to work with data. Experience with More ❯
Arlington, Virginia, United States Hybrid / WFH Options
G2 Ops, Inc
tuning or inference optimization. Understanding of vector databases (e.g., Qdrant, Pinecone) and semantic search techniques. Use of MLOps tools for CI/CD pipelines in AI (e.g., MLflow, Kubeflow, SageMaker). AI for Systems Engineering Experience working with SysML, MBSE tools, or digital engineering pipelines. Understanding of how to map or extract system design intent from technical documentation using More ❯
Reston, Virginia, United States Hybrid / WFH Options
ICF
updates in development, integration and production environments. Set up, integrate, and maintain a scalable, stable set of CI/CD tools to support development, testing, and security scanning. Implement Amazon CloudWatch, Splunk and other third party monitoring solutions to provide continuous monitoring capabilities, track all aspects of the system, infrastructure, performance, application errors and roll up metrics. Analyze functional … with standard concepts, practices, and procedures such as NIST, FISMA, FedRamp and Common Criteria regulations and standards. Familiarity with the MLOps, machine learning lifecycle and product landscape, for example: AmazonSageMaker, Apache Airflow, Looker, Trifacta etc. You don't need to be an expert in all these. Working knowledge of Linux Professional Skills: Excellent communication and interpersonal skills More ❯
San Antonio, Texas, United States Hybrid / WFH Options
IAMUS
higher certification. Willingness to do a programming challenge during the interview process. Experience with Jupyter Notebooks, Python Data Science Libraries. Experience with ML-OPS and related tools (e.g., MLFLOW, Sagemaker, Bedrock) and ability to build interactive, insightful dashboards for monitoring ML models. Place of Performance: Hybrid work in San Antonio, TX. Desired Skills (Optional) Experience with NOSQL databases such More ❯
Reston, Virginia, United States Hybrid / WFH Options
ICF
updates in development, integration and production environments. Set up, integrate, and maintain a scalable, stable set of CI/CD tools to support development, testing, and security scanning. Implement Amazon CloudWatch, Splunk and other third party monitoring solutions to provide continuous monitoring capabilities, track all aspects of the system, infrastructure, performance, application errors and roll up metrics. Analyze functional … with standard concepts, practices, and procedures such as NIST, FISMA, FedRamp and Common Criteria regulations and standards. Familiarity with the MLOps, machine learning lifecycle and product landscape, for example: AmazonSageMaker, Apache Airflow, Looker, Trifacta etc. You don't need to be an expert in all these. Working knowledge of Linux Professional Skills: Excellent communication and interpersonal skills More ❯
Collaborate closely with data scientists and software engineers to productionize prototypes into scalable, maintainable solutions. Essential Skills & Experienc: • Hands-on experience with ML Ops tools such as MLflow, Kubeflow, AmazonSageMaker, Vertex AI, orequivalent platforms. • Deep understanding of cloud infrastructure services (AWS, Azure, GCP). • Strong experience with CI/CD practices and containerization tools (Docker, Kubernetes). More ❯
of AI frameworks (TensorFlow, PyTorch, Keras) Experience with NLP and computer vision models Strong understanding of machine learning algorithms and statistical modeling Experience with cloud-based AI platforms (AWS SageMaker, Google Cloud AI Platform) Excellent problem-solving and communication skills Qualifications Master's degree in Computer Science, Artificial Intelligence, or related field Certified AI Engineer or related certification (optional More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Ada Meher
LangGraph, LlamaIndex Experience with Hugging Face and LoRA/QLoRA for fine-tuning Experience with RAG & Vector DBs eg. FAISS, Weaviate, Pinecone Any experience of MLOps with MLFlow, AWS (SageMaker), CI/CD (GitHub Actions) or similar would be a benefit to an application The employer is well known not only for the forward-thinking approach they have to More ❯
Baltimore, Maryland, United States Hybrid / WFH Options
US Main
and S3. • Manage dashboards and reporting tools like Qlik Sense and Splunk, linked with AWS data services. • Support AI/ML initiatives with cloud-hosted experiments and pipelines (e.g., SageMaker optional). AWS Technologies Required: • EC2, S3, RDS, API Gateway, CloudFront, CloudWatch, IAM • DevOps experience with Jenkins, SonarQube, Spring Boot, Mulesoft More ❯
SENIOR DATA ENGINEER PERMANENT CITY OF LONDON £90,000 - £100,000 HYBRID work available - City of London office/Home As a key member of this team, you will play a pivotal role in designing and implementing data warehousing solutions More ❯
City of London, London, United Kingdom Hybrid / WFH Options
i3
SENIOR DATA ENGINEER PERMANENT CITY OF LONDON £90,000 - £100,000 HYBRID work available - City of London office/Home As a key member of this team, you will play a pivotal role in designing and implementing data warehousing solutions More ❯
slough, south east england, united kingdom Hybrid / WFH Options
i3
SENIOR DATA ENGINEER PERMANENT CITY OF LONDON £90,000 - £100,000 HYBRID work available - City of London office/Home As a key member of this team, you will play a pivotal role in designing and implementing data warehousing solutions More ❯
london, south east england, united kingdom Hybrid / WFH Options
i3
SENIOR DATA ENGINEER PERMANENT CITY OF LONDON £90,000 - £100,000 HYBRID work available - City of London office/Home As a key member of this team, you will play a pivotal role in designing and implementing data warehousing solutions More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
i3
SENIOR DATA ENGINEER PERMANENT CITY OF LONDON £90,000 - £100,000 HYBRID work available - City of London office/Home As a key member of this team, you will play a pivotal role in designing and implementing data warehousing solutions More ❯
Sacramento, California, United States Hybrid / WFH Options
KK Tech LLC
evidenced by Attachment II, Proposed Project Person Skill Summary Sheet, and resume: At least three (3) to five (5) years of experience with AWS cloud platform and services like Amazon Elastic Compute Cloud (EC2)/virtual machines (VM) Instances, S3/Cloud Storage, Serverless (AWS Lambda, Cloud Functions), and IAM (Identity & Access Management). At least three (3) to … and debug basic web applications with familiarity with APIs Representational State Transfer (REST) and using Python, Node.js, or C#. At least three (3) years of experience using Docker, Kubernetes Amazon Elastic Kubernetes Service (EKS) , and container orchestration. At least three (3) to five (5) years of experience with structured query language (SQL) and/or non-SQL (NoSQL) databases … CloudWatch Logs, CloudTrail, or third-party tools like Splunk. Scripting experience with Bash, PowerShell, or equivalent. Experience with Artificial Intelligence/Machine Learning (AI/ML) cloud services (e.g., SageMaker) and multi-cloud architecture. The proposed project person will be required to adhere to the Client's hybrid working model, currently working on-site 2-3 days per week More ❯
Python Experience with machine learning, familiar with Huggingface, Pytorch, and similar ML tools and packages Familiarity with deploying and scaling ML models in the cloud, particularly with AWS and SageMaker Understanding of DevOps processes and tools: CI/CD, Docker, Terraform, and monitoring/observability Bonus: experience with vector databases, semantic search, or event-driven systems like Kafka We … everyone, regardless of background, gender identity, sexual orientation, disability status, ethnicity, belief, age, family or parental status, and any other characteristic. Python, Apache Kafka, Semantic search, PyTorch, Algorithms, Docker, Amazon Web Services, Language Models, Service support, Back End, Machine learning, DevOps Engineering, Monitoring, Cloud, Labour market, Pipelines, Database, Realtime, Infrastructure, Recommender system, Agile Environment, Software Use, Terraform, API, Making More ❯
City of London, London, United Kingdom Hybrid / WFH Options
I3 Resourcing Limited
Senior Data Engineer MUST HAVE SNOWFLAKE Salary - £90-100k with 15% bonus Hybrid working - couple of days in the office City of London We are looking for: Good understanding of data engineering principles A good technical grasp of Snowflake More ❯
Hollywood, Florida, United States Hybrid / WFH Options
INSPYR Solutions
pipelines and integration frameworks. Administer and optimize Databricks and Apache Spark environments for data engineering workloads. Build and manage data workflows using AWS services such as Lambda, Glue, Redshift, SageMaker, and S3. Support and troubleshoot DataOps pipelines, ensuring reliability and performance across environments. Automate platform operations using Python, PySpark, and infrastructure-as-code tools. Collaborate with cross-functional teams … and PySpark. Expertise in Databricks, Apache Spark, and Delta Lake. Proficiency in AWS CloudOps, Cloud Security, including configuration, deployment, and monitoring. Strong SQL skills and hands-on experience with Amazon Redshift. Experience with ETL development, data transformation, and orchestration tools. Nice to Have/Working Knowledge Kafka for real-time data streaming and integration. Fivetran and DBT for data More ❯