agile sprints. Work with cross-functional teams, including data scientists, software engineers, and designers. Develop and implement state-of-the-art GenAI services leveraging Azure OpenAI models and AWS Bedrock service. Required Qualifications, Capabilities, and Skills: Formal training or certification on software engineering concepts and proficient applied experience Strong hands-on experience with at least one programming language (Python More ❯
Bedford, Massachusetts, United States Hybrid/Remote Options
Credence
AI extensions like Cline and Claude Code, in addition to QDeveloper. Strong communication skills and client-oriented mindset. U.S. Citizenship with eligibility for DoD Secret clearance. Requirements Experience with Bedrock, Strands, Q Developer, Kiro, AgentCore Gateway. Exposure to AWS Serverless technologies including ones that enable Event based architecture (important for Agentic AI systems). Experience with IaC tools (AWS More ❯
McLean, Virginia, United States Hybrid/Remote Options
Credence
AI extensions like Cline and Claude Code, in addition to QDeveloper. Strong communication skills and client-oriented mindset. U.S. Citizenship with eligibility for DoD Secret clearance. Requirements Experience with Bedrock, Strands, Q Developer, Kiro, AgentCore Gateway. Exposure to AWS Serverless technologies including ones that enable Event based architecture (important for Agentic AI systems). Experience with IaC tools (AWS More ❯
Warner Robins, Georgia, United States Hybrid/Remote Options
Credence
AI extensions like Cline and Claude Code, in addition to QDeveloper. Strong communication skills and client-oriented mindset. U.S. Citizenship with eligibility for DoD Secret clearance. Requirements Experience with Bedrock, Strands, Q Developer, Kiro, AgentCore Gateway. Exposure to AWS Serverless technologies including ones that enable Event based architecture (important for Agentic AI systems). Experience with IaC tools (AWS More ❯
ML algorithms or technologies using Python 2 years of experience with Retrieval Augmented Generation (RAG) Experience deploying scalable AI/ML solutions in a public cloud such as AWS Bedrock, Google Cloud, Azure Experience designing, implementing, and scaling complex data pipelines for ML models and evaluating their performance At this time, Capital One will not sponsor a new applicant More ❯
ML algorithms or technologies using Python 2 years of experience with Retrieval Augmented Generation (RAG) Experience deploying scalable AI/ML solutions in a public cloud such as AWS Bedrock, Google Cloud, Azure Experience designing, implementing, and scaling complex data pipelines for ML models and evaluating their performance At this time, Capital One will not sponsor a new applicant More ❯
ML algorithms or technologies using Python 2 years of experience with Retrieval Augmented Generation (RAG) Experience deploying scalable AI/ML solutions in a public cloud such as AWS Bedrock, Google Cloud, Azure Experience designing, implementing, and scaling complex data pipelines for ML models and evaluating their performance At this time, Capital One will not sponsor a new applicant More ❯
ML algorithms or technologies using Python 2 years of experience with Retrieval Augmented Generation (RAG) Experience deploying scalable AI/ML solutions in a public cloud such as AWS Bedrock, Google Cloud, Azure Experience designing, implementing, and scaling complex data pipelines for ML models and evaluating their performance At this time, Capital One will not sponsor a new applicant More ❯
client proposal/presentations, to onboarding delivery teams. Excellent communication and stakeholder management skills, with the ability to articulate technical concepts to non-technical stakeholders. Any experience around AWS Bedrock, Google Vertex, AI Studio would be preferable. Experience around leading teams and engagements. A degree in Computer Science, Data Science, or a related field; advanced degrees are preferred. Benefits More ❯
in Python engineering, with a focus on backend and infrastructure tooling. Deep knowledge of AWS services (IAM, KMS, CloudFormation, API Gateway, S3, Lambda, ECS, Glue, Step Functions, MSK, EKS, Bedrock). Experience scaling platforms for AI/ML workloads and integrating generative AI tooling. Understanding of secure software development, cloud cost optimization, and platform observability. Ability to communicate complex More ❯
in Python engineering, with a focus on backend and infrastructure tooling. Deep knowledge of AWS services (IAM, KMS, CloudFormation, API Gateway, S3, Lambda, ECS, Glue, Step Functions, MSK, EKS, Bedrock). Experience scaling platforms for AI/ML workloads and integrating generative AI tooling. Understanding of secure software development, cloud cost optimization, and platform observability. Ability to communicate complex More ❯
Python, .NET/C# (Key to have a background in C#) Knowledge of AI principles and AI ethics Knowledge of data safety in LLM usage Experience with: AWS: boto3, Bedrock, SageMaker, Lambda, S3, EC2 Azure: Azure OpenAI Service, Cosmos DB Retrieval-Augmented Generation (RAG), Graph RAG Embedding models and LLM training fundamentals Damia Group Limited acts as an employment More ❯
create standardized, reusable components that accelerate agentic development across teams. Cloud Engineering: Develop secure, resilient, and scalable cloud-native solutions using AWS services (Lambda, ECS, S3, API Gateway, SageMaker, Bedrock, etc.) to support production-grade AI operations. Monitoring and Evaluation: Implement metrics, tracing, and evaluation pipelines that ensure transparency, reliability, and continuous improvement in agentic behavior. Integration and Governance More ❯
your work immediately impact global financial operations - this is your platform What you will do Agent Architecture & Orchestration: Design and orchestrate LLM-based agentic workflows (using tools like AWS Bedrock, LangGraph or Google ADK). You will move beyond simple linear chains to build agents capable of loops, reasoning, and self-correction. Data Integration & Tooling: Build secure, low-latency More ❯
world applications Knowledge of authentication, secret management, networking, and model access governance Bonus Points For: Kong API Gateway, Kong Mesh, Flux CD AWS stack: EC2, EKS, S3, SQS, DynamoDB, Bedrock RESTful API development with FastAPI, microservices, Terraform, GitOps workflows Prompt evaluation tools such as Promptfoo SQL and NoSQL experience: MySQL, PostgreSQL, MongoDB, Cassandra Exposure to RAG patterns and vector More ❯
world applications Knowledge of authentication, secret management, networking, and model access governance Bonus Points For: Kong API Gateway, Kong Mesh, Flux CD AWS stack: EC2, EKS, S3, SQS, DynamoDB, Bedrock RESTful API development with FastAPI, microservices, Terraform, GitOps workflows Prompt evaluation tools such as Promptfoo SQL and NoSQL experience: MySQL, PostgreSQL, MongoDB, Cassandra Exposure to RAG patterns and vector More ❯
solutions What You'll Bring Strong background in software/platform/DevOps architecture with AI/ML exposure Experience in using Azure Open AI and/or AWS Bedrock Hands-on experience deploying agentic or autonomous AI systems in production. Proficiency in Python Excellent communicator with consulting or client-facing experience Knowledge of AI governance, interpretability, and bias More ❯
in AI adoption, ensuring ethical and compliant implementation. Required Skills & Experience Deep experience architecting and scaling enterprise-grade AI/ML systems. Proven expertise with cloud AI ecosystems (AWS Bedrock, SageMaker, Azure OpenAI, Azure ML). Strong command of agent frameworks, vector databases, and LLM orchestration tools. Ability to lead multi-disciplinary teams and manage complex AI projects. Advanced More ❯
you had Experience working with large codebases and collaborating with multiple engineering teams in large companies Experience in diverse LLM deployment methods (eg hosted finetuned models via services like Bedrock, and running directly via engines like vLLM) Kraken is a certified Great Place to Work in France, Germany, Spain, Japan and Australia. In the UK we are one of More ❯
and services used globally. Develop in one or more core technology areas: NoSQL (MongoDB, DocumentDB) Graph databases (GraphDB, Neo4j) Search technologies (ElasticSearch, OpenSearch, SOLR) ML/AI frameworks (SageMaker, Bedrock) Cloud-based data engineering (AWS) Build and maintain cloud-native data pipelines, ensuring data quality, accuracy, and reliability. What We’re Looking For 5+ years’ experience in software or More ❯
role. Proven experience in Java 17, Restful, Oracle, Spring Boot, Microservices architecture, Jenkins, Splunk, Load Testing, Tuning, User Focused, distributed systems and financial applications, Data Dog, Dynatrace, Flyway, SageMaker, Bedrock, AgentCore. Familiarity with CI/CD pipelines, cloud platforms, and performance monitoring tools. Strong communication, problem-solving, and leadership abilities with a collaborative mindset. More ❯
code platform experience (Power Platform) - disqualifying if absent Python proficiency for AI development AI frameworks experience (LangChain) Microsoft stack (SharePoint, Power Platform, Azure AI Foundry) Cloud AI services (Vertex, Bedrock, Azure AI) Experience leading teams and making technical decisions More ❯
code platform experience (Power Platform) - disqualifying if absent Python proficiency for AI development AI frameworks experience (LangChain) Microsoft stack (SharePoint, Power Platform, Azure AI Foundry) Cloud AI services (Vertex, Bedrock, Azure AI) Experience leading teams and making technical decisions More ❯
Join Us in Shaping the Future of AI at Barclays. We're launching an exciting new initiative at Barclays to design, build, and scale next-generation platform components that empower developers - including Quants and Strats - to create high-performance, AI More ❯
Join Us in Shaping the Future of AI at Barclays. We're launching an exciting new initiative at Barclays to design, build, and scale next-generation platform components that empower developers - including Quants and Strats - to create high-performance, AI More ❯