practices. A solid understanding of networking protocols and concepts (TCP/IP, DNS, SSL/TLS, routing, etc.). Proficient with AWS services including EC2, ELB, VPC, IAM, CloudWatch, S3,VPC Lattice, Transit Gateway, VPN and more. Practical knowledge of DevOps tools: Git, Jenkins, Docker, Ansible, Terraform. Strong scripting skills (Bash, Python, or equivalent). Candidates must be eligible More ❯
improvement. Participate in sprint planning, technical design sessions, and architectural reviews. Required Skills & Experience Strong proficiency in React. Deep experience with AWS services such as Lambda, API Gateway, DynamoDB, S3, and CloudFormation. Solid understanding of TypeScript, Node.js, and RESTful API design. Familiarity with DevOps practices and tools (e.g., GitHub Actions, Terraform, CloudWatch). Experience working in Agile environments and More ❯
driven tools and workflows. · Build and integrate modular AI agents capable of real-world task execution in cloud-native environments. · Utilize AWS services such as Lambda, Step Functions, Bedrock, S3, ECS/Fargate, DynamoDB, and API Gateway to support scalable, serverless infrastructure. · Write production-grade Python code, following best practices in software design, testing, and documentation. · Build robust CI … track record with LangGraph, LangChain, or similar orchestration frameworks. · Expert in Python (asyncio, FastAPI preferred). · Hands-on experience building and deploying applications on AWS, particularly using Lambda, Fargate, S3, Step Functions, and DynamoDB. · Familiarity with AWS Bedrock is a plus. · Strong understanding of agentic patterns, prompt chaining, tool calling, and memory/state management in LLM applications. · Solid More ❯
ingestion processes from APIs and internal systems, leveraging tools such as Kafka, Spark, or AWS-native services. Cloud Data Platforms - Develop and maintain data lakes and warehouses (e.g., AWS S3, Redshift). Data Quality & Governance - Implement automated validation, testing, and monitoring for data integrity. Performance & Troubleshooting - Monitor workflows, enhance logging/alerting, and fine-tune performance. Data Modelling - Handle … IAM), and GDPR-aligned data practices. Technical Skills & Experience Proficient in Python and SQL for data processing. Solid experience with Apache Airflow - writing and configuring DAGs. Strong AWS skills (S3, Redshift, etc.). Big data experience with Apache Spark. Knowledge of data modelling, schema design, and partitioning. Understanding of batch and streaming data architectures (e.g., Kafka). Experience with More ❯