Brockworth, England, United Kingdom Hybrid / WFH Options
Lockheed Martin
software pipelines for Cloud deployments Understanding the development challenges moving on-prem workload to cloud, Hands on experience in using some AWS Services such as EC2, EKS, Fargate, IAM, S3, Lambda Lockheed Martin is committed to upholding principles of equal opportunity, fostering a work environment that is aligned with our core values of integrity, respect, and exceptional performance. We More ❯
brockworth, south west england, united kingdom Hybrid / WFH Options
Lockheed Martin
software pipelines for Cloud deployments Understanding the development challenges moving on-prem workload to cloud, Hands on experience in using some AWS Services such as EC2, EKS, Fargate, IAM, S3, Lambda Lockheed Martin is committed to upholding principles of equal opportunity, fostering a work environment that is aligned with our core values of integrity, respect, and exceptional performance. We More ❯
gloucester, south west england, united kingdom Hybrid / WFH Options
Lockheed Martin
software pipelines for Cloud deployments Understanding the development challenges moving on-prem workload to cloud, Hands on experience in using some AWS Services such as EC2, EKS, Fargate, IAM, S3, Lambda Lockheed Martin is committed to upholding principles of equal opportunity, fostering a work environment that is aligned with our core values of integrity, respect, and exceptional performance. We More ❯
a fast paced environment. Front End & Web Technologies: Back End & Infrastructure: Experience with Docker, Elasticsearch, or Palantir. Hands on experience with cloud platforms and services, particularly AWS (EC2, Lambda, S3, etc.). Background in backend or frontend architecture and development. Experience with Python modules such as ArcPy and ArcGIS API for Python. Relevant experience using Esri technology and the More ❯
experience in a technical support, DevOps, or infrastructure engineering role. Strong hands-on experience with Docker and Docker Swarm in a production environment. Proficiency with core AWS services: EC2, S3, CloudWatch, IAM, ELB, ECS/Fargate, and RDS. Solid understanding of Linux server environments, command-line operations, and scripting. Experience in supporting real-time or mission-critical systems (security … 3+ years in a technical support, DevOps, or systems engineering role. Hands-on experience with Docker, docker-compose, and container lifecycle management. Experience supporting or integrating MinIO or similar S3-compatible object storage. Technical Skills: Familiarity with advanced networking concepts and protocols (TCP/IP, DNS, DHCP, VLANs, routing, proxies, firewall configuration, etc.). Strong Ubuntu Linux system administration More ❯
they do. You’ll use our tech stack.. React + Next.js + TypeScript for our frontend code Node.js + TypeScript for our backend code AWS Lambda for serverless computing S3 for scalable storage, and NoSQL databases like DynamoDB SQS/SNS for messaging Continuous deployment with Terraform (IaC) and GitHub Actions Vitest + React Testing Library for unit tests More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Involved Solutions
Skills for the Senior Data Engineer: Experience with event sourcing, dbt, or related data transformation tools Familiarity with PostgreSQL and cloud-native data services (Azure Event Hub, Redshift, Kinesis, S3, Blob Storage, OneLake, or Microsoft Fabric) Understanding of machine learning model enablement and operationalisation within data architectures Experience working within Agile delivery environments If you are an experienced Senior More ❯
analysis and self-serve data exploration Redis for caching and task queues GitHub, Codeship and Heroku for seamless code review, integration and deployment Claude Code for agentic coding AWS S3, CloudFront and Lambda for storage and content delivery Cypress for end-to-end testing Third party APIs: OpenAI, Stripe, Twilio, Sendgrid, Mailchimp and more Our tech culture We follow More ❯
analysis and self-serve data exploration Redis for caching and task queues GitHub, Codeship and Heroku for seamless code review, integration and deployment Claude Code for agentic coding AWS S3, CloudFront and Lambda for storage and content delivery Cypress for end-to-end testing Third party APIs: OpenAI, Stripe, Twilio, Sendgrid, Mailchimp and more Our tech culture We follow More ❯
analysis and self-serve data exploration Redis for caching and task queues GitHub, Codeship and Heroku for seamless code review, integration and deployment Claude Code for agentic coding AWS S3, CloudFront and Lambda for storage and content delivery Cypress for end-to-end testing Third party APIs: OpenAI, Stripe, Twilio, Sendgrid, Mailchimp and more Our tech culture We follow More ❯
analysis and self-serve data exploration Redis for caching and task queues GitHub, Codeship and Heroku for seamless code review, integration and deployment Claude Code for agentic coding AWS S3, CloudFront and Lambda for storage and content delivery Cypress for end-to-end testing Third party APIs: OpenAI, Stripe, Twilio, Sendgrid, Mailchimp and more Our tech culture We follow More ❯
london (city of london), south east england, united kingdom
Venator Recruitment
analysis and self-serve data exploration Redis for caching and task queues GitHub, Codeship and Heroku for seamless code review, integration and deployment Claude Code for agentic coding AWS S3, CloudFront and Lambda for storage and content delivery Cypress for end-to-end testing Third party APIs: OpenAI, Stripe, Twilio, Sendgrid, Mailchimp and more Our tech culture We follow More ❯
knowledge of multiple data storage subsystems. Strong experience of Terraform, AWS, Airflow, Docker, Github/Github actions, Jenkins/Teamcity• Strong AWS specific skills for Athena, Lambda, ECS, ECR, S3 and IAM Strong knowledge in industry best practices in development and a security first mindset. Strong knowledge in using and developing applicable tool sets. Ability to interface competently with More ❯
for a highly skilled Cloud DevOps Engineer to join the team on an inital 6 month contract Kubernetes, specifically with EKS Terraform (Enterprise) Micro-services AWS experience including Cloudfront, S3, EKS, LoadBalancing, Security, API Gateway, Transit Gateway ArgoCD OpenTelemetry Istio Database experience ideally but not mandatory: RDS (Oracle/Postgres), MongoDB Gitlab CICD If this role sounds like a More ❯
sector department. We’re looking for an experienced AWS DevOps Engineer with strong Apigee expertise to join the team. 🛠️ Tech Stack: AWS (Lambda, API Gateway, CloudWatch, ECS, ECR, DynamoDB, S3) Azure DevOps CI/CD GitHub Workflows Apigee X Key Responsibilities: Build and maintain CI/CD pipelines using Azure DevOps & GitHub Deploy and monitor scalable solutions on AWS More ❯
and ETL skills are must. Understanding of data transformations, cleansing, and deduplications. Experience developing pipelines for both Cloud and Hybrid Cloud infrastructures. Experience in AWS utilizing services such as S3, AWS CLI, Kinesis and RDS. Experience programming using Python or Java. Experience working in an Agile delivery environment DTS offers an excellent compensation package. Contact : Ajay Bharbutta Team Lead More ❯
Troubleshoot issues and deliver solutions for high availability and reliability What We’re Looking For Experience 4+ years in a senior technical role, ideally within data engineering AWS stack: S3, EMR, EC2, Kinesis, Firehose, Flink/MSF Python & SQL Security and risk management frameworks Excellent communication and collaboration skills Desirable Event streaming and event streaming analytics in real-world More ❯
Troubleshoot issues and deliver solutions for high availability and reliability What We’re Looking For Experience 4+ years in a senior technical role, ideally within data engineering AWS stack: S3, EMR, EC2, Kinesis, Firehose, Flink/MSF Python & SQL Security and risk management frameworks Excellent communication and collaboration skills Desirable Event streaming and event streaming analytics in real-world More ❯
end applications using Node.js, TypeScript, Next.js, and AWS CDK. Full stack developer skills mix – Strong proficiency in TypeScript and Node.js. Hands-on experience with AWS Lambda, API Gateway, DynamoDB, S3, CloudWatch, and related services. Experience with AWS CDK (TypeScript or Python) for infrastructure as code. Expertise in building React/Next.js front-end applications. Experience with serverless architectures and More ❯
Troubleshoot issues and deliver solutions for high availability and reliability What We’re Looking For Experience 4+ years in a senior technical role, ideally within data engineering AWS stack: S3, EMR, EC2, Kinesis, Firehose, Flink/MSF Python & SQL Security and risk management frameworks Excellent communication and collaboration skills Desirable Event streaming and event streaming analytics in real-world More ❯
london (city of london), south east england, united kingdom
Accelero
Troubleshoot issues and deliver solutions for high availability and reliability What We’re Looking For Experience 4+ years in a senior technical role, ideally within data engineering AWS stack: S3, EMR, EC2, Kinesis, Firehose, Flink/MSF Python & SQL Security and risk management frameworks Excellent communication and collaboration skills Desirable Event streaming and event streaming analytics in real-world More ❯
Troubleshoot issues and deliver solutions for high availability and reliability What We’re Looking For Experience 4+ years in a senior technical role, ideally within data engineering AWS stack: S3, EMR, EC2, Kinesis, Firehose, Flink/MSF Python & SQL Security and risk management frameworks Excellent communication and collaboration skills Desirable Event streaming and event streaming analytics in real-world More ❯
hands-on experience with Databricks and Apache Spark (preferably PySpark). • Proven track record of building and optimizing data pipelines in cloud environments. • Experience with AWS services such as S3, Glue, Lambda, Step Functions, Athena, IAM, and VPC. • Proficiency in Python for data engineering tasks. • Familiarity with GitLab for version control and CI/CD. • Strong understanding of unit More ❯
hands-on experience with Databricks and Apache Spark (preferably PySpark). • Proven track record of building and optimizing data pipelines in cloud environments. • Experience with AWS services such as S3, Glue, Lambda, Step Functions, Athena, IAM, and VPC. • Proficiency in Python for data engineering tasks. • Familiarity with GitLab for version control and CI/CD. • Strong understanding of unit More ❯