/GitHub, Sonar Cube, CAST, Team City/Jenkins/Azure DevOps Expert level knowledge of telemetry and observability platforms like ELK stack, Grafana, Kibana, Azure Application Insights, AWS Cloud Watch etc., Scripting languages preferably python, PowerShell Database technologies preferably MS SQL Server, Postgres SQL Infrastructure as code – AWSCloudformation/Terraform/Ansible …/Chef AWS cloud native development using EC2, Lambda, S3, Simple Que Service etc. More ❯
london (city of london), south east england, united kingdom
rmg digital
/GitHub, Sonar Cube, CAST, Team City/Jenkins/Azure DevOps Expert level knowledge of telemetry and observability platforms like ELK stack, Grafana, Kibana, Azure Application Insights, AWS Cloud Watch etc., Scripting languages preferably python, PowerShell Database technologies preferably MS SQL Server, Postgres SQL Infrastructure as code – AWSCloudformation/Terraform/Ansible …/Chef AWS cloud native development using EC2, Lambda, S3, Simple Que Service etc. More ❯
City of London, London, Canary Wharf, United Kingdom
Fusion People Ltd
scale Citrix/Terminal Server installations, be proficient in scripting and automation using languages (Python, PowerShell, Bash), experience with infrastructure as code (IaC) tools such as Terraform or CloudFormation, be familiar with DevOps practices, CI/CD pipelines and related tools (Jenkins, GitLab, CI/CD or CircleCI) with the ability to monitor, optimise and troubleshoot cloud infrastructure More ❯
Our client currently seek a SC Cleared AWS DevOps Engineer to join their dynamic team on an initial 6 month contract. This role is 95% remote with travel required once a month to the office. Key Skills and Responsibilities: Design, deliver, and support secure and scalable AWS infrastructure using services like EC2, S3, ECS, and FARGATE … Integrate SAST (Static Application Security Testing) and DAST (Dynamic Application Security Testing) tools into CI/CD pipelines to enforce secure development practices Automate infrastructure provisioning using CloudFormation, Terraform, or CDK Use tools like Chef and Ansible for automated server configuration, patching, and environment standardization Build and manage CI/CD pipelines with Jenkins, GitHub Actions, or AWS … on Linux-based systems, including log analysis and performance tuning. Lead technical triage and root cause analysis for infrastructure-related issues Develop and deploy applications using Docker and AWS FARGATE Use CloudWatch, CloudTrail, and third-party tools like Datadog for performance and cost efficiency Configure AWS networking (VPCs, TGWs), enforce governance via AWS Config More ❯
Senior AWS Platform Engineer Central London (2 days a week in office) £85,000 - £95,000 per annum + Generous Benefits Package + Shares We are working with a London-based software company that have been quietly doing some incredible things across the Retail and Fintech spaces in the last 7 years. Now, they’re looking to bring … in a Senior AWS Platform Engineer to help scale their cloud infrastructure and DevOps capability. They’ve built a high-performing engineering team and are now investing further into the platform side of things as demand grows. Think modern, cloud-native architecture, and a real emphasis on automation, scalability, and developer enablement. You’ll have the autonomy to … make technical decisions and help shape how platform engineering is done as the team continues to scale. Tech stack AWS (Core services - EC2, S3, IAM, etc.) Kubernetes (building and managing production clusters) Terraform (for full IaC provisioning) Python (scripting, automation) GitHub Actions (CI/CD pipelines) Docker & Helm (for containerised app deployments) What They’re Looking For Strong More ❯
Retail and CPG, and Public Services. Consolidated revenues as of 12 months ending December 2024 totaled $13.8 billion. Job Summary: We are seeking a highly skilled and experienced AWS Lead Data Engineer, who will build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena … AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data pipelines using PySpark and AWS Glue. • Architect and manage data lakes using AWS Lake Formation, ensuring proper access control … reporting. • Collaborate with analysts and business stakeholders to understand data requirements and deliver robust solutions. • Implement and maintain CI/CD pipelines for data workflows using tools like AWS CodePipeline, Git, GitHub Actions. • Ensure data quality, lineage, and observability. • Mentor junior engineers and establish coding and design standards across the team. • Monitor and optimize performance of data pipelines More ❯
london (city of london), south east england, united kingdom
HCLTech
Retail and CPG, and Public Services. Consolidated revenues as of 12 months ending December 2024 totaled $13.8 billion. Job Summary: We are seeking a highly skilled and experienced AWS Lead Data Engineer, who will build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena … AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data pipelines using PySpark and AWS Glue. • Architect and manage data lakes using AWS Lake Formation, ensuring proper access control … reporting. • Collaborate with analysts and business stakeholders to understand data requirements and deliver robust solutions. • Implement and maintain CI/CD pipelines for data workflows using tools like AWS CodePipeline, Git, GitHub Actions. • Ensure data quality, lineage, and observability. • Mentor junior engineers and establish coding and design standards across the team. • Monitor and optimize performance of data pipelines More ❯