to-have experience in the following areas: Experience with using or extending ArgoCD, Argo Rollouts, Spinnaker, or any cloud native CI/CD platforms (AWS/Azure/GCP) Experience with continuous integration and continuous deployment (CI/CD), ideally both as a consumer and as someone who has enabled such solutions in their past roles Proficiency with a More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
The Investigo Group
environment Deep understanding of infrastructure automation frameworks, SRE principles, and continuous delivery Hands-on skills with tools like Terraform, Ansible, Docker, Kubernetes Familiarity with cloud platforms (AWS, Azure, GCP) Programming/scripting in Python, Bash, or similar Strong knowledge of CI/CD tooling (GitHub Actions, Jenkins, Azure DevOps) Understanding of REST APIs, SQL, and containerisation Bonus Points For More ❯
Position Description Company: Arion Systems, Inc. Corporate Headquarters: 15040 Conference Center Drive, Suite 200, Chantilly, VA 20151 POC: Terri Shaulis, Director of Recruiting Email: Telephone: ext 1125 Job Title: Senior Cloud Engineer Work Location: Chantilly, VA Primary Function: Secondary More ❯
We are seeking a Lead, Full Stack Software Developer, technical team lead with strong systems, software, cloud, and Agile experience to support a complex program to provide Agile development and operations and maintenance for critical systems on a mission More ❯
in CME's Cloud data transformation, the data SRE will be aligned to data product pods ensuring the our data infrastructure is reliable, scalable, and efficient as the GCP data footprint expands rapidly. Accountabilities: Automate data tasks on GCP Work with data domain owners, data scientists and other stakeholders to ensure that data is consumed effectively on GCP Design … build, secure and maintain data infrastructure, including data pipelines, databases, data warehouses, and data processing platforms on GCP Measure and monitor the quality of data on GCP data platforms Implement robust monitoring and alerting systems to proactively identify and resolve issues in data systems. Respond to incidents promptly to minimize downtime and data loss. Develop automation scripts and tools to … collaboration skills to work effectively in a team-oriented environment. Ideally a background in cloud computing and data-Intensive applications and services, with a focus on GoogleCloudPlatform 3+ years of experience in data engineering or data science. Experience with data quality assurance and testing. Ideally knowledge of GCP data services (BigQuery; Dataflow; Data Fusion More ❯
the ability to work across a range of tooling in the following areas: Programming: with at least one language; Go, Java, Python, Ruby, TypeScript etc Cloud: Either AWS, GCP, Azure Container/Orchestration: Docker, Kubernetes IAC: Terraform, Cloud Formation, CrossPlane, Ansible (Config Mgt) CI/CD: CircleCI, GitHub Actions, Jenkins, etc The Role: Hands-On: The Senior Engineering More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Inara
the ability to work across a range of tooling in the following areas: Programming: with at least one language; Go, Java, Python, Ruby, TypeScript etc Cloud: Either AWS, GCP, Azure Container/Orchestration: Docker, Kubernetes IAC: Terraform, Cloud Formation, CrossPlane, Ansible (Config Mgt) CI/CD: CircleCI, GitHub Actions, Jenkins, etc The Role: Hands-On: The Senior Engineering More ❯
our front-end applications. This means we do data-intensive work for both OLTP and OLAP use cases. Our environments are primarily cloud-native spanning AWS, Azure and GCP, but we also work on systems running self-hosted open source services exclusively. We strive towards a strong code-first, data as a product mindset at all times, where testing … for distributed data processing and analytics. Cloud Platforms: Deploy and manage data solutions on cloud platforms such as AWS, Azure, or GoogleCloudPlatform (GCP), leveraging cloud-native services for data storage, processing, and analytics. Data Quality and Governance: Implement data quality checks, validation processes, and data governance policies to ensure accuracy, consistency, and … such as Apache Airflow, Informatica, or Talend. Knowledge of data governance and best practices in data management. Familiarity with cloud platforms and services such as AWS, Azure, or GCP for deploying and managing data solutions. Strong problem-solving and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache More ❯
experience with Python and SQL is beneficial Practical experience with containerization and orchestration technologies (e.g., Kubernetes, Docker) Deep knowledge of cloud-native deployment architectures and experience with AWS, GCP, or Azure Demonstrated expertise in performance tuning, monitoring, debugging, and optimizing distributed data processing pipelines Proven experience conducting technical workshops, trainings, or delivering professional services directly to customers Exceptional problem More ❯
Windows, macOS Databases: proficiency in databases like MySQL, SQL Server, MongoDB, PostgreSQL, etc. Familiarity with platforms like Amazon Web Services (AWS), Microsoft Azure, GoogleCloudPlatform (GCP), etc. Proficiency in version control systems such as Git Knowledge of data and cloud security Knowledge of software international standards such as IEC 62304, IEC 62366, ISO/IEC More ❯
Go development to join our Data Processing team at Snowplow. The Data Processing team is responsible for the applications that make up our core streaming pipeline product, running on GCP, AWS and Azure. You'll be responsible for solving complex problems in building, testing and maintaining our high-throughput real-time services, powering the next generation of Snowplow’s attribution … about building exceptional data pipelines, we want to hear from you! What You’ll Be Doing: Design, build and test real-time data services (e.g., identity graphs, attribution) on GCP/AWS/Azure, delivering reliable, high-quality code Build robust QA, unit and integration tests both within our Go projects, and using our Go-based automated QA framework Collaborate … practices Proficiency with tools like Terraform/IaC tooling and GitHub Actions Familiarity with containerization tools such as Docker Experience with cloud-based services and environments (e.g., AWS, GCP, Azure) Excellent problem-solving skills and attention to detail You approach software delivery pragmatically, balancing rapid learning with a commitment to reliable, trusted service for our customers You May Also More ❯
London, England, United Kingdom Hybrid / WFH Options
NMI
with relational databases (e.g., MySQL, SQL Server, Oracle). Strong knowledge of security best practices (e.g. OWASP, PCI, SOC2, HIPAA). Proficiency with GoogleCloudPlatform (GCP), Amazon Web Services (AWS), or similar cloud environments. Demonstrated experience applying modern software development practices in a collaborative, agile environment. Excellent communication skills, with a proven ability to mentor More ❯
London, England, United Kingdom Hybrid / WFH Options
NMI
with relational databases (e.g., MySQL, SQL Server, Oracle). Strong knowledge of security best practices (e.g. OWASP, PCI, SOC2, HIPAA). Proficiency with GoogleCloudPlatform (GCP), Amazon Web Services (AWS), or similar cloud environments. Demonstrated experience applying modern software development practices in a collaborative, agile environment. Excellent communication skills, with a proven ability to mentor More ❯
domain. Understanding and/or certification in one or more of the following technology Kubernetes, Linux, Java and databases, Docker, Amazon Web Service (AWS), Azure, GoogleCloud (GCP), Kafka, Redis, VM's, Lucene. Occasional travel may be required. Bonus Points: Certifications and specialization in Data Science, Data Analytics, Data Engineering, Machine Learning, NLP, Data Infrastructure, analytics Deep understanding More ❯
London, England, United Kingdom Hybrid / WFH Options
Elasticsearch B.V
domain. Understanding and/or certification in one or more of the following technology Kubernetes, Linux, Java and databases, Docker, Amazon Web Service (AWS), Azure, GoogleCloud (GCP), Kafka, Redis, VM’s, Lucene. Occasional travel may be required. Bonus Points: Certifications and specialization in Data Science, Data Analytics, Data Engineering, Machine Learning, NLP, Data Infrastructure, analytics Deep understanding More ❯
Python or PowerShell Production experience operating containerization technologies - ideally with Kubernetes and/or Docker. Proficiency with one or more public cloud providers such as Azure, AWS or GCP Proficiency using Infrastructure as Code (IaC) tools such as Terraform (preferred), Ansible, or CloudFormation. Experience with monitoring, observability and logging tools such as DataDog, Prometheus, Grafana, or similar. Proven track More ❯
team for a contract until the end of November. Job Responsibilities/Objectives You will leverage your expertise in Software Development, along with your DevOps proficiency across AWS and GCP, to design and implement resilient frontend client, backend services, infrastructure automation, and cloud-native solutions. This is an opportunity to work on high-impact systems within a secure, high … CI/CD pipelines in collaboration with DevOps and Security teams, with a focus on traceability and regulatory controls. Manage, monitor, and optimize cloud infrastructure across AWS and GCP, ensuring resilience, cost-efficiency, and data security. Collaborate closely with infrastructure, architecture, and cybersecurity teams to meet internal risk, compliance, and governance requirements. Support live systems, perform root cause analysis … scale, distributed systems. Proficient in Python and GoLang. Experience with Liquibase or similar tools for database change management and version control. Hands-on experience with AWS and/or GCP, including cloud-native services, networking, IAM, and cost optimization. Experience with other cloud providers is desirable. Proven experience with DevOps practices, including Infrastructure as Code (e.g., Terraform), CI More ❯
Join Vonage and help us innovate cloud communications for businesses worldwide! Vonage is the emerging leader in the $100B+ cloud communications platform (CPaaS) market. Customers like Airbnb, Viber, Whatsapp, Snapchat, and many others depend on our APIs More ❯
Hands-on expertise in modern data platforms, tools, and technologies, such as: Advanced data modelling - operational and analytical Python, SQL Databricks, Spark Orchestration frameworks such as Dataform, Airflow, and GCP Workflows. Modern architecture and cloud platforms (GCP, AWS, Azure) DevOps practices Data warehouse and data lake design and implementation Familiarity with containerization and IaC tools (e.g., Docker, Kubernetes, Terraform More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Hypercube Consulting
Bonus + Benefits TL;DR Role: Data engineer role Location: UK‐based, fast growing data and AI consultancy specialising in the energy sector Cloud: Strong AWS, Azure or GCP (multi‐cloud desirable, certs a plus) Consultancy/Energy Experience: Highly beneficial, not essential Visa Sponsorship: Not available – you must already have the right to work in the UK … only part of the equation. Core - Ideally, you will have hands-on experience with the following in a previous role: Python SQL Cloudplatform architecture (AWS, Azure, GCP) Data Lakes/Lakehouses and analytical tools (Databricks, Azure Fabric/OneLake, AWS Lake Formation, Spark, Athena, etc.) CI/CD and other DevOps practices such as IaC Testing Nice More ❯
boundaries. Preferred Experience - 3+ years of professional software development experience. - Exposure to containerization (e.g., Docker) and orchestration (e.g., Kubernetes). - Experience with cloud platforms (e.g., Azure, AWS, or GCP). - Familiarity with distributed systems, compilers, or low-latency applications. - Knowledge of DevOps practices and CI/CD pipelines, with experience in GitHub Actions or similar tools. Our Tech Stack More ❯
infrastructure systems, ensuring alignment with business needs and technology standards. Minimum qualifications Proven experience managing and optimizing a diverse infrastructure stack. Extensive knowledge of cloud platforms (AWS, Azure, GCP) and infrastructure as code (Terraform, CloudFormation). Familiarity of service mesh technologies (Istio, Linkerd). Solid understanding of virtualization (VMware, Hyper-V) and containerization (Docker, Kubernetes) and orchestration. Understanding of More ❯
business platforms including Addepar, NetSuite, Salesforce, and other external and internal applications. Manage cloud-based data infrastructure on platforms such as Azure, Amazon Web Services, or GoogleCloudPlatform, with focus on cost optimization, stability, scalability, and performance. Collaborate with business analytics and data science teams to ensure data environments are optimized for downstream consumption, including More ❯
Job Description Maxar is seeking a talented and security-oriented Software Engineer with API and front-end development experience to join our Usage, Billing, and Metrics team within Maxar Geospatial Platform Core. Responsibilities: Develop, deploy, scale, and maintain highly More ❯