Established ‘ethically minded’ FinTech specialising in cutting edge Data Analytics are looking for a DevOps & Infrastructure Engineer. Wide variety of technology and cloud providers, including (but not limited to): AWS for most of their own internal projects Azure and More ❯
enterprise data pipelines. Mentor engineering teams and lead best practice adoption across data architecture, orchestration, and DevOps tooling. Participate in technical workshops, executive briefings, and architecture reviews to evangelize GCP data capabilities. Required Qualifications Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related technical field. 12+ years of experience in data architecture and data engineering with … BigQuery, Cloud Spanner, and Bigtable. Hands-on experience with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for More ❯
enterprise data pipelines. Mentor engineering teams and lead best practice adoption across data architecture, orchestration, and DevOps tooling. Participate in technical workshops, executive briefings, and architecture reviews to evangelize GCP data capabilities. Required Qualifications Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related technical field. 12+ years of experience in data architecture and data engineering with … BigQuery, Cloud Spanner, and Bigtable. Hands-on experience with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for More ❯
London) with a strong delivery and solutions back ground in Data Analytics and Cloud. The ideal candidate should have expertise in next-gen data technologies such as Microsoft, AWS, GCP, Snowflake, and Databricks. Sales, Delivery, Presales, Solutioning experience is mandatory within Data Analytics alongside with the ability to deliver cloud-based enterprise platforms and data consulting skills is a … z2bz0 years of total experience in DWBI, Big Data, Cloud Technologies Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). … Good written and verbal communication skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills More ❯
Qualifications: Proven experience in technical leadership roles, with a strong background in platform engineering and delivery. Experience with AI technologies. Proficiency in cloud platforms (e.g., AWS, Azure, GCP) and container orchestration (e.g., Kubernetes). Excellent stakeholder management skills with a track record of successfully delivering complex technical projects on time and within budget. Strong analytical and problem-solving More ❯
Orchestration: In-depth knowledge and hands-on experience with Kubernetes and its managed counterparts (EKS, AKS, GKE). Public Cloud Expertise: In-depth knowledge of AWS, Azure, or GCP services, architecture, and best practices. DevOps Mastery: Excellent knowledge of CI/CD, containerisation, and serverless technologies. Networking: Expertise in designing and managing secure, scalable AWS network architectures (VPC, VPN … and Industry Involvement: Proven commitment to staying up-to-date with the latest trends and contributing to industry knowledge sharing. Qualifications Certification in relevant cloud technologies (AWS, Azure, GCP) at an expert/professional level is highly desirable, e.g.: AWS Solutions Architect Professional AWS Certified Advanced Networking - Specialty (highly desirable) Certification in Kubernetes administration is desirable: Certified Kubernetes Administrator More ❯
London, England, United Kingdom Hybrid / WFH Options
Our Future Health
makes. Requirements Experience Proficiency in cloud DevOps/platform engineering and large-scale live services. Azure experience preferred; experience with an additional cloud preferred (AWS or GCP) Hands-on experience with infrastructure-as-code tools such as Terraform, Ansible, Chef, Puppet Strong experience coding and automating tasks in a high-level language, preferably Python Experience in building More ❯
London, England, United Kingdom Hybrid / WFH Options
Rightmove
who: Always pushes for continuous improvement and has strong attention to detail. Positive outlook, good people and communication skills. Has an automation mindset. Relevant Technology we use: GoogleCloudPlatformGoogle Kubernetes Engine with Anthos Service Mesh Confluent Cloud Incident.io Gitlab Jira, Confluence, Slack, Teams Elastic APM, Kibana Java, Node, Python, Javascript, Go, React About More ❯
Newcastle Upon Tyne, United Kingdom Hybrid / WFH Options
NHS Business Services Authority
Job summary If you are looking for the opportunity to be responsible for the day-to-day reactive and proactive support of the platforms that host business critical NHS Business Services Authority (NHSBSA) systems as well as getting involved in More ❯
Preferred Technical And Professional Experience Experience with AI/ML workloads and infrastructure. Knowledge of cybersecurity best practices and threat mitigation. Familiarity with multiple cloud vendors (e.g., AWS, GCP, Azure). Seniority level Seniority level Mid-Senior level Employment type Employment type Full-time Job function Job function Information Technology Industries IT Services and IT Consulting Referrals increase your More ❯
Ability to juggle tasks and projects in a fast-paced environment PREFERRED QUALIFICATIONS Professional experience with AWS and/or other cloud offerings such as Azure, GoogleCloudPlatform etc. Programming or scripting skills with a combination of Java, Python Perl, Ruby, C#, and/or PHP Previous experience as a Software Engineer, Developer, Solution Architect More ❯
Farnborough, England, United Kingdom Hybrid / WFH Options
Searchability NS&D
Grafana Comfortable in fast-paced, agile environments Excellent communication and problem-solving skills Active SC or DV clearance required NICE TO HAVE Experience with cloud platforms (AWS, Azure, GCP) GitOps mindset and secure-by-design principles Background in consultancy or client-facing delivery WHY JOIN? Rapid career growth in a mission-led tech environment Flexible working & supportive team culture More ❯
contributions, and a variety of benefit options for you to choosefrom. Your future role: We are looking for Senior DevOps in the areas of Cloud Native Services (AWS, GCP, Azure), Software Defined Networks, Operating Systems, and Infrastructure as Code within the Technology & Engineering unit of the newly formed Sovereign Cloud Delivery & Operations team in the Product Engineering board … CD) system; What you bring Bachelor's degree in computer science, engineering or a related field, or equivalent experience Knowledge of Cloud services such as AWS EC2, S3, GCP Compute Engine, Object Storage, Azure Compute etc. Implementation Experience with CI/CD pipelines e.g. Jenkins, ConcourseCI, ArgoCD, Azure Pipelines, CircleCI, GitLab. Knowledge/Certification on hyperscaler infrastructure such as … AWS, Azure, GCP Infra Automation skills using git, ansible, and terraform Solid experience in installing, configuring and troubleshooting UNIX/Linux based environments Strong scripting skills in any one of the following: shell scripts(bash/zsh),Ruby, Python, Strong Knowledge and experience on Helm tool is an added advantage. Knowledge of programming languages preferred: Python, Go, Groovy, Java, RustGo More ❯
Our work language is English; hence it’s very important to be proficient in it. Extensive knowledge and experience in one of the major clouds, including AWS, Azure, or GCP, with a comprehensive understanding and real-world implementation experience (We currently use AWS and Azure). Microservices in a cloud-native world: architecture, deployments, and engineering in the Kubernetes More ❯
including architectures and user guides,helping to enforce data management standards. Participate in agile ceremonies and provide occasional client interaction. Engage in DataOps practices and improve data delivery performance. GCP: GCS, BigQuery, GKE, Artifact Registry, Vertex AI, App Engine, Data Store, Secret Manager, Pub/Sub What do I need to bring with me? It is essential that your personal … data systems optimisation. Commitment to data quality and experience in synthetic data, AI/ML model deployment. Excellent communication skills and a collaborative and positive mindset. Willingness to learn. GCP certifications are a plus. We offer a comprehensive benefits package designed to support you as an individual. Our standard benefits include 25 days annual leave, pension contribution, income protection and More ❯
Optimistically by Championing Growth and Development to Mobilize the Enterprise. What You’ll Need Recent hands-on experience (7+ years) as a Cloud Engineer/DevOps/SRE (GCP/AWS/Azure). Proven work experience in automating tasks within a cloud-based environment, using infrastructure coding tools (Terraform, Terragrunt). Experience with configuration management tools (Ansible More ❯
Proficiency in at least one programming language (e.g. Python, Java, Go). Familiarity with IaC/IfC tools (e.g. Terraform). Experience with cloud platforms (AWS, Azure, or GCP) and their associated data services. Hands-on experience with continuous integration and deployment systems (e.g. Jenkins, Tekton). Practical experience with containerization and orchestration technologies, particularly Kubernetes. Familiarity with observability More ❯
Familiarity with product management principles and collaboration with cross-functional teams. Skills and Competencies Technical Proficiency: Strong hands-on experience with programming languages, cloud platforms (e.g., AWS, Azure, GCP), and modern development tools. Leadership: Ability to inspire, motivate, and guide engineering teams toward achieving shared goals. Problem Solving: Analytical and strategic thinking to address technical and delivery challenges. Communication More ❯
with CI/CD pipelines and associated tools/frameworks. Containerisation Good knowledge of container technologies such as Docker and Kubernetes. Cloud Experience with cloud platforms like GCP, AWS, or Azure. Good understanding of cloud storage, networking, and resource management. #J-18808-Ljbffr More ❯
London, England, United Kingdom Hybrid / WFH Options
Attest
methodologies, CI/CD pipelines, security best practices, and cloud-based technologies (we use AWS but we believe skills are transferable from similar products such as Azure and GCP). We want this person to delve into the challenges and focusing on outcomes. This role might not be for you if ... . You want to work in a More ❯
Preferred technical and professional experience Experience with AI/ML workloads and infrastructure. Knowledge of cybersecurity best practices and threat mitigation. Familiarity with multiple cloud vendors (e.g., AWS, GCP, Azure). ABOUT BUSINESS UNIT IBM Consulting is IBM’s consulting and global professional services business, with market leading capabilities in business and technology transformation. With deep expertise in many More ❯
Preferred technical and professional experience Experience with AI/ML workloads and infrastructure. Knowledge of cybersecurity best practices and threat mitigation. Familiarity with multiple cloud vendors (e.g., AWS, GCP, Azure). ABOUT BUSINESS UNIT IBM Consulting is IBM’s consulting and global professional services business, with market leading capabilities in business and technology transformation. With deep expertise in many More ❯
London, England, United Kingdom Hybrid / WFH Options
Doit Intl
quickly learn and stay up to date with industry trends. Data certification is a major advantage (e.g., Stanford, Coursera, Udacity, MIT, eCornell, or any Data certification with AWS/GCP). BA/BS degree in Computer Science, Mathematics, Economics, or a related technical field, or equivalent practical experience. Be your truest self. Work on your terms. Make a difference. More ❯
London) with a strong delivery and solutions back ground in Data Analytics and Cloud. The ideal candidate should have expertise in next-gen data technologies such as Microsoft, AWS, GCP, Snowflake, and Databricks. Sales, Delivery, Presales, Solutioning experience is mandatory within Data Analytics alongside with the ability to deliver cloud-based enterprise platforms and data consulting skills is a … z2bz0 years of total experience in DWBI, Big Data, Cloud Technologies Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). … Good written and verbal communication skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills More ❯