don't ask that you have experience in all of this, but if you do, that's great! Java, which makes up the majority of our backend codebase AWS & GCP - we're cloud-native Microservice based architecture Kubernetes (EKS) TeamCity for CI/CD (lots of teams are releasing code 15-20 times per day!) Terraform and Grafana The team More ❯
tooling. Collaborate with engineering, security, and product teams to drive end to end reliability. Qualifications Experience 6+ years of DevOps/SRE experience in cloud environments (AWS, Azure, or GCP). Expertise in Kubernetes, Docker, Helm, and microservices architectures. Strong scripting skills in Python, Bash, or Typescript. Proven track record of automating build, test, and deployment workflows. Clear communicator with More ❯
You'll Need 3-5+ years in DevOps, with proven CI/CD pipeline implementation experience. 3-5+ years' experience working with cloud platforms (AWS, Azure, or GCP). Strong scripting and automation skills (e.g., Python, JavaScript). Proficiency in tools such as Jenkins, CircleCI, GitHub/GitLab, Terraform, and Ansible. Familiarity with containerization (Docker), networking, and cloudMore ❯
Key responsibilities include: Develop, maintain, and improve CI/CD pipelines using tools such as GitHub Actions, Jenkins, GitLab CI, or similar. Manage cloud infrastructure (OCI, AWS, Azure, or GCP) using Infrastructure as Code tools like Terraform or Serverless Functions. Monitor system health and performance using tools like Prometheus, Grafana, or Datadog or NewRelic. Collaborate closely with development teams to More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Omnis Partners
Strong in Python and SQL , plus familiarity with Scala or Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake , Airflow Solid grasp of data modelling, orchestration, and infrastructure-as-code (Terraform, Docker, CI/CD) Excellent communication and client-facing More ❯
Strong in Python and SQL , plus familiarity with Scala or Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake , Airflow Solid grasp of data modelling, orchestration, and infrastructure-as-code (Terraform, Docker, CI/CD) Excellent communication and client-facing More ❯
Experience with containerized environments (e.g., Docker, Kubernetes). Exposure to various messaging platforms. Understanding of DevOps methodologies and CI/CD pipelines. Knowledge of cloud environments (e.g., AWS, Azure, GCP) and cloud-native deployments. Why Join Us? Be part of a mission-critical team enabling real-time data flows. Work with cutting-edge technologies and contribute to high-impact projects. More ❯
Experience with containerized environments (e.g., Docker, Kubernetes). Exposure to various messaging platforms. Understanding of DevOps methodologies and CI/CD pipelines. Knowledge of cloud environments (e.g., AWS, Azure, GCP) and cloud-native deployments. Why Join Us? Be part of a mission-critical team enabling real-time data flows. Work with cutting-edge technologies and contribute to high-impact projects. More ❯
infrastructure and production workflows. Strong technical foundation in machine learning and software engineering Proficiency in Python and ML libraries (e.g., TensorFlow, PyTorch, scikit-learn) Experience with cloud platforms (AWS, GCP, Azure) Experience with CI/CD pipelines for machine learning (e.g., Vertex AI) Familiarity with data processing tools like Apache Beam/Dataflow Strong understanding of monitoring and maintaining models More ❯
pg_stat_statements, Prometheus, Grafana). • Knowledge of scripting languages for automation and tooling. Preferred Qualifications: • Experience with containerized environments (Docker, Kubernetes). • Familiarity with cloud platforms (AWS RDS, GCPCloud SQL, or Azure Database for PostgreSQL). • Understanding of CI/CD pipelines and infrastructure as code (Terraform, Ansible). • Exposure to other RDBMS (e.g., Oracle, MySQL) or NoSQL More ❯
Angular. Hands-on experience building and scaling ML models (LLMs, NLP, classification) and deploying them as APIs/services. Strong skills in MLOps: containerisation (Docker, Kubernetes), cloud deployment (AWS, GCP, Azure), and CI/CD pipelines. Experience with prompt engineering, LLM evaluation, and vector databases (Pinecone, Weaviate, FAISS). Excellent communication skills and cross-functional collaboration experience. Benefits: Our team More ❯
broker management, scaling, upgrades, and integration with high-throughput data pipelines. Experience managing PostgreSQL databases, including load analysis, query optimisation, and role management. Experience with modern cloud infrastructure as GCP/AWS, Kubernetes, and Docker Proficiency in Unix systems, ideally Linux (we use Ubuntu). Strong communication skills, with experience mentoring engineers and collaborating with stakeholders. Proven ability to resolve More ❯
/Spring Boot, TypeScript/React/Angular, Golang, or Python • Proven track record of delivering scalable and maintainable solutions • Hands-on experience with cloud platforms (AWS, Azure, or GCP) • Solid understanding of CI/CD pipelines, Git, and secure software development • Ability to navigate complex systems and communicate technical concepts clearly • Leadership experience and ability to influence team and More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
RogueThink Inc
with InfoSec to integrate compliance and security scanning tools (e.g., SAST, DAST, SCA) into build pipelines. • Implement and monitor security controls and configurations in cloud platforms (AWS, Azure, or GCP). • Conduct vulnerability assessments and assist in remediation strategies. • Provide documentation and knowledge transfer to operations and development teams. • Collaborate with development teams on secure coding practices, especially in Node.js More ❯
Charles, Proxyman CI tools like Bitrise, Gitlab CI Application security knowledge Experience working with Android teams, website, or microservices Knowledge of GraphQL, microservice architectures, Docker, Kubernetes, cloud platforms (AWS, GCP, Azure), UX principles, web technologies, Java/Kotlin Additional Information Application involves online assessment, CV upload, and questions. More info at Next steps include a screening call, tech assessment, and More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
experience 4+ years in a data engineering or similar role, building and maintaining data pipelines Experience with various cloud platforms (as opposed to specialised in one) such as AWS, GCP and Azure Strong experience with Python and SQL Experience working with data lakes, warehouses, and large-scale datasets Understanding of data privacy and security principles Exposure to life sciences, genomics More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
ADLIB Recruitment
experience 4+ years in a data engineering or similar role, building and maintaining data pipelines Experience with various cloud platforms (as opposed to specialised in one) such as AWS, GCP and Azure Strong experience with Python and SQL Experience working with data lakes, warehouses, and large-scale datasets Understanding of data privacy and security principles Exposure to life sciences, genomics More ❯
experience of these would be really useful Scripting or Automation - Basic knowledge of Bash or Python to automate routine tasks Cloud support knowledge - Understanding of cloud environments like OCP, GCP, Azure, AWS as well as Private Cloud solutions Soft skills & Emotional Intelligence - Staying clam under pressure and effectively handling stressful situations whilst continuing to communicate effectively Observability - Familiarity with monitoring More ❯
and drivers. Experience with Git for version control. Hands-on experience with CI tools such as Jenkins or Harness. Experience packaging and deploying containers using Docker, K8S, AWS/GCP/OpenShift Experience with ALM, Zephyr, Jira, Confluence, Jenkins, Docker Strong analytical and troubleshooting skills. Excellent verbal and written communication skills. Why Join Citi? Citi is a global leader in More ❯
dbt, Airflow, and cloud data warehouses (Snowflake, BigQuery, Redshift) Strong understanding of data modelling, schema design, and building maintainable ELT/ETL pipelines Experience with cloud platforms (AWS, Azure, GCP) and infrastructure-as-code practices Familiarity with data visualisation tools (Tableau, PowerBI, Looker) and analytics frameworks Leadership & Communication Proven experience leading technical work streams and mentoring junior team members Exceptional More ❯
a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow , Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT pipelines . Proficiency in SQL and at least one programming language such as Python More ❯
low-level Linux issues, utilizing the command line, and shell scripting (bash) Strong fundamentals in data structures, design patterns, and algorithms Experience working with public clouds (e.g., AWS, Azure, GCP) Experience working with Docker and Kubernetes Understanding of authentication and authorization frameworks/standards (e.g., OAuth) Familiarity with hypervisors (e.g., VMWare, Hyper-V, VirtualBox, KVM) is a plus Familiarity with More ❯
databases , and Git . A knack for responsive design and mobile-first development. A mindset that blends innovation, curiosity, and precision . Bonus Points For: Cloud wizardry (AWS, Azure, GCP) Knowledge of AI tools (OpenAI, Document Intelligence) Experience with CI/CD pipelines and modern DevOps practices Security know-how (OWASP, data protection) Agile team experience - or just loving the More ❯