experience with agile methodologies and trunk-based development. Solid expertise in database modeling, optimization, and performance tuning, specifically PostgreSQL. Hands-on experience with event-driven queuing services (AWS SNS, Kafka, or similar). Experience with LLMs and AI Agents Experience with building resilient and high-performance backend architectures (e.g., REST APIs, microservices). Nice to have Certifications in AWS More ❯
the bigger picture. Essential Skills Significant demonstrable experience in one or both of: o modern software engineering for digital products (ideally Java, microservice architectures, Hexagonal software architecture, Mongo and Kafka); o OR designing and implementing modern cloud infrastructure, DevOps, and automation (ideally AWS, Terraform, GitLab CI, Jenkins); Significant demonstrable experience of leading engineering teams providing technical leadership and guidance More ❯
working with internal technical staff and stakeholders Practical experience of analysing requirements Ability to quickly learn new technical skills Coaching and mentoring Desirable Experience of working with Flask, MongoDB, Kafka, Typescript, AWS Experience of working with statically typed Python codebases Experience defining, implementing or supporting software in financial services industry(either traditional finance or crypto) preferred Experience of Domain More ❯
with database design and management using SQL and NoSQL databases Familiarity with cloud infrastructure services like AWS, GCP, or Azure is a plus Experience with message brokers such as Kafka or RabbitMQ Strong analytical and problem-solving skills Excellent communication skills and ability to work well in a team environment Amber Group opens its doors to the curious problem More ❯
St. Albans, Hertfordshire, United Kingdom Hybrid / WFH Options
Lindar
application architecture and microservices. Strong grasp and professional experience with concurrent programming, concepts idempotency and distributed transaction management in application logic. Familiarity with the technologies in our stack, including Kafka, Redis, and Kubernetes. Practical experience with MongoDB would be considered a big advantage. Knowledge of AngularJS is a plus. Passion in the eyes and swiftness in the fingers ;) What More ❯
in setting up and managing monitoring, metrics, and alerting systems Experience operating production-grade services at scale Great to have: Experience with tools such as: Terraform, SaltStack, MongoDB, Elasticsearch, Kafka, Prometheus, Grafana or HashiCorp Vault Experience with securing applications, services, and data, including authentication, authorization, TLS, and encryption Exposure to Kubernetes (administering, deploying, or developing apps on K8s clusters More ❯
tools. Strong technical proficiency in data modeling, SQL, NoSQL databases, and data warehousing. Hands-on experience with data pipeline development, ETL processes, and big data technologies (e.g., Hadoop, Spark, Kafka). Proficiency in cloud platforms such as AWS, Azure, or Google Cloud and cloud-based data services (e.g., AWS Redshift, Azure Synapse Analytics, Google BigQuery). Experience with DataOps More ❯
south west london, south east england, united kingdom
Mars
tools. Strong technical proficiency in data modeling, SQL, NoSQL databases, and data warehousing. Hands-on experience with data pipeline development, ETL processes, and big data technologies (e.g., Hadoop, Spark, Kafka). Proficiency in cloud platforms such as AWS, Azure, or Google Cloud and cloud-based data services (e.g., AWS Redshift, Azure Synapse Analytics, Google BigQuery). Experience with DataOps More ❯
and Data pipelines are built on-top of the open-source Flyte orchestration framework and are deployed to AWS. Pipeline code is written in Python. We use SQS and Kafka to automate data connections and leverage BigQuery and Elasticsearch for data storage. We believe strongly in automation and testing to ensure delivery of robust and correct systems. We are More ❯
and Data pipelines are built on-top of the open-source Flyte orchestration framework and are deployed to AWS. Pipeline code is written in Python. We use SQS and Kafka to automate data connections and leverage BigQuery and Elasticsearch for data storage. We believe strongly in automation and testing to ensure delivery of robust and correct systems. We are More ❯
end-to-end, deploy to production frequently, and see the real-world impact of what they build. You'll also get to work with a modern tech stack: Kotlin, Kafka, Kubernetes, Docker, AWS, Aurora Postgres and more. We work hybrid, with at least 2 days a week together in our Manchester office. A day in the life: Owning and More ❯
or similar data structures and feeds. Exposure to microservices systems and container/cloud deployment and hosting designs. Experience leveraging LLMs assistant tools rationally for coding. Additional technology experience: Kafka, ELK, Mongo, DBaaS, SaaS, Tableau. Education: Bachelor's degree or equivalent experience operating in a similar role This job description provides a high-level review of the types of More ❯
as Azure (our preference) or AWS, including active participation in building CI/CD pipelines and instrumenting for operations, is required. Experience with streaming and messaging platforms, such as Kafka or NATS, is highly desirable. Proven experience in leading the design of complex software applications, preferably for corporate enterprise, where solutions comprise existing services alongside custom development. We are More ❯
/Integration - Understanding of the application middleware space. Technical knowledge of messaging products such as IBM MQ, TIBCO EMS/RV and/or Open Source messaging such as Kafka, Rabbit, ActiveMQ, etc. would be beneficial to the role Enterprise Integration Patterns and Solution architecture experience would be an asset. Knowledge of Digital patterns, application lifecycle management, continuous integration More ❯
or non-relational databases, preferably PostgreSQL, DynamoDB, AWS Athena Nice to haves: Experience with eCommerce Experience with Docker and Kubernetes Experience with event-driven architectures, preferably using RabbitMQ or Kafka Experience in using production AWS infrastructure, ideally with Terraform Additional Information PMI and cash plan healthcare access with Bupa Subsidised counselling and coaching with Self Space Cycle to Work More ❯
deliver on time Collaborative mindset with experience working in cross-functional teams Bonus Points Experience with machine learning workflows and MLOps practices Knowledge of real-time streaming data processing (Kafka, Kinesis) Background in consulting or client-facing technical roles Contributions to open-source analytics or data engineering projects What Success Looks Like in Your First Year Technical Impact: Successfully More ❯
Proficiency in Unix systems, ideally Linux (Ubuntu) Strong communication skills and experience mentoring engineers Responsibilities Your main responsibilities will involve working on: Data pipeline and underlying infrastructure, especially ApacheKafka and PostgreSQL database Technologies like Kubernetes, Vault, Go microservices, and other cloud services On-premise infrastructure Your technical expertise will play a crucial role in these areas. More ❯
. Advanced skills in SQL and Python , with hands-on experience in relational databases across cloud and on-prem environments. Familiarity with modern data technologies such as Apache Spark , Kafka , or Snowflake . A comprehensive understanding of the data engineering lifecycle, including Agile delivery , DevOps , Git , APIs , containers , microservices , and pipeline orchestration . Nice to have: Databricks Certified Data More ❯
platforms (Azure, AWS, GCP) Hands-on experience with SQL, Data Pipelines, Data Orchestration and Integration Tools Experience in data platforms on premises/cloud using technologies such as: Hadoop, Kafka, Apache Spark, Apache Flink, object, relational and NoSQL data stores. Hands-on experience with big data application development and cloud data warehousing (e.g. Hadoop, Spark, Redshift, Snowflake, GCP BigQuery More ❯
and sustainability-delivering under tight deadlines without compromising quality. Your Qualifications 12+ years of software engineering experience, ideally in platform, infrastructure, or data-centric product development. Expertise in ApacheKafka, Apache Flink, and/or Apache Pulsar. Deep understanding of event-driven architectures, data lakes, and streaming pipelines. Strong experience integrating AI/ML models into production systems, including More ❯
and sustainability-delivering under tight deadlines without compromising quality. Your Qualifications 12+ years of software engineering experience, ideally in platform, infrastructure, or data-centric product development. Expertise in ApacheKafka, Apache Flink, and/or Apache Pulsar. Deep understanding of event-driven architectures, data lakes, and streaming pipelines. Strong experience integrating AI/ML models into production systems, including More ❯
. Advanced skills in SQL and Python , with hands-on experience in relational databases across cloud and on-prem environments. Familiarity with modern data technologies such as Apache Spark , Kafka , or Snowflake . A comprehensive understanding of the data engineering lifecycle, including Agile delivery , DevOps , Git , APIs , containers , microservices , and pipeline orchestration . Nice to have: DP-203 Azure More ❯
. Advanced skills in SQL and Python , with hands-on experience in relational databases across cloud and on-prem environments. Familiarity with modern data technologies such as Apache Spark , Kafka , or Snowflake . A comprehensive understanding of the data engineering lifecycle, including Agile delivery , DevOps , Git , APIs , containers , microservices , and pipeline orchestration . Nice to have: DP-203 Azure More ❯
Data Science, Engineering, or a related field. Strong programming skills in languages such as Python, SQL, or Java. Familiarity with data processing frameworks and tools (e.g., Apache Spark, Hadoop, Kafka) is a plus. Basic understanding of cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledge of database systems (e.g., MySQL, PostgreSQL, MongoDB) and data warehousing concepts. More ❯
Comfortable balancing technical excellence with business priorities and constraints Nice to have Experience building Data Mesh or Data Lake architectures. Familiarity with Kubernetes, Docker, and real-time streaming (e.g. Kafka, Kinesis). Exposure to ML engineering pipelines or MLOps frameworks. What's it like to work at Zego? Joining Zego is a career-defining move. People go further here More ❯