Cheltenham, Gloucestershire, United Kingdom Hybrid / WFH Options
Ripjar Ltd
Ripjar specialises in the development of software and data products that help governments and organisations combat serious financial crime. Our technology is used to identify criminal activity such as money laundering and terrorist financing, and enables organisations to enforce sanctions More ❯
in PowerShell, Bash, Python Strong knowledge in computer networking and security concepts and troubleshooting Experience with Cloud technologies e.g. AWS, Kubernetes, Lambda, RDS, DynamoDB Experience with messaging technologies e.g. Kafka, Solace, MQ Experience leveraging AI to increase efficiency and troubleshoot issues Familiarity with enterprise scheduling platforms e.g. ActiveBatch, Autosys, Control-M Experience using/building monitoring on platforms e.g. More ❯
Liverpool, England, United Kingdom Hybrid / WFH Options
BJSS
organisations. You can expect to get involved in variety of projects in the cloud (AWS, Azure, GCP), learning about and using data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. About You You're an engineer at heart and enjoy the challenge of building reliable, efficient data applications systems, services More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Ripjar
Ripjar specialises in the development of software and data products that help governments and organisations combat serious financial crime. Our technology is used to identify criminal activity such as money laundering and terrorist financing, and enables organisations to enforce sanctions More ❯
Sheffield, England, United Kingdom Hybrid / WFH Options
BJSS
organisations. You can expect to get involved in variety of projects in the cloud (AWS, Azure, GCP), learning about and using data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. About You You're an engineer at heart and enjoy the challenge of building reliable, efficient data applications systems, services More ❯
education or several years of Software engineering experience (development). Experience in designing data models. SQL knowledge. Experience with software testing (preferably Cucumber) Experience with application integration (API, ActiveMQ, Kafka, data lake, etc.). Experience with object-oriented languages (preferably JAVA and C#). Experience with cloud services and CICD (preferably GitHub and OpenShift). Experience with Scripting (PL More ❯
education or several years of Software engineering experience (development). Experience in designing data models. SQL knowledge. Experience with software testing (preferably Cucumber) Experience with application integration (API, ActiveMQ, Kafka, data lake, etc.). Experience with object-oriented languages (preferably JAVA and C#). Experience with cloud services and CICD (preferably GitHub and OpenShift). Experience with Scripting (PL More ❯
Confidence working with third-party vendors and managing tech partnerships. Nice to have AWS Solution Architect certification. Docker, Kubernetes, or container orchestration tools. Observability tools (e.g. New Relic) experience. Kafka, Flink, or IoT streaming tech exposure. Background in financial services or regulated environments. What’s in it for you? Competitive salary : Up to £100,000 per annum, dependent on More ❯
modern data architecture patterns . Familiarity with AI/ML workflows and their data requirements. Experience with API specifications and data integrations acrossETL and streaming services ( Glue, MSK, Kinesis, Kafka ) Familiarity with AI developer tools like Cursor and GitHub Copilot and desire to use them to 10x your throughput. Collaborative approach and excellent communication skills. Strong problem-solving abilities More ❯
with cron jobs , job orchestration, and error monitoring tools. Good to have Experience with Azure Bicep or other Infrastructure-as-Code tools. Exposure to real-time/streaming data (Kafka, Spark Streaming, etc.). Understanding of data mesh , data contracts , or domain-driven data architecture . Hands on experience with MLflow and Llama Apply for this job indicates a More ❯
Experience with observability platforms (OpenTelemetry) and distributed systems Nice-to-have skills: Python programming and Linux system debugging Database administration (SQL, MongoDB, Redis) Message broker and event streaming experience (Kafka) Database performance optimisation skills At Busuu we want to ensure that you have access to some great benefits: Our centrally located offices are well-equipped with free breakfast, plenty More ❯
and MLflow for managing and tracking code deployment or model versions Experience with cloud-based data platforms such as AWS or Google Cloud Platform Nice to have: Experience with Kafka Proven track record of leading large scale data infrastructure in production Experience with container technologies (such as Docker) and orchestration technologies (such as Kubernetes) Experience working in ecommerce or More ❯
network protocols and architectures (ISIS, BGP, BMP, ARP, SNMP, CDP/LLDP) and network engineering, management, and operations. Experience with search and analytics engines/big data tools (OpenSearch, Kafka, Kibana, Telegraf, InfluxDB, Prometheus). Our Preferred Qualifications for this role: Basic understanding of AI and ML algorithms, including model training, testing, and deployment. Hands-on project experience in More ❯
Bash), VCS (git), and Linux Proven experience in cross-functional teams and able to communicate effectively about technical and operational challenges. Preferred Qualifications: Proficiency with scalable data frameworks (Spark, Kafka, Flink) Proven Expertise with Infrastructure as Code and Cloud best practices Proficiency with monitoring and logging tools (e.g., Prometheus, Grafana) Working at Lila Sciences, you would have access to More ❯
in setting up and managing monitoring, metrics, and alerting systems Experience operating production-grade services at scale Great to have: Experience with tools such as: Terraform, SaltStack, MongoDB, Elasticsearch, Kafka, Prometheus, Grafana or HashiCorp Vault Experience with securing applications, services, and data, including authentication, authorization, TLS, and encryption Exposure to Kubernetes (administering, deploying, or developing apps on K8s clusters More ❯
London, England, United Kingdom Hybrid / WFH Options
Bloomberg
side code adheres to the latest best practices; utilizing Python >3.10, extensive test coverage, local development with Docker and automated packaging and deployment alongside leveraging open-source technologies like Kafka, RabbitMQ, Redis, Cassandra and Zookeeper. By joining our team, you’ll have the opportunity to work on a modern tech stack that blends infrastructure (~80%) and application development More ❯
Ruby, and Python Experience of Relational, NoSQL, and Columnar Databases We have PostgreSQL, Scylla, Mongo, Redis, ClickHouse, and more... Comfortable with Linux server management. Practical experience with Kubernetes (K8s), Kafka, Infrastructure as Code (IaC), and major cloud providers (AWS/GCP). Understanding of front-end frameworks (e.g., React) and web crawling/scraping methodologies is beneficial. Corsearch is More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
QAD Inc
Uses database migration patterns, such as, “expand and contract” using go-migrate Writing observable and testable code using libraries such as testify and mockgen Publishing and consuming Avro formatted Kafka messages CI/CD GitHub Actions Trunk Based Development & Continuous Delivery Good collaboration skills at all levels with cross-functional teams Highly developed ownerships and creative thinking Analytical thinking More ❯
modern data architecture patterns . Familiarity with AI/ML workflows and their data requirements. Experience with API specifications and data integrations acrossETL and streaming services ( Glue, MSK, Kinesis, Kafka ) Familiarity with AI developer tools like Cursor and GitHub Copilot and desire to use them to 10x your throughput. Collaborative approach and excellent communication skills. Strong problem-solving abilities More ❯
professional experience in data engineering roles, preferably for a customer facing data product Expertise in designing and implementing large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling More ❯
Uses database migration patterns, such as, "expand and contract" using go-migrate Writing observable and testable code using libraries such as testify and mockgen Publishing and consuming Avro formatted Kafka messages CI/CD GitHub Actions Trunk Based Development & Continuous Delivery Soft skills: Good collaboration skills at all levels with cross-functional teams Highly developed ownerships and creative thinking More ❯
London, England, United Kingdom Hybrid / WFH Options
Prima
an Agile environment Nice-to-Have Good knowledge of functional programming languages Professional experience with at least one of Rust or Elixir Knowledge of TDD Knowledge of RabbitMQ/Kafka Why you’ll love it here We want to make Prima a happy and empowering place to work. So if you decide to join us, you can expect plenty More ❯
specifically AWS - enterprise software implementations. Experience with Agile software development. Experience with big data applications. Experience or Interest in building AI capabilities for Operations. Experience in data technologies like Kafka, Elastic, Spark, NiFi. Familiarity with Kubernetes deployment, Agile methodologies, and tools. Experience with NetCool IBM products. Experience with Watson X IBM Products. Experience with LLM products and capabilities. Experience More ❯
London, England, United Kingdom Hybrid / WFH Options
ZILO™
days each week. Key Responsibilities: Design and develop highly scalable and reliable services in GO language Collaborating with cross-functional teams to design, develop, and test software solutions Kafka integration and implementation with Go services Leverage the corporate AI assistant and other strategic coding tools to enhance development workflows Actively use AI tools to support code generation, debugging, documentation More ❯
experiences of SDLC methodologies e.g. Agile, Waterfall Skilled in business requirements analysis with ability to translate business information into technical specifications Skills Required (desirable): Knowledge of streaming services – Flink, Kafka Knowledge of Dimensional Modelling Knowledge of No Sql dbs (dynamo db, Cassandra) Knowledge of node based architecture, graph databases and languages – Neptune, Neo4j, Gremlin, Cypher Experience 5+ years of More ❯