experience with deployment, configuration, and troubleshooting in live production systems. Experience with Messaging Systems: You have experience with distributed systems that use some form of messaging system (e.g. RabbitMQ, Kafka, Pulsar, etc). The role is focusing on RabbitMQ and you will have time to acquire deep knowledge in it. Programming Proficiency: You have some proficiency in at least More ❯
hyperparameter tuning, and model versioning. Strong social media data extraction and scraping skills at scale (Twitter v2, Reddit, Discord, Telegram, Scrapy, Playwright). Experience with real-time streaming systems (Kafka, RabbitMQ) and ingesting high-velocity data. Deep data-engineering expertise across Postgres, Redis, InfluxDB, and ClickHouse-schema design, indexing, and caching for sub-second reads. Experience deploying microservices in More ❯
understanding of relational databases (e.g., PostgreSQL). Bonus: Advanced LookML knowledge and experience building data visualisation tools. Skilled in building and managing real-time and batch data pipelines using Kafka and DBT. Familiarity with Docker, Terraform, and Kubernetes for application orchestration and deployment. A strong numerical or technical background, ideally with a degree in mathematics, physics, computer science, engineering More ❯
strengthen an application: Passion for transportation or sustainable technologies Deeper experience with parts of our stack, eg Go, Typescript, react Terraform or other Infrastructure as Code tooling Exposure to Kafka, event driven architectures, or message queues Familiarity with HashiCorp Vault or other secrets management tooling Deeper knowledge of CI/CD pipelines Experience in a start-up or scale More ❯
Spark and Databricks AWS services (e.g. IAM, S3, Redis, ECS) Shell scripting and related developer tooling CI/CD tools and best practices Streaming and batch data systems (e.g. Kafka, Airflow, RabbitMQ) Additional Information Health + Mental Wellbeing PMI and cash plan healthcare access with Bupa Subsidised counselling and coaching with Self Space Cycle to Work scheme with options More ❯
cloud computing. • You have additional nice-to-have experience in the following areas: database engines (Microsoft SQL Server, Aerospike, Vertica, Redis), building micro-services, operating systems and cloud, Kubernetes, Kafka, EMR, Spark. A variety of technical opportunities is one of the best things about working at The Trade Desk as a software engineer, which is why we do not More ❯
Testing performance with JMeter or similar tools Web services technology such as REST, JSON or Thrift Testing web applications with Selenium WebDriver Big data technology such as Hadoop, MongoDB, Kafka or SQL Network principles and protocols such as HTTP, TLS and TCP Continuous integration systems such as Jenkins or Bamboo What you can expect: At Global Relay, there's More ❯
Create and maintain Forms, Reports, Views, Workflows, Groups and Roles Create, maintain and enhance Dashboards and reporting, including scheduled reports Create and configure tool integrations with ServiceNow (Elastic, Netcool, Kafka, etc) Coordinate and support application and platform upgrades Assist with design, creation and cataloging of business process flows Work with other members of the ServiceNow development team to propose More ❯
to ensure our systems are trusted, reliable and available . The technology underpinning these capabilities includes industry leading data and analytics products such as Snowflake, Tableau, DBT, Talend, Collibra, Kafka/Confluent , Astronomer/Airflow , and Kubernetes . This forms part of a longer-term strategic direction to implement Data Mesh, and with it establish shared platform s that More ❯
Norwich, Norfolk, United Kingdom Hybrid / WFH Options
Cooper Lomaz Recruitment Ltd
write clean, maintainable code The ability to work independently as well as in a team, remotely and in person Desirable Skills: REST (inc. authentication with OAuth, JWT, etc.) ApacheKafka GraphQL Docker Scala Company Benefits: Hybrid working pattern Pension Bonus Discount on products If you wish to work in a business that has strong values, offers passion for progress More ❯
Norwich, Norfolk, United Kingdom Hybrid / WFH Options
Cooper Lomaz
clean, maintainable code The ability to work independently as well as in a team, remotely and in person Nice to have: REST (inc. authentication with OAuth, JWT, etc.) ApacheKafka GraphQL Docker Scala What the business can offer you . Hybrid working pattern Pension Bonus Discount on products If you wish to work in a business that has strong More ❯
Norwich, Norfolk, England, United Kingdom Hybrid / WFH Options
Cooper Lomaz Recruitment
clean, maintainable code The ability to work independently as well as in a team, remotely and in person Nice to have: REST (inc. authentication with OAuth, JWT, etc.) ApacheKafka GraphQL Docker Scala What the business can offer you.... Hybrid working pattern Pension Bonus Discount on products If you wish to work in a business that has strong values More ❯
Glasgow, Lanarkshire, Scotland, United Kingdom Hybrid / WFH Options
Henderson Scott
a fast-paced environment. Tech Stack: You'll need extensive experience in complex software development, with a strong background in Java and Oracle . The platform also uses: ApacheKafka & Ignite Spring Kubernetes If you have a strong technical background and are looking for a significant role in a high-profile greenfield project, please get in touch for more More ❯
management workflows across diverse data sources 4+ years of hands-on experience with Python and SQL/NoSQL databases, experience with technologies such as Amazon S3, AWS Lambda, ApacheKafka, and Apache Airflow Proven ability to mentor team members, clearly communicate complex concepts and methodologies, and effectively collaborate across diverse and distributed teams Leadership experience, either as a formal More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
developing talent and fostering a culture of learning and ownership Nice to Have Experience with Data Mesh or Data Lake architectures Familiarity with Kubernetes , Docker, and real-time streaming (Kafka, Kinesis) Exposure to ML engineering pipelines or MLOps Interested? If you're ready to lead a team at the forefront of data innovation in a mission-driven scale-up More ❯
West London, London, England, United Kingdom Hybrid / WFH Options
Young's Employment Services Ltd
Data Engineer, Tech Lead, Data Engineering Manager etc. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in a cloud More ❯
and modular code suitable for production environments. Desirable Skills or Knowledge Familiarity with AMO physics or quantum machine learning. Experience with MLOps best practices. Knowledge of systems like ApacheKafka, MQTT or real-time data pipelines. Experience with cloud platforms (AWS, Azure, GCP). Deep expertise in time series modelling techniques (e.g., ARIMA, VAR, Prophet, LSTM). Solid grasp More ❯
technology and product evolution We'd love to see: Experience with high volume, high availability distributed systems Good working knowledge of databases and messaging queues, preferably PostgreSQL and ApacheKafka Familiarity of JavaScript/TypeScript Bloomberg is an equal opportunity employer and we value diversity at our company. We do not discriminate on the basis of age, ancestry, color More ❯
/CD pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new More ❯
with the ability to clearly articulate technical concepts to both technical and non-technical stakeholders. An interest (or experience) in any Distributed Systems or Event-Driven Architectures (using ApacheKafka). Proven track record of designing and implementing software architecture for complex applications, including considerations for scalability, maintainability, and security. Familiarity with CI/CD pipelines and deployment strategies More ❯
development- At least 6+ years of experience in Java- Deep knowledge and usage of SQL (Oracle or PostgreSQL dialects is preferable) Nice to have - Experience with Messaging Systems: RabbitMQ, Kafka, etc.- Experience with Kubernetes and Docker- Linux user More ❯
governance and standards. Planning and risk management experience. Ability to understand, engage and work with all team capability areas - particularly with architects. Experience with Java, React, SQL, AWS cloud, Kafka, microservices, headless services & architecture. If you find this opportunity intriguing and aligning with your skill set, we welcome the submission of your CV without delay. More ❯
Kubernetes. Strong ownership, accountability, and communication skills. Bonus Points For: Experience leading projects in SCIF environments. Expertise in Cyber Analytics, PCAP, or network monitoring. Familiarity with Spark, Dask, Snowpark, Kafka, or task schedulers like Airflow and Celery. More ❯