/Integration - Understanding of the application middleware space. Technical knowledge of messaging products such as IBM MQ, TIBCO EMS/RV and/or Open Source messaging such as Kafka, Rabbit, ActiveMQ, etc. would be beneficial to the role Enterprise Integration Patterns and Solution architecture experience would be an asset. Knowledge of Digital patterns, application lifecycle management, continuous integration More ❯
or non-relational databases, preferably PostgreSQL, DynamoDB, AWS Athena Nice to haves: Experience with eCommerce Experience with Docker and Kubernetes Experience with event-driven architectures, preferably using RabbitMQ or Kafka Experience in using production AWS infrastructure, ideally with Terraform Additional Information PMI and cash plan healthcare access with Bupa Subsidised counselling and coaching with Self Space Cycle to Work More ❯
hyperparameter tuning, and model versioning. Strong social media data extraction and scraping skills at scale (Twitter v2, Reddit, Discord, Telegram, Scrapy, Playwright). Experience with real-time streaming systems (Kafka, RabbitMQ) and ingesting high-velocity data. Deep data-engineering expertise across Postgres, Redis, InfluxDB, and ClickHouseschema design, indexing, and caching for sub-second reads. Experience deploying microservices in production More ❯
onboarding to get up to speed in the below: Go to write our application code (there's an excellent interactive Go tutorial here) Cassandra for most persistent data storage Kafka for our asynchronous message queue Kubernetes and Docker to schedule and run our services Envoy Proxy for RPC AWS for most of our production infrastructure and GCP for most More ❯
Create and maintain Forms, Reports, Views, Workflows, Groups and Roles Create, maintain and enhance Dashboards and reporting, including scheduled reports Create and configure tool integrations with ServiceNow (Elastic, Netcool, Kafka, etc) Coordinate and support application and platform upgrades Assist with design, creation and cataloging of business process flows Work with other members of the ServiceNow development team to propose More ❯
Proficiency in Unix systems, ideally Linux (Ubuntu) Strong communication skills and experience mentoring engineers Responsibilities Your main responsibilities will involve working on: Data pipeline and underlying infrastructure, especially ApacheKafka and PostgreSQL database Technologies like Kubernetes, Vault, Go microservices, and other cloud services On-premise infrastructure Your technical expertise will play a crucial role in these areas. More ❯
training opportunities and social benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, ApacheKafka, Apache Hadoop, Apache HBase, Apache Nifi, Apache Airflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes Strong knowledge of DevOps tools (Terraform, Ansible, ArgoCD, GitOps, etc. More ❯
and sustainability-delivering under tight deadlines without compromising quality. Your Qualifications 12+ years of software engineering experience, ideally in platform, infrastructure, or data-centric product development. Expertise in ApacheKafka, Apache Flink, and/or Apache Pulsar. Deep understanding of event-driven architectures, data lakes, and streaming pipelines. Strong experience integrating AI/ML models into production systems, including More ❯
and sustainability-delivering under tight deadlines without compromising quality. Your Qualifications 12+ years of software engineering experience, ideally in platform, infrastructure, or data-centric product development. Expertise in ApacheKafka, Apache Flink, and/or Apache Pulsar. Deep understanding of event-driven architectures, data lakes, and streaming pipelines. Strong experience integrating AI/ML models into production systems, including More ❯
Experience building and maintaining data pipelines in a live environment. Data Storage Solutions: Familiarity with data lakes, warehousing, and other data storage patterns. Advanced Tools: Experience with tools like Kafka, Jenkins, Athena, or Spark. This role will require 4 days per week onsite in the office in Camden , this is not optional and you must be open to this More ❯
Programming Languages: Python, Java, or Go. • Data Engineering Tools: ApacheKafka, Airflow (for orchestration), Spark (if needed for larger datasets). • OpenSearch/Elasticsearch: Indexing, querying, and optimizing. • Visualization Tools: Kibana, Grafana (for more advanced visualizations), React.js. • Cloud: AWS (ElasticSearch Service), or Azure (if using cloud infrastructure). Desired: • 15+ years of experience working in data engineering or software More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
and advocate for scalable, efficient data processes and platform enhancements Tech Environment: Python, SQL, Spark, Airflow, dbt, Snowflake, Postgres AWS (S3), Docker, Terraform Exposure to Apache Iceberg, streaming tools (Kafka, Kinesis), and ML pipelines is a bonus What We're Looking For: 5+ years in Data Engineering, including 2+ years in a leadership or management role Experience designing and More ❯
Data Science, Engineering, or a related field. Strong programming skills in languages such as Python, SQL, or Java. Familiarity with data processing frameworks and tools (e.g., Apache Spark, Hadoop, Kafka) is a plus. Basic understanding of cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledge of database systems (e.g., MySQL, PostgreSQL, MongoDB) and data warehousing concepts. More ❯
Comfortable balancing technical excellence with business priorities and constraints Nice to have Experience building Data Mesh or Data Lake architectures. Familiarity with Kubernetes, Docker, and real-time streaming (e.g. Kafka, Kinesis). Exposure to ML engineering pipelines or MLOps frameworks. What's it like to work at Zego? Joining Zego is a career-defining move. People go further here More ❯
Comfortable balancing technical excellence with business priorities and constraints Nice to have Experience building Data Mesh or Data Lake architectures. Familiarity with Kubernetes, Docker, and real-time streaming (e.g. Kafka, Kinesis). Exposure to ML engineering pipelines or MLOps frameworks. Whats it like to work at Zego? Joining Zego is a career-defining move. People go further here, reaching More ❯
other testing methodologies Preferred: Familiarity with PostgreSQL and Snowflake Preferred: Familiarity with Web Frameworks such as Django, Flask or FastAPI Preferred: Familiarity with event streaming platforms such as ApacheKafka Preferred: Familiarity with data pipeline platforms such as Apache Airflow Preferred: Familiarity with Java Preferred: Experience in one or more relevant financial areas (market data, order management, algorithmic trading More ❯
West London, London, England, United Kingdom Hybrid / WFH Options
Young's Employment Services Ltd
Data Engineer, Tech Lead, Data Engineering Manager etc. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in a cloud More ❯
or Java. Client-side web apps are written in React, and some services in Clojure, Java and Go. Our platform consists of: Multiple Kubernetes Cluster for Container orchestration ApacheKafka and Redis shortly Postgres for event messaging Postgres for data storage OpenStack Swift for Object storage Juniper & Cisco networking devices A number of internally written tools for managing the More ❯
and techniques of computer science, engineering, and mathematical analysis to the development of complex architectures We encourage you to apply if you have experience with any of the following: Kafka (required) Linux (required) Java (required) React (required) Docker/Kubernetes (required) DevSecOps (preferred) Cloud Services (preferably AWS) MPLS/Networking (preferred) GitLab/Jenkins (preferred) Jira (preferred) Required Education More ❯
charts • Deploying and working with Argo CD and Argo Workflows • Creating GitLab CI jobs • Familiar with S3/MinIO and related APIs • Publishing and subscriber queues like RabbitMQ or Kafka We have deep experience in signal processing application and common services development for the National agencies of the Intelligence Community (IC) and the Department of Defense. We develop and More ❯
UI) Experience testing Windows applications, Oracle, and SQL Server databases Understanding of Agile (Scrum/Kanban) and experience transitioning from Waterfall Familiarity with Jira, Zephyr, and Confluence Exposure to Kafka, Azure, Jenkins, and Java What We're Looking For: 7+ years' experience in technology-focused testing roles Proven ability to test complex, high-transaction user systems Strong knowledge of More ❯
flight dynamics models Experience developing software using C++ and Python Experience working with a large-scale legacy software system Experience with tools such as Confluence, Eclipse, Jira, Jenkins, Junit, Kafka, and Spring Boot Responsibilities: Design, develop, test, deliver, and maintain software for satellite ground systems Modernize software systems and upgrade Commercial off-the-shelf COTS and Free and open More ❯
preventive actions. Maintain service dashboards, alerts, and incident tooling (e.g., PagerDuty, Datadog). Technical Expertise required for this engagement: Guide operational practices across services built using Java (Spring Boot) , Kafka , MongoDB and related technologies. Oversee monitoring, observability, and performance tuning using Datadog , ELK , Prometheus , or similar tooling. Problem Management & Root Cause Elimination required: Lead proactive and reactive problem management More ❯
data-driven decision-making What You'll Bring Proven experience in cloud-based data engineering (Azure, Databricks) Strong Python/PySpark skills and DevOps automation experience Familiarity with Kubernetes, Kafka/Event Hub, and AI/ML platform integration Certifications in Azure, Databricks, or data governance tools (a plus) A collaborative mindset and a passion for continuous improvement What More ❯
or ideally commodities would be of particular interest. The ideal Senior .Net Developer will have: Minimum 5+ years C# and .Net development experience Experience with Enterprise Messaging tools i.e. Kafka, Azure service bus etc. Experience working within a trading environment (energy or commodities preferred) Strong experience with distributed architecture and modern CI/CD practices (Docker, Kubernetes) Ability to More ❯