Server, Cosmos DB, Service Bus, Blob Storage. Message bus/queue OIDC, OAuth 2.0, JWTs. Preferred Technical Skills: Systems integration experience. Knowledge of Kubernetes, Kafka, Terraform, GitHub Actions, Open Telemetry (OTEL), Numpy, and Pandas. Required Soft Skills: Experience leading an agile software development team (preferably in a Scrum environment More ❯
Server, Cosmos DB, Service Bus, Blob Storage. Message bus/queue OIDC, OAuth 2.0, JWTs. Preferred Technical Skills: Systems integration experience. Knowledge of Kubernetes, Kafka, Terraform, GitHub Actions, Open Telemetry (OTEL), Numpy, and Pandas. Required Soft Skills: Experience leading an agile software development team (preferably in a Scrum environment More ❯
Frontend development experience with HTML, CSS, TypeScript, Angular CI/CD: Docker, Kubernetes, GitLab or GitHub Database: PostgreSQL or any RDBMS experience Message Broker: Kafka or any message broker and any event-driven architecture experience Trimble's Inclusiveness Commitment We believe in celebrating our differences. That is why our More ❯
RDBMS, ideally PostgreSQL . Strong distributed systems and queuing experience: You have hands-on experience with distributed systems and queuing technologies such as SQS+SNS, Kafka or equivalent. Terraform Familiarity: You're comfortable using Terraform to manage infrastructure as code. Kubernetes Familiarity: You understand the basics of Kubernetes and have More ❯
a globally-distributed team A background in some of the following a bonus: Java experience Python experience Ruby experience Big data technologies: Spark, Trino, Kafka Financial Markets experience SQL: Postgres, Oracle Cloud-native deployments: AWS, Docker, Kubernetes Observability: Splunk, Prometheus, Grafana For more information about DRW's processing activities More ❯
Systems and Virtualisation (Windows and Linux). Infrastructure as Code and Operational Automation (e.g. Terraform, Ansible). Message Queueing and Streaming Fabrics (e.g. AMQP, Kafka, Kinesis). Docker and Kubernetes. Scripting (Shell and PowerShell). Basic Coding with a bias for Infrastructure (Python, Go, C#). IAM Policy and More ❯
Uxbridge, Middlesex, United Kingdom Hybrid / WFH Options
Avature
high standard and quality of products we create. Client-Side: Typescript, Next.js, React and various React ecosystem tools and libraries Infrastructure: AWS, Kubernetes, Terraform, Kafka, DynamoDB, PostgreSQL, Redis, ElasticSearch, Kibana, Grafana, and Prometheus. However, you should be comfortable using a variety of frameworks, languages, and tools and be happy More ❯
experience on RESTful APIs - interconnected software components interaction, engineering and testing (e.g. NMS applications, controllers, orchestrators, supervisory systems, etc.). Experience and understanding of Kafka messaging bus. Experience in using monitoring tools like Nagios, Grafana, Prometheus and Kibana is desired. Deployment environment: Kubernetes, Docker, microservices. Experience on Talos Kubernetes More ❯
in environments with AI/ML components or interest in learning data workflows for ML applications . Bonus if you have e xposure to Kafka, Spark, or Flink . Experience with data compliance regulations (GDPR). What you can expect from us: Opportunity for annual bonuses Medical Insurance Cycle More ❯
business intelligence reports Experience of setting up and administering analytics services, such as Power BI or Pentaho BA Experience with streaming services, such as Kafka What we do for you: At Leidos we are PASSIONATE about customer success, UNITED as a team and INSPIRED to make a difference. We More ❯
looks like: Python for our application code, APIs and SDK SQL and NoSQL databases: PostgreSQL, Couchbase, DynamoDB Event-driven architecture, employing technologies such as Kafka, gRPC and Protobuf for event definitions Solver technology and algorithms to drive our workflow scheduling solver engine Deployment on AWS to IoT Greengrass, ECS More ❯
Java or Scala. - Solid understanding of data architecture, data modeling, and designing large-scale data pipelines. - Experience with big data technologies (e.g., Hadoop, Spark, Kafka) and data storage solutions (e.g., Redshift, Snowflake, S3). - Ability to manage and prioritize multiple projects and deliver them on time. - Strong communication skills More ❯
libraries we use extensively. We implement the systems that require the highest data throughput in Java and C++. We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log shipping and monitoring, Docker More ❯
We are seeking an experienced Kafka Real-Time Architect to design and implement scalable, high-performance real-time data processing systems leveraging Apache Kafka. In this role, you will be responsible for architecting and managing Kafka clusters, ensuring system scalability and availability, and integrating Kafka with various … to addressing business data needs and ensuring optimal system performance. Key Responsibilities: Design & Architecture: Architect and design scalable, real-time streaming systems using ApacheKafka, ensuring they are robust, highly available, and meet business requirements for data ingestion, processing, and real-time analytics. Kafka Cluster Management: Configure, deploy … Kafka usage, providing insights into new tools, technologies, and practices. Staying Current with Industry Trends: Keep up to date with the latest ApacheKafka features, updates, and emerging industry best practices. Participate in Kafka-related forums, conferences, and workshops to maintain a cutting-edge understanding of the More ❯
Skills Required Knowledge of pipelines or real-time streaming data (e.g., Kafka, Kinesis). Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes). Experience with infrastructure-as-code tools (e.g., Terraform, CloudFormation). Experience with development, ETL tools, and frameworks (e.g., Apache Airflow, Talend, Fivetran). Proficiency in More ❯
Experience working in government, defence, or highly regulated industries with knowledge of relevant standards. Experience with additional data processing and ETL tools like ApacheKafka, Spark, or Hadoop. Familiarity with containerization and orchestration tools such as Docker and Kubernetes. Experience with monitoring and alerting tools such as Prometheus, Grafana More ❯
. • In-depth knowledge of data warehousing concepts and tools (e.g., Redshift, Snowflake, Google BigQuery). • Experience with big data platforms (e.g., Hadoop, Spark, Kafka). • Familiarity with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud). • Expertise in ETL tools and processes (e.g., Apache NiFi More ❯
J2EE, Springboot experience - Candidate must have good exposure to Microservice based architecture - Experience with database Oracle and SQL knowledge - Preferable to Have - exposure to kafka/Hadoop/React JS/ElasticSearch/Spark - Good to Have - have exposure to Fabric/Kubernetes/Dockers/helm More ❯
SQL , and one or more: R, Java, Scala Experience with relational/NoSQL databases (e.g., PostgreSQL, MongoDB) Familiarity with big data tools (Hadoop, Spark, Kafka), cloud platforms (Azure, AWS, GCP), and workflow tools (Airflow, Luigi) Bonus: experience with BI tools , API integrations , and graph databases Apply now and help More ❯
C# - .NET Core Golang Python and R with Jupyter and Azure DataBricks Postgres (and Timescale), Redis, document and column-based storage engines RabbitMQ and Kafka-style commit logs Dapr React, Redux, React-Router, Styled-Components, Express, TRPC GraphQL, MQTT However, the ideal candidate will have experience in Golang, Azure More ❯
Employment Type: Permanent
Salary: £75000 - £100000/annum Up to £100k basic + excellent benef
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Exalto Consulting ltd
C# - .NET Core Golang Python and R with Jupyter and Azure DataBricks Postgres (and Timescale), Redis, document and column-based storage engines RabbitMQ and Kafka-style commit logs Dapr React, Redux, React-Router, Styled-Components, Express, TRPC GraphQL, MQTT However, the ideal candidate will have experience in Golang, Azure More ❯
bradford, yorkshire and the humber, united kingdom
Exalto Consulting ltd
C# - .NET Core Golang Python and R with Jupyter and Azure DataBricks Postgres (and Timescale), Redis, document and column-based storage engines RabbitMQ and Kafka-style commit logs Dapr React, Redux, React-Router, Styled-Components, Express, TRPC GraphQL, MQTT However, the ideal candidate will have experience in Golang, Azure More ❯
pipelines for enterprise data and one other business domain. Building and orchestrating data pipelines and analytical processing for streaming data with technologies such as Kafka, AWS Kinesis or Azure Stream Analytics. Knowledge of data governance, privacy regulations (e.g. GDPR), and security best practices. Delivering data engineering and designing for More ❯
TCP, DNS, TLS, Familiarity with IaC (Terraform), CI/CD (GitHub Actions, Jenkins), and Agile workflows. Experience with containerisation (Docker, Kubernetes) and stream processing (Kafka a plus). Benefits Annual Bonus (10%) 30 days holiday + bank holidays Private Healthcare Life Assurance If this is of interest, plesae feel More ❯