React , Angular , or Vue.js for full-stack development is a plus. Event-Driven Architecture: Experience with event-driven architectures or message queuing systems (e.g., Kafka , RabbitMQ ) is beneficial. Education: A degree in Computer Science , Engineering , or a related field is preferred but not required. More ❯
engineering pipelines using ETL/ELT tools (Apache Airflow, Azure Data Factory), managing data warehousing solutions (Snowflake, BigQuery, Azure Synapse), and handling streaming platforms (Kafka, Event Hubs). Security and Compliance: Proven ability to embed robust security practices (e.g., Azure AD, OAuth, JWT) and maintain compliance with data privacy More ❯
building RESTful APIs/WebSockets Proficient in Scala and its ecosystem (e.g., Akka, Play Framework, SBT) Experience working with distributed messaging systems such as Kafka, ActiveMQ, RabbitMQ, etc. Experience with microservices architecture Containerisation technologies (e.g., Docker, Kubernetes) Strong understanding of software design patterns, data structures, and algorithms Experience with More ❯
the highest data throughput in Java. We implement most of our long-running services and analytics in C#. We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, ELK for logs, Grafana, Prometheus & InfluxDb for metrics, Docker and Kubernetes for containerisation More ❯
to know everything , but here are some other core technologies and what our environment looks like: Event-driven architecture, employing technologies such as NATS, Kafka, gRPC, and Protobuf for event definitions. SQL and NoSQL databases: PostgreSQL, Couchbase, DynamoDB. Observability platforms: Datadog, Grafana, Prometheus. Feature flags to enable us to More ❯
Python for our application code, APIs and SDK, and scripting SQL and NoSQL databases: PostgreSQL, Couchbase, DynamoDB Event-driven architecture, employing technologies such as Kafka, gRPC and Protobuf for event definitions Websocket connections Solver technology and algorithms to drive our workflow scheduling solver engine Deployment on AWS to ECS More ❯
Scala for data processing. Practical experience with BigQuery, Cloud Dataflow, Cloud Dataproc, and Apache Beam. Experience with event-driven streaming platforms such as ApacheKafka or Pub/Sub. Familiarity with Terraform, Kubernetes (GKE), and Cloud Functions. Strong understanding of data modeling, data lakes, and data warehouse design. Knowledge More ❯
Experience with Java Spring Boot for integration micro-service patterns Knowledge of SOLID principles and clean code Familiarity with streaming data, such as ApacheKafka, and AWS native messaging/streaming features Ability to work with SQL and NoSQL data sources like Postgres and Mongo Understanding of DevOps tooling More ❯
Python for our application code, APIs and SDK, and scripting SQL and NoSQL databases: PostgreSQL, Couchbase, DynamoDB Event-driven architecture, employing technologies such as Kafka, gRPC and Protobuf for event definitions Websocket connections Solver technology and algorithms to drive our workflow scheduling solver engine Deployment on AWS to ECS More ❯
technologies and Distributed Systems such as Hadoop, HDFS, HIVE, Spark, Databricks, Cloudera. Experience developing near real time event streaming pipelines with tools such as - Kafka, Spark Streaming, Azure Event Hubs. Good understanding of the differences and trade-offs between SQL and NoSQL, ETL and ELT. Proven experience in DevOps More ❯
Python for our application code, APIs and SDK, and scripting SQL and NoSQL databases: PostgreSQL, Couchbase, DynamoDB Event-driven architecture, employing technologies such as Kafka, gRPC and Protobuf for event definitions Solver technology and algorithms to drive our workflow scheduling solver engine Deployment on AWS to IoT Greengrass, ECS More ❯
and NoSQL) and proficiency in designing efficient and scalable database schemas. Experience with workflow orchestration tools (Apache Airflow, Prefect) and data pipeline frameworks (ApacheKafka, Talend). Familiarity with cloud platforms (AWS, GCP or Azure) and their data services (AWS Glue, GCP Dataflow) for building scalable cost-effective data More ❯
tools such as Jenkins, GitLab CI, or Azure DevOps. Understanding of microservices architecture and event-driven systems. Experience with message queue services such as Kafka, RabbitMQ, or AWS SQS. Knowledge of security and authentication mechanisms such as OAuth, JWT, and SAML. Awareness of different agile methodologies and ability to More ❯
City of London, London, United Kingdom Hybrid / WFH Options
83zero Limited
Data Fusion. NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. Snowflake Data Warehouse/Platform Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. Experience of working with CI/CD technologies, Git, Jenkins, Spinnaker, GCP Cloud Build, Ansible etc Experience building More ❯
technologies and Distributed Systems such as Hadoop, HDFS, HIVE, Spark, Databricks, Cloudera. Experience developing near real time event streaming pipelines with tools such as - Kafka, Spark Streaming, Azure Event Hubs. Excellent experience in the Data Engineering Lifecycle, you will have created data pipelines which take data through all layers More ❯
technologies and Distributed Systems such as Hadoop, HDFS, HIVE, Spark, Databricks, Cloudera. Experience developing near real time event streaming pipelines with tools such as - Kafka, Spark Streaming, Azure Event Hubs. Excellent experience in the Data Engineering Lifecycle, you will have created data pipelines which take data through all layers More ❯
a globally-distributed team A background in some of the following a bonus: Java experience Python experience Ruby experience Big data technologies: Spark, Trino, Kafka Financial Markets experience SQL: Postgres, Oracle Cloud-native deployments: AWS, Docker, Kubernetes Observability: Splunk, Prometheus, Grafana For more information about DRW's processing activities More ❯
core skills we'd like to see: Python for our application code, APIs and SDK Experience with Event Driven architectures employing technologies such as Kafka, gRPC and Protobuf for event definitions Experience working in a Kubernetes environment Experience with SQL or NoSQL databases e.g. PostgreSQL, Couchbase, DynamoDB Owning technical More ❯
core skills we'd like to see: Python for our application code, APIs and SDK Experience with Event Driven architectures employing technologies such as Kafka, gRPC and Protobuf for event definitions Experience working in a Kubernetes environment Experience with SQL or NoSQL databases e.g. PostgreSQL, Couchbase, DynamoDB Owning technical More ❯
libraries we use extensively. We implement the systems that require the highest data throughput in Java and C++. We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log shipping and monitoring, Docker More ❯
Github Actions or similar. Knowledge and hands-on experience with Docker, Kubernetes. Exposure to the Pub-Sub model and event streaming platforms such as Kafka, Pulsar, or AWS Kinesis. Experience with build and package management tools such as npm, maven, dotnet, etc. Experience with Postgres or other RDBMS environments More ❯
or other advanced analytics infrastructure. Familiarity with infrastructure-as-code (IaC) tools such as Terraform or CloudFormation. Experience with modern data engineering technologies (e.g., Kafka, Spark, Flink, etc.). Why join YouLend? Award-Winning Workplace: YouLend has been recognised as one of the "Best Places to Work 2024" by More ❯
into different categories: Backend Java, Node.js, C#, Python, PHP, Scala, Power Platform Frontend React, JavaScript, Typescript, Angular Data PostgreSQL, Microsoft SQL Server, MongoDB, ApacheKafka, Neo4J, Amazon Athena DevOps AWS, Kubernetes, Azure, Jenkins, Docker, Ansible, Terraform, Dynatrace Responsibilities As part of the team, your day-to-day responsibilities will More ❯