practices include OWASP guidelines/top 10, SOC 2, and NCSC cloud security principles. Experience in data and orchestration tools including some of dbt, Apache Airflow, Azure Data Factory. Experience in programming languages including some of Python, Typescript, Javascript, R, Java, C#, producing services, APIs, Function Apps or Lambdas. More ❯
with REST APIs Experience with Java Experience with full lifecycle agile software development projects Desired skills: Experience with Python. Experience building data products in ApacheAvro and/or Parquet On-the-job experience with Java software development. Experience deploying the complete DevOps Lifecycle including integration of build More ❯
12+ years building and scaling distributed systems in agile/DevOps organizations. Expert in Java, Spring Boot , RESTful microservices, and event-driven architectures ( Kafka , Avro). Strong frontend acumen with React, JavaScript/TypeScript , and API-first integration (OpenAPI). Deep familiarity with Kubernetes orchestration , container security, and infrastructure More ❯
such as Relational and Columnar Databases, Data Integration (ETL), or API development. Familiarity with data formats like JSON, XML, and binary formats such as Avro or Google Protocol Buffers. Experience working with business and technical teams to develop Model Engineering solutions. Proficiency with tools like SQL, JavaScript, or Python More ❯
building and maintaining complex data pipelines. These pipelines ingest and transform data from diverse sources (e.g., email, CSV, ODBC/JDBC, JSON, XML, Excel, Avro, Parquet) using AWS technologies such as S3, Athena, Redshift, Glue, and programming languages like Python and Java (Docker/Spring) . What You’ll More ❯
years building and scaling distributed systems in agile/DevOps organizations . Expert in Java, Spring Boot, RESTful microservices, and event-driven architectures (Kafka, Avro). Strong frontend acumen with React, JavaScript/TypeScript, and API-first integration (OpenAPI). Deep familiarity with Kubernetes orchestration, container security, and infrastructure More ❯
+5 years building and scaling distributed systems in agile/DevOps organizations. Expert in Java , Spring Boot, RESTful microservices, and event-driven architectures (Kafka, Avro). Strong frontend acumen with React, JavaScript/TypeScript, and API-first integration (OpenAPI). Deep familiarity with Kubernetes orchestration, container security, and infrastructure More ❯
building and maintaining complex data pipelines. These pipelines ingest and transform data from diverse sources (e.g., email, CSV, ODBC/JDBC, JSON, XML, Excel, Avro, Parquet) using AWS technologies such as S3, Athena, Redshift, Glue, and programming languages like Python and Java (Docker/Spring) . What You’ll More ❯
support for OpenLink and processes developed by the group Participate in capacity planning and performance/throughput analysis Consuming and publish transaction data in AVRO over Kafka Automation of system maintenance tasks, end-of-day processing jobs, data integrity checks and bulk data loads/extracts Release planning and More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Quant Capital
worry. Additional exposure to the following is desired : - Tech Stack you will learn Hadoop and Flink RUST, Javascript, React, Redux, Flow Linux, Jenkins Kafka, Avro, Kubernetes, Puppet Involvement in the Java community My client is based London. Home work is encouraged but you will need to be able to More ❯
Databases, and/or Data Integration (ETL) or API development. Knowledge of some Data Formats such as JSON, XML, and binary formats such as Avro or Google Protocol Buffers. Experience collaborating with business and technical teams to understand, translate, review, and playback requirements and collaborate to develop Model Engineering More ❯
Previous experience in monitoring, tracking and optimising cloud compute and storage costs Experience working with RPC protocols and their formats, e.g., gRPC/protobuf, ApacheAvro, etc. Experience with cloud-based (e.g. AWS, GCP, Azure) microservice architecture, event-driven, distributed architectures. Experience working in a fast-paced environment More ❯
non-technical stakeholders. A strong grasp of Agile practices. Familiarity with NoSQL databases, big data technologies, and integrating diverse data formats (e.g., JSON, Parquet, Avro). Experience with data cataloguing, metadata management, and data lineage tools. Strong troubleshooting and problem-solving skills for complex data challenges. Desirable Experience & Knowledge More ❯
Key Responsibilities Design and implement real-time data pipelines using tools like Apache Kafka, Apache Flink, or Spark Streaming. Develop and maintain event schemas using Avro, Protobuf, or JSON Schema. Collaborate with backend teams to integrate event-driven microservices. Ensure data quality, lineage, and observability across streaming More ❯
Key Responsibilities Design and implement real-time data pipelines using tools like Apache Kafka, Apache Flink, or Spark Streaming. Develop and maintain event schemas using Avro, Protobuf, or JSON Schema. Collaborate with backend teams to integrate event-driven microservices. Ensure data quality, lineage, and observability across streaming More ❯
Key Responsibilities Design and implement real-time data pipelines using tools like Apache Kafka, Apache Flink, or Spark Streaming. Develop and maintain event schemas using Avro, Protobuf, or JSON Schema. Collaborate with backend teams to integrate event-driven microservices. Ensure data quality, lineage, and observability across streaming More ❯
datasets. Implement and manage Lake Formation and AWS Security Lake , ensuring data governance, access control, and security compliance. Optimise file formats (e.g., Parquet, ORC, Avro) for S3 storage , ensuring efficient querying and cost-effectiveness. Automate infrastructure deployment using Infrastructure as Code (IaC) tools such as Terraform or AWS CloudFormation. More ❯
patterns, such as, "expand and contract" using go-migrate Writing observable and testable code using libraries such as testify and mockgen Publishing and consuming Avro formatted Kafka messages CI/CD GitHub Actions Trunk Based Development & Continuous Delivery Soft skills: Good collaboration skills at all levels with cross-functional More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
QAD
patterns, such as, “expand and contract” using go-migrate Writing observable and testable code using libraries such as testify and mockgen Publishing and consuming Avro formatted Kafka messages CI/CD GitHub Actions Trunk Based Development & Continuous Delivery Soft Skills Good collaboration skills at all levels with cross-functional More ❯
building robust, scalable data processing solutions to support critical intelligence operations. Key Responsibilities Data Pipeline Development Design and implement automated data ingestion pipelines using Apache NiFi, Python, and other enterprise data tools Develop ETL/ELT processes for structured and unstructured classified datasets Build real-time and batch processing … Qualifications: Active TS/SCI with CI poly Ability to work within JWICS and other classified networks Technical Skills 3+ years of experience with Apache NiFi for data flow automation Strong proficiency in Python for data processing and pipeline development Experience with SQL databases (PostgreSQL, Oracle, SQL Server) Knowledge … of big data technologies (Hadoop, Spark, Kafka) Familiarity with cloud platforms and containerization (Docker, Kubernetes) Understanding of data formats (JSON, XML, Parquet, Avro) Professional Experience Bachelor's degree in Computer Science, Engineering, or related field 5+ years of experience in data engineering or related roles Experience working in classified More ❯
experience with SQL, Data Pipelines, Data Orchestration and Integration Tools Experience in data platforms on premises/cloud using technologies such as: Hadoop, Kafka, Apache Spark, Apache Flink, object, relational and NoSQL data stores. Hands-on experience with big data application development and cloud data warehousing (e.g. Hadoop … Spark, Redshift, Snowflake, GCP BigQuery) Expertise in building data architectures that support batch and streaming paradigms Experience with standards such as JSON, XML, YAML, Avro, Parquet Strong communication skills Open to learning new technologies, methodologies, and skills As the successful Data Engineering Manager you will be responsible for: Building More ❯
ecosystem, including modules for Security, Web, and Data Expertise in building event-driven systems that deliver high reliability and performance Familiarity with message schemas (Avro, Protobuf) and strategies for evolving events Experience integrating with identity systems like Keycloak Strong grasp of system-level concerns like observability, availability, and resilience More ❯
ecosystem, including modules for Security, Web, and Data Expertise in building event-driven systems that deliver high reliability and performance Familiarity with message schemas (Avro, Protobuf) and strategies for evolving events Experience integrating with systems like Keycloak Strong grasp of system-level concerns like observability, availability, and resilience Background More ❯
ecosystem, including modules for Security, Web, and Data Expertise in building event-driven systems that deliver high reliability and performance Familiarity with message schemas (Avro, Protobuf) and strategies for evolving events Experience integrating with identity systems like Keycloak Strong grasp of system-level concerns like observability, availability, and resilience More ❯
ecosystem, including modules for Security, Web, and Data Expertise in building event-driven systems that deliver high reliability and performance Familiarity with message schemas (Avro, Protobuf) and strategies for evolving events Experience integrating with identity systems like Keycloak Strong grasp of system-level concerns like observability, availability, and resilience More ❯