uses modern techniques to achieve the best results for its clients while empowering and supporting you to realize your full potential? We are an experienced team of Spring committers, Kafka contributors, ex-Pivotal, and ex-Google engineers. If you want to know more, keep reading or reach out for a chat. About us We are a VC-backed fintech More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Chi Square Analytics
learning techniques, including clustering, Bayesian methods, reinforcement learning, and scalable modelling Technical fluency in tools and platforms like Python or R, cloud environments, and big data infrastructure (e.g., Hadoop, Kafka, SQL) Exceptional stakeholder communication-able to translate complex technical ideas into strategic insights for executive audiences Bonus: Experience in industries like gaming, entertainment or technology would add value Please More ❯
learning techniques, including clustering, Bayesian methods, reinforcement learning, and scalable modelling Technical fluency in tools and platforms like Python or R, cloud environments, and big data infrastructure (e.g., Hadoop, Kafka, SQL) Exceptional stakeholder communication-able to translate complex technical ideas into strategic insights for executive audiences Bonus: Experience in industries like gaming, entertainment or technology would add value Please More ❯
small team. Proficiency in at least one object-oriented language, such as C# or Java. Familiarity with .NET Framework or .NET Core is preferred. Experience with message brokers like Kafka is an advantage. Knowledge of modern cloud technologies, including AWS services and infrastructure. At least 5 years of experience in Sports Book industry Join Our Team We're a More ❯
within DevEx/Developer Enablement/Developer Experience. A background in software engineering, with exposure to cloud-native tooling, microservices, and event-driven architectures. Familiarity with tools like GitLab, Kafka, and monitoring solutions like Grafana (or similar). A strong understanding of building internal platforms that improve developer workflows, CI/CD processes, and observability. A thoughtful, empathetic leadership More ❯
within DevEx/Developer Enablement/Developer Experience. A background in software engineering, with exposure to cloud-native tooling, microservices, and event-driven architectures. Familiarity with tools like GitLab, Kafka, and monitoring solutions like Grafana (or similar). A strong understanding of building internal platforms that improve developer workflows, CI/CD processes, and observability. A thoughtful, empathetic leadership More ❯
within DevEx/Developer Enablement/Developer Experience. A background in software engineering, with exposure to cloud-native tooling, microservices, and event-driven architectures. Familiarity with tools like GitLab, Kafka, and monitoring solutions like Grafana (or similar). A strong understanding of building internal platforms that improve developer workflows, CI/CD processes, and observability. A thoughtful, empathetic leadership More ❯
and/or data pipelines. Experience with modern technologies and frameworks including FastAPI, PydanticAI, Haystack AI, OpenTelemetry, Procrastinate, databases like Postgres and Snowflake, queue systems such as SQS/Kafka, and Airflow. Hands-on experience with Kubernetes, including orchestration and lifecycle maintenance; you are not an SRE but you know how things run and what is required to ship More ❯
accredited college or university plus four (4) years of systems engineering experience. Experience in production dataflow implementation, verification, and monitoring, with a deep understanding of data processing frameworks, especially Kafka and Nifi. ( Nifi limited to the point and click UI is insufficient ) Proficient in programming in Python and Java is desired. Extensive experience in Tier 2 and 3 support More ❯
Understanding of Data Architecture including management of confidential data. Delivering against company and industry guidelines and regulations e.g. cyber security and data protection. Experience with event driven architectures like Kafka Familiarity with Architectural Frameworks such as TOGAF, Archimate , Cloud Architectures, or comparable frameworks. Previous involvement with Mobile native development, cross-platform/hybrid applications, API Gateway Management, and microservices. More ❯
widely in our sector, we strongly encourage exploring new technologies and techniques. Some of the following experience is therefore desirable: • Practical experience using streaming technologies, including streaming platforms (e.g. Kafka), online algorithms (e.g. stochastic gradient descent), and fixed-memory data structures (e.g. Bloom Filters). • Experience using next generation machine learning techniques and tools, including Deep Neural Networks and More ❯
event-driven and data lake. Digital platform development, inc. experience developing colleague and customer facing applications. Banking software development. Deep technical awareness of cloud and data technology, such as Kafka, Data Bricks, microservices, API. Some other highly valued skills may include: Technical design: Analysis and documentation of system characteristics, inc. designing solutions to meet multi-faceted non-functional requirements. More ❯
s ITSM, OMT, TSM, and TSOM modules Assist in the design, creation, and maintenance of the ServiceNow application and modules Architect and design tool integrations with ServiceNow (Elastic, Netcool, Kafka, etc) Coordinate and support application and platform upgrades Resolve identified vulnerabilities associated with STIGs and Scans Assist with design, creation of business process flows Work with other members of More ❯
data from diverse sources, transform it into usable formats, and load it into data warehouses, data lakes or lakehouses. Big Data Technologies: Utilize big data technologies such as Spark, Kafka, and Flink for distributed data processing and analytics. Cloud Platforms: Deploy and manage data solutions on cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP), leveraging cloud … for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka, and Flink. Experience in using modern data architectures, such as lakehouse. Experience with CI/CD pipelines, version control systems like Git, and containerization (e.g., Docker). Experience with … and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache Spark (for distributed data processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP (e.g. S3, EMR, Redshift, Glue, Azure Data Factory More ❯
Data Architecture: Build and maintain big data architectures and data pipelines to efficiently process large volumes of geospatial and sensor data. Leverage technologies such as Hadoop, Apache Spark, and Kafka to ensure scalability, fault tolerance, and speed. Geospatial Data Integration: Develop systems that integrate geospatial data from a variety of sources (e.g., satellite imagery, remote sensing, IoT sensors, and … applications. Familiarity with geospatial data formats (e.g., GeoJSON, Shapefiles, KML) and tools (e.g., PostGIS, GDAL, GeoServer). Technical Skills: Expertise in big data frameworks and technologies (e.g., Hadoop, Spark, Kafka, Flink) for processing large datasets. Proficiency in programming languages such as Python, Java, or Scala, with a focus on big data frameworks and APIs. Experience with cloud services and … field. Experience with data visualization tools and libraries (e.g., Tableau, D3.js, Mapbox, Leaflet) for displaying geospatial insights and analytics. Familiarity with real-time stream processing frameworks (e.g., Apache Flink, Kafka Streams). Experience with geospatial data processing libraries (e.g., GDAL, Shapely, Fiona). Background in defense, national security, or environmental monitoring applications is a plus. Compensation and Benefits: Competitive More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Be-IT Resourcing
data processing Strong computer science fundamentals Expertise in CI/CD and automation Proficiency in languages such as Python, Java, Go, Rust, JavaScript, React, SQL Experience with Docker, Kubernetes, Kafka Familiarity with GCP, Azure, AWS Understanding of cloud storage, networking, and resource provisioning Desirable Master’s or PhD in Computer Science or related field GCP or Kafka certifications More ❯
and APIs using Java , Spring Boot , and microservices architecture . Build dynamic, responsive UIs using Angular , React , or Vue.js . Integrate Camunda workflows with REST APIs, databases, messaging systems (Kafka/RabbitMQ), and user interfaces. Configure and manage Camunda engine, external task workers, and job execution. Write unit and integration tests for both frontend and backend components. Collaborate with … with Camunda BPM (v7 or v8) - BPMN, DMN, external tasks, process orchestration Experience in Angular , React , or Vue.js Other Technologies: Integration experience with Databases (PostgreSQL, MySQL, MongoDB) Familiarity with Kafka , RabbitMQ , or other messaging tools Experience with Git , Maven/Gradle , Jenkins/GitLab CI Experience with Camunda 8/Zeebe , Operate, Tasklist Exposure to Docker , Kubernetes , Helm Knowledge More ❯
deep knowledge of security architecture on cloud platforms, you will provide technical leadership and strategic direction in building scalable, secure, and resilient cloud-native systems based on AWS, Kubernetes, Kafka, microservices, Java, and event-driven design. Key Responsibilities Cloud Architecture Design: Lead the design of cloud-based systems and solutions, utilising AWS and modern architectural patterns such as microservices … auditing). Containerisation & Orchestration: Architect and oversee containerised environments using Kubernetes, ensuring high availability, scalability, and fault tolerance for critical applications. Event-Driven Systems: Architect event-driven systems using Kafka, designing and managing messaging frameworks to handle real-time data processing across distributed microservices. Microservices Architecture: Design and oversee the development of microservices-based systems, establishing best practices for More ❯
to fulfill business requirements Develop and maintain highly available distributed systems that supports critical client workflows Collaborate on building systems using a diverse stack of open-source technologies, including Kafka, FastAPI, Airflow, etc. Develop configurable, performant, and monitorable pipelines that prioritize scalability, efficiency, and operation resiliency Collaborate with cross-organizational teams to drive project success Mentor and coach junior … Computer Science, Engineering, Mathematics, or a related field, and/or related professional experience Nice to Have Familiarity with big data processing with highly scalable technologies such as Spark, Kafka, RabbitMQ, Redis, Flink, Airflow and Cassandra Familiarity with Cloud Platforms like AWS, Azure, or GCP Familiarity with S3 compliant data store (e.g., AWS S3, Azure Blob Storage, GCP Cloud More ❯
WE ARE HIRING OWNERS Set of X is led by industry veterans who see government contracting as a good community with plenty of opportunity to go around. With a shared desire to give back, grow the community, and do great More ❯
mission to empower safe and sustainable transport around the world through data insights. You will be the go-to person for our cloud data pipeline & underlying infrastructure, especially ApacheKafka and our PostgreSQL database . You will also work with tech like Kubernetes, Vault, go microservices and other cloud services, as well as on-premise infrastructure. Your technical expertise … on problems, even when you are not familiar with the problem space, and you know when and how to ask for help. Requirements for the role: Expertise in ApacheKafka (ideally Kafka Strimzi), including broker management, scaling, upgrades, and integration with high-throughput data pipelines. Experience managing PostgreSQL databases, including load analysis, query optimisation, and role management. Experience More ❯
Columbia, South Carolina, United States Hybrid / WFH Options
Systemtec Inc
processing using Databricks, AI and Machine Learning Amazon Bedrock, AWS Sagemaker, Unified Studio, R Studio/Posit Workbench, R Shiny/Posit Connect, Posit Package Manager, AWS Data Firehose, Kafka, Hive, Hue, Oozie, Sqoop, Git/Git Actions, IntelliJ, Scala Responsibilities of the Data Engineer (AWS): Act as an internal consultant, advocate, mentor, and change agent providing expertise and More ❯
customer and internal process improvements. Your tech-stack & skillset consists of a foundation of knowledge in Programming paradigms and languages (Java, Spring). Various tools (Apache Camel, Crucible, Maven, Kafka, Soap, XML, XSLT). Technical solutions and their implementation. Communicators and conferencing tools. Project, Time, and Capacity Management. Trimble's Inclusiveness Commitment We believe in celebrating our differences. That More ❯
troubleshooting, tuning) of web architecture and related applications such as the following: Apache, Nginx, Python, MySQL, Postgres, MongoDB, Postfix, CDN integrations. Experience managing data warehouse platforms & tooling: ex. Hadoop, Kafka, Cassandra. Advanced knowledge and experience creating, maintaining and debugging shell scripts. The ideal candidate will be comfortable in "non-siloed" environments and have an appetite to research, test, and More ❯