days/week. Flexibility is key to accommodate any schedules changes per the customer. Preferred Requirements Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers, EKS, Diode, CI/CD, and Terraform are a plus. Work could possibly require some on-call work. We have More ❯
A solid understanding of key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines. Experience working with one or more of Kafka, Snowflake, Azure Data Factory, Azure Synapse or Microsoft Fabric is highly desirable. Knowledge of data modelling and data architectures: Inmon, Kimball, DataVault. About You A high level of drive More ❯
clarity in a fast-paced, evolving environment Preferred Qualifications Experience in high-availability, high-scale environments (e.g., gaming, fintech, e-commerce, or similar industries) Familiarity with event-driven architectures, Kafka, or real-time processing systems Background in influencing technical strategy at a departmental or org-wide level Excitement for growing teams, building platforms, and contributing to company-wide technical More ❯
file encryption, LDAP (RedHat directory) Java, Linux (CentOS), Modern IDEs (IntelliJ, Eclipse, etc.), Agile Scrum process Use of automated testing tools like Selenium a plus Knowledge of ActiveMQ, Artemis, Kafka a plus $85,000 - $250,000 a year The pay range for this job, with multi-levels, is a general guideline only and not a guarantee of compensation or More ❯
with execution algos, TCA, order-routing, or market-impact modelling Knowledge of statistical or machine-learning libraries (NumPy, pandas, scikit-learn, PyTorch) Experience building distributed systems with message buses (Kafka, ZeroMQ) and asynchronous I/O Experience with cloud or on-prem orchestration and scheduling frameworks (Kubernetes, HT Condor, SLURM) Benefits Tower's headquarters are in the historic Equitable More ❯
productivity effectively and managing teams of 5-15 developers. Experience with modern technologies and data engineering platforms (Java/.NET, AWS/Azure, microservices, CI/CD pipelines, IaC, kafka, Databricks etc) as well as modern enterprise and integration architectural patterns. Excellent communication, influencing, and negotiation skills, comfortably engaging with senior stakeholders to ensure successful outcomes. Client-focused, collaborative More ❯
productivity effectively and managing teams of 5-15 developers. Experience with modern technologies and data engineering platforms (Java/.NET, AWS/Azure, microservices, CI/CD pipelines, IaC, kafka, Databricks etc) as well as modern enterprise and integration architectural patterns. Excellent communication, influencing, and negotiation skills, comfortably engaging with senior stakeholders to ensure successful outcomes. Client-focused, collaborative More ❯
productivity effectively and managing teams of 5-15 developers. Experience with modern technologies and data engineering platforms (Java/.NET, AWS/Azure, microservices, CI/CD pipelines, IaC, kafka, Databricks etc) as well as modern enterprise and integration architectural patterns. Excellent communication, influencing, and negotiation skills, comfortably engaging with senior stakeholders to ensure successful outcomes. Client-focused, collaborative More ❯
in developing production applications. Data Engineering & Streaming: Familiarity with data streaming tools and Elastic to ensure high performance in data-driven environments. Nice to Have: Neo4j Data Streaming e.g. Kafka Nvidia Triton Octopus Deploy Langsmith Elastic About Definely Named in the top 25 of the prestigious Deloitte UK Technology Fast50 in 2023, as well as the EMEA Technology Fast More ❯
ML ops principles and best practices to deploy, monitor and maintain machine learning models in production Familiarity with Git and MLflow for managing and tracking model versions Experience with Kafka is a big bonus Experience with cloud-based data platforms such as AWS or Google Cloud Platform. Proven track record of running large scale mission critical data infrastructure in More ❯
5+ years of experience in cloud architecture and implementation Bachelor's degree in Computer Science, Engineering, related field, or equivalent experience Experience in database (e.g., SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) Experience in consulting, design and implementation of serverless distributed solutions Experience in software development with object-oriented language AWS experience preferred, with proficiency in a wide range of More ❯
serverless architectures (Lambda, API Gateway, etc.) Kubernetes and container orchestration PostgreSQL and other relational databases Spark and distributed data processing Typescript and Java programming languages Infrastructure as Code (Terraform) Kafka and message queuing systems Git/GitHub, including GitHub Actions for CI/CD About Chainalysis Blockchain technology is powering a growing wave of innovation. Businesses and governments around More ❯
Create and maintain Forms, Reports, Views, Workflows, Groups and Roles • Create, maintain and enhance Dashboards and reporting, including scheduled reports • Create and configure tool integrations with ServiceNow (Elastic, Netcool, Kafka, etc) • Coordinate and support application and platform upgrades • Assist with design, creation and cataloging of business process flows • Work with other members of the ServiceNow development team to propose More ❯
Accessibility Testing. Has a working knowledge of a minimum of 5 of the following: API AWS Burp ( security ) BrowserStack Concourse/Jenkins Docker JMeter Karate/Rest API MongoDB Kafka Oracle Playwright Postman/SOAP UI Selenium Grid SQL Typescript Unix based systems Git XML Benefits Alongside your salary of £41,571, Companies House contributes £12,043 towards you More ❯
Create and maintain Forms, Reports, Views, Workflows, Groups and Roles Create, maintain and enhance Dashboards and reporting, including scheduled reports Create and configure tool integrations with ServiceNow (Elastic, Netcool, Kafka, etc) Coordinate and support application and platform upgrades Assist with design, creation and cataloging of business process flows Work with other members of the ServiceNow development team to propose More ❯
edge computing tools such as AWS Greengrass & IoT Core Experience/knowledge with eventing systems and large data streaming for test automation Understanding of streaming systems/message brokers (Kafka, RabbitMQ) Experience in hyper growth startup-like environments, with demonstrated success in scaling and maturing software development processes A background in scaled manufacturing environments Previous experience with supply chain More ❯
We are looking for the very Top Talent and we would be delighted if you were to join our team! More in details, UST is a multinational company based in North America, certified as a Top Employer and Great Place More ❯
Engineering Manager - Delivery - Tech Lead - Microservices - CI/CD - Hybrid - £600/£650 (Inside IR35) - Immediate Starters Paying up to £650 Inside IR35 My client is seeking an experienced Engineering Manager to join their team. You will need strong commercial More ❯
Data Engineer - Snowflake & Kafka Portugal or Spain (Remote) Full-time Permanent ️ Fluent English required A leading technology consultancy is hiring a Data Engineer with strong expertise in ApacheKafka and Snowflake . You'll join a fast-growing, remote-first team delivering cutting-edge data solutions to global clients in finance, retail, and other industries. This is a … candidates based in Portugal or Spain , offering exciting projects, autonomy, and long-term career growth. What You'll Be Doing Design and manage real-time data pipelines using ApacheKafka Develop scalable ETL/ELT processes for ingesting data into Snowflake Collaborate with cross-functional teams, including analysts, data scientists, and DevOps Apply modern engineering best practices, including CI …/CD and version control Provide guidance and mentorship to junior engineers What We're Looking For 5+ years of experience as a Data Engineer Expertise in Kafka Streams , Kafka Connect , or Confluent Kafka Hands-on experience with Snowflake (schema design, optimisation) Strong SQL skills and/or programming knowledge in Python, Java, Scala, or .NET Experience More ❯
Senior Engineering Manager – Data Platform Tech Stack Knowledge - Databricks, Kafka, AWS 1 Day a Week Onsite - London We are seeking a customer-centric Senior Engineering Manager – Data Platform to lead the teams responsible for building and evolving our core data infrastructure. In this role, you will oversee the development of our foundational data platform — encompassing experimentation frameworks, event ingestion … scale, ideally in consumer-facing or marketplace environments. Strong knowledge of distributed systems and modern data ecosystems, with hands-on experience using technologies such as Databricks, Apache Spark, ApacheKafka, and DBT. Proven success in building and managing data platforms supporting both batch and real-time processing architectures. Deep understanding of data warehousing, ETL/ELT pipelines, and analytics More ❯
Senior Engineering Manager – Data Platform Tech Stack Knowledge - Databricks, Kafka, AWS 1 Day a Week Onsite - London We are seeking a customer-centric Senior Engineering Manager – Data Platform to lead the teams responsible for building and evolving our core data infrastructure. In this role, you will oversee the development of our foundational data platform — encompassing experimentation frameworks, event ingestion … scale, ideally in consumer-facing or marketplace environments. Strong knowledge of distributed systems and modern data ecosystems, with hands-on experience using technologies such as Databricks, Apache Spark, ApacheKafka, and DBT. Proven success in building and managing data platforms supporting both batch and real-time processing architectures. Deep understanding of data warehousing, ETL/ELT pipelines, and analytics More ❯
or are technical lead for the execution. The system probably contains a combo of Data integration: integrate a variety of datasources with for example apache camel, apache pulsar or kafka, dlt, python, airbyte Analytics Engineering: model datawarehouses both batch and real-time with for example clickhouse and dbt or sqlmesh Business intelligence: build visuals that answer critical business questions … k8s Translate business logic to available data, for example creating insights for a wholesale client with data warehousing using an azure, aws, gcp or on-premise architecture including apachekafka/pulsar, sqlmesh/dbt, clickhouse/databend and metabase/superset. Build state-of-the-art systems that solve client-specific challenges, for example building agentic LLM's More ❯
you know that organizing big data can yield pivotal insights when it is gathered from disparate sources. We need a Data Engineer who is experienced in upgrading and maintaining Kafka clusters in Kubernetes in AWS, as well as utilizing Kafka Schema Registry and Kafka Security Manager ( KSM ) to manage schema evolution and security. In this role, you … ll use your expertise in designing, developing, and deploying Kafka clusters in a cloud environment, and with Kafka Schema Registry and Kafka Security Manager. Here, you will guide and mentor data engineers, developers, and data consumers in a fast-paced, agile environment and will oversee the assessment, design, building, and maintenance of scalable platforms for your clients. … Join us. The world can't wait. You Have: Experience with Kafka or Confluent in a containerized environment Experience with Apache NiFi in a containerized environment Experience creating data partitioning strategies and monitoring topics for performance Experience deploying and upgrading Kafka clusters in high availability containerized environments Experience utilizing observability platforms, including Prometheus, Grafana, or Elastic to configure More ❯
scale, ideally in consumer-facing or marketplace environments. Strong knowledge of distributed systems and modern data ecosystems, with hands-on experience using technologies such as Databricks, Apache Spark, ApacheKafka, and DBT. Proven success in building and managing data platforms supporting both batch and real-time processing architectures. Deep understanding of data warehousing, ETL/ELT pipelines, and analytics More ❯
for data flow automation Strong proficiency in Python for data processing and pipeline development Experience with SQL databases (PostgreSQL, Oracle, SQL Server) Knowledge of big data technologies (Hadoop, Spark, Kafka) Familiarity with cloud platforms and containerization (Docker, Kubernetes) Understanding of data formats (JSON, XML, Parquet, Avro) Professional Experience Bachelor's degree in Computer Science, Engineering, or related field 5+ More ❯