London, South East, England, United Kingdom Hybrid / WFH Options
Sanderson
to improve operational efficiency and service delivery Act as the sole DevOps Engineer within the squad, providing end-to-end engineering support Essential Skills & Experience: CI/CD Terraform Kafka Kubernetes (plus experience migrating away from Kubernetes to serverless) GitLab GitHub Actions (and ideally GitHub Enterprise) AWS services: Lambda, CloudWatch, S3, IAM, Athena, RDS Strong stakeholder management and communication More ❯
ensure a smooth transition of service. Working with cross functional teams, outsourcing contracts, contract schedules, managed service commercial principles. Experience of data driven technology stacks including Docker, Kubernetes and Kafka, data structures including both SQL and Non-SQL such as Elastic and Mongo, experience with programming language (Python, C#) . The role requires someone with strong problem-solving skills More ❯
Stoke-on-Trent, Staffordshire, England, United Kingdom
OCC Computer Personnel
ensure a smooth transition of service. Working with cross functional teams, outsourcing contracts, contract schedules, managed service commercial principles. Experience of data driven technology stacks including Docker, Kubernetes and Kafka, data structures including both SQL and Non-SQL such as Elastic and Mongo, experience with programming language (Python, C#) . The role requires someone with strong problem-solving skills More ❯
YouYou'll bring: Proven experience in data architecture, data modelling, and database design. Strong knowledge of cloud platforms (Azure, AWS, or GCP) and modern data technologies (e.g., Snowflake, Databricks, Kafka). Expertise in SQL, Python, or other data-centric programming languages. Familiarity with data governance, security, and compliance frameworks. Excellent communication and stakeholder management skills. Why Join Us? Competitive More ❯
Belfast, County Antrim, Northern Ireland, United Kingdom Hybrid / WFH Options
Hays
YouYou'll bring: Proven experience in data architecture, data modelling, and database design. Strong knowledge of cloud platforms (Azure, AWS, or GCP) and modern data technologies (e.g., Snowflake, Databricks, Kafka). Expertise in SQL, Python, or other data-centric programming languages. Familiarity with data governance, security, and compliance frameworks. Excellent communication and stakeholder management skills. Why Join Us? Competitive More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
Terraform A collaborative mindset and a passion for mentoring and developing others Comfortable balancing technical decisions with business needs Nice to have: experience with Data Mesh, real-time streaming (Kafka/Kinesis), or MLOps frameworks.What’s on Offer Competitive salary and equity Hybrid working model (1 day/week in central London office) Private medical insurance Generous holiday allowance More ❯
autonomy features Our Tech Stack Backend: Go C++ (Either or both - will allow upskill) Frontend: React + Next.js Databases: PostgreSQL, MongoDB, Redis, Clickhouse Cloud: AWS (S3), Railway Messaging: MQTT, Kafka Pay range and compensation package up to £75,000 with equity options and other benefits The role is mainly remote but will need to be onsite for deployments and More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
data products. Ensure solutions meet governance, compliance, and security standards. Skills & Experience: Proven experience as a Data Solution Architect or similar senior-level data architecture role. Strong knowledge of Kafka, Confluent, Databricks, Unity Catalog, and cloud-native architecture. Skilled in Data Mesh, Data Fabric, and product-led data strategy design. Experience with big data tools (e.g., Spark), ETL/ More ❯
IT-related education or several years of software engineering experience Experience in designing data models SQL knowledge Experience with software testing (preferably Cucumber) Experience with application integration (API, ActiveMQ, Kafka, data lake, etc.) Experience with object-oriented languages (preferably Java and C#) Experience with cloud services and CI/CD (preferably GitHub and OpenShift) Experience with scripting (PL/ More ❯
IT-related education or several years of software engineering experience Experience in designing data models SQL knowledge Experience with software testing (preferably Cucumber) Experience with application integration (API, ActiveMQ, Kafka, data lake, etc.) Experience with object-oriented languages (preferably Java and C#) Experience with cloud services and CI/CD (preferably GitHub and OpenShift) Experience with scripting (PL/ More ❯
Scottsdale, Arizona, United States Hybrid / WFH Options
Moseley Technical Services, Inc
Software Engineering, or a related Science, Engineering or Mathematics field and 5+ years of job-related experience, or a Master's degree plus 3 years of job-related experience Kafka (required) Linux (required) Java (required) React (required) Docker/Kubernetes (required) Preferred Qualifications: DevSecOps (preferred) Cloud Services (preferably AWS) MPLS/Networking (preferred) GitLab/Jenkins (preferred) Jira (preferred More ❯
numeric discipline, or equivalent work experience Excellent written and spoken English Added bonus if you have: Unix/Linux experience Oracle Experience Experience of event-driven distributed messaging (e.g. Kafka) Experience of financial markets and the trade lifecycle beneficial C# and any GUI development experience What we offer you: At FIS, you can learn, grow and make an impact More ❯
of experience in big data technology with experience ranging from platform architecture, data management, data architecture and application architecture High Proficiency working with Hadoop platform including Spark/Scala, Kafka, SparkSQL, HBase, Impala, Hive and HDFS in multi-tenant environments Solid base in data technologies like warehousing, ETL, MDM, DQ, BI and analytical tools extensive experience in metadata management More ❯
contexts. Strong .NET software engineering background. Deep expertise in Kubernetes (on-prem and cloud) and Terraform . Experience with real-time, event-driven systems and message brokers (e.g., RabbitMQ, Kafka). Strong grasp of telemetry, observability, and performance monitoring in distributed systems. Track record of technical leadership and setting engineering standards. Nice to Have: Experience with OpenTelemetry , Prometheus, Grafana More ❯
and be a role model for your fellow technical colleagues. Refine our architecture, drive topics and share knowledge across the organisation. Technologies: AI AWS Cloud Confluence Docker GitHub JIRA Kafka Kanban MS Teams NestJS NodeJS PHP Product Owner React TypeScript Web JavaScript More: Founded in 2016 and based in Munich, Planerio offers a modern operating system with intelligent duty More ❯
to travel to conferences, and dedicated time for your personal development What you'll be working with: •Backend: Distributed, event-driven core Java (90% of the code-base), MySQL, Kafka •Data analytics: Python & Jupyter notebooks, Parquet, Docker •Testing: JUnit, JMH, JCStress, Jenkins, Selenium, many in-house tools •OS: Linux (Fedora for development, Rocky in production) The LMAX way is More ❯
Logstash, and Kibana (ELK) applications including data ingest, storage, processing, and visualization. •Working knowledge and understanding of Linux. •Demonstrated experience with Event, SNMP V3, and Performance Management. •Experience with Kafka and Live Action. •Demonstrated experience with virtualization technologies such as VMware or KVM. •Demonstrate experience in one or more of the following enterprise network management tools such as Palo More ❯
focus. Effective communicator in both written and verbal mediums. BENEFICIAL SKILLS & QUALIFICATIONS Prior experience working on an electronic trading platform, e.g. reference data, market data & FIX. Knowledge of Spring, Kafka, SQL and/or Linux. Prior experience designing and implementing distributed systems modelling complex workflows. Prior experience in the financial industry. Understanding of common data structures and optimisations regarding More ❯
Solid querying and data preparation skills. Data Architectures: Understanding of modern data systems (lakehouses, data lakes). Additional (nice-to-have) skills: Infrastructure as Code: Terraform or equivalent. Streaming: Kafka, Kinesis. Cloud certifications (AWS or Azure). Experience in consulting or the energy sector. Public engagement through blogging or speaking. Strong communication and stakeholder engagement. Integration with hybrid or More ❯
modelin g , SQL, NoSQL databases, and data warehousing . Hands-on experience with data pipeline development, ETL processes, and big data technolo g ies (e. g ., Hadoop, Spark, Kafka). Proficiency in cloud platforms such as AWS, Azure, or Goo g le Cloud and cloud-based data services (e.g ., AWS Redshift, Azure Synapse Analytics, Goog le Bi More ❯
modelin g , SQL, NoSQL databases, and data warehousing . Hands-on experience with data pipeline development, ETL processes, and big data technolo g ies (e. g ., Hadoop, Spark, Kafka). Proficiency in cloud platforms such as AWS, Azure, or Goo g le Cloud and cloud-based data services (e.g ., AWS Redshift, Azure Synapse Analytics, Goog le Bi More ❯
Maven or Gradle Bonus Points Experience with Spring Boot (used for DI and config management) Experience with ORMs (we use Spring Data over JPA/Hibernate) Basic understanding of Kafka Soft Skills Thinks critically and spots gaps in test coverage Can read and extend existing test code Cares about test value, not just coverage numbers What's in it More ❯
specialist, design and architecture experience - 7+ years of external or internal customer facing, complex and large scale project management experience - 5+ years of database (eg. SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) experience - 3+ years of cloud based solution (AWS or equivalent), system, network and operating system experience PREFERRED QUALIFICATIONS - AWS experience preferred, with proficiency in a wide range of More ❯
professional experience in data engineering roles, preferably for a customer facing data product Expertise in designing and implementing large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling More ❯
Uses database migration patterns, such as, "expand and contract" using go-migrate Writing observable and testable code using libraries such as testify and mockgen Publishing and consuming Avro formatted Kafka messages CI/CD GitHub Actions Trunk Based Development & Continuous Delivery Soft skills: Good collaboration skills at all levels with cross-functional teams Highly developed ownerships and creative thinking More ❯