Delta, Delta Live Tables, PyTest, Great Expectations (or similar). Building and orchestrating data and analytical processing for streaming data with technologies such as Kafka, AWS Kinesis or Azure Stream Analytics. Building data solutions for embedded analytics and/or SaaS product platforms. Knowledge of data governance, privacy regulations More ❯
bristol, south west england, United Kingdom Hybrid / WFH Options
LHH
a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with ApacheKafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine learning More ❯
Build efficient data models for real-time analytics. Proven experience in managing real-time data pipelines across multiple initiatives. Expertise in distributed streaming platforms (Kafka, Spark Streaming, Flink). Experience with GCP (preferred), AWS, or Azure for real-time data ingestion and storage. Strong programming skills in Python, Java More ❯
Build efficient data models for real-time analytics. Proven experience in managing real-time data pipelines across multiple initiatives. Expertise in distributed streaming platforms (Kafka, Spark Streaming, Flink). Experience with GCP (preferred), AWS, or Azure for real-time data ingestion and storage. Strong programming skills in Python, Java More ❯
Databricks) and Big Data processing (e.g., Apache Spark, Beam). Proficiency in key technologies like BigQuery, Redshift, Synapse, Pub/Sub, Kinesis, Event Hubs, Kafka, Dataflow, Airflow, and ADF. Strong ETL and data modeling skills. Proven ability to design and implement effective database solutions, ensure compliance with regulations, and More ❯
Databricks) and Big Data processing (e.g., Apache Spark, Beam). Proficiency in key technologies like BigQuery, Redshift, Synapse, Pub/Sub, Kinesis, Event Hubs, Kafka, Dataflow, Airflow, and ADF. Strong ETL and data modeling skills. Proven ability to design and implement effective database solutions, ensure compliance with regulations, and More ❯
experience in developing Java systems. Proven track record of leading a team and delivering projects with a commercial mindset. Prior experience with Event Sourcing (Kafka, Akka, Spark) and Data Distribution based architecture Experience with NoSQL (Mongo, Elastic, Hadoop), in memory (MEMSQL, Ignite) and relational (Sybase, DB2, SybaseIQ) data store More ❯
of building and managing real-time data pipelines across a track record of multiple initiatives. Expertise in developing data backbones using distributed streaming platforms (Kafka, Spark Streaming, Flink, etc.). Experience working with cloud platforms such as AWS, GCP, or Azure for real-time data ingestion and storage. Ability More ❯
Data Engineer, or AWS Certified Data Analytics, or AWS Certified Solutions Architect Experience with big data tools and technologies like Apache Spark, Hadoop, and Kafka Knowledge of CI/CD pipelines and automation tools such as Jenkins or GitLab CI About Adastra For more than 25 years, Adastra Corporation More ❯
EMR, and Kinesis. Strong knowledge of data modeling, warehousing, and schema design. Experience with event-driven architectures, streaming data, and real-time processing using Kafka or Kinesis. Expertise in IaC (Infrastructure as Code) using Terraform, CloudFormation, or AWS CDK. Familiarity with DevOps and CI/CD practices for data More ❯
Experience with DevOps tools such as Docker, Kubernetes, Jenkins, etc. Innate curiosity about consumer behavior and technology Experience with event messaging frameworks like ApacheKafka A fan of movies and television is a strong plus. Required Education Bachelor's degree in Computer Science, Information Systems, Software, Electrical or Electronics More ❯
with delivery partners and third-party applications. Proficiency in Java and Spring Boot is essential. Familiarity with the Spring Framework and tools like ApacheKafka is a bonus. Knowledge of finical billing domains/systems Hands-on experience with microservices architecture , database programming , and event streaming in a cloud More ❯
ideally version 8+), database, C#, .Net Multi-threading Junit Maven Logging frameworks Apache libraries GOF Patterns MS SQL Server/SQL & PL Experience in Kafka is a plus Experience in Monitoring solutions - Azure Insights/Grafana etc. is a plus ETRM/CTRM experience Knowledge of the Endur application More ❯
. • In-depth knowledge of data warehousing concepts and tools (e.g., Redshift, Snowflake, Google BigQuery). • Experience with big data platforms (e.g., Hadoop, Spark, Kafka). • Familiarity with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud). • Expertise in ETL tools and processes (e.g., Apache NiFi More ❯
deliver high-quality products. Mandatory Skillsets: They need hands-on developer with the below skills: Java Spring Boot Cassandra or any alternative NoSQL database Kafka (very critical) IBM MQ Messaging React JS - (Good to have), not mandatory as of now. Good to have: Individual Contributor Quick Learner Good communication More ❯
SQL , and one or more: R, Java, Scala Experience with relational/NoSQL databases (e.g., PostgreSQL, MongoDB) Familiarity with big data tools (Hadoop, Spark, Kafka), cloud platforms (Azure, AWS, GCP), and workflow tools (Airflow, Luigi) Bonus: experience with BI tools , API integrations , and graph databases Apply now and help More ❯
Beneficial Experience: Background in energy trading or commodities markets Exposure to containerization and orchestration (Kubernetes) Familiarity with event-driven architecture and message brokers (e.g., Kafka, Service Bus) This role offers the opportunity to work within one of the most innovative teams in the energy trading space, with long-term More ❯
london, south east england, United Kingdom Hybrid / WFH Options
RJC Group
Beneficial Experience: Background in energy trading or commodities markets Exposure to containerization and orchestration (Kubernetes) Familiarity with event-driven architecture and message brokers (e.g., Kafka, Service Bus) This role offers the opportunity to work within one of the most innovative teams in the energy trading space, with long-term More ❯
C# - .NET Core Golang Python and R with Jupyter and Azure DataBricks Postgres (and Timescale), Redis, document and column-based storage engines RabbitMQ and Kafka-style commit logs Dapr React, Redux, React-Router, Styled-Components, Express, TRPC GraphQL, MQTT However, the ideal candidate will have experience in Golang, Azure More ❯
Employment Type: Permanent
Salary: £75000 - £100000/annum Up to £100k basic + excellent benef
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Exalto Consulting ltd
C# - .NET Core Golang Python and R with Jupyter and Azure DataBricks Postgres (and Timescale), Redis, document and column-based storage engines RabbitMQ and Kafka-style commit logs Dapr React, Redux, React-Router, Styled-Components, Express, TRPC GraphQL, MQTT However, the ideal candidate will have experience in Golang, Azure More ❯
bradford, yorkshire and the humber, united kingdom
Exalto Consulting ltd
C# - .NET Core Golang Python and R with Jupyter and Azure DataBricks Postgres (and Timescale), Redis, document and column-based storage engines RabbitMQ and Kafka-style commit logs Dapr React, Redux, React-Router, Styled-Components, Express, TRPC GraphQL, MQTT However, the ideal candidate will have experience in Golang, Azure More ❯
industry standard solution Splunk, ELK would be beneficial Expertise of any public cloud (AWS preferred) Knowledge of Enterprise Integration Patterns with deep understanding of Kafka would also be beneficial Expertise in any industry DevOps solutions (eg - Jenkins, GitLab, AWS CodePipeline) Financial services experience would be advantageous Ability to take More ❯