and analytics teams to ensure seamless data flow across the organisation. You’ll own the full data lifecycle—from ingestion to transformation and delivery—leveraging modern cloud and streaming technologies. 🔧 Responsibilities Design and build scalable, low-latency real-time data pipelines Integrate diverse data sources and ensure data accuracy and integrity Develop and maintain data models optimised for … availability Implement logging, monitoring, and alerting for infrastructure health Partner with cross-functional teams to deliver robust data solutions 💡 What You’ll Bring Strong hands-on experience building streaming data platforms Deep understanding of tools like Kafka , Flink , SparkStreaming , etc. Proficiency in Python , Java , or Scala Cloud experience with AWS , GCP , or Azure More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Atarus
and analytics teams to ensure seamless data flow across the organisation. You’ll own the full data lifecycle—from ingestion to transformation and delivery—leveraging modern cloud and streaming technologies. 🔧 Responsibilities Design and build scalable, low-latency real-time data pipelines Integrate diverse data sources and ensure data accuracy and integrity Develop and maintain data models optimised for … availability Implement logging, monitoring, and alerting for infrastructure health Partner with cross-functional teams to deliver robust data solutions 💡 What You’ll Bring Strong hands-on experience building streaming data platforms Deep understanding of tools like Kafka , Flink , SparkStreaming , etc. Proficiency in Python , Java , or Scala Cloud experience with AWS , GCP , or Azure More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Atarus
and analytics teams to ensure seamless data flow across the organisation. You’ll own the full data lifecycle—from ingestion to transformation and delivery—leveraging modern cloud and streaming technologies. Responsibilities Design and build scalable, low-latency real-time data pipelines Integrate diverse data sources and ensure data accuracy and integrity Develop and maintain data models optimised for … availability Implement logging, monitoring, and alerting for infrastructure health Partner with cross-functional teams to deliver robust data solutions What You’ll Bring Strong hands-on experience building streaming data platforms Deep understanding of tools like Kafka , Flink , SparkStreaming , etc. Proficiency in Python , Java , or Scala Cloud experience with AWS , GCP , or Azure More ❯
Services, Manufacturing, Life Sciences and Healthcare, Technology and Services, Telecom and Media, Retail and CPG, and Public Services. Consolidated revenues as of $13 billion. Job Description: Required Skills: Spark - Must have Scala - Must have Hive & SQL - Must have Hadoop - Must have Communication - Must have Banking/Capital Markets Domain - Good to have Note: Candidate should know Scala/… Core) coding language. Pyspark profile will not help here. Responsibilities: Good Big Data resource with the below Skillset: Experience in Big data technologies, real time data processing platform (SparkStreaming) experience would be an advantage. Consistently demonstrates clear and concise written and verbal communication. A history of delivering against agreed objectives. Ability to multi-task and More ❯
responsibilities API & Service Development Design, develop, and maintain robust internal RESTful and event-driven APIs. Establish best practices for versioning, documentation, security, and monitoring. Build and manage data streaming pipelines using frameworks such as Spark, Kafka, or equivalents. Ensure high-availability and low-latency delivery of market and industry data. Deploy and operate services on Azure … understand data requirements. Produce clear design docs and code reviews to foster team knowledge sharing. What we are looking for Hands-on experience building and operating real-time streaming solutions (e.g., SparkStreaming, Kafka, Event-Hub) Thorough understanding of Azure services (Functions, App Services, AKS). Proven design and integration with deploying API frameworks More ❯
Excellent written and verbal communication skills, with the ability to collaborate effectively with cross-functional teams. Understanding of good engineering practices (DevSecOps, source-code versioning, ) Preferred - Experience with streaming technologies like SparkStreaming or Kafka Preferred - Infrastructure as Code (IaC) experience, preferably with Terraform. Preferred - Experience designing and developing API data integrations using SOAP More ❯
optimal. Strong experience in different software testing paradigms to ensure consistency and correctness of our data. Bonus Skills Knowledge of real-time or stream processing systems (e.g. Kafka, SparkStreaming). Domain experience in energy, IoT, or working with unreliable/messy datasets. Frontend awareness—able to contribute to or reason about how data is visualized More ❯
optimal. Strong experience in different software testing paradigms to ensure consistency and correctness of our data. Bonus Skills Knowledge of real-time or stream processing systems (e.g. Kafka, SparkStreaming). Domain experience in energy, IoT, or working with unreliable/messy datasets. Frontend awareness—able to contribute to or reason about how data is visualized More ❯
optimal. Strong experience in different software testing paradigms to ensure consistency and correctness of our data. Bonus Skills Knowledge of real-time or stream processing systems (e.g. Kafka, SparkStreaming). Domain experience in energy, IoT, or working with unreliable/messy datasets. Frontend awareness—able to contribute to or reason about how data is visualized More ❯
in R&D activities: evaluating new tools, improving system performance, and ensuring scalability What We're Looking For Knowledge of AWS data services: MSK (Kafka: message broker), EMR (Spark/Flink), Glue, Kinesis, MWAA (Airflow) Basic understanding of cloud environments (preferably AWS) Basic experience with infrastructure-as-code tools (e.g., Terraform, AWS CDK, or CloudFormation) Strong interest in … willingness to learn in a fast-paced environment Nice to Have Hands-on experience with Kubernetes (EKS or similar) AWS certification (Cloud Practitioner or Associate level) Exposure to streaming frameworks (Flink, SparkStreaming) Awareness of cloud security best practices Why Join Us Be part of a product-driven AdTech company making a global impact More ❯
also requires experience in the following: Experience in software development and management, specialized in Java/J2EE based Web Application Development. Experience in Big Data processing systems using Spark, Python, PySpark and Databricks platforms. Experience with data streaming solutions using Sparkstreaming, Databricks Autoloader and Database migration services replication change logs. Experience More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Atarus
across the team 🧰 What You’ll Need Strong experience leading data engineering teams in high-growth environments Deep expertise with real-time data processing tools (e.g. Kafka, Flink, SparkStreaming) Solid hands-on knowledge of cloud platforms (AWS, GCP or Azure) Strong proficiency in languages like Python, Java or Scala Familiarity with orchestration tools such as More ❯
across the team 🧰 What You’ll Need Strong experience leading data engineering teams in high-growth environments Deep expertise with real-time data processing tools (e.g. Kafka, Flink, SparkStreaming) Solid hands-on knowledge of cloud platforms (AWS, GCP or Azure) Strong proficiency in languages like Python, Java or Scala Familiarity with orchestration tools such as More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Atarus
across the team What You’ll Need Strong experience leading data engineering teams in high-growth environments Deep expertise with real-time data processing tools (e.g. Kafka, Flink, SparkStreaming) Solid hands-on knowledge of cloud platforms (AWS, GCP or Azure) Strong proficiency in languages like Python, Java or Scala Familiarity with orchestration tools such as More ❯
Skills & Experience: Strong programming skills (ideally Python), focusing on testable, maintainable code. Expertise in cloud services (ideally AWS and Databricks), emphasizing secure, scalable architectures. Experience with large-scale streaming data systems (e.g., Kafka, SparkStreaming), especially on Databricks. Proficiency with low-latency time-series databases (e.g., Apache Druid). Proven leadership in building and More ❯
and understanding of current cyber security threats, actors and their techniques. Experience with data science, big data analytics technology stack, analytic development for endpoint and network security, and streaming technologies (e.g., Kafka, SparkStreaming, and Kinesis). Strong sense of ownership combined with collaborative approach to overcoming challenges and influencing organizational change. Amazon is More ❯
London, England, United Kingdom Hybrid / WFH Options
EMBS Technology
opportunity to be at the forefront of enterprise-scale AI adoption , working with AWS cloud-native event-driven architectures that will shape the future of AI-powered data streaming and automation. The Role As an AWS Platform Engineer , you will play a key role in designing, deploying, and optimising event-driven architectures in an AWS cloud environment. You … will work alongside AI specialists, DevOps engineers, and data teams to build highly scalable, real-time data streaming solutions that drive AI-driven automation and decision-making. This is a client-facing role , requiring: Exceptional communication skills to engage with both technical and non-technical stakeholders A high-energy approach , working in a fast-paced, cutting-edge environment … remote work available for the remaining three days. Essential Skills and Experience Strong expertise in AWS cloud services , with practical experience or theoretical knowledge of AWS MSK (Managed Streaming for Apache Kafka) Deep understanding of Kafka (Apache Kafka, Confluent Kafka) and event-driven microservices architecture Proficiency in Infrastructure as Code tools (Terraform, AWS CloudFormation) Experience with observability and More ❯
recovery process/tools Experience in troubleshooting and problem resolution Experience in System Integration Knowledge of the following: Hadoop, Flume, Sqoop, Map Reduce, Hive/Impala, Hbase, Kafka, SparkStreaming Experience of ETL tools incorporating Big Data Shell Scripting, Python Beneficial Skills: Understanding of: LAN, WAN, VPN and SD Networks Hardware and Cabling set-up experience More ❯
and understanding of current cyber security threats, actors and their techniques. Experience with data science, big data analytics technology stack, analytic development for endpoint and network security, and streaming technologies (e.g., Kafka, SparkStreaming, and Kinesis). Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to More ❯
GDPR, CCPA, SOC). Preferred Experience Proficiency in SQL, data modeling, ETL processes, and modern data warehousing. Experience with functional programming and real-time stream processing (e.g., Flink, SparkStreaming, or similar). Demonstrated ability to handle production environments processing tens of thousands of events per second from diverse sources. This is an opportunity to join More ❯
GDPR, CCPA, SOC). Preferred Experience Proficiency in SQL, data modeling, ETL processes, and modern data warehousing. Experience with functional programming and real-time stream processing (e.g., Flink, SparkStreaming, or similar). Demonstrated ability to handle production environments processing tens of thousands of events per second from diverse sources. This is an opportunity to join More ❯
GDPR, CCPA, SOC). Preferred Experience Proficiency in SQL, data modeling, ETL processes, and modern data warehousing. Experience with functional programming and real-time stream processing (e.g., Flink, SparkStreaming, or similar). Demonstrated ability to handle production environments processing tens of thousands of events per second from diverse sources. This is an opportunity to join More ❯
GDPR, CCPA, SOC). Preferred Experience Proficiency in SQL, data modeling, ETL processes, and modern data warehousing. Experience with functional programming and real-time stream processing (e.g., Flink, SparkStreaming, or similar). Demonstrated ability to handle production environments processing tens of thousands of events per second from diverse sources. This is an opportunity to join More ❯