Watford, Hertfordshire, United Kingdom Hybrid / WFH Options
Digital Gaming Corp
data from sources like Facebook, Google Analytics, and payment providers. Using tools like AWS Redshift, S3, and Kafka, you'll optimize data models for batch and real-time processing. Collaborating with stakeholders, you'll deliver actionable insights on player behavior and gaming analytics, enhancing experiences and driving revenue with … robust ETL pipelines to integrate data from diverse sources, including APIs like Facebook, Google Analytics, and payment providers. Develop and optimize data models for batchprocessing and real-time streaming using tools like AWS Redshift, S3, and Kafka. Lead efforts in acquiring, storing, processing, and provisioning data More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Smart DCC
the data environment. What will you be doing? Design and implement efficient ETL processes for data extraction, transformation, and loading. Build real-time data processing pipelines using platforms like Apache Kafka or cloud-native tools. Optimize batchprocessing workflows with tools like Apache Spark and Flink for … What are we looking for? Advanced proficiency with databases (SQL Server, Oracle, MySQL, PostgreSQL). Expertise in building and managing data pipelines and data processing workflows. Strong understanding of data warehousing concepts, schema design, and data modelling. Hands-on experience with cloud platforms (AWS, Azure, Google Cloud) for scalable More ❯
and implement streaming data pipelines using AWS EMR and PySpark to generate real-time (fast-moving) features for the feature store. Develop and maintain batchprocessing pipelines using DBT and BigQuery to generate batch (slow-moving) features, ensuring data quality, consistency and reliability. Work with Feast feature … recruiting and related purposes. Our Privacy Notice explains what personal information we will process, where we will process your personal information, its purposes for processing your personal information, and the rights you can exercise over our use of your personal information. More ❯
robust ETL pipelines to integrate data from diverse sources, including APIs like Facebook, Google Analytics, and payment providers. Develop and optimize data models for batchprocessing and real-time streaming using tools like AWS Redshift, S3, and Kafka. Lead efforts in acquiring, storing, processing, and provisioning data More ❯
in Java , Spark , Scala ( or Java) Production scale hands-on Experience to write Data pipelines using Spark/any other distributed real time/batch processing. Strong skill set in SQL/Databases Strong understanding of Messaging tech like Kafka, Solace , MQ etc. Writing production scale applications to use More ❯
Pydantic). Scalable Infrastructure: Design, implement, and optimise cloud-based solutions for AI workloads. AI Workload Optimisation: Architect and scale compute infrastructure for inference, batchprocessing, and real-time AI interactions. Event-Driven Architecture: Develop and maintain real-time, event-driven systems (SNS/SQS, Kafka, Redis Streams … resilience. Problem-Solving: Proven ability to debug and optimise distributed systems. Nice to Have AI Scaling: Experience running AI/ML workloads in production (batch vs. real-time inference, GPU optimisation). Vector Databases: Familiarity with vector search and retrieval systems. Kubernetes: Hands-on experience managing and deploying Kubernetes More ❯
and debugging code related performance issues, SQL tuning, etc. Experience with AWS services such as S3, RDS, Aurora, NoSQL, MSK (Kafka). Experience with batchprocessing/ETL using Glue Jobs, AWS Lambda, and Step functions. Experience with designing bespoke & tailored front-end solutions (GUI based) using open More ❯
London Based Investment Bank You will: Design and build foundation components that will underpin our data mesh ecosystem. Build enterprise class real-time and batch solutions that support mission critical processes. Build solutions in line with our Digital Principles. Partner with our Product team(s) to create sustainable and … hands-on engineering in large scale complex Enterprise(s), ideally in the banking/financial industry. Worked with modern tech - data streaming, real-time & batchprocessing and compute clusters. Working knowledge of relational and NoSQL databases, designing and implementing scalable solutions. Experience of working in continuous architecture environment More ❯
business objectives. Manage the technical product backlog, prioritising system enhancements and platform stability. Work closely with engineering and architecture teams to ensure API integrations, batchprocessing, and data flow across platforms. Lead and oversee broker and partner onboarding, ensuring seamless integration (direct, API, or via Software House solutions More ❯
Hampton Magna, Warwickshire, UK Hybrid / WFH Options
Telent
of level 3 technical incident resolution along with the delivery of continuous improvements. What you’ll do: Day to day monitoring of integration and batch process processes, remediating errors in a swiftly manner to mitigate business impact. Day to Day technical administration of corporate applications inc Oracle & MS Sql More ❯
Newcastle Upon Tyne, Tyne And Wear, United Kingdom Hybrid / WFH Options
Accenture
Knowledge of database technologies such as PostgreSQL or relational DB. Knowledge in microservices development and various integration patterns. Knowledge in event driven development and batchprocessing Experience in application development across full stack technologies including integrations with Power Platform, Messaging services, Rabbit MQ, API, Kafka live streaming or More ❯
Experience with deep learning frameworks like TensorFlow or PyTorch. Knowledge of ML model deployment options (e.g., Azure Functions, FastAPI, Kubernetes) for real-time and batch processing. Experience with CI/CD pipelines (e.g., DevOps pipelines, Git actions). Knowledge of infrastructure as code (e.g., Terraform, ARM Template, Databricks Asset … Bundles). Understanding of advanced machine learning techniques, including graph-based processing, computer vision, natural language processing, and simulation modeling. Experience with generative AI and LLMs, such as LLamaIndex and LangChain Understanding of MLOps or LLMOps. Familiarity with Agile methodologies, preferably Scrum We are actively seeking candidates for More ❯