Croydon, Surrey, England, United Kingdom Hybrid/Remote Options
eTeam Inc
Job Title: Senior Engineer with Node.js and Python or Back End (Java and ideally Kafka) Max rate: £537/Day on Umbrella Location: Croydon (Hybrid) Clearance required: SC Transfer (Active SC only ) Contract Duration: 05 months Experience of Java Experience of Spring framework or equivalent. Knowledge of software design patterns and when to apply them Excellent knowledge of development More ❯
/Kubernetes and CI/CD. Proven track record optimising apps for performance, memory, and scalability. SQL/NoSQL database experience, including deployment and integration. Knowledge of messaging systems (Kafka, RabbitMQ, Pub/Sub). Excellent communication and analytical skills. Nice to have: Telecom-specific protocols (SMPP, SIP), OSS/BSS integrations, or network APIs. Event-driven systems, CQRS More ❯
It would be a plus if you also possess previous experience in: TDD/ATDD/BDD, DDD, Pair/Ensemble Programming Wildfly, Oracle SQL, AWS, Docker/Kubernetes, Kafka, Jenkins GWT, Jest, Vite, Cypress, Playwright, eslint, esbuild, webpack, web components More ❯
with data security, privacy, and compliance frameworks ● Exposure to machine learning pipelines, MLOps, or AI-driven data products ● Experience with big data platforms and technologies such as EMR, Databricks, Kafka, Spark ● Exposure to AI/ML concepts and collaboration with data science or AI teams. ● Experience integrating data solutions with AI/ML platforms or supporting AI-driven analytics More ❯
A culture of engineering excellence driven by mentoring and high-quality practices. Preferred Experience Databricks in a SaaS environment, Spark, Python, and database technologies. Event-driven and distributed systems (Kafka, AWS SNS/SQS, Java, Python). Data Governance, Data Lakehouse/Data Intelligence platforms. AI software delivery and AI data preparation. More ❯
Richmond, Surrey, South East, United Kingdom Hybrid/Remote Options
Client Server
stack: data pipeline orchestration tools (e.g. Airflow, Prefect, Dagster), cloud data platforms (e.g. AWS, S3, Glue, Athena, Redshift, Kinesis), data warehouse concepts and dimensional modelling, data streaming tools (e.g. Kafka, Kinesis) You're collaborative with great communication skills What's in it for you: Salary to £50k Remote working (including abroad) Paid for training and certifications Home office budget More ❯
Reigate, England, United Kingdom Hybrid/Remote Options
esure Group
deployment.Ensure security and governance through robust access controls, including RBAC, SSO, token policies, and pseudonymisation frameworks.Develop resilient data flows for both batch and streaming workloads using technologies such as Kafka, Airflow, DBT, and Terraform.Shape data strategy and standards by contributing to architectural decisions, authoring ADRs, and participating in reviews, data councils, and platform enablement initiatives.QualificationsWhat we’d love you … and regulatory reporting requirementsDirect exposure to cloud-native data infrastructures (Databricks, Snowflake) especially in AWS environments is a plusExperience in building and maintaining batch and streaming data pipelines using Kafka, Airflow, or SparkFamiliarity with governance frameworks, access controls (RBAC), and implementation of pseudonymisation and retention policiesExposure to enabling GenAI and ML workloads by preparing model-ready and vector-optimised More ❯
Reigate, England, United Kingdom Hybrid/Remote Options
esure Group
security and governance through robust access controls, including RBAC, SSO, token policies, and pseudonymisation frameworks. Develop resilient data flows for both batch and streaming workloads using technologies such as Kafka, Airflow, DBT, and Terraform. Shape data strategy and standards by contributing to architectural decisions, authoring ADRs, and participating in reviews, data councils, and platform enablement initiatives. What we’d … reporting requirements Direct exposure to cloud-native data infrastructures (Databricks, Snowflake) especially in AWS environments is a plus Experience in building and maintaining batch and streaming data pipelines using Kafka, Airflow, or Spark Familiarity with governance frameworks, access controls (RBAC), and implementation of pseudonymisation and retention policies Exposure to enabling GenAI and ML workloads by preparing model-ready and More ❯