Bracknell, Berkshire, United Kingdom Hybrid / WFH Options
Techex
troubleshooting Experience of public cloud platform architecture/design CCNP or higher/equivalent non-cisco qualification (Routing & Switching or Data-Centre/SDN) Experience with either Influx, Redis, Kafka, Grafana, Kibana Our Values and Benefits We have secured Great Place to work accreditation for the past two years and we seek out individuals who enjoy developing their professional More ❯
Participate in on-call rotation to ensure 24/7 coverage of global trading systems Use the latest technology stacks such as AWS, Java 17, Python 3, HDF5, Kubernetes, Kafka and Argo Candidate Requirements Years experience in financial technology support, preferably in electronic trading Strong understanding of market data structures and exchange connectivity Linux/Unix system administration skills … or similar languages Knowledge of monitoring tools and alerting frameworks SQL experience including queries/updates/table creation/basic database maintenance Exposure to data technologies such as Kafka, Spark or Delta Lake is useful but not mandat Bachelor's degree in Computer Science, Engineering, or related technical field This role offers competitive compensation and the opportunity to More ❯
Participate in on-call rotation to ensure 24/7 coverage of global trading systems Use the latest technology stacks such as AWS, Java 17, Python 3, HDF5, Kubernetes, Kafka and Argo Candidate Requirements Years experience in financial technology support, preferably in electronic trading Strong understanding of market data structures and exchange connectivity Linux/Unix system administration skills … or similar languages Knowledge of monitoring tools and alerting frameworks SQL experience including queries/updates/table creation/basic database maintenance Exposure to data technologies such as Kafka, Spark or Delta Lake is useful but not mandat Bachelor's degree in Computer Science, Engineering, or related technical field This role offers competitive compensation and the opportunity to More ❯
Participate in on-call rotation to ensure 24/7 coverage of global trading systems Use the latest technology stacks such as AWS, Java 17, Python 3, HDF5, Kubernetes, Kafka and Argo Candidate Requirements Years experience in financial technology support, preferably in electronic trading Strong understanding of market data structures and exchange connectivity Linux/Unix system administration skills … or similar languages Knowledge of monitoring tools and alerting frameworks SQL experience including queries/updates/table creation/basic database maintenance Exposure to data technologies such as Kafka, Spark or Delta Lake is useful but not mandat Bachelor's degree in Computer Science, Engineering, or related technical field This role offers competitive compensation and the opportunity to More ❯
Milton Keynes, Buckinghamshire, UK Hybrid / WFH Options
Algo Capital Group
Participate in on-call rotation to ensure 24/7 coverage of global trading systems Use the latest technology stacks such as AWS, Java 17, Python 3, HDF5, Kubernetes, Kafka and Argo Candidate Requirements Years experience in financial technology support, preferably in electronic trading Strong understanding of market data structures and exchange connectivity Linux/Unix system administration skills … or similar languages Knowledge of monitoring tools and alerting frameworks SQL experience including queries/updates/table creation/basic database maintenance Exposure to data technologies such as Kafka, Spark or Delta Lake is useful but not mandat Bachelor's degree in Computer Science, Engineering, or related technical field This role offers competitive compensation and the opportunity to More ❯
Maidenhead, Berkshire, United Kingdom Hybrid / WFH Options
Wireless Logic Group
opportunity to leave your mark on something transformative. Design, develop and maintain user interfaces and web applications (Angular, React) Create and maintain RESTful APIs and contribute to EDAs (Node, Kafka) Implement authentication, authorization and security measures Translate UX designs or wireframes into efficient, reusable code Actively participate in CI/CD strategy with wider-team and colleagues Propose and More ❯
teams to ensure adherence to data privacy regulations (e.g., GDPR) and internal governance standards. Lead evaluation and integration of data tools, platforms, and technologies (e.g., Snowflake, Databricks, Azure Synapse, Kafka, dbt, Power BI). Oversee data integration strategy across the enterprise-including ETL/ELT pipelines, APIs, and event-driven data streaming. Contribute to the development of a Data More ❯
roll-up-your-sleeves, get-into-the-terminal opportunity. We’re looking for someone who enjoys writing Terraform as much as discussing architecture, and who’s comfortable setting up Kafka or Airflow when needed. You’ll architect, build, and deploy core services—embedding pipelines, image-to-nutrient models, ML workflows, and real-time APIs—then scale the team around More ❯
of the industry's brightest minds. Technical skills required: C#, .NET, Python, Blazor, REST APIs, gRPC, GraphQL, MS SQL Server. Any experience of the following will be useful: Docker, Kafka, Sentry, Grafana Degree education is preferred, but we will consider Software Engineers with the right level of technical ability for the role. We are looking for a C# Software More ❯
of the industry's brightest minds. Technical skills required: C#, .NET, Python, Blazor, REST APIs, gRPC, GraphQL, MS SQL Server. Any experience of the following will be useful: Docker, Kafka, Sentry, Grafana Degree education is preferred, but we will consider Software Engineers with the right level of technical ability for the role. We are looking for a C# Software More ❯
Strong background in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact More ❯
Strong background in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact More ❯
Strong background in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact More ❯
Strong background in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact More ❯
Strong background in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact More ❯
Skills (Bonus Points): Proven experience in recommender systems, behavioural AI, and/or reinforcement learning. Building data pipelines (realtime or batch) & data quality using modern toolchain (e.g., Apache Spark, Kafka, Airflow, dbt). PhD in Computer Science, Machine Learning, or a closely related field What We Offer: Opportunity to build technology that will transform millions of shopping experiences Real More ❯
in a commercial setting. MSc in Computer Science, Machine Learning, or a related field. Experience building data pipelines (realtime or batch) & data quality using modern toolchain (e.g., Apache Spark, Kafka, Airflow, dbt). Strong foundational knowledge of machine learning and deep learning algorithms, including deep neural networks, supervised/unsupervised learning, predictive analysis, and forecasting. Expert-level proficiency in More ❯
such as fraud detection, network analysis, and knowledge graphs. - Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. More ❯
such as fraud detection, network analysis, and knowledge graphs. - Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. More ❯
such as fraud detection, network analysis, and knowledge graphs. - Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. More ❯
such as fraud detection, network analysis, and knowledge graphs. - Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. More ❯
such as fraud detection, network analysis, and knowledge graphs. - Optimize performance of graph queries and design for scalability. - Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. - Implement metadata management, security, and data governance using Data Catalog and IAM. - Work across functional teams and clients in diverse EMEA time zones and project settings. More ❯