Senior Software engineer to join their team in London on a full-time basis What You’ll Do Architect and implement high-performance data processing systems in Rust Leverage Apache Arrow and Parquet for in-memory and on-disk data efficiency Integrate and extend systems like DataFusion, ClickHouse, and DuckDB Design low-latency pipelines for analytical workloads Collaborate with More ❯
Senior Software engineer to join their team in London on a full-time basis What You’ll Do Architect and implement high-performance data processing systems in Rust Leverage Apache Arrow and Parquet for in-memory and on-disk data efficiency Integrate and extend systems like DataFusion, ClickHouse, and DuckDB Design low-latency pipelines for analytical workloads Collaborate with More ❯
code Desired Skills (Bonus Points): Proven experience in recommender systems, behavioural AI, and/or reinforcement learning. Building data pipelines (realtime or batch) & data quality using modern toolchain (e.g., Apache Spark, Kafka, Airflow, dbt). PhD in Computer Science, Machine Learning, or a closely related field What We Offer: Opportunity to build technology that will transform millions of shopping More ❯
code Experience working on distributed systems Strong knowledge of Kubernetes and Kafka Experience with Git, and Deployment Pipelines Having worked with at least one of the following stacks: Hadoop, Apache Spark, Presto Experience profiling performance issues in database systems Ability to learn and/or adapt quickly to complex issues Happy to collaborate with a wide group of stakeholders More ❯
science solutions in a commercial setting. MSc in Computer Science, Machine Learning, or a related field. Experience building data pipelines (realtime or batch) & data quality using modern toolchain (e.g., Apache Spark, Kafka, Airflow, dbt). Strong foundational knowledge of machine learning and deep learning algorithms, including deep neural networks, supervised/unsupervised learning, predictive analysis, and forecasting. Expert-level More ❯
Familiar with observability tools, logging frameworks, and performance monitoring Desirable Skills: Background in serverless technologies (e.g. Lambda, Step Functions, API Gateway) Experience with data tools like EMR, Glue, or Apache Spark Understanding of event-driven architecture (EventBridge, SNS, SQS) Knowledge of AWS database offerings including DynamoDB and RDS Familiarity with multi-region deployments and failover strategies AWS certifications (Solutions More ❯
Analysts with efficient and performant SQL. Performance optimisation of data ingestion and query performance for MSSQL (or transferable skills from another RDBMS) Familiar with data processing frameworks such as Apache Spark. Experience of working with Terabyte data sets and managing rapid data growth. The benefits at APF: At AllPoints Fibre, we're all about looking after you. We offer More ❯
data modeling, and database management. Strong understanding of cloud-based data solutions. Experience with the tools we work with, or other equivalent ones (dbt, BigQuery, Stitch, Segment, Fivetran, Metabase, Apache Airflow). Education A bachelor’s degree in a relevant field such as Computer Science, Information Technology, Data Science, Engineering (e.g., Computer Science Engineering), Mathematics, or Statistics. *** Only successful More ❯
Reading, England, United Kingdom Hybrid / WFH Options
Areti Group | B Corp™
support for Data Analysts with efficient and performant queries. • Skilled in optimizing data ingestion and query performance for MSSQL or other RDBMS. • Familiar with data processing frameworks such as Apache Spark. • Highly analytical and tenacious in solving complex problems. More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Areti Group | B Corp™
support for Data Analysts with efficient and performant queries. • Skilled in optimizing data ingestion and query performance for MSSQL or other RDBMS. • Familiar with data processing frameworks such as Apache Spark. • Highly analytical and tenacious in solving complex problems. More ❯
software Strong programming skills in Rust , C , or C++ Solid understanding of operating systems, file systems, and storage internals Experience working with modern data formats or infrastructure tools (e.g., Apache Arrow, Parquet, DuckDB, ClickHouse) A passion for infrastructure and performance problems Willingness and ability to work on-site 5 days/week in London or New York City More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
candidates based anywhere in Europe or the UK. Design and build a cloud-native analytical database system optimised for sparse, wide, and multimodal logs. Work with technologies like Rust, Apache Arrow, and Datafusion to build scalable, low-latency systems. Collaborate on every part of the stack—from data ingestion and indexing to query planning and optimisation. Build observability, reliability More ❯
Reading, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
candidates based anywhere in Europe or the UK. Design and build a cloud-native analytical database system optimised for sparse, wide, and multimodal logs. Work with technologies like Rust, Apache Arrow, and Datafusion to build scalable, low-latency systems. Collaborate on every part of the stack—from data ingestion and indexing to query planning and optimisation. Build observability, reliability More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Singular Recruitment
applications and high proficiency SQL for complex querying and performance tuning. ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (Apache Beam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability to More ❯
slough, south east england, united kingdom Hybrid / WFH Options
twentyAI
agile environment to deliver data solutions that support key firm initiatives. Build scalable and efficient batch and streaming data workflows within the Azure ecosystem. Apply distributed processing techniques using Apache Spark to handle large datasets effectively. Help drive improvements in data quality, implementing validation, cleansing, and monitoring frameworks. Contribute to the firm’s efforts around data security, governance, and More ❯
Bracknell, England, United Kingdom Hybrid / WFH Options
BlckBx
Senior Developer role). SKILLS & KNOWLEDGE Experience with modern front-end frameworks: React, Vue.js. Knowledge of database technologies: MySQL, Airtable. Knowledge of AWS or other cloud services. Exposure to Apache, Nginx, Linux (bonus). Familiarity with Git and version control. Strong problem-solving skills and attention to detail. Excellent communication and teamwork skills. Knowledge of AI/ML (advantageous More ❯
Engineer £85,000 Remote (travel into London a few times a year) My client are looking for an individual that has the following (practical working experience needed): Python (PySpark) Apache Spark Kafka (Event Driven Architecture) **Please note, my client are only looking for candidates based within the UK** Get in touch now More ❯
high-level system analysis and design abilities Strong hands-on experience with React.js, AngularJS, Node.js, TypeScript, HTML5, CSS Familiar with deploying web applications on web servers like NGINX or Apache Experience with agile/scrum methodology including tools like Jira Experience with using design tools such as Figma, Axure, or Sketch Experience with Kubernetes/OpenShift deployment is a More ❯
Reading, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Social network you want to login/join with: Product Specialist, Geospatial Solutions, reading col-narrow-left Client: Idox plc Location: reading, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 10.06.2025 More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Social network you want to login/join with: Product Specialist, Geospatial Solutions, slough col-narrow-left Client: Idox plc Location: slough, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 10.06.2025 More ❯
slough, south east england, united kingdom Hybrid / WFH Options
La Fosse
production environment. Proficiency in TCP/IP, DNS management, and troubleshooting network issues. Experience with BGP/OSPF and Juniper technologies is highly desirable. Web tech (Nginx/Apache), containerisation (Docker), cloud (AWS a bonus), and basic DB admin - MySQL or Galera is ideal Bonus: any experience in audio streaming or media platforms Set-up & culture Team of More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
statistical modeling. Strong hands-on experience with ML frameworks (PyTorch, TensorFlow, Keras). Proficiency in Python and C/C++. Experience with scalable data tools (e.g., PySpark, Kubernetes, Databricks, Apache Arrow). Proven ability to manage GPU-intensive data processing jobs. 4+ years of applied research or industry experience. Creative problem-solver with a bias for action and a More ❯
for Financial Services, Manufacturing, Life Sciences and Healthcare, Technology and Services, Telecom and Media, Retail and CPG, and Public Services. Consolidated revenues as of $13+ billion. Location- London Skill - Apache Hadoop We are looking for open-source contributors to Apache projects, who have an in-depth understanding of the code behind the Apache ecosystem, should have experience … possess in depth knowledge of bigdata tech stack. Requirement: Experience of platform engineering along with application engineering (hands-on) Experience in design of an open source platform based on Apache framework for Hadoop. Experience in integrating Infra-as-a-Code in their platform (Bespoke implementation from scratch) Experience of design & architect work for the open source Apache platform … in hybrid cloud environment Ability to do debug & fix code in the open source Apache code and should be an individual contributor to open source projects. Job description: The Apache Hadoop project requires up to 3 individuals with experience in designing and building platforms, and supporting applications both in cloud environments and on-premises. These resources are expected More ❯
cloud services. We serve clients in various sectors including Financial Services, Manufacturing, Life Sciences, Healthcare, Technology, Telecom, Retail, and Public Services. Our revenue exceeds $13 billion. Location: London Skills: Apache Hadoop We seek open-source contributors to Apache projects who have deep understanding of the Apache ecosystem, experience with Cloudera or similar distributions, and extensive knowledge of … big data technologies. Requirements: Platform engineering and application engineering experience (hands-on) Design experience of open-source platforms based on Apache Hadoop Experience integrating Infra-as-Code in platforms (from scratch) Design and architecture experience for Apache platforms in hybrid cloud environments Ability to debug and fix code within the Apache ecosystem; individual contribution to open source … open-source contributors, capable of troubleshooting complex issues, and supporting the migration and debugging of critical applications like RiskFinder. They must be experts in Big Data platform development using Apache Hadoop and supporting Hadoop implementations in various environments. #J-18808-Ljbffr More ❯