London, England, United Kingdom Hybrid / WFH Options
Derisk360
in Neo4j such as fraud detection, knowledge graphs, and network analysis. Optimize graph database performance, ensure query scalability, and maintain system efficiency. Manage ingestion of large-scale datasets using ApacheBeam, Spark, or Kafka into GCP environments. Implement metadata management, security, and data governance using Data Catalog and IAM. Collaborate with cross-functional teams and clients across diverse More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Lloyds Bank plc
relational and non-relational databases to build data solutions, such as SQL Server/Oracle , experience with relational and dimensional data structures. Experience in using distributed frameworks ( Spark, Flink, Beam, Hadoop ). Proficiency in infrastructure as code (IaC) using Terraform . Experience with CI/CD pipelines and related tools/frameworks. Containerisation: Good knowledge of containers ( Docker, Kubernetes … AWS, or Azure . Good understanding of cloud storage, networking, and resource provisioning. It would be great if you had... Certification in GCP “Professional Data Engineer”. Certification in Apache Kafka (CCDAK). Proficiency across the data lifecycle. Working for us: Our focus is to ensure we are inclusive every day, building an organisation that reflects modern society and More ❯
London, England, United Kingdom Hybrid / WFH Options
Lloyds Banking Group
relational and non-relational databases to build data solutions, such as SQL Server/Oracle , experience with relational and dimensional data structures. Experience in using distributed frameworks ( Spark, Flink, Beam, Hadoop ). Proficiency in infrastructure as code (IaC) using Terraform . Experience with CI/CD pipelines and related tools/frameworks. Containerisation Good knowledge of containers ( Docker, Kubernetes … AWS, or Azure . Good understanding of cloud storage, networking, and resource provisioning. It would be great if you had... Certification in GCP “Professional Data Engineer”. Certification in Apache Kafka (CCDAK). Proficiency across the data lifecycle. WORKING FOR US Our focus is to ensure we are inclusive every day, building an organisation that reflects modern society and More ❯
applications and high proficiency SQL for complex querying and performance tuning. ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (ApacheBeam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Singular Recruitment
applications and high proficiency SQL for complex querying and performance tuning. ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (ApacheBeam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability More ❯
london, south east england, united kingdom Hybrid / WFH Options
Singular Recruitment
applications and high proficiency SQL for complex querying and performance tuning. ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (ApacheBeam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Singular Recruitment
applications and high proficiency SQL for complex querying and performance tuning. ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (ApacheBeam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Singular Recruitment
applications and high proficiency SQL for complex querying and performance tuning. ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (ApacheBeam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Singular Recruitment
applications and high proficiency SQL for complex querying and performance tuning. ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (ApacheBeam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability More ❯
London, England, United Kingdom Hybrid / WFH Options
Singular Recruitment
applications and high proficiency SQL for complex querying and performance tuning. ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (ApacheBeam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability More ❯
London, England, United Kingdom Hybrid / WFH Options
HR Ways - Hiring Tech Talent
ML/DL libraries like TensorFlow, PyTorch, or JAX. Knowledge of data analytics concepts, including data warehouse technical architectures, ETL and reporting/analytic tools and environments (such as ApacheBeam, Hadoop, Spark, Pig, Hive, MapReduce, Flume). Customer facing experience of discovery, assessment, execution, and operations. Demonstrated excellent communication, presentation, and problem solving skills. Experience in project More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Axiom Software Solutions Limited
Kafka Focus on disaster recovery aspects Knowledge of Kafka resiliency and new features like KRAFT Experience with real-time technologies such as Spark Required Skills & Experience Extensive experience with Apache Kafka and real-time architecture including event-driven frameworks. Strong knowledge of Kafka Streams, Kafka Connect, Spark Streaming, Schema Registry, Flink, and Beam. Experience with cloud platforms such as More ❯
London, England, United Kingdom Hybrid / WFH Options
So Energy
machine learning purposes. Expertise in design of data solutions for BigQuery. Expertise in logical and physical data modelling. Hands-on experience using Google Dataflow, GCS, cloud functions, BigQuery, DataProc, ApacheBeam (Python) in designing data transformation rules for batch and data streaming. Solid Python programming skills and using ApacheBeam (Python). Structure of CI/ More ❯
London, England, United Kingdom Hybrid / WFH Options
Starling Bank
primarily GCP. Experience with some or all of the services below would put you at the top of our list: Google Cloud Storage. Google Data Transfer Service. Google Dataflow (ApacheBeam). Google PubSub. Google CloudRun. BigQuery or any RDBMS. Python. Debezium/Kafka. dbt (Data Build tool). Interview process Interviewing is a two way process and More ❯
primarily GCP. Experience with some or all of the services below would put you at the top of our list Google Cloud Storage Google Data Transfer Service Google Dataflow (ApacheBeam) Google PubSub Google CloudRun BigQuery or any RDBMS Python Debezium/Kafka dbt (Data Build tool) Interview process Interviewing is a two way process and we want More ❯
London, England, United Kingdom Hybrid / WFH Options
Scope3
and GraphQL APIs React w/Next.js for frontend applications Low latency + high throughput Golang API Big Query Data warehouse Airflow for batch orchestration Temporal for event orchestration ApacheBeam (dataflow runner) for some batch jobs Most transformations are performed via SQL directly in Big Query. The Role We are excited to add a Lead Engineer to More ❯
London, England, United Kingdom Hybrid / WFH Options
Spotify
constructive feedback to foster accountability, growth, and collaboration within the team. Who You Are Experienced with Data Processing Frameworks: Skilled with higher-level JVM-based frameworks such as Flink, Beam, Dataflow, or Spark. Comfortable with Ambiguity: Able to work through loosely defined problems and thrive in autonomous team environments. Skilled in Cloud-based Environments: Proficient with large-scale data More ❯
London, England, United Kingdom Hybrid / WFH Options
Spotify AB
constructive feedback to foster accountability, growth, and collaboration within the team. Who You Are Experienced with Data Processing Frameworks: Skilled with higher-level JVM-based frameworks such as Flink, Beam, Dataflow, or Spark. Comfortable with Ambiguity: Able to work through loosely defined problems and thrive in autonomous team environments. Skilled in Cloud-based Environments: Proficient with large-scale data More ❯
Lexington, Massachusetts, United States Hybrid / WFH Options
Equiliem
Qualifications: • Bachelor's Degree in Computer Science. Recent graduates or candidates without a Bachelor's degree considered with clear evidence of significant outside-of-classroom experience. • Experience with the Apache Maven or Gradle build system. • Ability to understand front-end source code written in React or similar frameworks. Provide guidance to less experienced front-end engineers. • General knowledge of … and reinforcement learning concepts, frameworks, and environments, such as Pandas, TensorFlow, and Jupyter Notebook. • Broad knowledge of the general features, capabilities, and trade-offs of common data warehouse (e.g. Apache Hadoop); workflow orchestration (e.g. ApacheBeam); data extract, transform and load (ETL); and stream processing (e.g. Kafka) technologies. Hands-on experience with several of these technologies. -This More ❯