Remote Permanent Apache Beam Jobs

14 of 14 Remote Permanent Apache Beam Jobs

Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
So Energy
design of data solutions for BigQuery. Expertise in logical and physical data modelling. Hands-on experience using Google Dataflow, GCS, cloud functions, BigQuery, DataProc, Apache Beam (Python) in designing data transformation rules for batch and data streaming. Solid Python programming skills and using Apache Beam (Python More ❯
Posted:

Full Stack Developer with Security Clearance

Lexington, Massachusetts, United States
Hybrid / WFH Options
Equiliem
Computer Science. Recent graduates or candidates without a Bachelor's degree considered with clear evidence of significant outside-of-classroom experience. • Experience with the Apache Maven or Gradle build system. • Ability to understand front-end source code written in React or similar frameworks. Provide guidance to less experienced front … and environments, such as Pandas, TensorFlow, and Jupyter Notebook. • Broad knowledge of the general features, capabilities, and trade-offs of common data warehouse (e.g. Apache Hadoop); workflow orchestration (e.g. Apache Beam); data extract, transform and load (ETL); and stream processing (e.g. Kafka) technologies. Hands-on experience with More ❯
Employment Type: Permanent
Salary: USD Annual
Posted:

Data Engineer

City of London, London, United Kingdom
Hybrid / WFH Options
Singular Recruitment
for complex querying and performance tuning. ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (Apache Beam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex More ❯
Posted:

Data Engineer

London Area, United Kingdom
Hybrid / WFH Options
Singular Recruitment
for complex querying and performance tuning. ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (Apache Beam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex More ❯
Posted:

Data Engineer

South East London, England, United Kingdom
Hybrid / WFH Options
Singular Recruitment
for complex querying and performance tuning. ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (Apache Beam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex More ❯
Posted:

Solution Architect

Leeds, England, United Kingdom
Hybrid / WFH Options
Axiom Software Solutions Limited
aspects Knowledge of Kafka resiliency and new features like KRAFT Experience with real-time technologies such as Spark Required Skills & Experience Extensive experience with Apache Kafka and real-time architecture including event-driven frameworks. Strong knowledge of Kafka Streams, Kafka Connect, Spark Streaming, Schema Registry, Flink, and Beam. Experience More ❯
Posted:

Data Engineer - Data Pipelines

London, England, United Kingdom
Hybrid / WFH Options
Starling Bank
or all of the services below would put you at the top of our list: Google Cloud Storage. Google Data Transfer Service. Google Dataflow (Apache Beam). Google PubSub. Google CloudRun. BigQuery or any RDBMS. Python. Debezium/Kafka. dbt (Data Build tool). Interview process Interviewing is More ❯
Posted:

Data Engineer - Data Pipelines

London
Hybrid / WFH Options
Starling Bank
or all of the services below would put you at the top of our list Google Cloud Storage Google Data Transfer Service Google Dataflow (Apache Beam) Google PubSub Google CloudRun BigQuery or any RDBMS Python Debezium/Kafka dbt (Data Build tool) Interview process Interviewing is a two More ❯
Employment Type: Permanent
Posted:

Lead Engineer, Data Platform

London, England, United Kingdom
Hybrid / WFH Options
Scope3
/Next.js for frontend applications Low latency + high throughput Golang API Big Query Data warehouse Airflow for batch orchestration Temporal for event orchestration Apache Beam (dataflow runner) for some batch jobs Most transformations are performed via SQL directly in Big Query. The Role We are excited to More ❯
Posted:

Data Engineer - Financial Data Platform

London, England, United Kingdom
Hybrid / WFH Options
Spotify
growth, and collaboration within the team. Who You Are Experienced with Data Processing Frameworks: Skilled with higher-level JVM-based frameworks such as Flink, Beam, Dataflow, or Spark. Comfortable with Ambiguity: Able to work through loosely defined problems and thrive in autonomous team environments. Skilled in Cloud-based Environments More ❯
Posted:

Data Engineer, Financial Data Platform

London, England, United Kingdom
Hybrid / WFH Options
Spotify AB
growth, and collaboration within the team. Who You Are Experienced with Data Processing Frameworks: Skilled with higher-level JVM-based frameworks such as Flink, Beam, Dataflow, or Spark. Comfortable with Ambiguity: Able to work through loosely defined problems and thrive in autonomous team environments. Skilled in Cloud-based Environments More ❯
Posted:

Senior Data Engineer (GCP/Kafka)

Bristol, England, United Kingdom
Hybrid / WFH Options
Lloyds Bank plc
to build data solutions, such as SQL Server/Oracle , experience with relational and dimensional data structures. Experience in using distributed frameworks ( Spark, Flink, Beam, Hadoop ). Proficiency in infrastructure as code (IaC) using Terraform . Experience with CI/CD pipelines and related tools/frameworks. Containerisation: Good … understanding of cloud storage, networking, and resource provisioning. It would be great if you had... Certification in GCP “Professional Data Engineer”. Certification in Apache Kafka (CCDAK). Proficiency across the data lifecycle. Working for us: Our focus is to ensure we are inclusive every day, building an organisation More ❯
Posted:

Senior Data Engineer (GCP/Kafka)

London, England, United Kingdom
Hybrid / WFH Options
Lloyds Banking Group
to build data solutions, such as SQL Server/Oracle , experience with relational and dimensional data structures. Experience in using distributed frameworks ( Spark, Flink, Beam, Hadoop ). Proficiency in infrastructure as code (IaC) using Terraform . Experience with CI/CD pipelines and related tools/frameworks. Containerisation Good … understanding of cloud storage, networking, and resource provisioning. It would be great if you had... Certification in GCP “Professional Data Engineer”. Certification in Apache Kafka (CCDAK). Proficiency across the data lifecycle. WORKING FOR US Our focus is to ensure we are inclusive every day, building an organisation More ❯
Posted:

Senior Data Engineer (GCP/Kafka)

Bristol, England, United Kingdom
Hybrid / WFH Options
Lloyds Banking Group
to build data solutions, such as SQL Server/Oracle, experience with relational and dimensional data structures Experience in using distributed frameworks (Spark, Flink, Beam, Hadoop) Proficiency in infrastructure as code (IaC) using Terraform Experience with CI/CD pipelines and related tools/frameworks Containerisation Good knowledge of … Good understating of cloud storage, networking and resource provisioning It would be great if you had... Certification in GCP “Professional Data Engineer” Certification in Apache Kafka (CCDAK) Proficiency across the data lifecycle WORKING FOR US Our focus is to ensure we are inclusive every day, building an organisation that More ❯
Posted: