Permanent Apache Beam Job Vacancies

1 to 25 of 137 Permanent Apache Beam Jobs

Senior Cloud and Data Architect

City of London, London, United Kingdom
Gazelle Global
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs … minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Required Skills : Mandatory Skills [at least 2 Hyperscalers]: GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF. Preferred Skills : Designing Databricks based More ❯
Posted:

Senior Cloud and Data Architect

London Area, United Kingdom
Gazelle Global
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs … minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Required Skills : Mandatory Skills [at least 2 Hyperscalers]: GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF. Preferred Skills : Designing Databricks based More ❯
Posted:

Senior Cloud and Data Architect

South East London, England, United Kingdom
Gazelle Global
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs … minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Required Skills : Mandatory Skills [at least 2 Hyperscalers]: GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF. Preferred Skills : Designing Databricks based More ❯
Posted:

Senior Cloud and Data Solution Architect

City of London, England, United Kingdom
Coforge
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based More ❯
Posted:

Senior Cloud and Data Solution Architect

South East London, England, United Kingdom
Coforge
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based More ❯
Posted:

Solutions Architect (Data Analytics)

City of London, London, United Kingdom
Vallum Associates
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills Designing Databricks based More ❯
Posted:

Solutions Architect (Data Analytics)

London, United Kingdom
Vallum
technologies - Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs … skills. A minimum of 5 years' experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache Spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills Designing Databricks based More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Solutions Architect (Data Analytics)- Presales, RFP creation

London Area, United Kingdom
Vallum Associates
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills Designing Databricks based More ❯
Posted:

Solutions Architect (Data Analytics)- Presales, RFP creation

City of London, London, United Kingdom
Vallum Associates
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills Designing Databricks based More ❯
Posted:

Solutions Architect (Data Analytics)

Slough, England, United Kingdom
JR United Kingdom
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Designing Databricks based solutions for More ❯
Posted:

Solutions Architect (Data Analytics)- Presales, RFP creation

City of London, England, United Kingdom
JR United Kingdom
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Designing Databricks based solutions for More ❯
Posted:

Solutions Architect (Data Analytics)- Presales, RFP creation

London, England, United Kingdom
Vallum Associates
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Designing Databricks based solutions for More ❯
Posted:

Senior Data Platform Engineer

London, England, United Kingdom
easyJet Airline Company PLC
indexing, partitioning. · Hands-on IaC development experience with Terraform or CloudFormation. · Understanding of ML development workflow and knowledge of when and how to use dedicated hardware. · Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) · Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. · Experience with data More ❯
Posted:

Senior Data Platform Engineer

Luton, England, United Kingdom
easyJet
indexing, partitioning. Hands-on IaC development experience with Terraform or CloudFormation. Understanding of ML development workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data More ❯
Posted:

Senior Data Platform Engineer

Watford, England, United Kingdom
JR United Kingdom
indexing, partitioning. Hands-on IaC development experience with Terraform or CloudFormation. Understanding of ML development workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data More ❯
Posted:

Data Platform Engineer

London, England, United Kingdom
easyJet Airline Company PLC
field. Technical Skills Required · Hands-on software development experience with Python and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). · Experience with Apache Spark or any other distributed data programming frameworks. · Comfortable writing efficient SQL and debugging on cloud warehouses like Databricks SQL or Snowflake. · Experience with cloud infrastructure like AWS or More ❯
Posted:

Data Product Engineer

Luton, England, United Kingdom
easyJet
field. Technical Skills Required Hands-on software development experience with Python and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). Experience with Apache Spark or any other distributed data programming frameworks. Comfortable writing efficient SQL and debugging on cloud warehouses like Databricks SQL or Snowflake. Experience with cloud infrastructure like AWS or More ❯
Posted:

Data Engineer

Miami, Florida, United States
Royal Caribbean Group
Python, SQL, and Command Line Interfaces. 3+ years of experience with streaming technologies (Kafka, Pubsub, Kinesis) and log-based architectures and experience writing batch and stream processing jobs (i.e. Apache Beam, Google Cloud DataFlow, Apache Spark, Apache Storm). 3+ years of experience working and creating datasets for a data warehouse. Clear understanding of data modeling More ❯
Employment Type: Permanent
Salary: USD Annual
Posted:

Cloud Data Engineer (based in Spain)

Spain
Hays
Python and Java programming experience and in design, programming, unit-testing ( Pytest, JUnit ). Experience using version control (GIT), Kafka or PubSub and Docker Experience using Terraform. Experience using Apache Beam with Dataflow Experience with Pyspark and Databricks working in an Azure environment. Knowledge of Clean code principles Strong verbal and written skills in English. Nice to have More ❯
Employment Type: Permanent
Salary: EUR Annual
Posted:

Senior Data Engineer (GCP/Kafka)

Bristol, England, United Kingdom
Lloyds Banking Group
relational and non-relational databases to build data solutions, such as SQL Server/Oracle , experience with relational and dimensional data structures Experience in using distributed frameworks ( Spark, Flink, Beam, Hadoop ) Proficiency in infrastructure as code (IaC) using Terraform Experience with CI/CD pipelines and related tools/frameworks Containerisation Good knowledge of containers ( Docker, Kubernetes etc) Cloud … with GCP, AWS or Azure Good understating of cloud storage, networking and resource provisioning It would be great if you had... Certification in GCP "Professional Data Engineer" Certification in Apache Kafka (CCDAK) Proficiency across the data lifecycle WORKING FOR US Our focus is to ensure we are inclusive every day, building an organisation that reflects modern society and celebrates More ❯
Posted:

Software Engineer - Data

London, England, United Kingdom
Man Group
We implement the systems that require the highest data throughput in Java. Within Data Engineering we use Dataiku, Snowflake, Prometheus, and ArcticDB heavily. We use Kafka for data pipelines, Apache Beam for ETL, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log shipping and monitoring, Docker for containerisation, Kubernetes for More ❯
Posted:

Senior Presales Consultant (Cloud Data & AI)

Madrid, Spain
Insight
OpenAI, Azure Machine Learning Studio, Azure AI Foundry AWS SageMaker, Amazon Bedrock Google Vertex AI, TensorFlow, scikit-learn, Hugging Face Data Engineering & Big Data Azure Data Factory, Azure Databricks, Apache Spark, Delta Lake AWS Glue ETL, AWS EMR Google Dataflow, Apache Beam Business Intelligence & Analytics Power BI, Amazon QuickSight, Looker Studio Embedded analytics and interactive dashboarding solutions More ❯
Employment Type: Permanent
Salary: EUR Annual
Posted:

Data Engineer

New York, United States
S&P Global
years of experience in B ig Data and Data Engineering , building enterprise-level applications in a public cloud, preferably Google Cloud Experience building cloud-native data pipelines using Python , Apache Airflow, Apache Spark, and Apache Beam . Working knowledge of SQL and data warehousing concepts, including PostgreSQL and BigQuery . Optionally , knowledge of Mongo and AlloyDB More ❯
Employment Type: Permanent
Salary: USD Annual
Posted:

Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
Derisk360
in Neo4j such as fraud detection, knowledge graphs, and network analysis. Optimize graph database performance, ensure query scalability, and maintain system efficiency. Manage ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. Implement metadata management, security, and data governance using Data Catalog and IAM. Collaborate with cross-functional teams and clients across diverse More ❯
Posted:

Senior Data Engineer (GCP/Kafka)

Bristol, England, United Kingdom
Lloyds Banking Group
relational and non-relational databases to build data solutions, such as SQL Server/Oracle , experience with relational and dimensional data structures. Experience in using distributed frameworks ( Spark, Flink, Beam, Hadoop ). Proficiency in infrastructure as code (IaC) using Terraform . Experience with CI/CD pipelines and related tools/frameworks. Containerisation Good knowledge of containers ( Docker, Kubernetes … AWS or Azure . Good understanding of cloud storage, networking and resource provisioning. It would be great if you had Certification in GCP "Professional Data Engineer". Certification in Apache Kafka (CCDAK). Proficiency across the data lifecycle. WORKING FOR US Our focus is to ensure we are inclusive every day, building an organisation that reflects modern society and More ❯
Posted: