Apache Beam Jobs in England

76 to 100 of 188 Apache Beam Jobs in England

Solutions Architect (Data Analytics)- Presales, RFP creation

Watford, England, United Kingdom
JR United Kingdom
Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis … years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF More ❯
Posted:

Solutions Architect (Data Analytics)- Presales, RFP creation

Stoke-on-Trent, England, United Kingdom
JR United Kingdom
Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis … years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF More ❯
Posted:

Solutions Architect (Data Analytics)- Presales, RFP creation

Hounslow, England, United Kingdom
JR United Kingdom
Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis … years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF More ❯
Posted:

Solutions Architect (Data Analytics)- Presales, RFP creation

Newcastle upon Tyne, England, United Kingdom
JR United Kingdom
Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis … years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF More ❯
Posted:

Solutions Architect (Data Analytics)- Presales, RFP creation

London, England, United Kingdom
Vallum Associates
Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis … years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF More ❯
Posted:

Senior Cloud Data Architect

London, England, United Kingdom
ZipRecruiter
Big Data, and Cloud Technologies. Hands-on expertise in at least 2 Cloud platforms (Azure, AWS, GCP, Snowflake, Databricks) and Big Data processing (e.g., Apache Spark, Beam). Proficiency in key technologies like BigQuery, Redshift, Synapse, Pub/Sub, Kinesis, Event Hubs, Kafka, Dataflow, Airflow, and ADF. Strong … with the ability to mentor architects. Mandatory expertise in at least 2 Hyperscalers (GCP/AWS/Azure) and Big Data tools (e.g., Spark, Beam). Desirable: Experience designing Databricks solutions and familiarity with DevOps tools. Coforge is an equal opportunities employer and welcomes applications from all sections of More ❯
Posted:

Solutions Architect (Data Analytics)- Presales, RFP creation

Hull, England, United Kingdom
JR United Kingdom
Technologies Implementation and hands-on experience with at least two Cloud platforms (Azure, AWS, GCP, Snowflake, Databricks) Expertise in Big Data processing services like Apache Spark, Beam, or equivalents on at least two hyperscalers Knowledge of key technologies such as BigQuery, Redshift, Synapse, Pub/Sub, Kinesis, Kafka … of relevant experience Leadership and mentoring abilities Mandatory skills include experience with at least two hyperscalers and technologies such as GCP, AWS, Azure, Spark, Beam, Kafka, Dataflow, Airflow, ADF, Databricks, Jenkins, Terraform, StackDriver #J-18808-Ljbffr More ❯
Posted:

Solutions Architect (Data Analytics)- Presales, RFP creation

Leeds, England, United Kingdom
JR United Kingdom
Technologies Implementation and hands-on experience in Azure, AWS, GCP, Snowflake, Databricks Experience with at least 2 Hyperscalers and Big Data processing services like Apache Spark, Beam Knowledge of technologies like BigQuery, Redshift, Synapse, Pub/Sub, Kinesis, Kafka, Dataflow, Airflow, ADF Consulting experience and ability to design … communication skills Minimum 5 years’ experience Leadership and mentoring abilities Skills in at least 2 Hyperscalers (GCP, AWS, Azure) Experience with Big Data, Spark, Beam, Pub/Sub, Kinesis, Kafka, Dataflow, Airflow, ADF Designing solutions with Databricks, Jenkins, Terraform, StackDriver #J-18808-Ljbffr More ❯
Posted:

Solutions Architect (Data Analytics)- Presales, RFP creation

Crawley, England, United Kingdom
JR United Kingdom
Cloud Technologies Implementation and hands-on experience in Azure, AWS, GCP, Snowflake, Databricks Experience with at least 2 Hyperscalers and Big Data processing services (Apache Spark, Beam, etc.) Knowledge of technologies like BigQuery, Redshift, Synapse, Pub/Sub, Kinesis, Kafka, Dataflow, Airflow, ADF Consulting experience and ability to More ❯
Posted:

Python Developer

City of London, England, United Kingdom
VE3
Flask, Django, or FastAPI. Proficiency in Python 3.x and libraries like Pandas, NumPy, and Dask. Experience with data manipulation and processing frameworks (e.g., PySpark, Apache Beam). Strong knowledge of databases, including SQL and NoSQL (e.g., PostgreSQL, MongoDB). Familiarity with ETL processes and tools such as Airflow More ❯
Posted:

Python Developer

London, England, United Kingdom
Data Controller, VE Ltd
Flask, Django, or FastAPI. Proficiency in Python 3.x and libraries like Pandas, NumPy, and Dask. Experience with data manipulation and processing frameworks (e.g., PySpark, Apache Beam). Strong knowledge of databases, including SQL and NoSQL (e.g., PostgreSQL, MongoDB). Familiarity with ETL processes and tools such as Airflow More ❯
Posted:

Python Developer

London, England, United Kingdom
VE3
Flask, Django, or FastAPI. Proficiency in Python 3.x and libraries like Pandas, NumPy, and Dask. Experience with data manipulation and processing frameworks (e.g., PySpark, Apache Beam). Strong knowledge of databases, including SQL and NoSQL (e.g., PostgreSQL, MongoDB). Familiarity with ETL processes and tools such as Airflow More ❯
Posted:

Software Engineer - Data- Tech-Driven Global Hedge Fund

London, United Kingdom
Quality Control Specialist - Pest Control
highest data throughput are implemented in Java. Within Data Engineering, Dataiku, Snowflake, Prometheus, and ArcticDB are used heavily. Kafka is used for data pipelines, Apache Beam for ETL, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log shipping and monitoring More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Platform Engineer

London, England, United Kingdom
easyJet Airline Company PLC
on software development experience with Python and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). · Experience with Apache Spark or any other distributed data programming frameworks. · Comfortable writing efficient SQL and debugging on cloud warehouses like Databricks SQL or Snowflake. · Experience with More ❯
Posted:

Data Product Engineer

Luton, England, United Kingdom
easyJet
on software development experience with Python and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). Experience with Apache Spark or any other distributed data programming frameworks. Comfortable writing efficient SQL and debugging on cloud warehouses like Databricks SQL or Snowflake. Experience with More ❯
Posted:

Data Product Engineer

bedford, east anglia, united kingdom
easyJet
on software development experience with Python and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). Experience with Apache Spark or any other distributed data programming frameworks. Comfortable writing efficient SQL and debugging on cloud warehouses like Databricks SQL or Snowflake. Experience with More ❯
Posted:

Data Product Engineer

luton, bedfordshire, east anglia, united kingdom
easyJet
on software development experience with Python and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). Experience with Apache Spark or any other distributed data programming frameworks. Comfortable writing efficient SQL and debugging on cloud warehouses like Databricks SQL or Snowflake. Experience with More ❯
Posted:

Data Product Engineer

watford, hertfordshire, east anglia, united kingdom
easyJet
on software development experience with Python and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). Experience with Apache Spark or any other distributed data programming frameworks. Comfortable writing efficient SQL and debugging on cloud warehouses like Databricks SQL or Snowflake. Experience with More ❯
Posted:

GCP Data Engineer-UK

London, England, United Kingdom
Ampstek
as fraud detection, network analysis, and knowledge graphs. Optimize performance of graph queries and design for scalability. Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. Implement metadata management, security, and data governance using Data Catalog and IAM. Work across functional teams More ❯
Posted:

Cloud Architect (GCP)

London, England, United Kingdom
Ampstek
as fraud detection, network analysis, and knowledge graphs. Optimize performance of graph queries and design for scalability. Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. Implement metadata management, security, and data governance using Data Catalog and IAM. Work across functional teams More ❯
Posted:

Data Engineer

London, England, United Kingdom
Creo Invent
as fraud detection, network analysis, and knowledge graphs. Optimize performance of graph queries and design for scalability. Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. Implement metadata management, security, and data governance using Data Catalog and IAM. Work across functional teams More ❯
Posted:

Senior GCP Data Engineer

Southend-on-Sea, England, United Kingdom
Hybrid / WFH Options
TN United Kingdom
years hands-on with Google Cloud Platform. Strong experience with BigQuery, Cloud Storage, Pub/Sub, and Dataflow. Proficient in SQL, Python, and Apache Beam. Familiarity with DevOps and CI/CD pipelines in cloud environments. Experience with Terraform, Cloud Build, or similar tools for infrastructure automation. Understanding of … available) Responsibilities: Design, build, and maintain scalable and reliable data pipelines on Google Cloud Platform (GCP) Develop ETL processes using tools like Cloud Dataflow, Apache Beam, BigQuery, and Cloud Composer Collaborate with data analysts, scientists, and business stakeholders to understand data requirements Optimize performance and cost-efficiency of More ❯
Posted:

Senior GCP Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
ZipRecruiter
years hands-on experience with Google Cloud Platform. Strong experience with BigQuery, Cloud Storage, Pub/Sub, and Dataflow. Proficient in SQL, Python, and Apache Beam. Familiarity with DevOps and CI/CD pipelines in cloud environments. Experience with Terraform, Cloud Build, or similar tools for infrastructure automation. Understanding … Responsibilities Design, build, and maintain scalable and reliable data pipelines on Google Cloud Platform (GCP). Develop ETL processes using tools like Cloud Dataflow, Apache Beam, BigQuery, and Cloud Composer. Collaborate with data analysts, scientists, and business stakeholders to understand data requirements. Optimize performance and cost-efficiency of More ❯
Posted:

Senior Data Engineer

Bath, Somerset, United Kingdom
gamigo AG
for purpose Experience that will put you ahead of the curve Experience using Python on Google Cloud Platform for Big Data projects, BigQuery, DataFlow (Apache Beam), Cloud Run Functions, Cloud Run, Cloud Workflows, Cloud Composure SQL development skills Experience using Dataform or dbt Demonstrated strength in data modelling More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Data Engineer

Bath, England, United Kingdom
Hybrid / WFH Options
Future
purposefulness. Experience that will put you ahead of the curve: Experience using Python on Google Cloud Platform for Big Data projects, including BigQuery, DataFlow (Apache Beam), Cloud Run Functions, Cloud Run, Cloud Workflows, Cloud Composer. SQL development skills. Experience using Dataform or dbt. Strength in data modeling, ETL More ❯
Posted: