Apache Spark Jobs in the City of London

1 to 25 of 74 Apache Spark Jobs in the City of London

Senior Cloud and Data Architect

City of London, London, United Kingdom
Gazelle Global
Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis … in a similar role. Ability to lead and mentor the architects. Required Skills : Mandatory Skills [at least 2 Hyperscalers]: GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF. More ❯
Posted:

Senior Cloud and Data Architect

City of London, Greater London, UK
Gazelle Global
Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis … in a similar role. Ability to lead and mentor the architects. Required Skills : Mandatory Skills [at least 2 Hyperscalers]: GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF. More ❯
Posted:

Solutions Architect (Data Analytics)

City of London, London, United Kingdom
Vallum Associates
Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis … years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF More ❯
Posted:

Solutions Architect (Data Analytics)

City of London, Greater London, UK
Vallum Associates
Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis … years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF More ❯
Posted:

Solutions Architect (Data Analytics)- Presales, RFP creation

City of London, London, United Kingdom
Vallum Associates
Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis … years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF More ❯
Posted:

Solutions Architect (Data Analytics)- Presales, RFP creation

City of London, Greater London, UK
Vallum Associates
Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis … years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF More ❯
Posted:

Senior Engineering Manager – Data Platform

City of London, London, United Kingdom
Stott and May
in consumer-facing or marketplace environments. Strong knowledge of distributed systems and modern data ecosystems, with hands-on experience using technologies such as Databricks, Apache Spark, Apache Kafka, and DBT. Proven success in building and managing data platforms supporting both batch and real-time processing architectures. Deep More ❯
Posted:

Senior Engineering Manager - Data Platform

City of London, Greater London, UK
Stott and May
in consumer-facing or marketplace environments. Strong knowledge of distributed systems and modern data ecosystems, with hands-on experience using technologies such as Databricks, Apache Spark, Apache Kafka, and DBT. Proven success in building and managing data platforms supporting both batch and real-time processing architectures. Deep More ❯
Posted:

Solutions Architect (Data Analytics)- Presales, RFP creation

City of London, England, United Kingdom
JR United Kingdom
Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis … years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF More ❯
Posted:

GCP Cloud Architect

City of London, Greater London, UK
Infoplus Technologies UK Limited
advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Hands-on experience with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala is mandated in some cases. Deep understanding of data lakehouse design, event-driven architecture, and hybrid cloud data strategies. More ❯
Posted:

GCP Cloud Architect

City of London, London, United Kingdom
Infoplus Technologies UK Limited
advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Hands-on experience with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala is mandated in some cases. Deep understanding of data lakehouse design, event-driven architecture, and hybrid cloud data strategies. More ❯
Posted:

Data Architect - Senior Manager

City of London, England, United Kingdom
Hybrid / WFH Options
Staging It
modelling (relational, NoSQL) and ETL/ELT processes. Experience with data integration tools (e.g., Kafka, Talend) and APIs. Familiarity with big data technologies (Hadoop, Spark) and real-time streaming. Expertise in cloud security, data governance, and compliance (GDPR, HIPAA). Strong SQL skills and proficiency in at least one More ❯
Posted:

Senior Consultant - Business Intelligence Specialist

City of London, England, United Kingdom
Hybrid / WFH Options
Delta Capita
team-oriented environment. Preferred Skills: Experience with programming languages such as Python or R for data analysis. Knowledge of big data technologies (e.g., Hadoop, Spark) and data warehousing concepts. Familiarity with cloud data platforms (e.g., Azure, AWS, Google Cloud) is a plus. Certification in BI tools, SQL, or related More ❯
Posted:

Head of Data Engineering & Analytics

City of London, Greater London, UK
AlTi Tiedemann Global
Purview, or Informatica, including projects around lineage, cataloging, and quality rules. Strong hands-on development experience in SQL and Python, with working knowledge of Spark or other distributed data processing frameworks. Design, development and implementation of distributed data solutions using API and microservice-based architecture. Deep understanding of ETL More ❯
Posted:

Head of Data Engineering & Analytics

City of London, London, United Kingdom
AlTi Tiedemann Global
Purview, or Informatica, including projects around lineage, cataloging, and quality rules. Strong hands-on development experience in SQL and Python, with working knowledge of Spark or other distributed data processing frameworks. Design, development and implementation of distributed data solutions using API and microservice-based architecture. Deep understanding of ETL More ❯
Posted:

Data Engineering Consultant

City of London, England, United Kingdom
Accenture
platforms (AWS, GCP, or Azure) Experience with: Data warehousing and lake architectures ETL/ELT pipeline development SQL and NoSQL databases Distributed computing frameworks (Spark, Kinesis etc) Software development best practices including CI/CD, TDD and version control. Strong understanding of data modelling and system architecture Excellent problem More ❯
Posted:

Senior Python Data Engineer - AI

City of London, Greater London, UK
Synechron
Skills: x5 + experience with Python programming for data engineering tasks Strong proficiency in SQL and database management Hands-on experience with Databricks and Apache Spark Familiarity with Azure cloud platform and related services Knowledge of data security best practices and compliance standards Excellent problem-solving and communication More ❯
Posted:

Senior Python Data Engineer - AI

City of London, London, United Kingdom
Synechron
Skills: x5 + experience with Python programming for data engineering tasks Strong proficiency in SQL and database management Hands-on experience with Databricks and Apache Spark Familiarity with Azure cloud platform and related services Knowledge of data security best practices and compliance standards Excellent problem-solving and communication More ❯
Posted:

Data Engineer - London/Hybrid - TWE41666

City of London, London, United Kingdom
Hybrid / WFH Options
twentyAI
solutions that support key firm initiatives. Build scalable and efficient batch and streaming data workflows within the Azure ecosystem. Apply distributed processing techniques using Apache Spark to handle large datasets effectively. Help drive improvements in data quality, implementing validation, cleansing, and monitoring frameworks. Contribute to the firm’s More ❯
Posted:

Data Engineer - London/Hybrid - TWE41666

City of London, Greater London, UK
Hybrid / WFH Options
twentyAI
solutions that support key firm initiatives. Build scalable and efficient batch and streaming data workflows within the Azure ecosystem. Apply distributed processing techniques using Apache Spark to handle large datasets effectively. Help drive improvements in data quality, implementing validation, cleansing, and monitoring frameworks. Contribute to the firm’s More ❯
Posted:

AWS Data Engineer

City of London, Greater London, UK
Randstad Digital
Python, SQL , and one or more: R, Java, Scala Experience with relational/NoSQL databases (e.g., PostgreSQL, MongoDB) Familiarity with big data tools (Hadoop, Spark, Kafka), cloud platforms (Azure, AWS, GCP), and workflow tools (Airflow, Luigi) Bonus: experience with BI tools , API integrations , and graph databases Why Join Us More ❯
Posted:

AWS Data Engineer

City of London, London, United Kingdom
Randstad Digital
Python, SQL , and one or more: R, Java, Scala Experience with relational/NoSQL databases (e.g., PostgreSQL, MongoDB) Familiarity with big data tools (Hadoop, Spark, Kafka), cloud platforms (Azure, AWS, GCP), and workflow tools (Airflow, Luigi) Bonus: experience with BI tools , API integrations , and graph databases Why Join Us More ❯
Posted:

AI Engineering Researcher

City of London, Greater London, UK
Trinity Resource Solutions
various Vector Stores and more traditional technologies e.g. MySQL, PostgreSQL, NoSQL databases). − Hands-on experience with data tools and frameworks such as Hadoop, Spark, or Kafka - advantage − Familiarity with data warehousing solutions and cloud data platforms. − Background in building applications wrapped around AI/LLM/mathematical models More ❯
Posted:

AI Engineering Researcher

City of London, London, United Kingdom
Trinity Resource Solutions
various Vector Stores and more traditional technologies e.g. MySQL, PostgreSQL, NoSQL databases). − Hands-on experience with data tools and frameworks such as Hadoop, Spark, or Kafka - advantage − Familiarity with data warehousing solutions and cloud data platforms. − Background in building applications wrapped around AI/LLM/mathematical models More ❯
Posted:

Python Developer

City of London, Greater London, UK
Hedge Fund
in commodities markets or broader financial markets. Knowledge of quantitative modeling, risk management, or algorithmic trading. Familiarity with big data technologies like Kafka, Hadoop, Spark, or similar. Why Work With Us? Impactful Work: Directly influence the profitability of the business by building technology that drives trading decisions. Innovative Culture More ❯
Posted:
Apache Spark
the City of London
25th Percentile
£56,250
Median
£62,500
75th Percentile
£90,000
90th Percentile
£94,500