Apache Spark Jobs in West London

7 of 7 Apache Spark Jobs in West London

Data Engineer Manager

West London, London, United Kingdom
Hybrid / WFH Options
Young's Employment Services Ltd
a Senior Data Engineer, Tech Lead, Data Engineering Manager etc. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating More ❯
Employment Type: Permanent, Work From Home
Posted:

Senior Data Engineer

West London, London, United Kingdom
Hybrid / WFH Options
Young's Employment Services Ltd
a Senior Data Engineer, Tech Lead, Data Engineering Manager etc. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating More ❯
Employment Type: Permanent, Work From Home
Salary: £95,000
Posted:

Data Engineering Manager

West London, London, England, United Kingdom
Hybrid / WFH Options
Young's Employment Services Ltd
a Senior Data Engineer, Tech Lead, Data Engineering Manager etc. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating More ❯
Employment Type: Full-Time
Salary: £85,000 - £95,000 per annum, Negotiable
Posted:

Data Solution Architect

North West London, London, United Kingdom
Hybrid / WFH Options
Anson Mccade
knowledge of Kafka , Confluent , and event-driven architecture Hands-on experience with Databricks , Unity Catalog , and Lakehouse architectures Strong architectural understanding across AWS, Azure, GCP , and Snowflake Familiarity with Apache Spark, SQL/NoSQL databases, and programming (Python, R, Java) Knowledge of data visualisation, DevOps principles, and ML/AI integration into data architectures Strong grasp of data More ❯
Employment Type: Permanent, Work From Home
Posted:

Data Solutions Architect

South West London, London, United Kingdom
Hybrid / WFH Options
Anson Mccade
on experience with cloud platforms like AWS, Azure, GCP, or Snowflake. Strong knowledge of data governance, compliance, and security standards (GDPR, CCPA). Proficiency in big data technologies like Apache Spark and understanding of data product strategies. Strong leadership and stakeholder management skills in Agile delivery environments. Package: £90,000 - £115,000 base salary Bonus Pension and company More ❯
Employment Type: Permanent, Work From Home
Posted:

Product Engineering Lead (Supply and R&D)

South West London, London, United Kingdom
Mars
priorities aimed at maximizing value through data utilization. Knowled g e/Experience Expertise in Commercial/Procurement Analytics. Experience in SAP (S/4 Hana). Experience with Spark, Databricks, or similar data processing tools. Stron g technical proficiency in data modelin g , SQL, NoSQL databases, and data warehousing . Hands-on experience with data pipeline development, ETL … processes, and big data technolo g ies (e. g ., Hadoop, Spark, Kafka). Proficiency in cloud platforms such as AWS, Azure, or Goo g le Cloud and cloud-based data services (e.g ., AWS Redshift, Azure Synapse Analytics, Goog le Bi g Query). Experience with DataOps practices and tools, includin g CI/CD for data pipelines. More ❯
Employment Type: Permanent
Posted:

Data Engineer

North West London, London, United Kingdom
Searchworks Ltd
Oversee pipeline performance, address issues promptly, and maintain comprehensive data documentation. What Youll Bring Technical Expertise: Proficiency in Python and SQL; experience with data processing frameworks such as Airflow, Spark, or TensorFlow. Data Engineering Fundamentals: Strong understanding of data architecture, data modelling, and scalable data solutions. Backend Development: Willingness to develop proficiency in backend technologies (e.g., Python with Django … to support data pipeline integrations. Cloud Platforms: Familiarity with AWS or Azure, including services like Apache Airflow, Terraform, or SageMaker. Data Quality Management: Experience with data versioning and quality assurance practices. Automation and CI/CD: Knowledge of build and deployment automation processes. Experience within MLOps A 1st class Data degree from one of the UKs top 15 Universities More ❯
Employment Type: Permanent
Salary: £80,000
Posted: