PySpark, and SQL, with the ability to build modular, efficient, and scalable data pipelines. Deep expertise in data modeling for both relational databases and data warehouses, including Star and Snowflakeschema designs. Extensive experience working with AWS Redshift and Aurora for data warehousing and transactional workloads. Experience using dbt (Data Build Tool) for building modular, version-controlled, and More ❯
PySpark, and SQL, with the ability to build modular, efficient, and scalable data pipelines. Deep expertise in data modeling for both relational databases and data warehouses, including Star and Snowflakeschema designs. Extensive experience working with AWS Redshift and Aurora for data warehousing and transactional workloads. Experience using dbt (Data Build Tool) for building modular, version-controlled, and More ❯
london (city of london), south east england, united kingdom
HCLTech
PySpark, and SQL, with the ability to build modular, efficient, and scalable data pipelines. Deep expertise in data modeling for both relational databases and data warehouses, including Star and Snowflakeschema designs. Extensive experience working with AWS Redshift and Aurora for data warehousing and transactional workloads. Experience using dbt (Data Build Tool) for building modular, version-controlled, and More ❯
ETL/ELT pipelines using SQL and Python Integrate internal/external data sources via APIs and platform connectors Model and structure data for scalable analytics (e.g., star/snowflake schemas) Administer Microsoft Fabric Lakehouse and Azure services Optimise performance across queries, datasets, and pipelines Apply data validation, cleansing, and standardisation rules Document pipeline logic and contribute to business More ❯
You’ll be working with: Programming: Python, SQL, Scala/Java Big Data: Spark, Hadoop, Databricks Pipelines: Airflow, Kafka, ETL tools Cloud: AWS, GCP, or Azure (Glue, Redshift, BigQuery, Snowflake) Data Modelling & Warehousing 🔹 What’s on Offer 💷 £80,000pa (Permanent role) 📍 Hybrid – 2 days per week in Central London 🚀 Opportunity to work in a scaling HealthTech company with real More ❯
You’ll be working with: Programming: Python, SQL, Scala/Java Big Data: Spark, Hadoop, Databricks Pipelines: Airflow, Kafka, ETL tools Cloud: AWS, GCP, or Azure (Glue, Redshift, BigQuery, Snowflake) Data Modelling & Warehousing 🔹 What’s on Offer 💷 £80,000pa (Permanent role) 📍 Hybrid – 2 days per week in Central London 🚀 Opportunity to work in a scaling HealthTech company with real More ❯
london (city of london), south east england, united kingdom
Roc Search
You’ll be working with: Programming: Python, SQL, Scala/Java Big Data: Spark, Hadoop, Databricks Pipelines: Airflow, Kafka, ETL tools Cloud: AWS, GCP, or Azure (Glue, Redshift, BigQuery, Snowflake) Data Modelling & Warehousing 🔹 What’s on Offer 💷 £80,000pa (Permanent role) 📍 Hybrid – 2 days per week in Central London 🚀 Opportunity to work in a scaling HealthTech company with real More ❯