on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in a cloud-native environment (e.g. Fabric, AWS, GCP, or Azure). Excellent stakeholder management and communication skills. A strategic mindset, with a practical More ❯
Employment Type: Permanent
Salary: £80000 - £95000/annum Attractive Bonus and Benefits
West London, London, United Kingdom Hybrid / WFH Options
Young's Employment Services Ltd
on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in a cloud-native environment (e.g. Fabric, AWS, GCP, or Azure). Excellent stakeholder management and communication skills. A strategic mindset, with a practical More ❯
West London, London, United Kingdom Hybrid / WFH Options
Young's Employment Services Ltd
on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in a cloud-native environment (e.g. Fabric, AWS, GCP, or Azure). Excellent stakeholder management and communication skills. A strategic mindset, with a practical More ❯
West London, London, United Kingdom Hybrid / WFH Options
Young's Employment Services Ltd
on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in a cloud-native environment (e.g. Fabric, AWS, GCP, or Azure). Excellent stakeholder management and communication skills. A strategic mindset, with a practical More ❯
on data strategy, risk mitigation, and platform evolution. Mentor and develop data engineering talent, fostering a high-performance culture. Requirements: Extensive hands-on experience with AWS data tools: Glue, PySpark, Athena, Iceberg, Lake Formation. Strong proficiency in Python and SQL. Proven leadership in data engineering strategy and execution. Deep understanding of market data and its business applications. Experience with More ❯
on data strategy, risk mitigation, and platform evolution. Mentor and develop data engineering talent, fostering a high-performance culture. Requirements: Extensive hands-on experience with AWS data tools: Glue, PySpark, Athena, Iceberg, Lake Formation. Strong proficiency in Python and SQL. Proven leadership in data engineering strategy and execution. Deep understanding of market data and its business applications. Experience with More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Publicis Production
data modeling, data warehousing concepts, and distributed systems. Excellent problem-solving skills and ability to progress with design, build and validate output data independently. Deep proficiency in Python (including PySpark), SQL, and cloud-based data engineering tools. Expertise in multiple cloud platforms (AWS, GCP, or Azure) and managing cloud-based data infrastructure. Strong background in database technologies (SQL Server More ❯
data modeling, data warehousing concepts, and distributed systems. Excellent problem-solving skills and ability to progress with design, build and validate output data independently. Deep proficiency in Python (including PySpark), SQL, and cloud-based data engineering tools. Expertise in multiple cloud platforms (AWS, GCP, or Azure) and managing cloud-based data infrastructure. Strong background in database technologies (SQL Server More ❯
london, south east england, united kingdom Hybrid / WFH Options
Publicis Production
data modeling, data warehousing concepts, and distributed systems. Excellent problem-solving skills and ability to progress with design, build and validate output data independently. Deep proficiency in Python (including PySpark), SQL, and cloud-based data engineering tools. Expertise in multiple cloud platforms (AWS, GCP, or Azure) and managing cloud-based data infrastructure. Strong background in database technologies (SQL Server More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Publicis Production
data modeling, data warehousing concepts, and distributed systems. Excellent problem-solving skills and ability to progress with design, build and validate output data independently. Deep proficiency in Python (including PySpark), SQL, and cloud-based data engineering tools. Expertise in multiple cloud platforms (AWS, GCP, or Azure) and managing cloud-based data infrastructure. Strong background in database technologies (SQL Server More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Nigel Frank International
an impact, get in touch ASAP as interviews are already taking place. Don't miss out! Key Skills: AWS, Data, Architecture, Data Engineering, Data Warehousing, Data Lakes, Databricks, Glue, Pyspark, Athena, Python, SQL, Machine Learning, London More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
an impact, get in touch ASAP as interviews are already taking place. Don't miss out! Key Skills: AWS, Data, Architecture, Data Engineering, Data Warehousing, Data Lakes, Databricks, Glue, Pyspark, Athena, Python, SQL, Machine Learning, London More ❯
handling to support monitoring and incident response. ? Align implementations with InfoSum's privacy, security, and compliance practices. Required Skills and Experience: ? Proven experience with Apache Spark (Scala, Java, or PySpark), including performance optimization and advanced tuning techniques. ? Strong troubleshooting skills in production Spark environments, including diagnosing memory usage, shuffles, skew, and executor behavior. ? Experience deploying and managing Spark jobs More ❯
Drive automation and CI/CD practices across the data platform Explore new technologies to improve data ingestion and self-service Essential Skills Azure Databricks : Expert in Spark (SQL, PySpark), Databricks Workflows Data Pipeline Design : Proven experience in scalable ETL/ELT development Azure Services : Data Lake, Blob Storage, Synapse Data Governance : Unity Catalog, access control, metadata management Performance More ❯
Drive automation and CI/CD practices across the data platform Explore new technologies to improve data ingestion and self-service Essential Skills Azure Databricks : Expert in Spark (SQL, PySpark), Databricks Workflows Data Pipeline Design : Proven experience in scalable ETL/ELT development Azure Services : Data Lake, Blob Storage, Synapse Data Governance : Unity Catalog, access control, metadata management Performance More ❯
It is entering a phase of hardening and re-engineering to enhance the speed, ease, and quality of ingestion across the portfolio. Key Requirements: Strong proficiency in Python and PySpark Successful track history in a Data Engineering role including Database Management (ideally SQL), Data Orchestration (ideally Apache Airflow or Dagster), Containerisation (ideally Kubernetes and Docker) .Data Pipelines (big data More ❯
Microsoft Azure data services (Azure Data Factory, Azure Data Fabric, Azure Synapse Analytics, Azure SQL Database). Experience building ELT/ETL pipelines and managing data workflows. Proficiency in PySpark, Python, SQL, or Scala. Strong data modelling and relational database knowledge. Solid understanding of GDPR and UK data protection. Preferred Power BI experience. Familiarity with legal industry platforms. Awareness More ❯
Microsoft Azure data services (Azure Data Factory, Azure Data Fabric, Azure Synapse Analytics, Azure SQL Database). Experience building ELT/ETL pipelines and managing data workflows. Proficiency in PySpark, Python, SQL, or Scala. Strong data modelling and relational database knowledge. Solid understanding of GDPR and UK data protection. Preferred Power BI experience. Familiarity with legal industry platforms. Awareness More ❯
within Microsoft Azure data tools (Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensional modelling expertise for analytics use cases. ✅ Strong ETL/ELT development skills. ✅ Python/Pyspark experience ✅ Experience with CI/CD methodologies for data platforms. ✅ Deep knowledge of SQL ✅ Extensive London Markets experience Why Join? 🚀 New Projects – Work on a new data platform, shaping More ❯
within Microsoft Azure data tools (Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensional modelling expertise for analytics use cases. ✅ Strong ETL/ELT development skills. ✅ Python/Pyspark experience ✅ Experience with CI/CD methodologies for data platforms. ✅ Deep knowledge of SQL ✅ Extensive London Markets experience Why Join? 🚀 New Projects – Work on a new data platform, shaping More ❯
within Microsoft Azure data tools (Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensional modelling expertise for analytics use cases. ✅ Strong ETL/ELT development skills. ✅ Python/Pyspark experience ✅ Experience with CI/CD methodologies for data platforms. ✅ Deep knowledge of SQL ✅ Extensive London Markets experience Why Join? 🚀 New Projects – Work on a new data platform, shaping More ❯
london (city of london), south east england, united kingdom
KDR Talent Solutions
within Microsoft Azure data tools (Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensional modelling expertise for analytics use cases. ✅ Strong ETL/ELT development skills. ✅ Python/Pyspark experience ✅ Experience with CI/CD methodologies for data platforms. ✅ Deep knowledge of SQL ✅ Extensive London Markets experience Why Join? 🚀 New Projects – Work on a new data platform, shaping More ❯
Experience with Agile/Scrum Framework. Excellent problem-solving and analytical skills. Excellent communication skills, both at a deep technical level and stakeholder level. Data Expert experience with Databricks (PySpark). Experience building and maintaining complex ETL Projects, end-to-end (ingestion, processing, storage). Expert knowledge and experience with data modelling, data access, and data storage techniques. Experience More ❯
City of London, London, United Kingdom Hybrid / WFH Options
eTeam
Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Skillset Required: Experience in Python, PySpark and SQL Experience with AWS is a plus Strong proficiency in Core Java, including Collections, Concurrency, and Memory Management. Proficient in version control systems such as Git, GitLab, or More ❯
Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Skillset Required: Experience in Python, PySpark and SQL Experience with AWS is a plus Strong proficiency in Core Java, including Collections, Concurrency, and Memory Management. Proficient in version control systems such as Git, GitLab, or More ❯