role in designing, building, and maintaining their data infrastructure while collaborating closely with senior stakeholders across the organisation. Your expertise in Azure, Databricks, Spark, Python, and data modelling will be critical in driving the success of their data initiatives. Key Responsibilities: Lead the complete development cycle of data … of data modelling, data warehousing principles, and the innovative Lakehouse architecture. Exceptional proficiency in ETL methodologies, preferably utilising Azure Databricks or equivalent technologies (Spark, SparkSQL, Python, SQL), including deep insight into ETL/ELT design patterns. Proficient in Databricks, SQLmore »
DB Developer: * Minimum of 8 years' development and experience in MySQL/Oracle comprehending the existing SQL queries and writing medium to complex SQL queries. * Minimum of 3 years' experience in Unix and shell scripting. * Minimum of 1 year experience in investment banking or the financial … sector. * Performance Tuning of Oracle/MySQL/Hive SQL Queries/SparkSQL Statements. * Experience in working with large databases - multi terabytes (3+ Terabytes). * Minimum of 5 years' experience in Big Data Space (Hive, Impala, SparkSql, HDFS more »
London, England, United Kingdom Hybrid / WFH Options
Maclean Moore Ltd
develop unit test cases. Help in backlog grooming. Key skills: Extensive experience in developing Bigdata pipelines in cloud using Bigdata technologies such as ApacheSpark Expertise in performing complex data transformation using SparkSQL queries Experience in orchestrating data pipelines using Apache Airflow Proficiency more »