ETL/ELT pipelines using SQL and Python Integrate internal/external data sources via APIs and platform connectors Model and structure data for scalable analytics (e.g., star/snowflake schemas) Administer Microsoft Fabric Lakehouse and Azure services Optimise performance across queries, datasets, and pipelines Apply data validation, cleansing, and standardisation rules Document pipeline logic and contribute to business More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Randstad Technologies
Design and implement solutions using GCP , with a strong focus on Data Lakehouse Architecture , Master Data Management (MDM) , and Dimensional Data Modeling Work with modern databases and platforms, including Snowflake , Oracle , SQL Server , and PostgreSQL Apply Agile and conventional methodologies to manage development and delivery lifecycles Communicate effectively with stakeholders across the business to ensure alignment and engagement Required … Technical Skills: GCP Data Architecture Data Lakehouse Architecture MDM (Conceptual) Dimensional Data Modeling Snowflake, Oracle, SQL Server, PostgreSQL Python and Power BI (desirable) Knowledge of test automation tools and practices Strong understanding of Agile and software development best practices Ideal Candidate Profile: Extensive experience in delivering data programmes within the Insurance or Reinsurance sector Strong leadership and organisational skills More ❯