within the Azure cloud environment. Proficiency with Azure Data Factory, Synapse Analytics, Databricks, Azure SQL, and Azure Storage. Strong SQL skills and expertise in dimensional modelling (e.g., star/snowflake schemas). Familiarity with Power BI dataflows, DAX, and RLS setup. Hands-on experience with Python, PySpark, or T-SQL for data processing and automation. Understanding of CI/ More ❯
or BI Analyst in mid-to-large data environments. Advanced SQL , with significant hands-on experience using Google BigQuery for analytics at scale. Strong data modelling skills (star/snowflakeschema, optimisation, partitioning, clustering). Experience building dashboards using Looker Studio, Looker, Power BI, or Tableau . Proficiency working with large datasets and performance-tuning analytical queries. Strong More ❯
IronPython automation, document properties. Conduct detailed analysis of Spotfire markings, filters and visual structures to produce functional specification documents for migration. Define and redesign semantic data models (star/snowflake schemas) suitable for Power BI. Collaborate with data engineers and Power BI developers to align source data, dataflows and model transformations. Work with business stakeholders to define functional parity More ❯
This role demands proven, hands-on experience in the following areas: Foundational Modeling: Absolute mastery of OLAP/OLTP Modeling and extensive experience in Dimensional Data Modeling (Star/Snowflake schemas). Architecture Design: Expert in Data Architecture and designing modern Data Lakehouse Architecture . Cloud Platform: Proven architectural experience with GCP is mandatory. Data Governance: Strong conceptual understanding More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
requirements and support business deliverables.* Collect, transform, and process datasets from various internal and external sources, ensuring data quality, governance, and integrity.* Implement efficient data models and schemas within Snowflake, and use DBT for transformation, orchestration, and workflow management.* Optimise ELT/ETL processes for improved performance, cost efficiency, and scalability.* Troubleshoot and resolve data pipeline issues swiftly and … technologies in data engineering, and continuously improve your skills and knowledge. Profile * Minimum 3 years' experience working as a Data Engineer in a commercial environment.* Strong commercial experience with Snowflake and DBT.* Proficient in SQL and experienced in data modelling within cloud data warehouses.* Familiarity with cloud platforms such as AWS or Azure.* Experience with Python, Databricks, or related More ❯