DAX, M, advanced modelling techniques, DirectQuery, and Import. Strong SQL skills, with the ability to write complex queries for analysis and integration. Experience in dimensional modelling, including star and snowflakeschema design. Demonstrated capability in time-travel and snapshot reporting methodologies. Practical experience with integrating data from Dataverse and Databricks into reporting workflows. Experience & Knowledge Proven experience as More ❯
IronPython automation, document properties. Conduct detailed analysis of Spotfire markings, filters and visual structures to produce functional specification documents for migration. Define and redesign semantic data models (star/snowflake schemas) suitable for Power BI. Collaborate with data engineers and Power BI developers to align source data, dataflows and model transformations. Work with business stakeholders to define functional parity More ❯
ETL/ELT pipelines using SQL and Python Integrate internal/external data sources via APIs and platform connectors Model and structure data for scalable analytics (e.g., star/snowflake schemas) Administer Microsoft Fabric Lakehouse and Azure services Optimise performance across queries, datasets, and pipelines Apply data validation, cleansing, and standardisation rules Document pipeline logic and contribute to business More ❯
such as ERWin, Sparx, or RSA Your Skills and Experience 2-4 years of experience in data modelling or data architecture Exposure to relational and dimensional modelling (3NF, star, snowflake) Strong SQL and data warehousing knowledge Experience in financial services or consulting is highly desirable Strong communication and stakeholder engagement skills More ❯
into a more engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You'll work extensively with Python , Snowflake , SQL , and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role … s confident communicating with data, product, and engineering teams, not a 'heads-down coder' type. Top 4 Core Skills Python - workflow automation, data processing, and ETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (Data Build Tool) - modular data modelling, testing, documentation, and version … build, and maintain dbt models and SQL transformations to support analytical and operational use cases. Develop and maintain Python workflows for data ingestion, transformation, and automation. Engineer scalable, performant Snowflake pipelines and data models aligned with business and product needs. Partner closely with analysts, product managers, and engineers to translate complex business requirements into data-driven solutions. Write production More ❯
initiatives Your Skills and Experience 5+ years of experience in data modelling or architecture, ideally within financial services or consulting Strong knowledge of relational and dimensional design (3NF, star, snowflake) Proficient in ERWin, Sparx, or similar tools Experience working with semi-structured data (JSON, XML, Parquet) Excellent communication and client-facing skills More ❯
SQL - writing and optimizing complex queries, joins, window functions, performance tuning. Experience with dbt (or equivalent) - building models, tests, documentation, version control. Understanding of data warehousing concepts (star schemas, snowflake, slowly changing dimensions, partitioning, clustering). Experience working in a modern data stack (e.g. BigQuery, Snowflake, Redshift, Databricks, etc.) Comfortable working downstream (with BI/analytics users) and More ❯
This role demands proven, hands-on experience in the following areas: Foundational Modeling: Absolute mastery of OLAP/OLTP Modeling and extensive experience in Dimensional Data Modeling (Star/Snowflake schemas). Architecture Design: Expert in Data Architecture and designing modern Data Lakehouse Architecture . Cloud Platform: Proven architectural experience with GCP is mandatory. Data Governance: Strong conceptual understanding More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
requirements and support business deliverables.* Collect, transform, and process datasets from various internal and external sources, ensuring data quality, governance, and integrity.* Implement efficient data models and schemas within Snowflake, and use DBT for transformation, orchestration, and workflow management.* Optimise ELT/ETL processes for improved performance, cost efficiency, and scalability.* Troubleshoot and resolve data pipeline issues swiftly and … technologies in data engineering, and continuously improve your skills and knowledge. Profile * Minimum 3 years' experience working as a Data Engineer in a commercial environment.* Strong commercial experience with Snowflake and DBT.* Proficient in SQL and experienced in data modelling within cloud data warehouses.* Familiarity with cloud platforms such as AWS or Azure.* Experience with Python, Databricks, or related More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Salt Search
landscapes (MDM, CRM, ERP, Cloud DWH) through pragmatic and scalable architectural blueprints. Cloud Data Platform Leadership Design and implement high-performance cloud data platforms (AWS, Azure, Google Cloud, Databricks, Snowflake), overseeing data modelling, integration, transformation, and DevOps pipelines. Integrated Solution Architecture Design seamless integrations between cloud data platforms, AI/GenAI platforms, and business-critical systems (e.g., MDM, CRM … as a Data Architect , leading design and implementation of complex cloud-based data ecosystems. Solid engineering background with hands-on data platform implementation experience (AWS, Azure, GCP, Databricks, or Snowflake). Proven ability to evaluate data architecture decisions, influence business and IT stakeholders, and define strategic data direction. Strong understanding of coding best practices , code quality tools (e.g., SonarQube More ❯