and performance tuning. Solid experience with Python , ideally using PySpark in Azure Databricks . Hands-on experience with Azure Data Lake Storage Gen2 . Understanding of data warehouse concepts , dimensionalmodelling , and data architecture . Experience working with Delta Lake and large-scale data processing. Experience building ETL pipelines in Azure Data Factory or similar orchestration tools. Familiarity More ❯
them with robust data pipelines. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka, and Flink. Experience in using modern data architectures More ❯
the mobile/telecoms industry would be a bonus! Key outputs for the role• Design, build, and maintain scalable and trustworthy data models in dbt, making use of Kimball Dimensional and One Big Table (OBT) methodologies.• Translate business requirements from stakeholders into robust, well-documented and tested dbt models.• Develop and own workflows within Google Cloud Platform environments, primarily … requirements and implement robust, well-documented and performantdbt models that serve as a single source of truth for business reporting.• Implement and champion data quality testing, documentation standards, and modelling best practices within dbt projects.• Troubleshoot and resolve any issues or errors in the data pipelines.• Stay updated with the latest cloud technologies and industry best practices to continuously … to see: • Expert level proficiency in SQL .• Deep, practical experience of building and architecting data models with dbt .• A strong understanding of data warehousing concepts and data modelling techniques (eg, Kimball, DimensionalModelling, One Big Table).• Solid, hands-on experience within the Google Cloud Platform, especially with BigQuery.• Proven experience working directly with business More ❯
They closely collaborate with data scientists, analysts, and software engineers to ensure efficient data processing, storage, and retrieval for business insights and decision-making. From their expertise in data modelling, ETL (Extract, Transform, Load) processes, and big data technologies it becomes possible to develop robust and reliable data solutions. RESPONSIBILITIES Data Pipeline Development: Design, implement, and maintain scalable data … in software engineering a plus. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka, and Flink. Experience in using modern data architectures More ❯
Kirkby on Bain, England, United Kingdom Hybrid / WFH Options
ANGLIAN WATER-2
on your location) and home working (expectation is 2 days in the office) Permanent Full time, 37 hours per week Do you have prior experience of Enterprise Data Engineering, dimensionalmodelling? Do you have experience of working within an Agile environment or are keen to work in one? Do you like to always explore, learn, challenge yourself and … of new team members & coach other Data Engineers Actively contribute to drafting and updating our service policies, procedures, work instructions and guidance notes. As a BI Solution Architect/Dimensional Modeller, design robust, secure and supportable corporate data solutions to meet business requirements following dimensionalmodelling methodology, considering privacy by design and self-service capabilities by default. … be an Enterprise Data Engineer? Previous strong experience in data engineering ideally using Databricks, Azure Data Factory, Spark, Python, SQL, PowerBI Strong data engineering experience atleast 3-5 years Dimensional data modelling Experience in delivering end to end BI solution from requirements, design to delivery Experience of working within an Agile/Scrum environment Experience/understanding of More ❯
experience working with Fabric-based solutions. Azure Ecosystem: At least 2 years of experience delivering business solutions using: Azure Data Factory/Synapse Power BI Azure SQL Server Data Modelling: Strong knowledge of dimensionalmodelling (Kimball). Familiarity with other techniques such as Data Vault 2.0. Development Experience: Minimum 3 years in database and/or analytics More ❯
with a few in-person meetings in shared co-working spaces on an ad hoc basis. Role Description We are looking for an SQL Developer (Snowflake), specializing in data modelling, ETL processes, and cloud-based data solutions. This position requires expertise in Snowflake, Azure, Python and Power BI , with a strong focus on building semantic models and supporting analytics. … and translate them into scalable technical solutions. Ensure data quality, consistency, and performance across environments. Monitor and tune Snowflake performance, storage, and compute usage. Implement best practices in data modelling , schema design, and cloud architecture. Collaborate on CI/CD and automation initiatives for data deployments. Maintain technical documentation for processes, pipelines, models, and reports. Skills Required: Proven experience … code, especially within Snowflake stored procedures. Familiarity with modern cloud platforms for data integration, such as Azure Data Factory, Databricks, or similar tools. Experience in Power BI, including data modelling, DAX, and managing semantic models. Good understanding of data warehousing, dimensionalmodelling, and ELT best practices. Knowledge of version control and Agile development methodologies. Qualifications: Strong experience More ❯
Catalog for data governance and cataloging. Familiarity with Terraform for infrastructure automation. Solid experience working with GitHub in a collaborative development environment. Deep understanding of data modeling concepts (e.g., dimensional modeling, star/snowflake schemas). Proven experience working with sales or transactional data. Ability to build and maintain clean, performant data layers for Power BI dashboards. More ❯
including Azure Databricks, SQL Database, Data Factory, and T-SQL. In-depth understanding of London Market insurance data and Actuarial/Finance business processes. Experience with ETL processes, data modelling, and dimensionalmodelling techniques. Proficient with Azure DevOps, CI/CD, GIT, and deployment strategies. Proven track record in delivering cloud-based enterprise data solutions and supporting More ❯
including Azure Databricks, SQL Database, Data Factory, and T-SQL. In-depth understanding of London Market insurance data and Actuarial/Finance business processes. Experience with ETL processes, data modelling, and dimensionalmodelling techniques. Proficient with Azure DevOps, CI/CD, GIT, and deployment strategies. Proven track record in delivering cloud-based enterprise data solutions and supporting More ❯
experience Comprehensive Data Engineering background - proven track record in enterprise data solutions Experience with ETL processes and data transformation , preferably using Databricks Strong foundation in Data Warehousing architectures and dimensional modeling Familiarity with batch processing from relational database sources Communication & Collaboration Skills of the Data Engineer Outstanding stakeholder engagement abilities across technical and business audiences Strong relationship-building skills More ❯
depth knowledge of Snowflake architecture, features, and best practices. · Experience with CI/CD pipelines using Git and Git Actions. · Knowledge of various data modeling techniques, including Star Schema, Dimensional models, and Data Vault. · Hands-on experience with: · Developing data pipelines (Snowflake), writing complex SQL queries. · Building ETL/ELT/data pipelines. · Kubernetes and Linux containers (e.g., Docker More ❯
depth knowledge of Snowflake architecture, features, and best practices. · Experience with CI/CD pipelines using Git and Git Actions. · Knowledge of various data modeling techniques, including Star Schema, Dimensional models, and Data Vault. · Hands-on experience with: · Developing data pipelines (Snowflake), writing complex SQL queries. · Building ETL/ELT/data pipelines. · Kubernetes and Linux containers (e.g., Docker More ❯
Crawley, England, United Kingdom Hybrid / WFH Options
McCabe & Barton
design and deliver scalable data products that enable self-service analytics using Snowflake, Power BI, Python, and SQL. What You’ll Do: Build robust ETL/ELT pipelines and dimensional models for analytics Enforce data quality, security, and governance in a Data Mesh setup Enable self-service insights through intuitive data models and training Own data products end-to … continuous improvement Promote innovation and best practices across the organisation What We’re Looking For: Strong skills in SQL and Power BI. Python is beneficial Solid experience with data modelling and ETL pipelines Knowledge of cloud environments – Azure Experience with Snowflake and Databricks Familiarity with data governance in decentralised environments Excellent communication and stakeholder engagement skills A proactive, problem More ❯
including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices in data modelling, governance, and DevOps for data engineering (CI/CD, IaC). Serve as a key communicator between technical teams and business stakeholders, translating complex data needs into actionable plans. … Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensionalmodelling, and metadata management. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with version control, CI/CD , and Infrastructure-as-Code (Terraform or similar). More ❯
City of London, London, United Kingdom Hybrid / WFH Options
83data
including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices in data modelling, governance, and DevOps for data engineering (CI/CD, IaC). Serve as a key communicator between technical teams and business stakeholders, translating complex data needs into actionable plans. … Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensionalmodelling, and metadata management. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with version control, CI/CD , and Infrastructure-as-Code (Terraform or similar). More ❯
including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices in data modelling, governance, and DevOps for data engineering (CI/CD, IaC). Serve as a key communicator between technical teams and business stakeholders, translating complex data needs into actionable plans. … Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensionalmodelling, and metadata management. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with version control, CI/CD , and Infrastructure-as-Code (Terraform or similar). More ❯
and emerging technologies. What You’ll Bring ✅ Extensive hands-on experience with Databricks , and Microsoft Azure data tools (must-have: Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensionalmodelling expertise for analytics use cases. ✅ Strong ETL/ELT development skills. ✅ Python scripting experience for data automation. ✅ Experience with CI/CD methodologies for data platforms. ✅ Knowledge More ❯
City of London, London, United Kingdom Hybrid / WFH Options
KDR Talent Solutions
and emerging technologies. What You’ll Bring ✅ Extensive hands-on experience with Databricks , and Microsoft Azure data tools (must-have: Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensionalmodelling expertise for analytics use cases. ✅ Strong ETL/ELT development skills. ✅ Python scripting experience for data automation. ✅ Experience with CI/CD methodologies for data platforms. ✅ Knowledge More ❯
SQL Server) Qualifications Desired qualifications and experience: Experience with Programming languages such as (Python, SQL etc.) Experience building and maintaining data pipelines. Familiarity with medallion architecture. An understanding of dimensionalmodelling concepts. Proficiency with reporting tools, preferably Power BI. Practical experience working in an Agile Scrum environment. At Solus, we offer a collaborative and innovative work environment where More ❯
About the Platform: This greenfield initiative is focused on building a next-gen data ecosystem with a tech stack including: Snowflake for cloud data warehousing dbt for transformation and modelling Azure for cloud infrastructure and orchestration Fivetran for automated data ingestion Power BI and other modern BI tools for reporting and visualisation What You'll Do: Design and implement … BI developers to deliver insightful, high-performance dashboards Work with Data Engineers to optimise data ingestion and orchestration pipelines using Azure Data Factory and Fivetran Apply best practices in dimensionalmodelling, layered architecture, and data quality What We're Looking For: Strong experience in data modelling and SQL Hands-on experience with dbt and cloud data platforms … like Snowflake or Azure Synapse Analytics Solid understanding of modern data stack principles , including layered modelling and data warehouse design Excellent communication skills and the ability to work with stakeholders across technical and non-technical teams Nice to have: Experience with Power BI or similar BI tools Familiarity with CI/CD practices in data environments Exposure to data More ❯
Bachelor’s or master’s degree in a relevant field 5+ years of proven experience as a Data Engineer Strong knowledge of SQL and relational databases Experience with data modelling, data warehouses, data lakes, and ETL Familiarity with cloud-based BI solutions Experience with dimensionalmodelling Solid analytical and communication skills Fluent in Dutch (written and spoken More ❯
Business Intelligence Engineer Hybrid – London Up to £450 a day Inside IR35 6 Months Key Skills: Business Intelligence (Data modelling, data warehousing, Dashboarding) SQL & Python AWS (S3, Lambda, Glue, Redshift) The Senior Business Intelligence Engineer occupies a unique role at the intersection of technology, marketing, finance, statistics, data mining, and social science. We provide the key insight into customer … using AWS CDK - Proficiency in ETL/ELT processes and best practices - Experience with data visualization tools (Quicksight) Required Skills: - Strong analytical and problem-solving abilities - Excellent understanding of dimensional modeling and star schema design (Facts, dimensions, scd type 2) - Experience with agile development methodologies - Strong communication skills and ability to work with cross-functional teams - Background in data More ❯
Business Intelligence Engineer Hybrid – London Up to £450 a day Inside IR35 6 Months Key Skills: Business Intelligence (Data modelling, data warehousing, Dashboarding) SQL & Python AWS (S3, Lambda, Glue, Redshift) The Senior Business Intelligence Engineer occupies a unique role at the intersection of technology, marketing, finance, statistics, data mining, and social science. We provide the key insight into customer … using AWS CDK - Proficiency in ETL/ELT processes and best practices - Experience with data visualization tools (Quicksight) Required Skills: - Strong analytical and problem-solving abilities - Excellent understanding of dimensional modeling and star schema design (Facts, dimensions, scd type 2) - Experience with agile development methodologies - Strong communication skills and ability to work with cross-functional teams - Background in data More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Avanti
in SQL and Python for data transformation and workflow automation Experience with AWS data tools (e.g. Redshift, Glue, Lambda, S3 ) and infrastructure tools such as Terraform Understanding of data modelling concepts (e.g. dimensional models, star/snowflake schemas) Knowledge of data quality, access controls , and compliance frameworks Nice to Have Experience with orchestration or pipeline frameworks like Airflow More ❯