this field, we'd like to hear from you. Design and implement data pipelines using Databricks, Azure Data Factory, and other relevant tools. Develop and maintain data models, including starschema modeling, to support business intelligence and analytics. Implement and maintain CI/CD pipelines using Azure DevOps. Gathering & understanding end-user requirements. Maintain and improve Power BI … . Analytics, Coding Languages, Python, Improve Data, Aligning Data, Databricks, End Users, Data Platform, Dataset, Stay Informed, S-PLUS, Data Pipelines, Data models, DevOps Engineering, Databricks, Technology, Data Systems, Starschema, ISA, Attention to detail, Business Goals, Microsoft Azure, Relevant Data, Pipelines, Azure Data Factory, Industry Trends, Improve Data, Organization Skills, Data engineering, Provide Training, Data Infrastructure, Fabric More ❯
this field, we'd like to hear from you. Design and implement data pipelines using Databricks, Azure Data Factory, and other relevant tools. Develop and maintain data models, including starschema modeling, to support business intelligence and analytics. Implement and maintain CI/CD pipelines using Azure DevOps. Gathering & understanding end-user requirements. Maintain and improve Power BI … . Analytics, Coding Languages, Python, Improve Data, Aligning Data, Databricks, End Users, Data Platform, Dataset, Stay Informed, S-PLUS, Data Pipelines, Data models, DevOps Engineering, Databricks, Technology, Data Systems, Starschema, ISA, Attention to detail, Business Goals, Microsoft Azure, Relevant Data, Pipelines, Azure Data Factory, Industry Trends, Improve Data, Organization Skills, Data engineering, Provide Training, Data Infrastructure, Fabric More ❯
this field, we'd like to hear from you. Design and implement data pipelines using Databricks, Azure Data Factory, and other relevant tools. Develop and maintain data models, including starschema modeling, to support business intelligence and analytics. Implement and maintain CI/CD pipelines using Azure DevOps. Gathering & understanding end-user requirements. Maintain and improve Power BI … . Analytics, Coding Languages, Python, Improve Data, Aligning Data, Databricks, End Users, Data Platform, Dataset, Stay Informed, S-PLUS, Data Pipelines, Data models, DevOps Engineering, Databricks, Technology, Data Systems, Starschema, ISA, Attention to detail, Business Goals, Microsoft Azure, Relevant Data, Pipelines, Azure Data Factory, Industry Trends, Improve Data, Organization Skills, Data engineering, Provide Training, Data Infrastructure, Fabric More ❯
responsibilities: Partner with business teams to understand requirements and deliver meaningful, actionable insights. Design and implement data pipelines from APIs and relational sources. Model data effectively using Kimball/StarSchema methodologies. Develop dashboards, reports, and automated integrations in Power BI. Support the onboarding of data from newly acquired businesses. Contribute to data strategy, process improvement, and best … skills and experience: Advanced SQL proficiency (joins, CTEs, window functions). Strong Power BI skills, including semantic modelling, DAX, and report design. Experience building data warehouse solutions (Kimball/StarSchema). Excellent communication and stakeholder engagement skills. Proactive, organised, and adaptable with a genuine team spirit. Desirable: Knowledge of Medallion Architecture and Data Lakehouse concepts. Working knowledge More ❯
Concepts Foundational understanding of Big Data architecture (Data Lakes, Data Warehouses) and distributed processing concepts (e.g., MapReduce). ETL/ELT Basic knowledge of ETL principles and data modeling (starschema, snowflake schema). Version Control Practical experience with Git (branching, merging, pull requests). Preferred Qualifications (A Plus) Experience with a distributed computing framework like Apache More ❯
Concepts Foundational understanding of Big Data architecture (Data Lakes, Data Warehouses) and distributed processing concepts (e.g., MapReduce). ETL/ELT Basic knowledge of ETL principles and data modeling (starschema, snowflake schema). Version Control Practical experience with Git (branching, merging, pull requests). Preferred Qualifications (A Plus) Experience with a distributed computing framework like Apache More ❯
Concepts Foundational understanding of Big Data architecture (Data Lakes, Data Warehouses) and distributed processing concepts (e.g., MapReduce). ETL/ELT Basic knowledge of ETL principles and data modeling (starschema, snowflake schema). Version Control Practical experience with Git (branching, merging, pull requests). Preferred Qualifications (A Plus) Experience with a distributed computing framework like Apache More ❯
Concepts Foundational understanding of Big Data architecture (Data Lakes, Data Warehouses) and distributed processing concepts (e.g., MapReduce). ETL/ELT Basic knowledge of ETL principles and data modeling (starschema, snowflake schema). Version Control Practical experience with Git (branching, merging, pull requests). Preferred Qualifications (A Plus) Experience with a distributed computing framework like Apache More ❯
Concepts Foundational understanding of Big Data architecture (Data Lakes, Data Warehouses) and distributed processing concepts (e.g., MapReduce). ETL/ELT Basic knowledge of ETL principles and data modeling (starschema, snowflake schema). Version Control Practical experience with Git (branching, merging, pull requests). Preferred Qualifications (A Plus) Experience with a distributed computing framework like Apache More ❯
with tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (StarSchema or Snowflake Schema). Skilled in security frameworks such as GDPR, HIPAA, ISO 27001, NIST, SOX, and PII, with expertise in IAM, KMS, and RBAC implementation. More ❯
with tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (StarSchema or Snowflake Schema). Skilled in security frameworks such as GDPR, HIPAA, ISO 27001, NIST, SOX, and PII, with expertise in IAM, KMS, and RBAC implementation. More ❯
watford, hertfordshire, east anglia, united kingdom
Akkodis
with tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (StarSchema or Snowflake Schema). Skilled in security frameworks such as GDPR, HIPAA, ISO 27001, NIST, SOX, and PII, with expertise in IAM, KMS, and RBAC implementation. More ❯
Factory, SSIS, and Azure Data Storage solutions. Familiarity with programming languages such as Python or C# for scripting and automation. Proven experience in designing data warehouses, data lakes, and starschema models. Proficient in implementing DevOps practices using Azure DevOps, GitHub, and CI/CD pipelines. Strong understanding of big data architectures, data integration patterns, and best practices. More ❯
environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensional modelling (starschema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ More ❯
environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensional modelling (starschema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ More ❯
environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensional modelling (starschema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ More ❯
london (city of london), south east england, united kingdom
Sanderson
environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensional modelling (starschema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ More ❯
environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensional modelling (starschema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ More ❯
environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensional modelling (starschema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ More ❯
Proficiency in ETL/ELT processes and best practices - Experience with data visualization tools (Quicksight) Required Skills: - Strong analytical and problem-solving abilities - Excellent understanding of dimensional modeling and starschema design (Facts, dimensions, scd type 2) - Experience with agile development methodologies - Strong communication skills and ability to work with cross-functional teams - Background in data governance and More ❯
Proficiency in ETL/ELT processes and best practices - Experience with data visualization tools (Quicksight) Required Skills: - Strong analytical and problem-solving abilities - Excellent understanding of dimensional modeling and starschema design (Facts, dimensions, scd type 2) - Experience with agile development methodologies - Strong communication skills and ability to work with cross-functional teams - Background in data governance and More ❯
data pipelines using Fabric Pipelines, Azure Data Factory, Notebooks and SSIS. Develop Power BI dashboards and paginated reports tailored to business needs. Model complex datasets into actionable insights using star schemas, snowflakes, and denormalised models. Optimise SQL queries and build semantic models within Fabric. Collaborate with cross-functional teams to translate business requirements into technical solutions. Support CI/ More ❯
data pipelines using Fabric Pipelines, Azure Data Factory, Notebooks and SSIS. Develop Power BI dashboards and paginated reports tailored to business needs. Model complex datasets into actionable insights using star schemas, snowflakes, and denormalised models. Optimise SQL queries and build semantic models within Fabric. Collaborate with cross-functional teams to translate business requirements into technical solutions. Support CI/ More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Fuse Group
data pipelines using Fabric Pipelines, Azure Data Factory, Notebooks and SSIS. Develop Power BI dashboards and paginated reports tailored to business needs. Model complex datasets into actionable insights using star schemas, snowflakes, and denormalised models. Optimise SQL queries and build semantic models within Fabric. Collaborate with cross-functional teams to translate business requirements into technical solutions. Support CI/ More ❯
data pipelines using Fabric Pipelines, Azure Data Factory, Notebooks and SSIS. Develop Power BI dashboards and paginated reports tailored to business needs. Model complex datasets into actionable insights using star schemas, snowflakes, and denormalised models. Optimise SQL queries and build semantic models within Fabric. Collaborate with cross-functional teams to translate business requirements into technical solutions. Support CI/ More ❯