You would be responsible for designing, testing, and troubleshooting the company's data storage system before it goes live. Knowledge of dimensional modeling and data warehouse concepts, such as star schemas, snowflakes, dimensions, facts Define scope, execution plans, and coordinate test activities Ensure all sign-offs on deliverables (overall test strategy, test plan, test cases, ) and that testing meets More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
design and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (StarSchema, Snowflake Schema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake … design and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (StarSchema, Snowflake Schema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake More ❯
Employment Type: Permanent, Part Time, Work From Home
for recurring tasks in line with documentation standards. Essential Skills & Experience Proficient in working with UK mortgage data, including portfolio onboarding, new lending, and securitisations. Experience with Kimball-based starschema data warehouses. Strong capabilities in reporting and data visualisation using: Microsoft SQL Server T-SQL Power BI SSRS Microsoft Excel Azure DevOps Performance tuning techniques Solid understanding More ❯
Proficiency in ETL/ELT processes and best practices - Experience with data visualization tools (Quicksight) Required Skills: - Strong analytical and problem-solving abilities - Excellent understanding of dimensional modeling and starschema design (Facts, dimensions, scd type 2) - Experience with agile development methodologies - Strong communication skills and ability to work with cross-functional teams - Background in data governance and More ❯
London, England, United Kingdom Hybrid / WFH Options
Native Instruments
If you're looking to contribute, grow, and work on diverse data challenges, join us! Your Contribution Design and develop efficient and scalable data models and data marts (e.g.star schema, snowflake schema) using best practices for data warehousing and business intelligence that are optimized for self-service analytics tools Collaborate with business stakeholders (e.g. finance, marketing, operations) to More ❯
design and rollout. Hands on experience with business intelligence tools, data modelling, data staging, and data extraction processes, including data warehouse and cloud infrastructure. Experience with multi-dimensional design, star schemas, facts and dimensions. Experience and demonstrated competencies in ETL development techniques. Experience in data warehouse performance optimization. Experience on projects across a variety of industry sectors an advantage More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
SR2 | Socially Responsible Recruitment | Certified B Corporation™
and comfortable working across both technical and business domains. ✅ Key technical skills: Strong SQL and ELT/data pipeline development experience Expertise in Data Warehouse & Data Lake design (including StarSchema, Snowflake Schema, Data Vault) Hands-on experience with enterprise databases: Oracle, Snowflake, Teradata, or SQL Server Solid understanding of AWS (S3, Lambda, IAM, etc.) Proficiency in More ❯
other cloud-native compute engines Establish coding standards, reusable components, and naming conventions using Prophecy's visual designer and metdata-driven approach Implement scalable and efficient data models (e.g starschema, scd typ2) for data marts and analytics layer Integrate Prophecy pipelines with orchestration tools like Airflow, data catalog tools for lineage Implement version control, automated testing and More ❯
when needed Excellent requirements, Epics, User Stories and Acceptance Criteria specification Managing agile product backlogs in scrum Process and requirements modelling Data Modelling of financial data and CUBE/StarSchema Data Analysis Reconciliation Defect triage and classification of bug prioritisation and severity Financial modelling and statutory reporting platforms Knowledge and Experience 7+ years of direct experience of More ❯
databases, REST APIs, Kafka streams and other sources. Apply data cleansing rules to ensure high data quality standards. Model data into a single source of truth using Kimball methodology (starschema, snowflake, etc.). Develop high-quality code following DevOps and software engineering best practices, including testing and CI/CD. Monitor and maintain business-critical pipelines, reacting More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Harding
through robust, scalable, and innovative data solutions. In this position, you'll be responsible for designing, building, and optimizing data warehouses using best practices in dimensional modelling, Kimball methodologies, starschema design, and Data Vault principles, ensuring adherence to sound data architecture practices. Based in Avonmouth on a hybrid working basis – 2 days in office per week Compensation … premises SQL Server data warehouses, focusing on stored procedures, indexing, partitioning, and load performance. A proven track record in designing and building data warehouses using Kimball methodologies, dimensional modeling, starschema design, and Data Vault techniques is essential. You should be well-versed in applying data architecture principles to create robust, scalable solutions Azure Cloud Proficiency: Deep understanding More ❯
training to empower end users. PRINCIPAL MISSIONS Design and implement data pipelines using Azure Data Factory or Microsoft Fabric. Develop and maintain SQL-based transformations and data models (e.g., starschema, snowflakes) in SQL Server, Fabric Datawarehouse/Lakehouse. Build and optimize Power BI dashboards and reports to support business decision-making. Collaborate with stakeholders to gather requirements More ❯
sources. Solid experience with data engineering tools and practices, including Python, SQL, and ETL/ELT frameworks (e.g., Azure Data Factory, dbt). Experience designing and implementing dimensional and star-schema data models for reporting and analytics. Familiarity with data governance, metadata management, and data quality principles. Awareness of data security and compliance best practices (e.g., Secure by More ❯
Data Pipeline Orchestration : Experience with workflow orchestration tools such as Apache Airflow or Prefect to manage and schedule data pipelines. Data Modelling : Strong understanding of data modelling concepts (e.g., starschema, snowflake schema) and best practices for designing efficient and scalable data architectures. Data Quality and Governance : Knowledge of data quality principles and experience implementing data governance More ❯
culture of continuous improvement and innovation Build data integrations from multiple sources, including CRM, digital, and social platforms Design, implement and optimize data models with a medallion architecture using StarSchema and Snowflake techniques to enhance query performance and support analytical workloads Ensure data quality, consistency, and reliability across all marketing datasets Collaborate with analysts and data scientists More ❯
London, England, United Kingdom Hybrid / WFH Options
Mirai Talent
data from various sources including APIs, Excel, CSV, JSON, and databases. Manage data lakes, warehouses, and lakehouses within Azure cloud environments. Apply data modelling techniques such as Kimball methodologies, star schemas, and data warehouse design principles. Build and support ETL workflows using tools like Azure Data Factory, Synapse, Delta Live Tables, dbt, SSIS, etc. Automate infrastructure deployment with Terraform … engineering experience. Has strong skills in Python, SQL, and PySpark. Experienced working with data lakes, warehouses, lakehouses, and cloud platforms, preferably Azure. Knowledgeable in data modelling, including Kimball and star schemas. Familiar with ETL tools such as Azure Data Factory, Synapse, Delta Live Tables, dbt, SSIS. Experienced with Infrastructure as Code (Terraform, ARM, Bicep). Skilled in Power BI More ❯
the data-driven solutions within our products and across the organisation. You should have in-depth knowledge of SQL databases (Microsoft or Postgres), and be familiar with dimensional modelling, starschema design and ETL processes. You will have the opportunity to be part of our strategic plan to incorporate machine learning and artificial intelligence into our product suite … or similar) in an Agile environment ◦ Experience of SQL database systems and data warehousing, focusing on stored procedures, indexing, partitioning and load performance ◦ Experience of ETL, dimensional modelling and starschema solutions ◦ Technical expertise with data models, data mining, and segmentation techniques ◦ Have a strong understanding of data security and multi-tenancy ◦ Excellent communication skills with both technical More ❯
Proficiency in ETL/ELT processes and best practices - Experience with data visualization tools (Quicksight) Required Skills: - Strong analytical and problem-solving abilities - Excellent understanding of dimensional modeling and starschema design (Facts, dimensions, scd type 2) - Experience with agile development methodologies - Strong communication skills and ability to work with cross-functional teams - Background in data governance and More ❯
languages (Spark, MPP, Python, Delta, Parquet). Experience with Azure DevOps & CICD processes, software development lifecycle including infrastructure as code (Terraform). Understand data warehousing concepts, including dimensional modelling, starschema, data aggregation, and best practices for designing efficient and scalable data models. Excellent documentation and diagramming skills. Technical Cloud Certification preferred e.g., Azure, Amazon Web Services, or More ❯
Required Skills and Experience 5+ years of experience in ETL development using Informatica (PowerCenter and/or IICS). Strong understanding of data warehousing concepts , data modelling (dimensional/starschema), and data integration patterns. Hands-on experience with SQL and relational databases (e.g., Oracle, SQL Server, PostgreSQL). Familiarity with AWS data services such as S3, Redshift More ❯
design and rollout. Hands on experience with business intelligence tools, data modelling, data staging, and data extraction processes, including data warehouse and cloud infrastructure. Experience with multi-dimensional design, star schemas, facts and dimensions. Experience and demonstrated competencies in ETL development techniques. Experience in data warehouse performance optimization. Experience on projects across a variety of industry sectors an advantage. More ❯
knowledge of Azure Synapse Analytics, Azure SQL Database, Azure Data Factory, Azure Data Lake, and related Azure data services. Expertise in designing and optimizing relational and dimensional data models (starschema, snowflake schema). Experience with ETL tools and data integration techniques. Proficient in SQL and other data querying languages. Understanding of data governance practices, data security More ❯
The Person Confident in SQL and Python for data analysis and transformation. Experience with Azure Synapse Analytics and Microsoft Fabric is highly desirable. Understanding of data modeling concepts (e.g., starschema, snowflake, normalized and denormalized structures). Experience working in Change Control or Agile methodologies. Financial services or other regulated entity experience is desirable. The Opportunity Shawbrook is More ❯
design and rollout. Hands on experience with business intelligence tools, data modelling, data staging, and data extraction processes, including data warehouse and cloud infrastructure. Experience with multi-dimensional design, star schemas, facts and dimensions. Experience and demonstrated competencies in ETL development techniques. Experience in data warehouse performance optimization. Experience on projects across a variety of industry sectors an advantage More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Capgemini
ability to communicate issues and solutions to a variety of stakeholders including the facilitation of client workshops. An understanding of key data modelling concepts (e.g., fact and dimension tables, star schemas and snowflake schemas, denormalised tables, and views). Experience with data handling, e.g. data querying, data manipulation or data wrangling to transform raw data into the desired format More ❯