London, South East, England, United Kingdom Hybrid / WFH Options
Circle Recruitment
and maintaining data pipelines using modern, cloud-based tools and practices Proficiency in Python and SQL for data engineering tasks Experience with DBT and a good understanding of data modelling approaches (e.g. star schema, dimensionalmodelling) Familiarity with Airflow or similar orchestration tools Comfortable working with AWS services such as Glue and S3, or equivalent cloud infrastructure … a better way for us to communicate, please do let us know. Data, Database, Engineer, Lead, Manager, Data Science, Data Architect, Business Intelligence, Python, SQL, DBT, Data Model, Data Modelling, AWS, Security Check, Sc Level, Sc Cleared, Sc Clearance, Security Cleared, Security Clearance, Security Vetting Clearance, Active SC, SC Vetted, Cleared To A High Government Standard, Dv Cleared, Dv More ❯
and migration strategies Maintain enterprise data models, metadata repositories, and data lineage documentation Optimise data products for performance, scalability, and reliability Skills & Experience: 10+ years in data architecture and modelling Expertise in dimensionalmodelling, 3NF, and Data Vault 2.0 Experience with real-time/streaming data solutions Proficiency in Python, SQL, and data modelling tools (e.g. … services (Lambda, SNS, S3, EKS, API Gateway) Familiarity with Snowflake, Spark, Airflow, DBT, and data governance frameworks Preferred: Certifications in cloud/data technologies Experience with API/interface modelling and CI/CD (e.g. GitHub Actions) Knowledge of Atlan and iceberg tables Reference: AMC/SCU/SDA/3007 Postcode: SW1 #secu More ❯
Define scalable data models and implement architectural patterns such as lakehouse and medallion Lead technical solution design during client engagements, from discovery to delivery Establish and enforce data governance, modelling, and lifecycle standards Support engineering and DevOps teams with guidance on best practices, CI/CD, and infrastructure-as-code Requirements 7+ years in data architecture or senior engineering … roles Strong hands-on experience with Azure Databricks and Azure Data Factory Proficient in SQL, Python , and Spark Expertise in data modelling and architectural patterns for analytics (e.g., lakehouse, medallion, dimensionalmodelling) Solid understanding of cloud security, private networking, GDPR, and PII compliance Excellent communication skills with a strong consulting mindset Desirable Experience with Microsoft Purview, Power More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Randstad Technologies
Define scalable data models and implement architectural patterns such as lakehouse and medallion Lead technical solution design during client engagements, from discovery to delivery Establish and enforce data governance, modelling, and lifecycle standards Support engineering and DevOps teams with guidance on best practices, CI/CD, and infrastructure-as-code Requirements 7+ years in data architecture or senior engineering … roles Strong hands-on experience with Azure Databricks and Azure Data Factory Proficient in SQL, Python , and Spark Expertise in data modelling and architectural patterns for analytics (e.g., lakehouse, medallion, dimensionalmodelling) Solid understanding of cloud security, private networking, GDPR, and PII compliance Excellent communication skills with a strong consulting mindset Desirable Experience with Microsoft Purview, Power More ❯
City of London, London, United Kingdom Hybrid / WFH Options
McCabe & Barton
Snowflake, Power BI, Python, and SQL. Your work will enable self-service analytics and support data governance across the business. Key Responsibilities: Develop robust ETL/ELT pipelines and dimensional models for BI tools Define and implement data quality, ownership, and security standards Empower business teams with intuitive, self-serve data models Own data products end-to-end, from … design to continuous improvement Promote innovation and best practices in data engineering About You: Strong experience with SQL, Python, and BI tools (e.g., Power BI) Solid understanding of dimensionalmodelling and data architecture Experience working in governed, decentralised data environments Excellent communication and stakeholder engagement skills Analytical mindset with a focus on delivering business value If you are More ❯
Woking, Surrey, United Kingdom Hybrid / WFH Options
Michael Page (UK)
should have: Experience with Azure Data Factory, Azure Synapse, Azure SQL, or Azure Data Lake. Hands-on knowledge of the ETL process and working with large datasets. Understanding of dimensionalmodelling and data warehousing principles. Familiarity with CI/CD pipelines or monitoring tools for data processes. Solid skills in SQL and basic knowledge of Python scripting. Exposure More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
need all of these. Experience of building a data warehouse using an ETL/ELT tool, preferably Oracle ODI Significant database experience in Snowflake or Oracle Star schema/dimensional modelling. Excellent SQL skills Good knowledge of standard data formats (XML, JSON, csv, etc) Proven experience of delivering BI solutions for business requirements Experience of developing using an Agile More ❯
Rochdale, Greater Manchester, North West, United Kingdom Hybrid / WFH Options
Footasylum Ltd
with finance/financial systems and concepts Azure Databricks Azure Data Factory Excellent SQL skills Good Python/Spark/pyspark skills Experience of Kimball Methodology and star schemas (dimensional model). Experience of working with enterprise data warehouse solutions. Experience of working with structured and unstructured data Experience of a retail environment preferred A good understanding of cloud More ❯
with the potential for extension. This role offers a hybrid working arrangement, requiring 1-2 days per week onsite at Heathrow, Hounslow, with on-site parking available. Responsibilities: Data Modelling: Design and optimize star schema data models tailored to our client's business needs for streamlined analytics and reporting. Collaboration: Work closely with data architects, BI developers, and business … with business goals. Data Quality & Governance: Establish data quality checks and governance practices to ensure accuracy and integrity within data models. Skills/Must have: Proven experience in data modelling using Kimball methodology, with a focus on dimensionalmodelling and star schemas. Strong proficiency in SQL and experience with data modelling tools like ER Studio, Power More ❯
Slough, Berkshire, United Kingdom Hybrid / WFH Options
Halton Housing
Here at Halton Housing, we are looking for an experienced Data Developer to work across our vibrant organisation. What You'll Do: Coding DAX Measures and Dimensional Models Developing & delivering visually compelling Power BI Dashboards & Reports to specification Developing and maintaining SSRS reports Developing & maintaining ETL pipeline solutions in Azure Data Factory and SSIS, utilising Azure Data Lake & Dev More ❯
Wallington, Surrey, England, United Kingdom Hybrid / WFH Options
Newmarket Holidays
priorities. • Excellent analytical and problem-solving skills, with attention to detail. • Clear and confident communication skills, both written and verbal. Nice to Have • Understanding of data warehouse architecture and dimensional modeling. • Experience working in the travel or leisure industry is a plus. What we can offer you A changing and multi-cultural team-spirited environment with opportunities to learn More ❯
analytics engineering, data engineering, or a related field. Advanced SQL skills and experience architecting solutions on modern data warehouses (e.g., Snowflake, BigQuery, Redshift). Hands-on experience with advanced modelling techniques in dbt. A deep understanding of ETL/ELT processes and tools (e.g., Fivetran, Airbyte, Stitch). Experience with data visualisation tools (e.g., Mode, Looker, Tableau, Power BI … and designing robust BI semantic layers. Exceptional understanding of data warehousing principles, dimensional modeling, and analytics best practices. Proficiency in Git-based workflows with a strong background in implementing and managing CI/CD pipelines for data transformations. Outstanding communication, collaboration, and stakeholder management skills, with a demonstrated ability to influence and lead cross-functional initiatives. Nice to Have More ❯
be responsible for: Technical Delivery & Leadership : Leading the design and implementation of Oracle Analytics solutions (FDI, OAC, ODI) to meet client requirements. Architecting end-to-end solutions encompassing data modelling, ingestion, transformation, visualization, and predictive modelling. Designing and implementing data pipelines and integrations for diverse data sources. Developing and deploying machine learning models using OML, Oracle Data Science, and … Analytics Cloud (OAC) components, including Data Visualisation, Essbase, Data Preparation, and Data Flows. Proven experience in implementing Oracle Analytics or a similar role. Strong experience in data warehousing concepts, dimensionalmodelling, and ETL processes. Ability to translate and present technical information to a non-technical audience in a clear, concise, appropriate manner Ability to translate business requirements into … analytical and problem-solving abilities. Oracle certifications in relevant technologies are highly desirable Qualified/Part-Qualified ACA/CIMA/ACCA (or equivalent) is advantageous Experience in Data Modelling Security Clearance or at least eligible to support activities in Public Sector Connect to your business - Technology and Transformation Distinctive thinking, deep expertise, innovation and collaborative working. That's More ❯
Bromsgrove, Worcestershire, United Kingdom Hybrid / WFH Options
Reed Technology
business Technical skills Cloud data platforms - Azure, AWS, or GCP (Azure preferred) Snowflake - Deep knowledge and hands-on experience Matillion - Expertise in ETL orchestration Data warehousing and advanced analytics Dimensionalmodelling and data vault methodologies Stakeholder engagement and cross-functional collaboration Flexible hybrid working - 1 day per week onsite in Worcestershire (Tuesday) This is a great opportunity for More ❯
quality dimensions and integrate metrics with centralized tools that measure data products' quality and reliability in the organization Qualifications Understanding of data engineering (including SQL, Python, Data Warehousing, ETL, DimensionalModelling, Analytics) Understanding of cloud data infrastructure elements, and ideally AWS (Redshift, Glue, Athena, S3) and understanding of existing governance frameworks of data quality and their dimensions (DAMA More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Starling Bank
work across teams do great things here at Starling, to continue changing banking for good. Responsibilities: Translate data requirements from across the organisation into robust and reusable data models, modelling both within the data warehouse and exposing via Looker, with a particular focus on Customer Data Draw insights and use appropriate analytical methods to analyse large datasets to identify … with SQL Experience with Python Strong experience with Looker or a Similar Business Intelligence (BI) Tool Good understanding and experience of DBT and applying data architecture principles such as dimensionalmodelling, to translate raw data into a structured format, or a willingness to learn Self-starter with the ability to think outside the box and evolve projects. Take More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Starling Bank Limited
focus on driving warehouse efficiencies and optimisation to reduce complexity and cost Strong experience with SQL Strong understanding and experience of DBT and applying data architecture principles such as dimensionalmodelling, to translate raw data into a structured format Strong experience with Looker or a Similar Business Intelligence (BI) Tool Self-starter with the ability to think outside More ❯
on driving warehouse efficiencies and optimisation to reduce complexity and cost Requirements Strong experience with SQL Strong understanding and experience of DBT and applying data architecture principles such as dimensionalmodelling, to translate raw data into a structured format Strong experience with Looker or a Similar Business Intelligence (BI) Tool Self-starter with the ability to think outside More ❯
the right time. Essentially, to ensure you succeed in this role you're going to need Deep, hands-on experience designing and building data warehouses with strong command of dimensional modeling (e.g., Kimball methodology) Expertise in Google Cloud Platform, especially BigQuery architecture, optimization, and cost management Advanced SQL skills and production-level experience using dbt (or similar tools) to More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
BIOMETRIC TALENT
aligned with their strategic goals. How youll spend your day Architect and evolve enterprise-wide data models across key domains like Product, Customer, Supply, Finance, and Location Apply Kimball dimensionalmodelling for consumption layer design and Data Vault methodology for raw and business layer integration Define and maintain data architecture principles, patterns, and standards Collaborate with data platform … teams on governance, quality, lineage, and optimisation Consult with squads on physical data modelling Contribute to metadata and data catalog initiatives to improve discoverability and reuse Support our shift to a Data Product Ownership model, enabling domain-aligned teams What youll bring to this role Proven experience in enterprise data architecture in a complex organisation Expertise in Kimball and … Data Vault 2.0 methodologies Strong grasp of data modelling, metadata, and governance knowledge Hands-on experience with modern data platforms (Databricks, Delta Lake, Unity Catalog, Azure) Ability to define and drive architecture principles, patterns, and best practices Excellent communication and stakeholder management skills Retail industry experience is a bonus but not a deal breaker Perks & Benefits: Discretionary company bonus More ❯
Southampton, Hampshire, United Kingdom Hybrid / WFH Options
gen2fund.com
The position requires at least 2 years of experience using QlikView version 11 or higher, with proven expertise in the following areas: Good knowledge of SQL, relational databases, and Dimensional Modeling Experience working with large data sets and complex data models involving more than 10 tables Integrating data from multiple sources into QlikView Data Models, including social media content … and API extensions Use of complex QlikView functions and developing optimal scripts for solutions Optimizing Dimensional data models for performance Primary Responsibilities: Creating and providing reporting and dashboard applications using QlikView and NPrinting to facilitate better decision-making Collaborating with stakeholders to gather requirements, and translating these into system and functional specifications Creating prototypes and conducting proof of concepts More ❯
together! The Role As a Senior Principal Data Scientist in the Multimodal Data & Analytics group you will be responsible for the discussion and implementation of data science and high-dimensional modeling methodologies applied to patient-level data (including various biomarker, clinical and outcomes data) across clinical development. You will combine your data science and AI skills and your scientific … identify opportunities for influencing internal decision making as well as discussions on white papers/regulatory policy. You will perform hands-on analysis of integrated clinical, outcomes and high-dimensional, patient-level biomarker data from clinical trials and the real world (genomics, transcriptomics, proteomics, flow cytometry etc.) to generate fit-for-purpose evidence that is applied to decision making … selection methods (e.g., lasso, elastic net, random forest), design of clinical trials. Familiarity with statistical and analytical methods for genetics and -omics data analysis and working knowledge of high dimensional biomarker platforms (e.g., next generation sequencing, transcriptomics, proteomics, flow cytometry, etc.). Strong programming skills in R and Python. Demonstrated knowledge of data visualization, exploratory analysis, and predictive modeling. More ❯
together! The Role As a Senior Principal Data Scientist in the Multimodal Data & Analytics group you will be responsible for the discussion and implementation of data science and high-dimensional modeling methodologies applied to patient-level data (including various biomarker, clinical and outcomes data) across clinical development. You will combine your data science and AI skills and your scientific … identify opportunities for influencing internal decision making as well as discussions on white papers/regulatory policy. You will perform hands-on analysis of integrated clinical, outcomes and high-dimensional, patient-level biomarker data from clinical trials and the real world (genomics, transcriptomics, proteomics, flow cytometry etc.) to generate fit-for-purpose evidence that is applied to decision making … selection methods (e.g., lasso, elastic net, random forest), design of clinical trials. Familiarity with statistical and analytical methods for genetics and -omics data analysis and working knowledge of high dimensional biomarker platforms (e.g., next generation sequencing, transcriptomics, proteomics, flow cytometry, etc.). Strong programming skills in R and Python. Demonstrated knowledge of data visualization, exploratory analysis, and predictive modeling. More ❯