owners to capture data requirements. Support the design and oversight of the data infrastructure and technical environments. Ensure architecture compliance, documentation standards, and technical integrity. Skills & Experience Expertise in dimensionalmodelling and strong understanding of Third Normal Form data models. Demonstrated experience across conceptual, logical, and physical data architectures. Familiarity with data integration, ETL, and data transformation patterns. … Kingdom 2 months ago Greater London, England, United Kingdom 1 week ago Ilford, England, United Kingdom 1 month ago Isleworth, England, United Kingdom 6 days ago Data Architect - Data Modelling - Dimensional - Relational - Data Vault London, England, United Kingdom 3 days ago Big Data Architect (Open source contributor) City Of London, England, United Kingdom 90,000.00-90,000.00 More ❯
Hands-on expertise in Azure ecosystem, including components like Azure Data Factory, Azure Data Lake Storage, Azure, SQL, Azure DataBricks, HD Insights, ML Service etc. Expertise in relational and dimensionalmodelling, including big data technologies. Experience in Azure Infrastructure and Azure Dev Ops will be a strong plus. In terms of business responsibilities - Client Engagement & Business Growth with … Hands-on expertise in Azure ecosystem, including components like Azure Data Factory, Azure Data Lake Storage, Azure, SQL, Azure DataBricks, HD Insights, ML Service etc. Expertise in relational and dimensionalmodelling, including big data technologies. Experience in Azure Infrastructure and Azure Dev Ops will be a strong plus. Nice to have: Client Engagement & Business Growth with a heavy More ❯
the design and implementation of the data architecture for our cutting-edge Azure Databricks platform focused on economic data. This platform is crucial for our Monetary Analysis, Forecasting, and Modelling efforts. The Data Architect will be responsible for defining the overall data strategy, data models, data governance framework, and data integration patterns. This role requires a deep understanding of … themes. Define the target data architecture and roadmap, considering scalability, performance, security, and cost-effectiveness. Stay abreast of industry trends and emerging technologies in data management and analytics. Data Modelling & Design: Design and implement logical and physical data models that support the analytical and modelling requirements of the platform. Define data dictionaries, data lineage, and metadata management processes. … e.g., APIs, databases, financial data providers). Work closely with data engineers to implement and optimise data pipelines within the Azure Databricks environment. Ensure data is readily available for modelling runtimes (Python, R, MATLAB). Data Governance & Quality: Establish and enforce data governance policies, standards, and procedures. Define data quality metrics and implement data quality monitoring processes. Ensure compliance More ❯
of a modern Data platform solution to handle large and complex data sets. Key tasks: Design Data Lake and Data Warehouse solutions Design Data Models using Data Vault and Dimensionalmodelling methods Implement automated, reusable and efficient batch data pipelines and streaming data pipelines Work closely with Governance and Quality teams to ensure metadata, catalogue, lineage and known More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Circle Recruitment
and maintaining data pipelines using modern, cloud-based tools and practices Proficiency in Python and SQL for data engineering tasks Experience with DBT and a good understanding of data modelling approaches (e.g. star schema, dimensionalmodelling) Familiarity with Airflow or similar orchestration tools Comfortable working with AWS services such as Glue and S3, or equivalent cloud infrastructure … a better way for us to communicate, please do let us know. Data, Database, Engineer, Lead, Manager, Data Science, Data Architect, Business Intelligence, Python, SQL, DBT, Data Model, Data Modelling, AWS, Security Check, Sc Level, Sc Cleared, Sc Clearance, Security Cleared, Security Clearance, Security Vetting Clearance, Active SC, SC Vetted, Cleared To A High Government Standard, Dv Cleared, Dv More ❯
Define scalable data models and implement architectural patterns such as lakehouse and medallion Lead technical solution design during client engagements, from discovery to delivery Establish and enforce data governance, modelling, and lifecycle standards Support engineering and DevOps teams with guidance on best practices, CI/CD, and infrastructure-as-code Requirements 7+ years in data architecture or senior engineering … roles Strong hands-on experience with Azure Databricks and Azure Data Factory Proficient in SQL, Python , and Spark Expertise in data modelling and architectural patterns for analytics (e.g., lakehouse, medallion, dimensionalmodelling) Solid understanding of cloud security, private networking, GDPR, and PII compliance Excellent communication skills with a strong consulting mindset Desirable Experience with Microsoft Purview, Power More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Randstad Technologies
Define scalable data models and implement architectural patterns such as lakehouse and medallion Lead technical solution design during client engagements, from discovery to delivery Establish and enforce data governance, modelling, and lifecycle standards Support engineering and DevOps teams with guidance on best practices, CI/CD, and infrastructure-as-code Requirements 7+ years in data architecture or senior engineering … roles Strong hands-on experience with Azure Databricks and Azure Data Factory Proficient in SQL, Python , and Spark Expertise in data modelling and architectural patterns for analytics (e.g., lakehouse, medallion, dimensionalmodelling) Solid understanding of cloud security, private networking, GDPR, and PII compliance Excellent communication skills with a strong consulting mindset Desirable Experience with Microsoft Purview, Power More ❯
and data analysis, ideally including both raw and aggregated data with the ability to review transformation logic Solid understanding of the full Software Development Lifecycle (SDLC) Appreciation of data modelling techniques (e.g. dimensionalmodelling, data vault) Strong knowledge of BCBS239 regulations and their practical applications. Experience in analysing business processes and delivering regulatory solutions. Proficiency in creating More ❯
City of London, London, United Kingdom Hybrid / WFH Options
McCabe & Barton
Snowflake, Power BI, Python, and SQL. Your work will enable self-service analytics and support data governance across the business. Key Responsibilities: Develop robust ETL/ELT pipelines and dimensional models for BI tools Define and implement data quality, ownership, and security standards Empower business teams with intuitive, self-serve data models Own data products end-to-end, from … design to continuous improvement Promote innovation and best practices in data engineering About You: Strong experience with SQL, Python, and BI tools (e.g., Power BI) Solid understanding of dimensionalmodelling and data architecture Experience working in governed, decentralised data environments Excellent communication and stakeholder engagement skills Analytical mindset with a focus on delivering business value If you are More ❯
VAR, CE/PE & PFE. · Experience working with large scale, multi-terabyte data warehouses including performance tuning, query optimization and execution plan analysis · Advanced knowledge of data warehousing principles, dimensionalmodelling and star schema design · Hands-on experience with SQL Server and Snowflake, including their architecture, features and best practices · Familiarity with data integration tools (SSIS, ADF) and … cyber security best practices · Effective communicator that is able to build and maintain relationships with senior stakeholders · Strong analytical and problem solving capabilities Preferred · Experience with data transformation and modelling · Familiarity with Agile methodologies and DevOps practices Job Title: Head Of Risk Location: London, UK Rate/Salary: 700.00 - 1100.00 GBP Daily Job Type: Contract Trading as TEKsystems. Allegis More ❯
with the potential for extension. This role offers a hybrid working arrangement, requiring 1-2 days per week onsite at Heathrow, Hounslow, with on-site parking available. Responsibilities: Data Modelling: Design and optimize star schema data models tailored to our client's business needs for streamlined analytics and reporting. Collaboration: Work closely with data architects, BI developers, and business … with business goals. Data Quality & Governance: Establish data quality checks and governance practices to ensure accuracy and integrity within data models. Skills/Must have: Proven experience in data modelling using Kimball methodology, with a focus on dimensionalmodelling and star schemas. Strong proficiency in SQL and experience with data modelling tools like ER Studio, Power More ❯
experience. Strong background in System Integration, Application Development, or Data-Warehouse projects across enterprise technologies. Experience with Object-oriented languages (e.g., Python, PySpark) and frameworks. Expertise in relational and dimensional modeling, including big data technologies. Proficiency in Microsoft Azure components like Azure Data Factory, Data Lake, SQL, DataBricks, HD Insights, ML Service. Good knowledge of Python and Spark. Experience More ❯
analytics engineering, data engineering, or a related field. Advanced SQL skills and experience architecting solutions on modern data warehouses (e.g., Snowflake, BigQuery, Redshift). Hands-on experience with advanced modelling techniques in dbt. A deep understanding of ETL/ELT processes and tools (e.g., Fivetran, Airbyte, Stitch). Experience with data visualisation tools (e.g., Mode, Looker, Tableau, Power BI … and designing robust BI semantic layers. Exceptional understanding of data warehousing principles, dimensional modeling, and analytics best practices. Proficiency in Git-based workflows with a strong background in implementing and managing CI/CD pipelines for data transformations. Outstanding communication, collaboration, and stakeholder management skills, with a demonstrated ability to influence and lead cross-functional initiatives. Nice to Have More ❯
be responsible for: Technical Delivery & Leadership : Leading the design and implementation of Oracle Analytics solutions (FDI, OAC, ODI) to meet client requirements. Architecting end-to-end solutions encompassing data modelling, ingestion, transformation, visualization, and predictive modelling. Designing and implementing data pipelines and integrations for diverse data sources. Developing and deploying machine learning models using OML, Oracle Data Science, and … Analytics Cloud (OAC) components, including Data Visualisation, Essbase, Data Preparation, and Data Flows. Proven experience in implementing Oracle Analytics or a similar role. Strong experience in data warehousing concepts, dimensionalmodelling, and ETL processes. Ability to translate and present technical information to a non-technical audience in a clear, concise, appropriate manner Ability to translate business requirements into … analytical and problem-solving abilities. Oracle certifications in relevant technologies are highly desirable Qualified/Part-Qualified ACA/CIMA/ACCA (or equivalent) is advantageous Experience in Data Modelling Security Clearance or at least eligible to support activities in Public Sector Connect to your business - Technology and Transformation Distinctive thinking, deep expertise, innovation and collaborative working. That's More ❯
They closely collaborate with data scientists, analysts, and software engineers to ensure efficient data processing, storage, and retrieval for business insights and decision-making. From their expertise in data modelling, ETL (Extract, Transform, Load) processes, and big data technologies it becomes possible to develop robust and reliable data solutions. RESPONSIBILITIES Data Pipeline Development: Design, implement, and maintain scalable data … in software engineering a plus. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka, and Flink. Experience in using modern data architectures More ❯
quality dimensions and integrate metrics with centralized tools that measure data products' quality and reliability in the organization Qualifications Understanding of data engineering (including SQL, Python, Data Warehousing, ETL, DimensionalModelling, Analytics) Understanding of cloud data infrastructure elements, and ideally AWS (Redshift, Glue, Athena, S3) and understanding of existing governance frameworks of data quality and their dimensions (DAMA More ❯
performance-optimized transformations at scale. Expertise with modern cloud-based data platforms (e.g., Snowflake, BigQuery, Redshift) and experience managing large, distributed data environments. Strong understanding of data architecture and dimensional modeling principles, with a track record of designing enterprise-grade schemas. Proven ability to lead projects through agile software development lifecycles and mentor junior team members. Proficiency with version More ❯
performance-optimized transformations at scale. Expertise with modern cloud-based data platforms (e.g., Snowflake, BigQuery, Redshift) and experience managing large, distributed data environments. Strong understanding of data architecture and dimensional modeling principles, with a track record of designing enterprise-grade schemas. Proven ability to lead projects through agile software development lifecycles and mentor junior team members. Proficiency with version More ❯
platform, ML & analytics. Governance models & data dictionaries adopted org-wide, improving trust. Clear growth and upskilling within the data engineering team. Deep experience in data engineering & architecture (Data Lakes, dimensional models, MySQL/PostgreSQL, MongoDB). Familiar with orchestration, CI/CD for data, and data quality/observability tools. Hands-on leader balancing strategic planning with code/ More ❯
the right time. Essentially, to ensure you succeed in this role you're going to need Deep, hands-on experience designing and building data warehouses with strong command of dimensional modeling (e.g., Kimball methodology) Expertise in Google Cloud Platform, especially BigQuery architecture, optimization, and cost management Advanced SQL skills and production-level experience using dbt (or similar tools) to More ❯
Women in Data We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in full data modelling life cycle, e.g. design, implement, and maintain complex data models that align with organisational goals and industry standards. This role requires a deep understanding of data architecture, data modelling … and data lineage documentation. Ensure data models comply with organisational policies and regulatory requirements. Optimise data products and their components for performance, scalability, and reliability. Evaluate and recommend data modelling tools and technologies. Stay updated on industry trends and emerging technologies in data architecture. Identify and resolve data inconsistencies, redundancies, and performance issues. Provide technical leadership in addressing complex … ll bring: Education: Bachelors or Masters degree in Computer Science, Information Technology, or a related field. 10+ years of experience in data architecture and modelling. Proven experience in data modelling, data architecture, and data products design. Proven experience and expertise in data modelling standards, techniques (e.g. dimensional model, 3NF, Vault 2.0) Familiarity with both analytical and real More ❯
impact at scale, wed love to hear from you. What you'll do Work with customers daily to transform their existing solutions, spreadsheets, and business problems into sophisticated multi-dimensional models by Understanding business requirements & document them Designing & building the corresponding Pigment applications Training/Co-building with the customer o they are self-sufficient in the solution Participating More ❯
at scale, we'd love to hear from you. What you'll do Work with customers daily to transform their existing solutions, spreadsheets, and business problems into sophisticated multi-dimensional models by Understanding business requirements & document them Designing & building the corresponding Pigment applications Training/Co-building with the customer o they are self-sufficient in the solution Participating More ❯
day per week) Duration: 1 Month Availability: Immediate joiner preferred. Over view of the role- BI Development & Reporting We need BI professionals familiar with OLAP concepts , such as: Dimensional modeling (facts, dimensions, hierarchies) Analytical KPIs and business logic Data modelling best practices Integration & Interoperability * Enable interoperability with third-party tools like Tableau and Amazon QuickSight * Manage secure integrations More ❯
together! The Role As a Senior Principal Data Scientist in the Multimodal Data & Analytics group you will be responsible for the discussion and implementation of data science and high-dimensional modeling methodologies applied to patient-level data (including various biomarker, clinical and outcomes data) across clinical development. You will combine your data science and AI skills and your scientific … identify opportunities for influencing internal decision making as well as discussions on white papers/regulatory policy. You will perform hands-on analysis of integrated clinical, outcomes and high-dimensional, patient-level biomarker data from clinical trials and the real world (genomics, transcriptomics, proteomics, flow cytometry etc.) to generate fit-for-purpose evidence that is applied to decision making … selection methods (e.g., lasso, elastic net, random forest), design of clinical trials. Familiarity with statistical and analytical methods for genetics and -omics data analysis and working knowledge of high dimensional biomarker platforms (e.g., next generation sequencing, transcriptomics, proteomics, flow cytometry, etc.). Strong programming skills in R and Python. Demonstrated knowledge of data visualization, exploratory analysis, and predictive modeling. More ❯