Hands-on expertise in Azure ecosystem, including components like Azure Data Factory, Azure Data Lake Storage, Azure, SQL, Azure DataBricks, HD Insights, ML Service etc. Expertise in relational and dimensionalmodelling, including big data technologies. Experience in Azure Infrastructure and Azure Dev Ops will be a strong plus. In terms of business responsibilities - Client Engagement & Business Growth with … Hands-on expertise in Azure ecosystem, including components like Azure Data Factory, Azure Data Lake Storage, Azure, SQL, Azure DataBricks, HD Insights, ML Service etc. Expertise in relational and dimensionalmodelling, including big data technologies. Experience in Azure Infrastructure and Azure Dev Ops will be a strong plus. Nice to have: Client Engagement & Business Growth with a heavy More ❯
s strategic objectives Provide expert input into data platform design and help evaluate new technologies and tools as the organisation scales Act as a subject matter expert on data modelling and architectural frameworks across the organisation About You Requirements Minimum of 5 years' experience in a Data Architect role Proven experience designing and implementing enterprise-scale data architecture in … complex environments Deep understanding of data modelling techniques-including conceptual, logical, and physical modelling Strong expertise in Kimball methodology and dimensionalmodelling (e.g. star schema design) Experience with modern cloud data platforms, ideally including Microsoft Azure, Databricks, and associated tools (e.g., Azure Data Factory, Azure SQL, Synapse) Familiarity with modern data engineering practices including the use More ❯
of a modern Data platform solution to handle large and complex data sets. Key tasks: Design Data Lake and Data Warehouse solutions Design Data Models using Data Vault and Dimensionalmodelling methods Implement automated, reusable and efficient batch data pipelines and streaming data pipelines Work closely with Governance and Quality teams to ensure metadata, catalogue, lineage and known More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Circle Recruitment
and maintaining data pipelines using modern, cloud-based tools and practices Proficiency in Python and SQL for data engineering tasks Experience with DBT and a good understanding of data modelling approaches (e.g. star schema, dimensionalmodelling) Familiarity with Airflow or similar orchestration tools Comfortable working with AWS services such as Glue and S3, or equivalent cloud infrastructure … a better way for us to communicate, please do let us know. Data, Database, Engineer, Lead, Manager, Data Science, Data Architect, Business Intelligence, Python, SQL, DBT, Data Model, Data Modelling, AWS, Security Check, Sc Level, Sc Cleared, Sc Clearance, Security Cleared, Security Clearance, Security Vetting Clearance, Active SC, SC Vetted, Cleared To A High Government Standard, Dv Cleared, Dv More ❯
Define scalable data models and implement architectural patterns such as lakehouse and medallion Lead technical solution design during client engagements, from discovery to delivery Establish and enforce data governance, modelling, and lifecycle standards Support engineering and DevOps teams with guidance on best practices, CI/CD, and infrastructure-as-code Requirements 7+ years in data architecture or senior engineering … roles Strong hands-on experience with Azure Databricks and Azure Data Factory Proficient in SQL, Python , and Spark Expertise in data modelling and architectural patterns for analytics (e.g., lakehouse, medallion, dimensionalmodelling) Solid understanding of cloud security, private networking, GDPR, and PII compliance Excellent communication skills with a strong consulting mindset Desirable Experience with Microsoft Purview, Power More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Randstad Technologies
Define scalable data models and implement architectural patterns such as lakehouse and medallion Lead technical solution design during client engagements, from discovery to delivery Establish and enforce data governance, modelling, and lifecycle standards Support engineering and DevOps teams with guidance on best practices, CI/CD, and infrastructure-as-code Requirements 7+ years in data architecture or senior engineering … roles Strong hands-on experience with Azure Databricks and Azure Data Factory Proficient in SQL, Python , and Spark Expertise in data modelling and architectural patterns for analytics (e.g., lakehouse, medallion, dimensionalmodelling) Solid understanding of cloud security, private networking, GDPR, and PII compliance Excellent communication skills with a strong consulting mindset Desirable Experience with Microsoft Purview, Power More ❯
and data analysis, ideally including both raw and aggregated data with the ability to review transformation logic Solid understanding of the full Software Development Lifecycle (SDLC) Appreciation of data modelling techniques (e.g. dimensionalmodelling, data vault) Strong knowledge of BCBS239 regulations and their practical applications. Experience in analysing business processes and delivering regulatory solutions. Proficiency in creating More ❯
orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms (e.g., AWS, Azure, GCP) and relevant data services. Understanding of data warehousing concepts, dimensional modeling, and ELT principles. Familiarity with data quality, governance, and security best practices. Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively in a fast More ❯
orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms (e.g., AWS, Azure, GCP) and relevant data services. Understanding of data warehousing concepts, dimensional modeling, and ELT principles. Familiarity with data quality, governance, and security best practices. Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively in a fast More ❯
City of London, London, United Kingdom Hybrid / WFH Options
McCabe & Barton
Snowflake, Power BI, Python, and SQL. Your work will enable self-service analytics and support data governance across the business. Key Responsibilities: Develop robust ETL/ELT pipelines and dimensional models for BI tools Define and implement data quality, ownership, and security standards Empower business teams with intuitive, self-serve data models Own data products end-to-end, from … design to continuous improvement Promote innovation and best practices in data engineering About You: Strong experience with SQL, Python, and BI tools (e.g., Power BI) Solid understanding of dimensionalmodelling and data architecture Experience working in governed, decentralised data environments Excellent communication and stakeholder engagement skills Analytical mindset with a focus on delivering business value If you are More ❯
VAR, CE/PE & PFE. · Experience working with large scale, multi-terabyte data warehouses including performance tuning, query optimization and execution plan analysis · Advanced knowledge of data warehousing principles, dimensionalmodelling and star schema design · Hands-on experience with SQL Server and Snowflake, including their architecture, features and best practices · Familiarity with data integration tools (SSIS, ADF) and … cyber security best practices · Effective communicator that is able to build and maintain relationships with senior stakeholders · Strong analytical and problem solving capabilities Preferred · Experience with data transformation and modelling · Familiarity with Agile methodologies and DevOps practices Job Title: Head Of Risk Location: London, UK Rate/Salary: 700.00 - 1100.00 GBP Daily Job Type: Contract Trading as TEKsystems. Allegis More ❯
with the potential for extension. This role offers a hybrid working arrangement, requiring 1-2 days per week onsite at Heathrow, Hounslow, with on-site parking available. Responsibilities: Data Modelling: Design and optimize star schema data models tailored to our client's business needs for streamlined analytics and reporting. Collaboration: Work closely with data architects, BI developers, and business … with business goals. Data Quality & Governance: Establish data quality checks and governance practices to ensure accuracy and integrity within data models. Skills/Must have: Proven experience in data modelling using Kimball methodology, with a focus on dimensionalmodelling and star schemas. Strong proficiency in SQL and experience with data modelling tools like ER Studio, Power More ❯
experience. Strong background in System Integration, Application Development, or Data-Warehouse projects across enterprise technologies. Experience with Object-oriented languages (e.g., Python, PySpark) and frameworks. Expertise in relational and dimensional modeling, including big data technologies. Proficiency in Microsoft Azure components like Azure Data Factory, Data Lake, SQL, DataBricks, HD Insights, ML Service. Good knowledge of Python and Spark. Experience More ❯
them with robust data pipelines. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka, and Flink. Experience in using modern data architectures More ❯
analytics engineering, data engineering, or a related field. Advanced SQL skills and experience architecting solutions on modern data warehouses (e.g., Snowflake, BigQuery, Redshift). Hands-on experience with advanced modelling techniques in dbt. A deep understanding of ETL/ELT processes and tools (e.g., Fivetran, Airbyte, Stitch). Experience with data visualisation tools (e.g., Mode, Looker, Tableau, Power BI … and designing robust BI semantic layers. Exceptional understanding of data warehousing principles, dimensional modeling, and analytics best practices. Proficiency in Git-based workflows with a strong background in implementing and managing CI/CD pipelines for data transformations. Outstanding communication, collaboration, and stakeholder management skills, with a demonstrated ability to influence and lead cross-functional initiatives. Nice to Have More ❯
be responsible for: Technical Delivery & Leadership : Leading the design and implementation of Oracle Analytics solutions (FDI, OAC, ODI) to meet client requirements. Architecting end-to-end solutions encompassing data modelling, ingestion, transformation, visualization, and predictive modelling. Designing and implementing data pipelines and integrations for diverse data sources. Developing and deploying machine learning models using OML, Oracle Data Science, and … Analytics Cloud (OAC) components, including Data Visualisation, Essbase, Data Preparation, and Data Flows. Proven experience in implementing Oracle Analytics or a similar role. Strong experience in data warehousing concepts, dimensionalmodelling, and ETL processes. Ability to translate and present technical information to a non-technical audience in a clear, concise, appropriate manner Ability to translate business requirements into … analytical and problem-solving abilities. Oracle certifications in relevant technologies are highly desirable Qualified/Part-Qualified ACA/CIMA/ACCA (or equivalent) is advantageous Experience in Data Modelling Security Clearance or at least eligible to support activities in Public Sector Connect to your business - Technology and Transformation Distinctive thinking, deep expertise, innovation and collaborative working. That's More ❯
They closely collaborate with data scientists, analysts, and software engineers to ensure efficient data processing, storage, and retrieval for business insights and decision-making. From their expertise in data modelling, ETL (Extract, Transform, Load) processes, and big data technologies it becomes possible to develop robust and reliable data solutions. RESPONSIBILITIES Data Pipeline Development: Design, implement, and maintain scalable data … in software engineering a plus. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka, and Flink. Experience in using modern data architectures More ❯
quality dimensions and integrate metrics with centralized tools that measure data products' quality and reliability in the organization Qualifications Understanding of data engineering (including SQL, Python, Data Warehousing, ETL, DimensionalModelling, Analytics) Understanding of cloud data infrastructure elements, and ideally AWS (Redshift, Glue, Athena, S3) and understanding of existing governance frameworks of data quality and their dimensions (DAMA More ❯
platform, ML & analytics. Governance models & data dictionaries adopted org-wide, improving trust. Clear growth and upskilling within the data engineering team. Deep experience in data engineering & architecture (Data Lakes, dimensional models, MySQL/PostgreSQL, MongoDB). Familiar with orchestration, CI/CD for data, and data quality/observability tools. Hands-on leader balancing strategic planning with code/ More ❯
the right time. Essentially, to ensure you succeed in this role you're going to need Deep, hands-on experience designing and building data warehouses with strong command of dimensional modeling (e.g., Kimball methodology) Expertise in Google Cloud Platform, especially BigQuery architecture, optimization, and cost management Advanced SQL skills and production-level experience using dbt (or similar tools) to More ❯
at scale, we'd love to hear from you. What you'll do Work with customers daily to transform their existing solutions, spreadsheets, and business problems into sophisticated multi-dimensional models by Understanding business requirements & document them Designing & building the corresponding Pigment applications Training/Co-building with the customer o they are self-sufficient in the solution Participating More ❯
day per week) Duration: 1 Month Availability: Immediate joiner preferred. Over view of the role- BI Development & Reporting We need BI professionals familiar with OLAP concepts , such as: Dimensional modeling (facts, dimensions, hierarchies) Analytical KPIs and business logic Data modelling best practices Integration & Interoperability * Enable interoperability with third-party tools like Tableau and Amazon QuickSight * Manage secure integrations More ❯
together! The Role As a Senior Principal Data Scientist in the Multimodal Data & Analytics group you will be responsible for the discussion and implementation of data science and high-dimensional modeling methodologies applied to patient-level data (including various biomarker, clinical and outcomes data) across clinical development. You will combine your data science and AI skills and your scientific … identify opportunities for influencing internal decision making as well as discussions on white papers/regulatory policy. You will perform hands-on analysis of integrated clinical, outcomes and high-dimensional, patient-level biomarker data from clinical trials and the real world (genomics, transcriptomics, proteomics, flow cytometry etc.) to generate fit-for-purpose evidence that is applied to decision making … selection methods (e.g., lasso, elastic net, random forest), design of clinical trials. Familiarity with statistical and analytical methods for genetics and -omics data analysis and working knowledge of high dimensional biomarker platforms (e.g., next generation sequencing, transcriptomics, proteomics, flow cytometry, etc.). Strong programming skills in R and Python. Demonstrated knowledge of data visualization, exploratory analysis, and predictive modeling. More ❯