of data pipelines. Demonstrated ability to design and implement data integration and conversion pipelines using ETL/ELT tools, accelerators, and frameworks such as Azure Data Factory, Azure Synapse, Snowflake (Cloud), SSIS, or custom scripts. Skilled in developing reusable ETL frameworks for data processing. Proficient in at least one programming language commonly used for data manipulation and scripting, including … as Code. Advanced SQL skills and experience working with relational databases like PostgreSQL, SQL Server, Oracle, and MySQL. Experience implementing solutions on cloud-based data platforms such as Azure, Snowflake, Google Cloud, and related accelerators. Experience with developing and deploying containerised microservices architectures. Understanding of data modelling techniques, including star schema, snowflakeschema, and third normal More ❯
enterprise data management and governance principles. Proven experience delivering Business Intelligence (BI) solutions and dashboards to enable data-driven decisions. Experience designing relational and dimensional data models (e.g. star schema, snowflake, etc.). Proficient in ETL and data warehousing, including handling slowly changing dimensions. Excellent communication and interpersonal skills, with the ability to liaise confidently between technical and More ❯
Job Title: Snowflake Centre of Excellence Lead Location: Central London (Hybrid - 2 to 3 days on site per week) Employment Type: Permanent Salary: up to £120,000 per annum + benefits About the Role: We are working with a prestigious client based in London who are seeking a Snowflake Lead to play a pivotal role in establishing and … scaling their Snowflake capability. This is a unique opportunity for a seasoned Snowflake professional to build a Centre of Excellence from the ground up within a fast-paced, high-impact environment. As the Snowflake CoE Lead, you will be instrumental in shaping the organisation's Snowflake strategy, architecture, and delivery model. You'll bring your deep … technical expertise, leadership experience, and direct engagement with Snowflake to build a best-in-class data platform offering. Key Responsibilities: Lead the design, setup, and growth of a Snowflake practice, including establishing a Centre of Excellence. Architect, implement, and maintain scalable data solutions using Snowflake. Collaborate closely with stakeholders across the organisation and with Snowflake directly to More ❯
Skills and Experience 5+ years of hands-on experience designing and managing modern data warehouses (ideally in Redshift) Advanced SQL skills and strong understanding of data modelling (star/snowflake schemas) Deep expertise in dbt - including documentation, testing, and CI/CD Proficiency with Python or Bash for automation and orchestration Familiarity with pipeline orchestration tools (e.g., Airflow) Knowledge More ❯
practices, system development life cycle management, IT services management, agile and lean methodologies, infrastructure and operations, and EA and ITIL frameworks. Proficiency with data warehousing solutions (e.g., Google BigQuery, Snowflake). Expertise in data modeling tools and techniques (e.g., SAP PowerDesigner, EA Sparx). Strong knowledge of SQL and NoSQL databases (e.g., MongoDB, Cassandra). Familiarity with cloud platforms More ❯
full data lifecycle-from design and development to deployment and optimisation is highly desirable. Strong knowledge and experience ofdata pipelines, data modelling, database design, and data warehousing. Proficiency in Snowflake, Power BI and medallion architecture. Ideal candidates will have been accountable for building out a data platform capability on GCP Big Query or Snowflake, implementing a medallion model … full data lifecycle-from design and development to deployment and optimisation is highly desirable. Strong knowledge and experience ofdata pipelines, data modelling, database design, and data warehousing. Proficiency in Snowflake, Power BI and medallion architecture. Ideal candidates will have been accountable for building out a data platform capability on GCP Big Query or Snowflake, implementing a medallion model More ❯
improving a system that handles a high volume of traffic so experience with cloud and data warehouse infrastructure will help you in this role (we use AWS, Cloudflare and snowflake). Familiarity with infrastructure as code will also help when updating our cloud architecture (we use terraform). We place a large focus on data quality so you'll … not just the code, but the architecture of our platforms and everything that enables the business to thrive. Gain expertise over our tools and services: Python, Docker, Github Actions, Snowflake, AWS Participate in all team ceremonies and have direct input in the team's ways of working. This is a high-trust, supportive, and collaborative environment where you will … experience-we want to hire people to grow into the role and beyond. About the team: Python is our bread and butter. The wider data platform team uses dbt, Snowflake, and Looker to model, transform, and expose data for analytics and reporting across the business. We use Docker and Kubernetes to manage our production services. We use Github Actions More ❯
data and analytics for a fortune 100 company. As a key member of this team, you will play a pivotal role in designing and implementing data warehousing solutions using Snowflake and AWS. You will help drive the evolution of our data architecture as we move our North Star from Redshift to Snowflake. At Liberty, we deliver our customers peace … at least one programming language, with Python strongly preferred for data processing, automation, and pipeline development • Strong acumen for application health through performance monitoring, logging, and debugging • AWS or Snowflake certifications are a plus About Liberty Specialty Markets (LSM) Liberty Specialty Markets is part of Global Risk Solutions and the broader Liberty Mutual Insurance Group, which is a leading More ❯
ETL code at scale. Modern Data Pipelines: Experience with batch and streaming frameworks (e.g., Apache Spark, Flink, Kafka Streams, Beam), including orchestration via Airflow, Prefect or Dagster. Data Modeling & Schema Management: Demonstrated expertise in designing, evolving, and documenting schemas (OLAP/OLTP, dimensional, star/snowflake, CDC), data contracts, and data cataloguing. API & Integration Fluency: Building data ingestion More ❯
function and work closely with senior stakeholders, including programme sponsors and business leads. Key responsibilities - Platform Engineer Support the build and enhancement of a cloud-based data platform using Snowflake on Azure, working within an established technical framework. Develop infrastructure components and manage deployment through infrastructure-as-code practices. Collaborate with internal stakeholders to ensure the data platform design … assurance processes. Contribute to platform documentation and technical standards. There is potential for future responsibility in mentoring or overseeing junior team members. Required experience Hands-on experience working with Snowflake, including schema development and data pipeline integration. Familiarity with Azure services, including resource provisioning and identity setup. Experience using Terraform for infrastructure deployment (ideally with cloud data services More ❯
a unified, scalable architecture to enable improved analytics, data governance, and operational insights. Technical requirements: ️ Substantial experience designing and implementing data solutions on Microsoft Azure Hands-on expertise with Snowflake, including data modelling, performance optimisation, and secure data sharing practices. Proficiency in DBT (Data Build Tool), with a strong understanding of modular pipeline development, testing, and version control. Familiarity … both enterprise reporting and self-service analytics. Candidates must demonstrate experience working in Agile environments, delivering in iterative cycles aligned to business value. Tech Stack: Azure Power BI DBT Snowflake About us esynergy is a technology consultancy and we build products, platforms and services to accelerate value for our clients. We drive measurable impact that is tightly aligned to More ❯
Design and implement solutions using GCP , with a strong focus on Data Lakehouse Architecture , Master Data Management (MDM) , and Dimensional Data Modeling Work with modern databases and platforms, including Snowflake , Oracle , SQL Server , and PostgreSQL Apply Agile and conventional methodologies to manage development and delivery lifecycles Communicate effectively with stakeholders across the business to ensure alignment and engagement Required … Technical Skills: GCP Data Architecture Data Lakehouse Architecture MDM (Conceptual) Dimensional Data Modeling Snowflake, Oracle, SQL Server, PostgreSQL Python and Power BI (desirable) Knowledge of test automation tools and practices Strong understanding of Agile and software development best practices Ideal Candidate Profile: Extensive experience in delivering data programmes within the Insurance or Reinsurance sector Strong leadership and organisational skills More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Randstad Technologies
Design and implement solutions using GCP , with a strong focus on Data Lakehouse Architecture , Master Data Management (MDM) , and Dimensional Data Modeling Work with modern databases and platforms, including Snowflake , Oracle , SQL Server , and PostgreSQL Apply Agile and conventional methodologies to manage development and delivery lifecycles Communicate effectively with stakeholders across the business to ensure alignment and engagement Required … Technical Skills: GCP Data Architecture Data Lakehouse Architecture MDM (Conceptual) Dimensional Data Modeling Snowflake, Oracle, SQL Server, PostgreSQL Python and Power BI (desirable) Knowledge of test automation tools and practices Strong understanding of Agile and software development best practices Ideal Candidate Profile: Extensive experience in delivering data programmes within the Insurance or Reinsurance sector Strong leadership and organisational skills More ❯
the role is fully remote. For location-specific details, please connect with our recruiting team. What You Will Do: Model usage based economics - Build and maintain dbt models in Snowflake that translate raw product telemetry into ARR, margin, and unit economics views. Forecast revenue & costs - Design Python driven Monte Carlo and statistical models to project usage, revenue, and gross … testing, reconciliation, and data quality SLAs so Finance can audit every revenue and cost metric back to source. Mentor analysts & engineers - Provide best practice guidance on dbt design patterns, Snowflake optimization, and Python analytical tooling. About You: 7+ years in analytics or data science roles focused on Finance (SaaS or usage based preferred). Deep hands on expertise withdbt More ❯
Where Data Does More. Join the Snowflake team. Commercial Counsel There is only one Data Cloud. Snowflake's founders started from scratch and designed a data platform built for the cloud that is effective, affordable, and accessible to all data users. But it didn't stop there. They engineered Snowflake to power the Data Cloud, where thousands … Develop and deliver training on legal issues and contract processes • Support continuous improvements of forms, policies and processes to help streamline, simplify and automate the contracting processes A successful Snowflake Commercial Counsel will be able to forge strong relationships with Snowflake's customers, and also with cross-functional groups within Snowflake, including sales, professional services and alliances. … The right candidate will be flexible, fun, hard-working and have the desire to be impactful, as your contributions will be a key driver to Snowflake's continued growth and success. A German fully qualified lawyer with 3-7 years of work experience Experience working on enterprise Cloud/SaaS contracts including AI Experience advising on GDPR data processing More ❯
modelled, documented, and served across the business, working with a modern stack and high-impact teams. Key responsibilities in the role Build and maintain robust dbt models within the Snowflake data warehouse Design scalable solutions that translate business questions into reliable datasets Collaborate with teams across product, ops, marketing, and finance Improve data quality, documentation, and platform reliability Integrate … third-party sources Support self-serve analytics through Looker What They’re Looking For Previous experience in an analytics engineering role (or similar) Strong experience with SQL, dbt, and Snowflake Someone just as confident building models as they are shaping best practices Comfortable in ambiguity and proactive about making things better Bonus if you’ve worked in insurance or More ❯
modelled, documented, and served across the business, working with a modern stack and high-impact teams. Key responsibilities in the role Build and maintain robust dbt models within the Snowflake data warehouse Design scalable solutions that translate business questions into reliable datasets Collaborate with teams across product, ops, marketing, and finance Improve data quality, documentation, and platform reliability Integrate … third-party sources Support self-serve analytics through Looker What They’re Looking For Previous experience in an analytics engineering role (or similar) Strong experience with SQL, dbt, and Snowflake Someone just as confident building models as they are shaping best practices Comfortable in ambiguity and proactive about making things better Bonus if you’ve worked in insurance or More ❯