Foundational understanding of Big Data architecture (Data Lakes, Data Warehouses) and distributed processing concepts (e.g., MapReduce). ETL/ELT Basic knowledge of ETL principles and data modeling (star schema, snowflakeschema). Version Control Practical experience with Git (branching, merging, pull requests). Preferred Qualifications (A Plus) Experience with a distributed computing framework like Apache Spark More ❯
data engineering, data wrangling and pipeline development. Cloud Platforms : Hands-on experience working with Azure. AWS experience is considered, however Azure exposure is essential. Data Warehousing: Proven expertise with Snowflake – schema design, performance tuning, data ingestion, and security. Workflow Orchestration: Production experience with Apache Airflow (Prefect, Dagster or similar), including authoring DAGs, scheduling workloads and monitoring pipeline execution. … Data Modeling: Strong skills in dbt, including writing modular SQL transformations, building data models, and maintaining dbt projects. SQL Databases: Extensive experience with PostgreSQL, MySQL (or similar), including schema design, optimization, and complex query development. Infrastructure as Code: Production experience with declarative infrastructure definition – e.g. Terraform, Pulumi or similar. Version Control and CI/CD: Familiarity with Git-based More ❯
City of London, London, United Kingdom Hybrid/Remote Options
TrueNorth®
we are unable to offer sponsorship for this position. What You’ll Bring Strong Python, SQL, and dbt experience in production settings. Knowledge of cloud data warehouses (BigQuery, Redshift, Snowflake) and denormalised data models. Experience developing ETL/ELT pipelines across varied data sources Experience deploying and managing data infrastructure in a cloud environment., e.g. GCP, AWS or Azure. More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
are continuing to expand their data engineering practice. They are now on the look out for an experienced Senior Data Engineer to join the team, bringing technical expertise in Snowflake, AWS, DBT, and Terraform. Salary and Benefits Competitive salary of up to £95k (DOE) Up to 10% Bonus 14% Pension Contribution Hybrid working from London office (2 in office … Days Annual Leave + Bank Holidays Free Company Shares Private medical care And many more Role and Responsibilities: Design, develop, and maintain scalable, high-performance data pipelines using Snowflake on AWS. Build and optimise Snowflake environments: Warehouses, schemas, roles, security policies. Snowpipe, Streams & Tasks. Partitioning, clustering, and performance tuning Develop robust ELT pipelines using dbt for modelling and … provide well-modelled, documented datasets. Establish and maintain CI/CD pipelines for data engineering using Git-based workflows. Drive best practices in data governance, cataloguing, and lineage. Optimise Snowflake usage and AWS resource costs across environments. Requirements: Strong Snowflake experience Proficiency across an AWS tech stack DBT expertise Terraform experience Expert SQL and Python Data Modelling Data More ❯
impactful dashboards Advanced SQL (T-SQL preferred) for managing and querying relational databases Experience with ETL tools (SSIS, Azure Data Factory, or similar) Strong data modelling skills (star/snowflakeschema) Familiarity with Azure SQL, Synapse Analytics , or other cloud platforms is a plus Domain Knowledge Solid experience in the Lloyd’s/London Market insurance environment Strong More ❯
standards for clients across financial services and energy, helping to shape data strategies and improve architecture maturity. ROLE AND RESPONSIBILITIES Design and deliver relational (3NF) and dimensional (star/snowflake) data models Define modelling frameworks and best practices across client engagements Work closely with architecture teams to ensure alignment with data strategies Support metadata management, governance, and documentation processes More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Formula Recruitment
data-driven opportunities and solve real problems. Collect, analyse, and interpret large datasets to support strategic decisions and operational excellence. Build and maintain scalable data models in DBT and Snowflake . Develop impactful dashboards and self-service tools in Power BI , Tableau , or similar platforms. Promote data literacy across the company by communicating insights clearly and confidently. What You More ❯
into a more engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You'll work extensively with Python , Snowflake , SQL , and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role … s confident communicating with data, product, and engineering teams, not a 'heads-down coder' type. Top 4 Core Skills Python - workflow automation, data processing, and ETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (Data Build Tool) - modular data modelling, testing, documentation, and version … build, and maintain dbt models and SQL transformations to support analytical and operational use cases. Develop and maintain Python workflows for data ingestion, transformation, and automation. Engineer scalable, performant Snowflake pipelines and data models aligned with business and product needs. Partner closely with analysts, product managers, and engineers to translate complex business requirements into data-driven solutions. Write production More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Harnham
while engaging with senior stakeholders to deliver impactful solutions. Specifically, you can expect to be involved in the following: Designing and implementing both relational (3NF) and dimensional (star/snowflake) data models for large-scale financial systems and analytical platforms. Working hands-on with ETL design, metadata management, and data cataloguing , ensuring robust data lineage and governance. Building standards More ❯
IRIS Software Group is one of the UK’s largest privately held software companies. Its purpose is to be the most trusted provider of mission-critical software and services, ensuring customers get it right first time, every time. IRIS has More ❯
Role – Technology Lead Technology – Snowflake, SQL Server, PostgreSQL, Data Warehousing, Data Lake, Medallion, Data Vault, Data Fabric, Semantic modelling Location – UK Business Unit – DNAINS Compensation – Competitive (including bonus) Job Description We are seeking an experienced Data Modeler who can analyse and understand the existing customer data and analytics models on SQL Server, and design efficient data models for Snowflake. … data models to support scalable, high-performance analytics and reporting solutions. You will analyse existing data models built on SQL Server and guide the development of optimized structures on Snowflake, ensuring alignment with business requirements and enterprise standards. You will collaborate with Business Analysts and cross-functional teams to translate complex reporting and analytics needs into efficient, well-governed … data requirements. Ensure data models align with industry standards and regulatory guidelines. Required 5+ years of experience in data modeling and database design. Strong expertise in SQL Server and Snowflake data modeling. Proficiency in SQL for data analysis and validation. Experience in data architecture principles and data governance. Solid understanding of Insurance domain concepts (Specialized Insurance, London Market, Regulatory More ❯
understanding of model selection, fine-tuning, prompt engineering, and evaluation methods. Python, SQL, dbt, Airflow Understand how to frame prompts, evaluate outputs of LLMs, manage prompts with LangChain Platforms: Snowflake Cortex, AWS Bedrock Evaluate costs/performance advantages of models on Bedrock vs Snowflake, the size of the model, fine-tune smaller models Create logic for summarising calls … transcripts, validate QA checklist Build evaluation framework Integrate LLM API to process text at scale via Snowflake Cortex or AWS Bedrock Build APIs on AWS to connect to other systems (slack, excel, kraken) Implement data security and governance practices to work with PII data If this sounds like an opportunity you are interested in, apply now for an immediate More ❯