About The Role Hippo is recruiting for a Principal Data Engineer to join our Hippo Herd. Principal Data Engineers work in multi-disciplinary teams that build, support & maintain User-Centred digital solutions that offer real value and work for everyone. More ❯
testing across data ingestion and transformation pipelines (ETL/ELT) Validate analytical data models to ensure accuracy and reliability. Utilize cloud platforms (AWS, GCP) and tools such as Snowflake, dBt, Fivetran, and Tableau for testing purposes. Thought Leadership and Collaboration: Act as a thought leader for testing within the data analytics space advocating for modern testing methodologies and technologies like … data analytics or related function. Proficiency in testing data ingestion and transformation pipelines (ETL/ELT) Hands-on experience with cloud platforms (AWS, GCP) and tools such as Snowflake, dBt, Fivetran, and Tableau (or similar) Strong knowledge … of SQL and experience in testing databases and data warehouses with dBt (e.g., Snowflake - Preferred, Redshift, BigQuery) Strong Knowledge of workload automation platforms like Apache Airflow and dBt (DataBuildTool) Familiarity with CI/CD tools (e.g. Azure DevOps – Preferred, Jenkins) and experience integrating automated tests into pipelines. Experience with cloud platforms (AWS - Preferred, GCP, Azure) for testing and More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
are now looking for a hands-on Data Engineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt , AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the UK. … a self-starter mentality, who wants to be part of company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of data products like … tooling Contributing to internal best practices and agile delivery processes Experience required: Proven experience as a Data Engineer or Analytics Engineer Strong experience with Snowflake Hands-on experience with dbt Proficiency in Python and SQL Solid understanding of Git and development lifecycle best practices Experience integrating APIs or working with event/log data streams Ability to manage multiple priorities More ❯
of experience as a data engineer with experience of GCP and good Cloud understanding Passion for software and data engineering, adopting the mindset of a curious engineer Experience of dbt, SQL, Python, Java, SAS or other open-source technologies used for analytics Ability to understand business requirements and create business ready solutions Show well-developed interpersonal, communication and influencing skills More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Jet2
demonstrate passion for data to inspire your team. Requirements: Experience in developing data pipelines from various sources, including databases, flat files, APIs, and event-driven data feeds; experience with dbt is preferred. Experience with data warehouse design and modelling, including conceptual, logical, physical, and dimensional modelling techniques. Knowledge of cloud data warehouse technologies such as Snowflake (preferred), Google BigQuery, AWS More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Jet2
Social network you want to login/join with: As our new Senior Data and Analytics Engineer , you'll work as part of a multi-disciplinary, agile, data delivery team focused on the delivery of innovative, robust, and efficient dataMore ❯
Experience working with Cloud technologies, preferably GCP Experience of building full ETL/ELT pipelines Building data pipelines for streaming, batch and event driven solutions Strong understanding of Python, DBT and SQL for data transformation Leading others within a data environment, using data-driven approach with strong data literacy and analysis Experience mentoring and coaching engineers to build a high More ❯
data solution as well as advancing it to the next level. We have created an initial gem of a Data Lake and Lakehouse (Azure Data Lake, ADF, Databricks, Airflow, DBT) to enable Business Intelligence and Data Analytics (Superset, RStudio Connect). Our Data Lake is fully metadata driven, cost efficient, documented, and reproducible. We need our one-source-of-truth … data (PySpark) (Required) Experience building deployment pipelines (e.g. Azure Pipelines) (Required) Deployment of web apps using Kubernetes (Preferably ArgoCD & Helm) (Preferred) Experience working on Analytics and Data Science enablement (dbt, DS deployments) (Preferred) Experience in MDM, Data Cataloguing and Lineage optimisation A strong preference for simplicity and transparency over complexity Best practice and detail focussed to enable scalability in future More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
a growing data team at the early stages of its evolution, where you’ll have real ownership, shape foundational data assets, and work with cutting-edge technologies like Snowflake, DBT, Microsoft Azure, and Power BI and next-gen BI platforms. Responsibilities Design and maintain clean, modular data models in Snowflake, applying analytics engineering best practices using DBT. Build curated data … data roles with a strong focus on data modelling and transformation Proficiency in SQL and cloud data platforms such as Snowflake or Azure Synapse Analytics Hands-on experience with DBT for developing, testing, and documenting transformations Understanding of modern data stack principles, including layered modelling, modular SQL, and Git-based workflows Familiarity with dimensional modelling, OBT, activity schemes & data warehouse More ❯
or alternatively, give me a call on (phone number removed). Keywords: Data Engineering, Data Engineer, Snowflake, ETL, ELT, ADF, Data Factory, Synapse Analytics, SSIS, Migration, Pipeline, Python, Spark, DBT, Snowflake, Azure, SQL, Leeds More ❯
Azure, Fabric, Dataverse, Synapse, Data Lake, Purview. Deep expertise in data engineering tools and practices, including Python, SQL, and modern ETL/ELT frameworks (e.g., Azure Data Factory, Talend, dbt). Experience designing and implementing scalable data pipelines and integration patterns across structured and unstructured data sources (e.g., Azure SQL, MySQL, MongoDB). Familiarity with data governance, metadata management, and More ❯
to build robust platforms and create intelligent solutions. If you thrive in fast-paced environments, enjoy variety in your work, and want to build meaningful data products using Snowflake, dbt, Python, and SQL, this role is for you. Requirements: Experience designing and building scalable data pipelines using Snowflake, dbt, and Python Proven experience as a Data Engineer or Analytics Engineer More ❯
key role in designing and implementing a brand-new Snowflake Data Warehouse hosted on Microsoft Azure , with a modern tech stack that includes: Snowflake for scalable cloud data warehousing dbt for data transformation and modelling Azure Data Factory for orchestration Power BI for data visualisation and reporting Design, build, and maintain robust data pipelines and data models in Snowflake Integrate … or Data Development role Strong knowledge of Azure services , especially Azure Data Factory Hands-on experience (or strong understanding) of Snowflake , including pipeline development and data modelling Familiarity with dbt or similar transformation tools Experience working with both structured and unstructured data Understanding of data governance , security best practices , and compliance Excellent communication skills and stakeholder engagement Snowflake or Power More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Financial Conduct Authority
Solid understanding of data lifecycle management, including retention policies and governance Desirable Knowledge of industry trends, standards and best practices in data engineering and data management Certifications such as dbt Certified Developer, AWS Certified Data Engineer – Associate, Azure Data Engineer Associate, Google Professional Data Engineer, CDMP, etc Experience with cloud data platforms, governance and observability to enhance reliability and performance More ❯
analysts, and business stakeholders. About the Platform: This greenfield initiative is focused on building a next-gen data ecosystem with a tech stack including: Snowflake for cloud data warehousing dbt for transformation and modelling Azure for cloud infrastructure and orchestration Fivetran for automated data ingestion Power BI and other modern BI tools for reporting and visualisation What You’ll Do … Design and implement scalable, well-documented data models in Snowflake using dbtBuild curated, reusable data layers that support consistent KPIs and enable self-service analytics Collaborate with Power BI developers to deliver insightful, high-performance dashboards Work with Data Engineers to optimise data ingestion and orchestration pipelines using Azure Data Factory and Fivetran Apply best practices in dimensional modelling … layered architecture, and data quality What We’re Looking For: Strong experience in data modelling and SQL Hands-on experience with dbt and cloud data platforms like Snowflake or Azure Synapse Analytics Solid understanding of modern data stack principles , including layered modelling and data warehouse design Excellent communication skills and the ability to work with stakeholders across technical and non More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
this will be from data engineers, governance and risk analysts through to our CFO to deliver actionable reporting. This would primarily service the finance team working with tools including DBT, Snowflake, PowerBI and no doubt a little Excel too. You will also act as an analytics champion - working with the wider analyst community to enable and upskill them to deliver … to build efficient joins, aggregations, and know your views from your tables. You’ll use Snowflake here but all SQL backgrounds are welcome. If you are already familiar with DBT that’s great too, but not essential – the more exposure you’ve had to it the better. Familiarity with PowerBI or equivalent tools is essential. Stakeholder management, business analysis and More ❯
key role in designing and implementing a brand-new Snowflake Data Warehouse hosted on Microsoft Azure , with a modern tech stack that includes: Snowflake for scalable cloud data warehousing dbt for data transformation and modelling Azure Data Factory for orchestration Power BI for data visualisation and reporting 🛠️ What You’ll Do: Design, build, and maintain robust data pipelines and data … or Data Development role Strong knowledge of Azure services , especially Azure Data Factory Hands-on experience (or strong understanding) of Snowflake , including pipeline development and data modelling Familiarity with dbt or similar transformation tools Experience working with both structured and unstructured data Understanding of data governance , security best practices , and compliance Excellent communication skills and stakeholder engagement Bonus points for More ❯
Data Warehouse. They are building a modern, AI-ready data platform, which allows for better decision-making for both themselves and their customers, with a tech stack including Snowflake, dbt, Microsoft Azure and Power BI. You'll join their newly established Data Team, based in their modern office space in Leeds - you'll spend 4 days per week in the More ❯
join their expanding data team. They are looking for someone who can work in their Leeds office 3 days per week. You will work with cutting-edge technologies Snowflake, DBT, Azure and Power BI and have the chance to have ownership over projects and shape data assets. As an Analytics Engineer, you'll collaborate with teams to design and deliver … BI/data roles with a focus on data modelling and transformation. Proficiency in SQL and cloud data platforms like Snowflake or Azure Synapse Analytics. Hands-on experience with dbt for developing, testing, and documenting transformations. Strong communication skills Ability to build scalable, high-quality systems. Please Note: This is role for UK residents only. This role does not offer More ❯
analysts, and business stakeholders. 🚀 About the Platform: This greenfield initiative is focused on building a next-gen data ecosystem with a tech stack including: Snowflake for cloud data warehousing dbt for transformation and modelling Azure for cloud infrastructure and orchestration Fivetran for automated data ingestion Power BI and other modern BI tools for reporting and visualisation 🧠 What You’ll Do … Design and implement scalable, well-documented data models in Snowflake using dbtBuild curated, reusable data layers that support consistent KPIs and enable self-service analytics Collaborate with Power BI developers to deliver insightful, high-performance dashboards Work with Data Engineers to optimise data ingestion and orchestration pipelines using Azure Data Factory and Fivetran Apply best practices in dimensional modelling … layered architecture, and data quality ✅ What We’re Looking For: Strong experience in data modelling and SQL Hands-on experience with dbt and cloud data platforms like Snowflake or Azure Synapse Analytics Solid understanding of modern data stack principles , including layered modelling and data warehouse design Excellent communication skills and the ability to work with stakeholders across technical and non More ❯
solutions that unlock business value Experience in promoting and maximising the value of agile ways of working Preferred: Basic knowledge of other tooling such as Tableau, Atlan, GCP and DBT What's in it for you Our goal is to celebrate our people, their lives and everything in-between. We aim to create a culture that empowers everyone to bring More ❯
techniques and experience with different frameworks, as well as extensive experience in handling and developing high-volume heterogeneous data (both batch and stream), preferably with hands-on development in DBT, and a solid understanding of cloud data storage technologies and data quality Demonstrated ability to align and analyse requirements, drive design solutions, break down work into stories and tasks, support More ❯
Role: Data Architect Job Type: Contract Skills: GCP, Data Products, Data Mesh, ETL, EDW Composer, BigQuery, DataProc, DBT More about the role: Able to define Data Product Definition, Architecture, and Roadmap. Define Target Model backlog in collaboration with client stakeholders and lead the delivery of a complete Architectural Design. Ensure designs and guidance are aligned with relevant strategies, roadmaps, and More ❯