Response Informatics is hiring for Data Architect Role : Technical/Data Architect Location : United kingdom Responsibilities: Data Architecture & Modeling: Design end-to-end data architecture for loyalty platforms, ensuring seamless integration with CRMs, CDPs, DMPs, analytics platforms, and MarTech stacks. More ❯
a modern data platform, including hiring a data manager to oversee governance and integrations, a data engineer, and most recently, a data scientist. With dbt introduced as part of a broader tech overhaul, the team is now looking for a dedicated Analytics Engineer to take ownership of the dbt layer … a reliable and scalable production environment on top of a growing set of raw data sources. Tech Stack and Environment Data Warehouse: Redshift Modelling: dbt (recently onboarded, still early-stage) ETL & Integration: Fivetran, with plans to scale Architecture: Medallion Dashboarding & Visualisation: Mode Analytics (SQL-based) Ideal Background: Strong SQL skills … with hands-on experience in dbt and modern cloud data warehouses (e.g. Redshift, Snowflake, BigQuery) Familiarity with ELT tools (e.g. Fivetran), BI tools (e.g. Looker, Tableau, Power BI), and Git-based workflows Solid understanding of data modelling, warehousing principles, and analytics best practices Experience working cross-functionally with strong communication More ❯
and deployment. Key Responsibilities Design, build and maintain cloud-based data platform infrastructure (AWS, Azure, or GCP) Deploy and manage modern data tools (e.g. DBT, Airflow, Snowflake) Implement Infrastructure as Code using Terraform Automate deployment pipelines using CI/CD tools, preferably Azure Pipelines Ensure platform stability, scalability, and performance … GCP) Infrastructure as Code expertise using Terraform CI/CD experience (Azure Pipelines preferred) Docker and Linux tooling Exposure to modern data tools (e.g. DBT, Airflow, Snowflake, Redshift) Agile delivery environment experience More ❯
and deployment. Key Responsibilities Design, build and maintain cloud-based data platform infrastructure (AWS, Azure, or GCP) Deploy and manage modern data tools (e.g. DBT, Airflow, Snowflake) Implement Infrastructure as Code using Terraform Automate deployment pipelines using CI/CD tools, preferably Azure Pipelines Ensure platform stability, scalability, and performance … GCP) Infrastructure as Code expertise using Terraform CI/CD experience (Azure Pipelines preferred) Docker and Linux tooling Exposure to modern data tools (e.g. DBT, Airflow, Snowflake, Redshift) Agile delivery environment experience More ❯
the development of test automation frameworks that support data ingestion, transformation (ETL/ELT), and analytical models. Work hands-on with tools like Snowflake, dbt, Fivetran , and Tableau , alongside SQL and Python. Design and implement scalable, agnostic testing frameworks for use across agile delivery teams. Promote best practices including Test … and modelling pipelines. Strong SQL and Python skills – essential for building and validating test cases. Proven experience with Snowflake (or similar cloud data platforms), dbt , Fivetran , and Airflow . Knowledge of automation frameworks such as Cucumber, Gherkin, TestNG. Experience integrating test automation into large-scale delivery functions. Experience Proven track More ❯
Greater Leeds Area, United Kingdom Hybrid / WFH Options
Corecom Consulting
the development of test automation frameworks that support data ingestion, transformation (ETL/ELT), and analytical models. Work hands-on with tools like Snowflake, dbt, Fivetran , and Tableau , alongside SQL and Python. Design and implement scalable, agnostic testing frameworks for use across agile delivery teams. Promote best practices including Test … and modelling pipelines. Strong SQL and Python skills – essential for building and validating test cases. Proven experience with Snowflake (or similar cloud data platforms), dbt , Fivetran , and Airflow . Knowledge of automation frameworks such as Cucumber, Gherkin, TestNG. Experience integrating test automation into large-scale delivery functions. Experience Proven track More ❯
including requirements impact analysis, platform selection, technical architecture design, application design and development, testing, and deployment. Leveraging your proficiency in tools such as Snowflake, DBT, Glue, and Airflow, you will help define the technical strategy and ensure scalable, high-performing data architecture. Your Profile Essential skills/knowledge/experience … Extensive experience as Solution Designer. Currently working on Data Lake (Big Data) based projects. Experience in Snowflake, DBT, Glue, Airflow. Ability to demonstrate structured consideration of multiple options, comparing competing technologies, and different functional design approaches e.g., Produce a KDD. Ability to present and discuss competing options and make recommendations More ❯
Hiring: Analytics Engineer Hybrid 80% remote Hey data wizards! Right now, we're on the lookout for a Analytics Engineer with strong knowledge of DBT to help us shape the future of data for one of our top clients. What You'll Be Doing: Building scalable, documented, modular DBT models … in Snowflake) that power decision-making. Owning the DBT project structure - from staging layers to marts, with testing and performance optimization baked in. Driving SQL performance , metadata practices , and clean data pipelines with tools like Airflow . Collaborating with analysts, engineers, and business stakeholders to keep data aligned with real … world needs. Helping shape and improve data governance practices that truly scale. What You Bring: 3+ years building end-to-end DBT projects (macros, testing, documentation - you know the drill). Expert-level SQL + hands-on experience optimizing queries in Snowflake . Familiarity with orchestration tools like Airflow , AWS More ❯
Data/Analytics Engineer. You’ll transform raw data into meaningful, high-quality datasets that power applications across the company. You’ll build scalable dbt models on top of Databricks and PostgreSQL, partnering with data and business stakeholders to define metrics, track performance, and ensure data quality. This role would … years professional experience as a Data Engineer or Analytics Engineer Strong proficiency in SQL, with proven experience writing complex, performant queries Experience working with DBT in production Experience working with Databricks and/or PostgreSQL Solid understanding of data testing, observability, and data quality assurance Familiarity with Git and modern More ❯
Data/Analytics Engineer. You’ll transform raw data into meaningful, high-quality datasets that power applications across the company. You’ll build scalable dbt models on top of Databricks and PostgreSQL, partnering with data and business stakeholders to define metrics, track performance, and ensure data quality. This role would … years professional experience as a Data Engineer or Analytics Engineer Strong proficiency in SQL, with proven experience writing complex, performant queries Experience working with DBT in production Experience working with Databricks and/or PostgreSQL Solid understanding of data testing, observability, and data quality assurance Familiarity with Git and modern More ❯
ecosystem: We primarily work in python, go and SQL. Our code (tracked via github) deploys to GCP (Bigquery, Airflow/Composer, GKE, Cloud Run), dbt Cloud and Azure via terraform. We recognise this is a broad list; if youre not deeply familiar with some of these tools, wed like you … otherwise), particularly for applications in support of customer personalisation Surfacing analytical products into data visualisation platforms (we use PowerBI) via semantic models (ideally within dbt) Improving engineering excellence (enforcement via CI/CD pipelines, KEDBs, SOPs, best practices) Why apply? We believe that diversity fuels innovation. At Pets at Home More ❯
BI solution, including requirements impact analysis, platform selection, technical architecture design, application design and development, testing, and deployment. Proficiency in tools such as Snowflake, DBT, Glue, and Airflow, you will help define the technical strategy and ensure scalable, high-performing data architecture Your Profile Essential skills/knowledge/experience … Extensive experience as Solution Designer. Currently working on Data Lake (Big Data) based projects. Experience in Snowflake, DBT, Glue, Airflow. Ability to demonstrate structured consideration of multiple options, comparing competing technologies, and different functional design approaches e.g., Produce a KDD. Ability to present and discuss competing options and make recommendations More ❯
focus on ensuring the technical integrity of our Looker data model (LookML development) , managing our Looker instance, and collaborating closely with analytics engineers (using dbt when required) . You will play a crucial role in defining and maintaining the company's data model and ensuring its scalability. You will receive … experience using data to provide insights to executive leadership and other departments. Extra Credit - Experience in Looker ( LookML development). Extra Credit - Experience with dbt modeling. Extra Credit - Experience with data analysis and building data pipelines with Python . Why CoverWallet At CoverWallet you will have the opportunity to be More ❯
platform. This role requires a deep understanding of modern data architectures, cloud technologies, and AI/ML frameworks with hands-on expertise in Snowflake, dbt, and Azure AI/ML services. Key responsibilities: Architect & Design: Define and evolve the Data platform architecture, ensuring scalability, security, and cost-effectiveness Data Strategy …/ML Initiatives: Lead Azure AI/ML adoption, integrating advanced analytics and machine learning capabilities Modern Data Stack: Oversee the implementation of Snowflake, dbt, and other data processing frameworks for optimal performance Stakeholder Collaboration: Engage with senior leadership, data engineering teams, and business stakeholders to align technology with business … best practices Your Profile Essential skills/knowledge/experience: Experience in Enterprise Architecture, Data Engineering, or Cloud Data Platforms Proven expertise in Snowflake, dbt, and Azure AI/ML Strong background in data modelling, ETL/ELT pipelines, and data warehousing Experience designing scalable, event-driven, and serverless architectures More ❯
platform. This role requires a deep understanding of modern data architectures, cloud technologies, and AI/ML frameworks with hands-on expertise in Snowflake, dbt, and Azure AI/ML services. Key responsibilities: Architect & Design: Define and evolve the Data platform architecture, ensuring scalability, security, and cost-effectiveness Data Strategy …/ML Initiatives: Lead Azure AI/ML adoption, integrating advanced analytics and machine learning capabilities Modern Data Stack: Oversee the implementation of Snowflake, dbt, and other data processing frameworks for optimal performance Stakeholder Collaboration: Engage with senior leadership, data engineering teams, and business stakeholders to align technology with business … best practices Your Profile Essential skills/knowledge/experience: Experience in Enterprise Architecture, Data Engineering, or Cloud Data Platforms Proven expertise in Snowflake, dbt, and Azure AI/ML Strong background in data modelling, ETL/ELT pipelines, and data warehousing Experience designing scalable, event-driven, and serverless architectures More ❯
you'll be working on: Building and optimising streaming data pipelines for real-time IoT and time series data Developing scalable solutions using Snowflake, DBT, and vector databases Architecting high-performance data workflows for advanced analytics and machine learning Collaborating with data scientists, analysts, and engineers to unlock new manufacturing … environments. Nice to Have: Manufacturing industry experience (pharma not required - automotive, chemicals, or IoT-heavy industries are a good fit). Familiarity with Snowflake, DBT, or cloud platforms (AWS preferred) - experience with any of these technologies is a plus. This is a high-impact role in a team that's More ❯
you'll be working on: Building and optimising streaming data pipelines for real-time IoT and time series data Developing scalable solutions using Snowflake, DBT, and vector databases Architecting high-performance data workflows for advanced analytics and machine learning Collaborating with data scientists, analysts, and engineers to unlock new manufacturing … environments. Nice to Have: Manufacturing industry experience (pharma not required - automotive, chemicals, or IoT-heavy industries are a good fit). Familiarity with Snowflake, DBT, or cloud platforms (AWS preferred) - experience with any of these technologies is a plus. This is a high-impact role in a team that's More ❯
Senior Analytics Engineer Location: Remote (United Kingdom) About The Company: We have partnered with a company that empowers underwriters to serve their insureds more effectively. They are using advanced data intelligence tools to rebuild the way that underwriters share and More ❯
next level. As Senior Analytics Engineer, you will have sole ownership of analytics engineering at Omaze. You will use industry standard tools and platforms (dbt, Snowflake, ThoughtSpot) to amplify the effectiveness and impact of our (growing) analytics team. You will provide clean, tested, well-documented models, and work with our … data engineer to give the team access to new data sources. 🔧 What You’ll Be Doing Fully own our dbt project, building and maintaining data models in our Snowflake data warehouse, blending and modelling data from multiple sources Work with analysts and engineers to collaboratively design and builddata new … experience at a high growth company, ideally with hands-on experience of product/BI analytics A mastery of postgres SQL Extensive knowledge of dbt, and experience using its non-standard functionality that can elevate the performance and efficacy of dbt projects Excellent communication skills, and experience working with cross More ❯
London, England, United Kingdom Hybrid / WFH Options
Harnham
pipelines for summarising customer conversations and surfacing insights in CRM Building and deploying production-ready Python code Developing ETL workflows and model pipelines using DBT and Airflow Supporting internal reps with smart tools directly embedded in their workflows Required Experience: Strong hands-on Python (production code, not just prototyping) Experience … with DBT , Airflow , and general ETL/data engineering Exposure to RAG systems , NLP , and GenAI tooling Ability to build ML solutions that scale across large user bases (100k+) Demonstrated commercial impact (e.g., reducing churn, increasing MRR) Experience in fast-paced SaaS/GTM environments Contract Details: Length: 12 months … Rate: £500-£800/day (depending on experience) Location: Fully remote Start Date: ASAP If interested, please send your CV Desired Skills and Experience DBT Apache Airflow CI/CD RAG (Retrieval-Augmented Generation) ETL Pipelines SQL More ❯
scalable dashboards, models, and reports that help teams-from marketing to manufacturing-make smarter, faster decisions. You'll work with tools like Apache Superset, DBT, any SQL compatible storage , to deliver trusted analytics solutions across the company. This is a great opportunity for someone with an analytical mindset, strong business … Build reusable, insightful dashboards in Apache Superset that track performance, reveal trends, and support decision-making. Develop and maintain clean, reliable data models in DBT for Redshift . Analyze user behavior, campaign results, customer interactions, and financial flows to identify actionable insights. Contribute to a strong data culture by advocating … practices in data quality, governance, and storytelling. Use Git and follow Git Flow best practices for version control and collaboration on analytics code and DBT models. What You'll Be Working On: Attraction: Campaign analytics, website funnel optimization, performance of landing pages and social media activities. Patient Onboarding & Customer Care More ❯
launch a new data-driven platform that will transform their operations. They're now looking for someone to help shape the platform using Python, DBT, Terraform, and Snowflake, building scalable solutions that will enable their data strategy to grow. Please apply if the below sounds exciting to you-but only … if you have strong experience with Snowflake, Python, DBT, and Terraform. Key Responsibilities Build and optimize scalable data pipelines on Snowflake Develop ETL processes using Python and DBT to automate data flows Set up and manage Snowflake environments in the cloud using Terraform Collaborate with the team to establish repeatable … and scalability Work alongside data leaders to establish best practices for cloud data engineering Your Experience & Skills Proven hands-on experience with Snowflake, Python, DBT, and Terraform - please only apply if you have recent experience with these technologies Strong understanding of building and optimizing data pipelines at scale Experience working More ❯
Analytics Engineer - £70,000 - dbt & Snowflake - London Overview: We are seeking an Analytics Engineer to help drive forward a fascinating greenfield project aiming to reinvent and build out a data platform from scratch. This is a demanding but rewarding project to be part of a team of data experts in … data platform while building data layers within dbt. Requirements: Proven Experience in Analytics Engineering/advanced BI/Data roles Expert SQL Hands-on dbt experience Beneficial: Snowflake experience Azure experience DBT,MS Business Intelligence,Snowflake,Azure,Analytics,Transformation,ELT,DataMore ❯
to optimising data pipelines and warehouse performance. You'll contribute to the design and implementation of a new enterprise data platform using Snowflake and DBT , while also supporting and maintaining existing legacy systems. This is a business-facing role, offering exposure to high-impact projects and cross-functional collaboration. Key … deep understanding of its architecture and integration across data ecosystems. Strong proficiency in data modelling (Kimball) and building scalable, production-grade ELT pipelines using DBT to support complex analytical needs. Hands-on expertise in designing and maintaining CI/CD pipelines with tools like GitLab, ensuring robust, automated deployment processes. More ❯
Data Scientist Salary: £65,000-£75,000 + Benefits Location: West London - Hybrid (3 days p/w in-office) Tech: AWS, Snowflake, Airflow, DBT The Company: Immersum are supporting the growth leading PropTech company on a mission to revolutionise how the property sector understands people, places, and data. By … models Strong analytical background with applied stats, EDA, and model validation techniques Confidence working with structured data pipelines and modern tooling (AWS, Snowflake, Airflow, DBT) Curiosity for emerging techniques and an eagerness to learn and innovate Excellent communication skills, especially when simplifying complex findings for non-technical teams Why Join More ❯