Person The ideal candidate has 1-2 years' experience in ecommerce or marketing analytics , strong SQL skills , and familiarity with metrics like CAC, LTV, ROAS, and attribution. Exposure to dbt, Snowflake, Shopify, or GA4 is a plus. This role suits someone who is: Hands-on and proactive with a strong commercial awareness. Comfortable juggling multiple projects in a fast-paced More ❯
Person The ideal candidate has 1-2 years' experience in ecommerce or marketing analytics , strong SQL skills , and familiarity with metrics like CAC, LTV, ROAS, and attribution. Exposure to dbt, Snowflake, Shopify, or GA4 is a plus. This role suits someone who is: Hands-on and proactive with a strong commercial awareness. Comfortable juggling multiple projects in a fast-paced More ❯
Person The ideal candidate has 1-2 years' experience in ecommerce or marketing analytics , strong SQL skills , and familiarity with metrics like CAC, LTV, ROAS, and attribution. Exposure to dbt, Snowflake, Shopify, or GA4 is a plus. This role suits someone who is: Hands-on and proactive with a strong commercial awareness. Comfortable juggling multiple projects in a fast-paced More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
and performance Ensure compliance with data governance policies including RBAC, encryption, and audit logging Collaborate with data engineering teams to streamline provisioning and reduce overhead Stakeholder Engagement: Liaise with DBT’s platform, data, and finance teams to align technical decisions with business objectives Present cost-saving recommendations and technical roadmaps to senior stakeholders Document architecture decisions and maintain technical artefacts More ❯
Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (databuildtool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and … Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with Apache Airflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms More ❯
Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (databuildtool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and … Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with Apache Airflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms More ❯
Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (databuildtool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and … Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with Apache Airflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms More ❯
Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (databuildtool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and … Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with Apache Airflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms More ❯
london (city of london), south east england, united kingdom
HCLTech
Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (databuildtool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and … Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with Apache Airflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms More ❯
Writing papers, briefings and responses to inquiries and consultations Prepare written summaries and analysis of new government policy announcements, new standards and regulations Engage regularly with officials in DESNZ, DBT, Defra and regulators, as well as other external bodies Represent techUK and the tech industry at events, meetings and conferences in different places in the UK and internationally Supporting members More ❯
Compose Nice-to-haves (Mix of the following but not all essential): * Experience working with Google Cloud products for ML including Vertex AI Pipelines & BigQuery * Experience with dbt (databuildtool) * Previous experience working with advertising data * Experience with FastAPI or other Python API frameworks * Experience with dashboarding tools such as Looker Studio, Tableau or PowerBI * Experience working on a More ❯
role where you'll be at the heart of data-driven decisions. Day to day you will be transforming raw data in clean, reliable and performant data models using dbt within Google Cloud Platform (GCP). Your primary goal is to empower our clients' business users by making data accessible, understandable and ready for reporting, dashboarding and ad-hoc analysis.We … and business intelligence. Experience in the mobile/telecoms industry would be a bonus! Key outputs for the role• Design, build, and maintain scalable and trustworthy data models in dbt, making use of Kimball Dimensional and One Big Table (OBT) methodologies.• Translate business requirements from stakeholders into robust, well-documented and tested dbt models.• Develop and own workflows within Google … quality, optimised SQL for data transformation and analysis.• Develop and maintain scalable data pipelines within the Google Cloud Platform, ensuring efficient and cost-effective data transformations.• Take ownership of dbt project health, monitoring daily runs and proactively resolving any data, model, or scheduling issues in collaboration with other project owners.• Use version control (Git) and CI/CD to manage More ❯
will build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena, AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data … related field. • 10+ years of experience in data engineering. • Strong hands-on experience with AWS services: S3, Glue, Lake Formation, Athena, Redshift, Lambda, IAM, CloudWatch. • Proficiency in PySpark, Python, DBT, Airflow, Docker and SQL. • Deep understanding of data modeling techniques and best practices. • Experience with CI/CD tools and version control systems like Git. • Familiarity with data governance, security More ❯
will build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena, AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data … related field. • 10+ years of experience in data engineering. • Strong hands-on experience with AWS services: S3, Glue, Lake Formation, Athena, Redshift, Lambda, IAM, CloudWatch. • Proficiency in PySpark, Python, DBT, Airflow, Docker and SQL. • Deep understanding of data modeling techniques and best practices. • Experience with CI/CD tools and version control systems like Git. • Familiarity with data governance, security More ❯
will build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena, AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data … related field. • 10+ years of experience in data engineering. • Strong hands-on experience with AWS services: S3, Glue, Lake Formation, Athena, Redshift, Lambda, IAM, CloudWatch. • Proficiency in PySpark, Python, DBT, Airflow, Docker and SQL. • Deep understanding of data modeling techniques and best practices. • Experience with CI/CD tools and version control systems like Git. • Familiarity with data governance, security More ❯
will build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena, AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data … related field. • 10+ years of experience in data engineering. • Strong hands-on experience with AWS services: S3, Glue, Lake Formation, Athena, Redshift, Lambda, IAM, CloudWatch. • Proficiency in PySpark, Python, DBT, Airflow, Docker and SQL. • Deep understanding of data modeling techniques and best practices. • Experience with CI/CD tools and version control systems like Git. • Familiarity with data governance, security More ❯
london (city of london), south east england, united kingdom
HCLTech
will build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena, AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data … related field. • 10+ years of experience in data engineering. • Strong hands-on experience with AWS services: S3, Glue, Lake Formation, Athena, Redshift, Lambda, IAM, CloudWatch. • Proficiency in PySpark, Python, DBT, Airflow, Docker and SQL. • Deep understanding of data modeling techniques and best practices. • Experience with CI/CD tools and version control systems like Git. • Familiarity with data governance, security More ❯
Mandatory Skills Required: Data Architecture Data Migration Data Modeling Snowflake Designer/Developer DBT (DataBuildTool) ETL Design AWS Services - including S3, ETL/EMR, Security, Lambda, etc. StreamSet Python Programming Leadership and Team Handling Strong Communication and Collaboration Skills Role: Snowflake Architect Location: Basildon, UK/Dublin, Ireland Work from Client office 5 days weekly Responsibilities: -Design, develop … science, Engineering, or a related field. >5+ years of experience in data engineering, with a strong focus on Snowflake and AWS. >Proficiency in SQL, Python, and ETL tools ( Streamsets, DBT etc.) >Hands on experience with Oracle RDBMS >Data Migration experience to Snowflake >Experience with AWS services such as S3, Lambda, Redshift, and Glue. >Strong understanding of data warehousing concepts and … as availability, scalability, operability, and maintainability Evaluation will be done according to below: Area of Assessment Priority Data Architect Must Have Data Migration Must Have Data Modeling Must Have DBT Knowledge Should Have ETL Design Must Have Snowflake Designer/Developer Must Have AWS (S3, ETL/EMR, Security, Lambda etc ) Must Have Leadership/Team handling Must Have Communication More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Birchwell Associates Ltd
DataOps culture of automation, reliability, and agility. Key Responsibilities Design, build, and optimise data pipelines across a modern data platform. Ingest, clean, and transform data using tools such as dbt, Snowflake, and Airflow . Collaborate with cross-functional teams to deliver data products aligned to business priorities. Develop scalable data models that support BI and analytics platforms including Tableau and … including ISO 27001, BS 10012, ISO 50001, and ISO 22301 . Skills & Experience Strong SQL expertise, with the ability to write and optimise complex queries. Hands-on experience with dbt (including testing and layered modelling). Practical knowledge of Snowflake for loading, transforming, and exporting datasets. Experience building and managing Airflow DAGs for pipeline orchestration. Understanding of BI tool requirements More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
Role: Principal Data Engineer (DBT lean)Salary: £70,000 - £90,000 (dependant on experience)Contracting and Permanent hires considered Location: ALL locations considered: We have offices in Liverpool or Newbury office UK for Hybrid working and Sibenik (Croatia)Let us introduce ourselves We're Intuita, a new kind of data partner. We're a collective of data-driven people ready … developing scalable cloud solutions in Azure, GCP or AWS. Data Modelling using Kimball, 3NF or Dimensional methodologies Analytics Engineering lean, with experience within BigQuery (GCP) with data modelling in DBT and mobile/telecoms industry experience would be beneficial, with deep DBT experience being highly beneficial Depth of knowledge and understanding across core orchestration tools and CI/CD pipelines More ❯
standards and best practices. Key responsibilities include: Build and maintain ELT pipelines Take full ownership of data ingestion Support data modelling and architecture within Snowflake Own and evolve the dbt layer, including governance and access controls Collaborate across analytics, product, and engineering teams Contribute to platform improvements, automation, and optimisation YOUR SKILLS AND EXPERIENCE: A successful Senior Data Engineer will … bring: Strong SQL skills Experience with dbt in a production environment Snowflake experience is desirable Exposure to AWS Confident mentoring peers and contributing to a collaborative, high-impact team Experience working in fast-paced, agile environments with modern data workflows THE BENEFITS: You will receive a salary of up to £55,000 depending on experience, along with a comprehensive benefits More ❯
and optimise existing data pipelines, reducing manual effort and improving reliability Lead on structuring data to align with performance, delivery, and leadership metrics Work with cloud-native tooling (AWS, dbt, Airflow) to build scalable data infrastructure Collaborate with digital, performance, and portfolio teams to drive cross-functional delivery Champion data quality, completeness, and usability at every stage of the data … Copilot and ChatGPT to enhance delivery Skills & Experience - Senior Data Engineer Strong hands-on experience building and maintaining automated data pipelines Advanced proficiency in Python and SQL Experience with dbt and strong understanding of data modelling (e.g., star schema) Proficient with orchestration tools such as Airflow Comfortable working with AWS services (Glue, S3, etc.) or similar cloud platforms Experience with More ❯
environments, managing multiple priorities and meeting deadlines. Proficiency in SQL (BigQuery), Python, Git/GitHub, and preferably Looker (Tableau or PowerBI are acceptable as well) Above average knowledge of DBT, Docker, GCP, and Airflow Experience in the cryptocurrency industry, fintech sector, or platform-type businesses is preferred but not required. Personal Attributes Analytical mindset with a passion for data-driven … principles thinking to drive efficient solutions Highly ambitious with a results-oriented attitude and continuous improvement mindset Technologies you will work with Python SQL (BigQuery) GCP EPPO for experimentation DBT, Docker, Cloud Run/Kubernetes, and Airflow for data orchestration and data pipelines Looker data visualization Git and GitHub for code collaboration Ability to leverage AI tools such as Cursor More ❯
assessment, product development, customer growth, and operational efficiency through data-driven insights, supporting our expansion efforts. Build & Scale: Own and evolve our modern data stack (centred on Snowflake and dbt), ensuring our analytics capabilities scale with Wayflyer's rapid growth and diversification. Lead & Mentor: Develop a high-performing team, fostering a culture of technical excellence, collaboration, and continuous learning within … prioritising initiatives that deliver maximum value to Wayflyer. Technical Oversight & Data Platform: Oversee the architecture, development, and maintenance of our analytics infrastructure, including our Snowflake Data Warehouse and our dbt-centric transformation layer. Ensure data quality, governance, scalability, and performance. Insight Generation & Delivery: Partner closely with stakeholders across Product, Risk, Sales, Marketing, and Operations to understand their needs, translate them More ❯
TO THE TEAM You bring a deep understanding of how to transform raw data into clean, reliable and analysis-ready datasets. You’re fluent in SQL and experienced with DBT, with a strong appreciation for scalable, well-structured models that encode business logic clearly and consistently. You think analytically and work methodically ensuring your transformations enable reporting, analytics and activation … use cases with confidence. YOUR NEXT CHALLENGE As an Analytics Engineer at Medialab, you will: Design, build and maintain data models within our DBT codebase, transforming raw data into clean, tested, and documented datasets ready for analytics and activation. Onboard new data feeds, working closely with business and technical stakeholders to understand requirements and ensure smooth integration into our pipelines. … Support users and team members, sharing knowledge and solving problems effectively. Must Have Skills SQL – proficient; confident in writing performant, well-structured SQL with full understanding of ANSI standards. DBT – strong working knowledge; experienced in building, testing, and documenting models as the central tool for transformation and semantic modelling. Python – some experience; able to use and adapt scripts and tooling More ❯