City of London, London, United Kingdom Hybrid / WFH Options
Hera
digestible visualizations and dashboards Excellent communication and presentation skills, with the ability to effectively convey insights to non-data colleagues Working knowledge of ELT/data modelling tools like dbt (bonus) Benefits Generous stock options scheme Enhanced maternity, paternity, adoption, and shared parental leave packages Time off for IVF/Fertility appointments and a compassionate leave policy 30 days annual More ❯
strategy. What's important to us Well-connected - You have a strong network in the public sector , particularly in one or more of the central government departments D SIT , DBT, Defra, MoD, and MoJ . You actively expand and nurture your network to uncover early opportunities and convert them into business impact for our clients and Zühlke . Industry-savvy More ❯
strategy, roadmap, and prioritization for a specific digital product, or products, within an engagement. You will collaborate with clients within the Consumer Products industry undertaking major Digital Business Transformation (DBT) engagements to understand their business, end-customer needs, and then constantly deliver value with fast increment cycles. Your Impact: • Partner with client(s), strategists, experience leads, and enterprise architects, to More ❯
environment. Strong SQL and comfortable working with large datasets in a cloud data warehouse. Data visualisation and storytelling. Strong attention to detail. Familiarity with tools such as Looker Studio, dbt, Git. More ❯
environment. Strong SQL and comfortable working with large datasets in a cloud data warehouse. Data visualisation and storytelling. Strong attention to detail. Familiarity with tools such as Looker Studio, dbt, Git. More ❯
team. Nice to Haves Experience with Go, C++, Java, or another systems language. Experience with Docker, Kubernetes, and ML production infrastructure. PyTorch/Tensorflow deep learning experience. Experience using dbt Benefits Flexible Working Hours & Remote-First Environment - Work when and where you're most productive, with flexibility and support. Comprehensive BUPA Health Insurance - Stay covered with top-tier medical care More ❯
a sophisticated & modern tech stack; you’ll be well-versed in SQL, data visualization (Looker preferred but not essential) as well as ideally having prior experience with data modelling (dbt) and/or data warehousing (Snowflake preferred but again not essential). You’ll be responsible for building and maintaining reporting for a range of subsidiary brands in the group. More ❯
Person The ideal candidate has 1-2 years' experience in ecommerce or marketing analytics , strong SQL skills , and familiarity with metrics like CAC, LTV, ROAS, and attribution. Exposure to dbt, Snowflake, Shopify, or GA4 is a plus. This role suits someone who is: Hands-on and proactive with a strong commercial awareness. Comfortable juggling multiple projects in a fast-paced More ❯
Person The ideal candidate has 1-2 years' experience in ecommerce or marketing analytics , strong SQL skills , and familiarity with metrics like CAC, LTV, ROAS, and attribution. Exposure to dbt, Snowflake, Shopify, or GA4 is a plus. This role suits someone who is: Hands-on and proactive with a strong commercial awareness. Comfortable juggling multiple projects in a fast-paced More ❯
Person The ideal candidate has 1-2 years' experience in ecommerce or marketing analytics , strong SQL skills , and familiarity with metrics like CAC, LTV, ROAS, and attribution. Exposure to dbt, Snowflake, Shopify, or GA4 is a plus. This role suits someone who is: Hands-on and proactive with a strong commercial awareness. Comfortable juggling multiple projects in a fast-paced More ❯
Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (databuildtool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and … Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with Apache Airflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms More ❯
Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (databuildtool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and … Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with Apache Airflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms More ❯
Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (databuildtool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and … Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with Apache Airflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms More ❯
Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (databuildtool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and … Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with Apache Airflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms More ❯
london (city of london), south east england, united kingdom
HCLTech
Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (databuildtool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and … Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with Apache Airflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms More ❯
Horsforth, Leeds, West Yorkshire, England, United Kingdom
TPP (The Phoenix Partnership)
product and service becomes and remains fully compliant with current standards, policies and regulations• Utilising your broad knowledge of health and social care processes to providing support for the DBT approach to new software productionApplicants should be clinically qualified and can show a breadth of knowledge in relation to the role as well as a demonstrable interest in clinical safety More ❯
Writing papers, briefings and responses to inquiries and consultations Prepare written summaries and analysis of new government policy announcements, new standards and regulations Engage regularly with officials in DESNZ, DBT, Defra and regulators, as well as other external bodies Represent techUK and the tech industry at events, meetings and conferences in different places in the UK and internationally Supporting members More ❯
Job duties Write and debug parts of our cloud database system, which incorporates client code, web-based code, the service layers, and the database engine. Debug memory and performance issues in C++, Java, and Python. Examine use cases of customers More ❯
role where you'll be at the heart of data-driven decisions. Day to day you will be transforming raw data in clean, reliable and performant data models using dbt within Google Cloud Platform (GCP). Your primary goal is to empower our clients' business users by making data accessible, understandable and ready for reporting, dashboarding and ad-hoc analysis.We … and business intelligence. Experience in the mobile/telecoms industry would be a bonus! Key outputs for the role• Design, build, and maintain scalable and trustworthy data models in dbt, making use of Kimball Dimensional and One Big Table (OBT) methodologies.• Translate business requirements from stakeholders into robust, well-documented and tested dbt models.• Develop and own workflows within Google … quality, optimised SQL for data transformation and analysis.• Develop and maintain scalable data pipelines within the Google Cloud Platform, ensuring efficient and cost-effective data transformations.• Take ownership of dbt project health, monitoring daily runs and proactively resolving any data, model, or scheduling issues in collaboration with other project owners.• Use version control (Git) and CI/CD to manage More ❯
will build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena, AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data … related field. • 10+ years of experience in data engineering. • Strong hands-on experience with AWS services: S3, Glue, Lake Formation, Athena, Redshift, Lambda, IAM, CloudWatch. • Proficiency in PySpark, Python, DBT, Airflow, Docker and SQL. • Deep understanding of data modeling techniques and best practices. • Experience with CI/CD tools and version control systems like Git. • Familiarity with data governance, security More ❯
will build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena, AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data … related field. • 10+ years of experience in data engineering. • Strong hands-on experience with AWS services: S3, Glue, Lake Formation, Athena, Redshift, Lambda, IAM, CloudWatch. • Proficiency in PySpark, Python, DBT, Airflow, Docker and SQL. • Deep understanding of data modeling techniques and best practices. • Experience with CI/CD tools and version control systems like Git. • Familiarity with data governance, security More ❯
will build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena, AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data … related field. • 10+ years of experience in data engineering. • Strong hands-on experience with AWS services: S3, Glue, Lake Formation, Athena, Redshift, Lambda, IAM, CloudWatch. • Proficiency in PySpark, Python, DBT, Airflow, Docker and SQL. • Deep understanding of data modeling techniques and best practices. • Experience with CI/CD tools and version control systems like Git. • Familiarity with data governance, security More ❯
will build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena, AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data … related field. • 10+ years of experience in data engineering. • Strong hands-on experience with AWS services: S3, Glue, Lake Formation, Athena, Redshift, Lambda, IAM, CloudWatch. • Proficiency in PySpark, Python, DBT, Airflow, Docker and SQL. • Deep understanding of data modeling techniques and best practices. • Experience with CI/CD tools and version control systems like Git. • Familiarity with data governance, security More ❯
london (city of london), south east england, united kingdom
HCLTech
will build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena, AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data … related field. • 10+ years of experience in data engineering. • Strong hands-on experience with AWS services: S3, Glue, Lake Formation, Athena, Redshift, Lambda, IAM, CloudWatch. • Proficiency in PySpark, Python, DBT, Airflow, Docker and SQL. • Deep understanding of data modeling techniques and best practices. • Experience with CI/CD tools and version control systems like Git. • Familiarity with data governance, security More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Birchwell Associates Ltd
DataOps culture of automation, reliability, and agility. Key Responsibilities Design, build, and optimise data pipelines across a modern data platform. Ingest, clean, and transform data using tools such as dbt, Snowflake, and Airflow . Collaborate with cross-functional teams to deliver data products aligned to business priorities. Develop scalable data models that support BI and analytics platforms including Tableau and … including ISO 27001, BS 10012, ISO 50001, and ISO 22301 . Skills & Experience Strong SQL expertise, with the ability to write and optimise complex queries. Hands-on experience with dbt (including testing and layered modelling). Practical knowledge of Snowflake for loading, transforming, and exporting datasets. Experience building and managing Airflow DAGs for pipeline orchestration. Understanding of BI tool requirements More ❯