Analytics Engineer to join our Data Platform chapter and central Data Platform team. We record nearly 100 million lines of data daily with 600 dbt models, and need you to help the business make sense of it by providing the capabilities that will help answer the product and commercial questions … the gap between data creation and data consumption Supporting internal stakeholders with reports and queries on data definitions and relationships Our stack includes: Snowflake, dbt, AWS, Looker, Monte Carlo, Terraform, Snowpipe, GitHub. Experience in analysing large, complex data sets, identifying issues, and ensuring data quality A great communicator, comfortable working … implementation Ability to prioritise your problems in order to focus on the biggest issues Ability to spot opportunities to automate repetitive tasks Experience with dbt or a BI tool (Looker, Omni, Lightdash or equivalent) Experience with git, Terraform, Python, JIRA would be a great bonus Familiarity of cloud computing environments More ❯
Join a team at the heart of the global economy! The Department for Business and Trade ("DBT") and Inspire People are partnering together to bring you an exciting opportunity for Senior Developers to build and maintain software to deliver a complete service for users. Salary between £59,634 to … and Edinburgh. About the role As a Senior Software Developer in the Digital, Data and Technology (DDaT) function in Department for Business and Trade (DBT), you will build and maintain software to deliver a complete service for users. By understanding the whole system, you will plan and lead development on … experience of the following. Developed applications or platforms using Python Django framework Frontend development using JavaScript, preferably React framework This role requires SC clearance. DBT's requirement for SC clearance is to have been present in the UK for at least 3 of the last 5 years. Failure to meet More ❯
the development of test automation frameworks that support data ingestion, transformation (ETL/ELT), and analytical models. Work hands-on with tools like Snowflake, dbt, Fivetran , and Tableau , alongside SQL and Python. Design and implement scalable, agnostic testing frameworks for use across agile delivery teams. Promote best practices including Test … and modelling pipelines. Strong SQL and Python skills – essential for building and validating test cases. Proven experience with Snowflake (or similar cloud data platforms), dbt , Fivetran , and Airflow . Knowledge of automation frameworks such as Cucumber, Gherkin, TestNG. Experience integrating test automation into large-scale delivery functions. Experience Proven track More ❯
Greater Leeds Area, United Kingdom Hybrid / WFH Options
Corecom Consulting
the development of test automation frameworks that support data ingestion, transformation (ETL/ELT), and analytical models. Work hands-on with tools like Snowflake, dbt, Fivetran , and Tableau , alongside SQL and Python. Design and implement scalable, agnostic testing frameworks for use across agile delivery teams. Promote best practices including Test … and modelling pipelines. Strong SQL and Python skills – essential for building and validating test cases. Proven experience with Snowflake (or similar cloud data platforms), dbt , Fivetran , and Airflow . Knowledge of automation frameworks such as Cucumber, Gherkin, TestNG. Experience integrating test automation into large-scale delivery functions. Experience Proven track More ❯
bradford, yorkshire and the humber, united kingdom Hybrid / WFH Options
Corecom Consulting
the development of test automation frameworks that support data ingestion, transformation (ETL/ELT), and analytical models. Work hands-on with tools like Snowflake, dbt, Fivetran , and Tableau , alongside SQL and Python. Design and implement scalable, agnostic testing frameworks for use across agile delivery teams. Promote best practices including Test … and modelling pipelines. Strong SQL and Python skills – essential for building and validating test cases. Proven experience with Snowflake (or similar cloud data platforms), dbt , Fivetran , and Airflow . Knowledge of automation frameworks such as Cucumber, Gherkin, TestNG. Experience integrating test automation into large-scale delivery functions. Experience Proven track More ❯
want to work on meaningful challenges where data turns into action, this role is for you. Key Responsibilities Develop and maintain data models using dbt and SQL that are efficient, scalable, and reliable. Encourage self-service analytics and consistent, high-quality metrics by building and maintaining semantic metric definitions and … About You Experience with SQL and (optionally) with Python. Experience building scalable, high-quality data models that serve complex business use cases. Knowledge of dbt and experience with cloud data warehouses (BigQuery, Snowflake, Databricks, Azure Synapse etc). Proficiency in building BI dashboards and self-service capabilities using tools like More ❯
also exploring how we can leverage AI to deliver an even better experience for our customers. Key Responsibilities Develop and maintain data models using dbt and SQL that are efficient, scalable, and reliable. Encourage self-service analytics and consistent, high-quality metrics by building and maintaining semantic metric definitions and … About You Advanced SQL and (optionally) experience with Python. Experience building scalable, high-quality data models that serve complex business use cases. Knowledge of dbt and experience with cloud data warehouses (BigQuery, Snowflake, Databricks, Azure Synapse etc). Proficiency in building BI dashboards and self-service capabilities using tools like More ❯
person will builddata platforms and integrate it with clients and move towards getting clients AI and automation ready. You will work with Snowflake, dbt, Python, and SQL. Requirements: Snowflake Python dbt SQL Communication Skills Nice to Have: AWS AI Exposure Interviews ongoing don't miss your chance to secure … the future of your career! Contact me @ (url removed) or on (phone number removed). Snowflake, AWS, Data, dbt, Data Engineer, ETL, Consulting, Pipelines, Consult, SQL, Python More ❯
person will builddata platforms and integrate it with clients and move towards getting clients AI and automation ready. You will work with Snowflake, dbt, Python, and SQL. Requirements: Snowflake Python dbt SQL Communication Skills Nice to Have: AWS AI Exposure Interviews ongoing don't miss your chance to secure … the future of your career! Contact me @ (url removed) or on (phone number removed). Snowflake, AWS, Data, dbt, Data Engineer, ETL, Consulting, Pipelines, Consult, SQL, Python More ❯
Warrington, Cheshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
person will builddata platforms and integrate it with clients and move towards getting clients AI and automation ready. You will work with Snowflake, dbt, Python, and SQL. Requirements: Snowflake Python dbt SQL Communication Skills Nice to Have: AWS AI Exposure Interviews ongoing don't miss your chance to secure … the future of your career! Contact me @ (url removed) or on (phone number removed). Snowflake, AWS, Data, dbt, Data Engineer, ETL, Consulting, Pipelines, Consult, SQL, Python More ❯
Architect & Implement Big Query Data Transformations • Working experience handling migration of ETL tools and on-premise relational DBs to Cloud DWH. (IBM Datastage to DBT and Oracle to GCP Big Query is preferred) • Design ,implement, Optimize & Automate scalable data pipelines using GCP-native tools to process data from Bronze (raw … analytics-ready). • Optimize performance with Big Query partitioning, clustering, materialized views, and optimized SQL transformations. • Automate and schedule ETL/ELT workflows with DBT, and Airflow Workflows. • Develop real-time and batch data pipelines using Dataflow, Apache Beam, and DBT for streaming and structured data ingestion. • Define and enforce More ❯
Hiring: Analytics Engineer Hybrid 80% remote Hey data wizards! Right now, we're on the lookout for a Analytics Engineer with strong knowledge of DBT to help us shape the future of data for one of our top clients. What You'll Be Doing: Building scalable, documented, modular DBT models … in Snowflake) that power decision-making. Owning the DBT project structure - from staging layers to marts, with testing and performance optimization baked in. Driving SQL performance , metadata practices , and clean data pipelines with tools like Airflow . Collaborating with analysts, engineers, and business stakeholders to keep data aligned with real … world needs. Helping shape and improve data governance practices that truly scale. What You Bring: 3+ years building end-to-end DBT projects (macros, testing, documentation - you know the drill). Expert-level SQL + hands-on experience optimizing queries in Snowflake . Familiarity with orchestration tools like Airflow , AWS More ❯
Profile Essential skills/knowledge/experience: Design, develop, and maintain scalable ETL pipelines using AWS Glue (PySpark) . Strong hands-on experience with DBT (Cloud or Core) . Implement and manage DBT models for data transformation and modeling in a modern data stack. Proficiency in SQL , Python , and PySpark … to the development of best practices and standards for data engineering and transformation. Desirable skills/knowledge/experience: Working knowledge of AWS GLUE, DBT, ETL. Rewards & Benefits TCS is consistently voted a Top Employer in the UK and globally. Our competitive salary packages feature pension, health care, life assurance More ❯
Profile Essential skills/knowledge/experience: Design, develop, and maintain scalable ETL pipelines using AWS Glue (PySpark) . Strong hands-on experience with DBT (Cloud or Core) . Implement and manage DBT models for data transformation and modeling in a modern data stack. Proficiency in SQL , Python , and PySpark … to the development of best practices and standards for data engineering and transformation. Desirable skills/knowledge/experience: Working knowledge of AWS GLUE, DBT, ETL. Rewards & Benefits TCS is consistently voted a Top Employer in the UK and globally. Our competitive salary packages feature pension, health care, life assurance More ❯
Profile Essential skills/knowledge/experience: Design, develop, and maintain scalable ETL pipelines using AWS Glue (PySpark) . Strong hands-on experience with DBT (Cloud or Core) . Implement and manage DBT models for data transformation and modeling in a modern data stack. Proficiency in SQL , Python , and PySpark … to the development of best practices and standards for data engineering and transformation. Desirable skills/knowledge/experience: Working knowledge of AWS GLUE, DBT, ETL. Rewards & Benefits TCS is consistently voted a Top Employer in the UK and globally. Our competitive salary packages feature pension, health care, life assurance More ❯
Profile Essential skills/knowledge/experience: Design, develop, and maintain scalable ETL pipelines using AWS Glue (PySpark) . Strong hands-on experience with DBT (Cloud or Core) . Implement and manage DBT models for data transformation and modeling in a modern data stack. Proficiency in SQL , Python , and PySpark … to the development of best practices and standards for data engineering and transformation. Desirable skills/knowledge/experience: Working knowledge of AWS GLUE, DBT, ETL. Rewards & Benefits TCS is consistently voted a Top Employer in the UK and globally. Our competitive salary packages feature pension, health care, life assurance More ❯
Profile Essential skills/knowledge/experience: Design, develop, and maintain scalable ETL pipelines using AWS Glue (PySpark) . Strong hands-on experience with DBT (Cloud or Core) . Implement and manage DBT models for data transformation and modeling in a modern data stack. Proficiency in SQL , Python , and PySpark … to the development of best practices and standards for data engineering and transformation. Desirable skills/knowledge/experience: Working knowledge of AWS GLUE, DBT, ETL. Rewards & Benefits TCS is consistently voted a Top Employer in the UK and globally. Our competitive salary packages feature pension, health care, life assurance More ❯
team and work on high-impact projects in a fast-paced environment. The role will primarily focus on building and maintaining data pipelines in DBT using SQL, with opportunities to get hands-on experience with Dagster, Synq, AWS, and Python. Key responsibilities Develop and maintain SQL-based data solutions (PostgreSQL …/MySQL). Work with DBT for data transformation and modeling. Support the scheduling and execution of data pipelines using Dagster. Assist in managing and optimizing AWS-lambda based data infrastructure. Collaborate with the team to identify opportunities to improve the overall data architecture and data pipelines within the Data … experience). 3-5 years' experience as a SQL Engineer, Data Engineer, or similar role. Proficiency in SQL (PostgreSQL/MySQL). Experience with DBT for data transformation. Understanding of Dagster or similar workflow orchestration tools. Familiarity with AWS, Lambda, Python, Tableau, Metabase (nice to have, but not essential). More ❯
Data/Analytics Engineer. You’ll transform raw data into meaningful, high-quality datasets that power applications across the company. You’ll build scalable dbt models on top of Databricks and PostgreSQL, partnering with data and business stakeholders to define metrics, track performance, and ensure data quality. This role would … years professional experience as a Data Engineer or Analytics Engineer Strong proficiency in SQL, with proven experience writing complex, performant queries Experience working with DBT in production Experience working with Databricks and/or PostgreSQL Solid understanding of data testing, observability, and data quality assurance Familiarity with Git and modern More ❯
Data/Analytics Engineer. You’ll transform raw data into meaningful, high-quality datasets that power applications across the company. You’ll build scalable dbt models on top of Databricks and PostgreSQL, partnering with data and business stakeholders to define metrics, track performance, and ensure data quality. This role would … years professional experience as a Data Engineer or Analytics Engineer Strong proficiency in SQL, with proven experience writing complex, performant queries Experience working with DBT in production Experience working with Databricks and/or PostgreSQL Solid understanding of data testing, observability, and data quality assurance Familiarity with Git and modern More ❯
Data/Analytics Engineer. You’ll transform raw data into meaningful, high-quality datasets that power applications across the company. You’ll build scalable dbt models on top of Databricks and PostgreSQL, partnering with data and business stakeholders to define metrics, track performance, and ensure data quality. This role would … years professional experience as a Data Engineer or Analytics Engineer Strong proficiency in SQL, with proven experience writing complex, performant queries Experience working with DBT in production Experience working with Databricks and/or PostgreSQL Solid understanding of data testing, observability, and data quality assurance Familiarity with Git and modern More ❯
move the needle, you'll love it here. What You'll Do Design models that hold up under pressure: Own and develop analytics-ready dbt models that transform raw data into clean, documented, and trusted sources of truth. Get the right data flowing : Use Fivetran and custom pipelines to ingest … environments where you've had to balance speed, quality, and scale. Proven ability to write clean, efficient SQL and Python, and to build robust dbt models that support scalable data workflows in production. Comfortable working across modern data stacks, including ELT tools, cloud warehouses, and BI platforms - with the ability … well. Current Stack We work with a modern data stack, but we're open to evolving as we grow. Currently, that includes: Fivetran BigQuery dbt Lightdash Hex Heap Benefits 28 days holiday per annum + Bank holidays, with the option to roll up to 5 days per annum. Employee Share More ❯
SQL, Python and Airflow; Experience in Kubernetes, Docker, Django, Spark and related monitoring tools for DevOps a big plus (e.g. Grafana, Prometheus); Experience with dbt for pipeline modeling also beneficial; Skilled at shaping needs into a solid set of requirements and designing scalable solutions to meet them; Able to quickly … for a more effective platform; Open to traveling to Octopus offices across Europe and the US. Our Data Stack: SQL-based pipelines built with dbt on Databricks Analysis via Python Jupyter notebooks Pyspark in Databricks workflows for heavy lifting Streamlit and Python for dashboarding Airflow DAGs with Python for ETL More ❯
that clients are invoiced correctly. We are looking for someone who ideally has experience of working in a Data Team and is proficient with DBT, Bigquery and Looker. Key Responsibilities: Building new processes to map Data to billable events and KPIs, working with Data engineers to ensure this. Data Mapping … managing the definition and Business logic for KPIs for the company, across Product team, Operations team, Commercial and Customer Success teams, using DBT and Look-ML. Will be in charge of dashboarding and will be the owner of Looker, the company's BI tool. Billing infrastructure & System, ensuring all clients More ❯
of £55k - £65k (DOE) Competitive annual leave package Fully remote working And many more Role and Responsibilities Design and deliver data pipelines using SQL, dbt, and Python within Snowflake to transform and model data for a variety of use cases. Build robust, testable, and maintainable code that integrates data from … facing or consultancy environments. Strong experIse in Snowflake (SQL, performance tuning, security, warehouse design). At least 1 year of hands-on experience using dbt for data transformaIon and modelling. Proficiency in Python, especially for data manipulaIon, orchestraIon, and scripIng workflows. Solid understanding of version control (Git) and development lifecycle More ❯