pipelines and infrastructure to ensure efficient data flow across the business. Key Responsibilities Develop, support and optimise robust data solutions using tools like Snowflake, dbt, Fivetran, and Azure Cloud services Collaborate with cross-functional teams to translate business needs into actionable data architecture Design and manage data pipelines and integration More ❯
efficient code and comfortable undertaking system optimisation and performance tuning tasks Comfortable working with relational databases such as Oracle, PostgreSQL, MySQL Has exposure to DBT and data quality test frameworks Has awareness of Infrastructure as Code such as Terraform and Ansible BENEFITS Competitive Salary. Company Laptop supplied. Bonus Scheme. More ❯
Cambridge, Cambridgeshire, UK Hybrid / WFH Options
Intellect Group
Nice to Have Experience working within a consultancy or client-facing environment Familiarity with tools and frameworks such as: Databricks PySpark Pandas Airflow or dbt Experience deploying solutions using cloud-native services (e.g., BigQuery, AWS Glue, S3, Lambda) What’s On Offer Fully remote working with the flexibility to work More ❯
a related field, with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow , Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Hargreaves Lansdown Asset Management Limited
low latency data pipeline with the following skills. Data Engineering Skills Modelling Orchestration using Apache Airflow Cloud native streaming pipelines using Flink, Beam etc. DBT Snowflake Infrastructure Skills Terraform Devops Skills Experienced in developing CI/CD pipelines Integration Skills REST and Graph APIs (Desirable) Serverless API development ( e.g. Lambda More ❯
Excellent communication and stakeholder management skills. Desirable: Experience working with large-scale retail datasets (e.g., POS, CRM, supply chain). Familiarity with tools like dbt, Airflow, or MLflow. Master’s or PhD in Data Science, Statistics, Computer Science, or related field. Benefits: Competitive salary and performance bonuses Flexible working options More ❯
SR2 | Socially Responsible Recruitment | Certified B Corporation™
databases (Structured, non-relational, Graph etc.) CI/CD experience Python or Java experience is preferred GIS experience is desirable. ETL experience (Ideally with DBT, but not a hard requirement) Benefits ? Hybrid working – 2 days per week on site. Up to £65,000 Private Medical 4x DIS 5% employer pension More ❯
data/business catalogs). Knowledge of at least one of the following technologies/methodologies will be an additional advantage: Python, Streamlit, Matillion, DBT, Atlan, Terraform, Kubernetes, Data Vault, Data Mesh Ability to engage with principal data architects of client stakeholders Excellent presentation and communication skills. This role will More ❯
communicator able to interface confidently with both technical and non-technical audiences Bonus Experience • Familiarity with IaC frameworks (CloudFormation, Terraform, SAM) • Exposure to Snowflake, DBT, Airflow, or cost analytics/data pipeline tools • Knowledge of FinOps practices or cost intelligence platforms • Experience contributing to open-source platforms or cloud-native More ❯
field. 5+ years of experience in data engineering, with a strong focus on Snowflake and AWS. Proficiency in SQL, Python, and ETL tools (Streamsets, DBT etc.) Hands on experience with Oracle RDBMS Data Migration experience to Snowflake Experience with AWS services such as S3, Lambda, Redshift, and Glue. Strong understanding More ❯
experience in writing complex queries to manipulate and extract insights from large datasets. Experience with ETL, rETL, IR tools and frameworks such as Airflow, dbt, FiveTran, Coalesce, HighTouch, Rudderstack, Snowplow, or similar. Strong analytical mindset with a focus on data accuracy, troubleshooting, and resolving complex data issues. Ability to communicate More ❯
platform standards for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud dataMore ❯
platform standards for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud dataMore ❯
platform standards for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud dataMore ❯
Altrincham, Greater Manchester, UK Hybrid / WFH Options
Realtime Recruitment
platform standards for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud dataMore ❯
Bolton, Greater Manchester, UK Hybrid / WFH Options
Realtime Recruitment
platform standards for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud dataMore ❯
Leigh, Greater Manchester, UK Hybrid / WFH Options
Realtime Recruitment
platform standards for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud dataMore ❯
Ashton-Under-Lyne, Greater Manchester, UK Hybrid / WFH Options
Realtime Recruitment
platform standards for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud dataMore ❯
ETL processes Fluent in SQL with ability to write complex queries for large-scale datasets Hands-on experience with ETL tools like Airflow or dbt Comfortable with data visualisation using BI tools (e.g., Looker, Tableau, Power BI) If you are interested, please reach out for a private conversation- daniel.wexler@source More ❯
the early stages of its evolution, where you’ll have real ownership, shape foundational data assets, and work with cutting-edge technologies like Snowflake, DBT, Microsoft Azure, and Power BI and next-gen BI platforms. Responsibilities Data Modelling & Transformation Design and maintain clean, modular data models in Snowflake, applying analytics … focus on data modelling and transformation Proficiency in SQL and cloud data platforms such as Snowflake or Azure Synapse Analytics Hands-on experience with DBT for developing, testing, and documenting transformations Understanding of modern data stack principles, including layered modelling, modular SQL, and Git-based workflows Familiarity with dimensional modelling More ❯
Collaborate with Marketing, e-commerce, and Buying & Merchandising teams on data needs • Work with Data Engineers to transform source system data using tools like DBT • Develop dimensional data models and apply best practices in data warehousing • Deliver advanced analytics projects such as customer segmentation, forecasting, and LTV analysis • Maintain code … skills (Google BigQuery or similar cloud platforms) • Proficiency in Tableau or other data visualisation tools • Understanding of data modelling (Kimball methodology preferred) • Experience with DBT/Airflow and orchestration tools • Strong communication and stakeholder management skills • Familiarity with GitHub for version control and documentation Bonus Points For: • Python skills • Knowledge More ❯
a great opportunity to push the company forward in this role. Role Summary You would be working closely with our two data engineers across DBT, Snowflake and PowerBI. You would work with our analysts and wider analyst community to enable them to deliver insights to the business through a combination … and know your views from your tables. You’ll use Snowflake here but all SQL backgrounds are welcome. If you are already familiar with DBT that’s great too, but not essential – the more exposure you’ve had to it the better. Familiarity with CI/CD pipelines, version control More ❯
analytical and problem-solving skills Excellent communication and interpersonal skills Education and Experience: Proficiency in SQL and Python Experience with modern data testing frameworks (dbt, Great Expectations) Proficiency in writing and maintaining dbt tests for data transformations Experience in CI/CD practices for data pipelines aPriori Offers Competitive compensation More ❯
data governance You will liaise with different teams and communicate to ensure stakeholders needs are met You will support the creation and maintenance of DBT models Skills and Experience: Essential to have expertise with: Stakeholder management DBT Kubernetes Databricks Desirable to have experience with: Azure Additional Benefits: Private medical insurance More ❯