data engineering or a related field, with a focus on building scalable data systems and platforms. Strong expertise in modern data tools and frameworks such as Kafka, Spark, Databricks, dbt, Airflow, and cloud-native services (AWS, GCP, or Azure). Deep understanding of streaming architectures, distributed systems, ETL/ELT pipelines, and data modeling. Proficiency at least one object-oriented More ❯
data engineering or a related field, with a focus on building scalable data systems and platforms. Strong expertise in modern data tools and frameworks such as Kafka, Spark, Databricks, dbt, Airflow, and cloud-native services (AWS, GCP, or Azure). Deep understanding of streaming architectures, distributed systems, ETL/ELT pipelines, and data modeling. Proficiency at least one object-oriented More ❯
Edinburgh, Midlothian, United Kingdom Hybrid/Remote Options
Aberdeen
systems. Good understanding of security, compliance, and agile delivery practices. Passionate about learning new technologies and collaborating with teams, including mentoring junior engineers. Nice-to-have experience with Snowflake, DBT, Terraform, and Azure DevOps CI/CD pipelines. We are proud to be a Disability Confident Committed employer. If you have a disability and would like to apply to one More ❯
Edinburgh, City of Edinburgh, United Kingdom Hybrid/Remote Options
Cathcart Technology
overall data strategy. The ideal person for this role will have a strong background in data engineering , with experience building modern data solutions using technologies like Kafka , Spark , Databricks , dbt , and Airflow . You'll know your way around cloud platforms (AWS, GCP, or Azure) and be confident coding in Python , Java , or Scala . Most importantly, you'll understand More ❯
Employment Type: Permanent
Salary: £80000 - £100000/annum Bonus, Pension and Shares
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid/Remote Options
Cathcart Technology
overall data strategy. The ideal person for this role will have a strong background in data engineering , with experience building modern data solutions using technologies like Kafka , Spark , Databricks , dbt , and Airflow . You'll know your way around cloud platforms (AWS, GCP, or Azure) and be confident coding in Python , Java , or Scala . Most importantly, you'll understand More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid/Remote Options
Cathcart Technology
foster a collaborative, forward-thinking culture where engineers work on large-scale datasets, distributed systems, and modern cloud platforms. Teams leverage cutting-edge tools such as Spark, Airflow, Kafka, dbt, and Databricks to build resilient, scalable, and high-quality data solutions. Why this role? ** Lead a talented team of data engineers delivering high-quality, reliable data systems. ** Shape the architecture More ❯
broughton, central scotland, united kingdom Hybrid/Remote Options
Cathcart Technology
foster a collaborative, forward-thinking culture where engineers work on large-scale datasets, distributed systems, and modern cloud platforms. Teams leverage cutting-edge tools such as Spark, Airflow, Kafka, dbt, and Databricks to build resilient, scalable, and high-quality data solutions. Why this role? ** Lead a talented team of data engineers delivering high-quality, reliable data systems. ** Shape the architecture More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid/Remote Options
IO Associates
reliable, accessible, and actionable. This role is 'remote first' and falls OUTSIDE IR35. Key Responsibilities Data Pipeline Development: Design, build, and maintain robust, scalable, and efficient data pipelines using DBT, SQL, Python, and ETL frameworks. Data Modelling: Develop and maintain logical and physical data models to support reporting and analysis needs, implementing best practices for data warehouse design. Data Transformation More ❯
Edinburgh, Midlothian, United Kingdom Hybrid/Remote Options
Aberdeen
CD, automation). Familiarity with Data/ETL tools such Microsoft Azure and Microsoft Fabric. Knowledge of Cloud-native development: Azure, Snowflake, ADF (Azure Data Factory) and DBT (DataBuildTool) for modular SQL development. Autosys scheduling and Ab Initio for ETL and batch processing as well as Markit EDM. Understanding of MQ (IBM MQ, ACE) for message-based integrations More ❯
analysts and 3 analytics engineers, you will turn CRM, finance and marketing data into decision-grade insights that move revenue, retention and unit economics metrics. Design & own core datasets (dbt → BigQuery). Author and maintain dbt models, tests, docs and macros that produce production tables for accounts, opportunities, bookings, revenue recognition, customer events and marketing touchpoints. Commercial performance & forecasting. Lead … thrives on turning complex commercial problems into reliable data products. Extensive experience in an analytical role supporting Sales, Finance, Marketing or GTM teams. Ideally in a B2B SaaS environment dbt & BigQuery: commercial experience developing DBT models on top of a modern data warehouse such as BigQuery or Snowflake Expert SQL: advanced query design, optimisation, and building reusable analytical datasets. Modern More ❯
analysts and 3 analytics engineers, you will turn CRM, finance and marketing data into decision-grade insights that move revenue, retention and unit economics metrics. Design & own core datasets (dbt → BigQuery). Author and maintain dbt models, tests, docs and macros that produce production tables for accounts, opportunities, bookings, revenue recognition, customer events and marketing touchpoints. Commercial performance & forecasting. Lead … thrives on turning complex commercial problems into reliable data products. Extensive experience in an analytical role supporting Sales, Finance, Marketing or GTM teams. Ideally in a B2B SaaS environment dbt & BigQuery: commercial experience developing DBT models on top of a modern data warehouse such as BigQuery or Snowflake Expert SQL: advanced query design, optimisation, and building reusable analytical datasets. Modern More ❯
Edinburgh, Midlothian, United Kingdom Hybrid/Remote Options
Aberdeen
About the Role As a Data Platform Engineer at Aberdeen, you will be at the heart of building and evolving a modern, cloud-based data platform using Azure, Snowflake, DBT, and Microsoft Fabric. Working within a Data Mesh architecture, you will enable decentralized data ownership and empower domain teams through self-service data capabilities. Your expertise will drive the design … About the Candidate Design and implement a Data Mesh architecture, enabling decentralized data ownership and collaboration across domain teams. Develop and maintain scalable cloud data infrastructure using Azure, Snowflake, DBT, and Microsoft Fabric, ensuring performance and automation. Build self-service data platforms and automate data pipelines to empower domain teams to manage their own data products independently. Collaborate with teams … networking, Data Factory, Key Vault, and Azure Fabric. Proven expertise in Snowflake administration, including database/user management, warehouses, role-based access control, and data sharing. Solid experience with DBT for data transformation, including best practices for modular, testable, and version-controlled data models. Proficiency in infrastructure-as-code deployments, especially using Terraform for Azure, Snowflake, and related platforms. Experience More ❯