public cloud technologies. Expertise with data orchestration tools such as Apache Airflow or Dagster. Proficiency in big data storage and processing technologies such as DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, PostgreSQL/MySQL. Strong knowledge of event-driven architectures and streaming technologies such as Apache Kafka, Kafka Streams more »
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Jet2.com
e.g. dimensional modelling, data vault, etc) Data Pipeline - Experience working with a wide variety of data sources and data transformation techniques. Any experience with dBt would be an advantage as would experience using Python to build customer transformation logic. CI/CD & Automation - We rely heavily on Infrastructure as Code more »
processes and data pipelines Ensure system performance and reliability Provide leadership to junior team members Requirements: 5+ years of experience Expertise in Python, Snowflake, DBT, Airflow, Kubernetes, Terraform, and GitHub CI/CD Strong background in cloud architecture (Azure, AWS, GCP) Experience with real-time streaming technologies Excellent communication and more »
Python/Javascript/C# Familiarity with statistical/machine learning/AI concepts and techniques Understanding of data pipeline/orchestration tools e.g. dbt, dataform Appreciation of GCP’s serverless technologies e.g. Cloud Run/Workflows Understanding of Google’s marketing stack, Google Analytics, Google Tag Manager, Google Ads more »
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Sonovate
still require hands-on assistance from time-to-time: Expert-level proficiency in SQL and Python, with strong experience in modern data platforms (Fivetran, dbt, Snowflake) Deep knowledge of cloud-based data architectures, particularly in Azure (Functions, LogicApps, etc.), experience in containerization (Docker) and Infrastructure-as-Code (Terraform, ARM templates more »
resource. KEY SKILLS REQUIRED Worked with a combination of cloud based Big data technologies (e.g. HDFS, Blob Storage, Spark, Kafka, Delta, Hive, Airflow and DBT) and OLTP and Data Warehousing – within SQL server or other RDBMS’s. (SQL server). Familiarity with Lake House Architecture and Databricks. Extensive Data modelling more »
creating reliable data models. Deploy and manage data-related tools on Kubernetes, including Airflow, Superset, and RStudio Connect. Support Data Analytics workflows by managing DBT, implementing deployment rules, and handling DevOps processes. Skills & Experience: Proven Data Engineering experience, particularly with Azure, Databricks, and Delta Lake. Expertise in Kubernetes, Infrastructure as more »
with DevOps and Data Engineering to integrate scalable, reliable data pipelines. - Implement Infrastructure as Code (IaC) using Terraform. - Support analysts in deploying and maintaining DBT models. - Explore AI and BI tools to enhance platform analytics. YOUR SKILLS AND EXPERIENCE - Experience with Azure, Kubernetes, Databricks, and PySpark. - Strong proficiency with Infrastructure more »
develop, maintain and draw insights from our business intelligence solutions consisting of Google Big Query (Data warehouse), Domo/Superset (Visualisations), Snaplogic (ELT) and DBT (Modelling). Your days will be varied including challenging the status quo, preparing ad-hoc analysis, supporting business users in self-serve operations mode and more »
data pipelines and extract, load, transform (ETL) processes to ensure efficient and accurate data flow. Implementing and optimising ETL processes with dbt (the databuildtool) transforming raw data into actionable insights, ensuring data quality and accuracy. Developing and maintaining data models and data architectures within Azure Synapse to support … disruption. Maintaining and documenting data engineering processes, workflows, and data models ensuring transparency and facilitate knowledge sharing, leveraging the lineage and documentation functionality of dbt docs. Using Visual Studio Code for code development, debugging, and version control, ensuring efficient and effective coding practices in Structured Query Language (SQL)/Python … cross-functional teams, ensuring data solutions meet business needs. Designing and implementing automated data validation testing. To be successful: Proficiency in Microsoft Azure Synapse, dbt, and Visual Studio Code. Strong skills in SQL, Python, or similar languages used for data manipulation and processing. Experience with data integration tools and techniques more »
Data Engineer Snowflake/SQL/Python/DBT £60,000 - £65,000 + 10% Bonus + 10% pension Milton Keynes – 1 day per month This Data Engineering opportunity is to join a rapidly growing financial services organisation, who are going through a data transformation journey. This organisation is investing … creating warehouses and data lakes, coding in various languages, and much more. Skills we are looking for: Strong working in a cloud environment Snowflake DBT SQL Python ETL pipeline development Desirable: Financial services/Insurance experience As previously mentioned, they take data seriously, it is not the type of business more »
work to equip the customers with the very latest technology. In this role you will use cutting edge technology on GCP Big query and DBT to build world class data models to surface data to the company. You will work in a team of Analytical engineers where you will be … transformation and data quality processes. Work collaboratively with data engineers on infrastructure issues, such as permissions, accounts, setup, architecture. Build/maintain environments on DBT for all other teams. Work within a continual improvement process. Help maintain product catalogues, documentation and linage. Work with visualization developers to align with the … data quality to the highest standard and consistent. Your Profile Essential skills/knowledge/experience: Strong SQL knowledge GCP Big Query DBT (DataBuildTool) Gitlab Good communication skills Agile way of working Desirable skills/knowledge/experience: Knowledge of Visualization tool like Tableau. Worked in a high more »
Senior Data Engineer Snowflake/DBT/DataOps Up to £80,000 + 15% Bonus + 10% Pension Milton Keynes – 1 day per month This Senior Data Engineering opportunity is to join a rapidly growing financial services organisation, who are going through a data transformation journey. This organisation is investing … looking for: Very experienced in Snowflake (for example, previously working on a Snowflake implementation and can help with best practices and ways of working) DBT Airflow SQL Power BI Desirable: Financial services/Insurance experience Our client is looking for an enthusiastic data professional who will improve what good looks more »
Wellingborough, England, United Kingdom Hybrid / WFH Options
The ONE Group Ltd
Handling: Comfortable working with datasets too large for Excel. Visualization & Reporting: Experience with BI tools like PowerBI or Tableau. Desirable Skills: Familiarity with DAX, DBT, and experience working with Azure Data Factory, Azure SQL. Hands-on experience with ETL tools like FiveTran, Data Warehousing, and Snowflake. Python skills and familiarity … unfortunately we can only consider applicants with full, unrestricted right to work in the UK with no limitations, time restrictions, or sponsorship requirements. Keywords: DBT, DAX, ETL, Data Analyst, Data Engineer, Analytics Engineer more »
Hampshire, England, United Kingdom Hybrid / WFH Options
Nigel Frank International
strong experience with cloud data tools and be comfortable working with GCP (though other cloud platforms are acceptable). Familiarity with Apache Airflow and dbt is a plus. Key Responsibilities: Lead the development of a data strategy and architecture from scratch Design, build, and maintain scalable ETL pipelines Collaborate with … in cloud-based data tools Hands-on experience with cloud ETL tools (GCP experience is preferred, but not required) Knowledge of Apache Airflow and dbt is beneficial Ability to work autonomously and take ownership of large-scale projects Flexibility in recommending and implementing new technologies to improve the data ecosystem more »
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Inspire People
Join a team at the heart of the global economy! The Department for Business and Trade ('DBT') and Inspire People are partnering to bring you an exciting opportunity for Senior Platform Engineer to help build and scale their global product platform using cutting-edge technologies and impact DBT's upcoming … and other excellent Civil Service benefits. Flexible, hybrid working from London, Cardiff, Darlington, Belfast, Birmingham, Salford and Edinburgh. The Department for Business and Trade (DBT) is the department for economic growth. The Government Digital and Data (GDaD) directorate develops and operates tools and services to support businesses to invest, grow … and export, creating jobs and opportunities across the country. Job Description DBT are on a mission to build a new cutting edge developer platform in AWS and migrate existing services from GOV.UK PasS in the process. Are you up to the challenge of building something new? We need Platform Engineers more »
data governance You will liaise with different teams and communicate to ensure stakeholders needs are met You will support the creation and maintenance of DBT models Skills and Experience: Essential to have expertise with: Stakeholder management DBT IaC - terraform Databricks Desirable to have experience with: Azure Kubernetes Additional Benefits: Private more »