. Experience deploying and maintaining cloud infrastructure (e.g. AWS, GCP, or Azure). Familiarity with data modeling and warehousing concepts, and dimensional modeling techniques. dbt knowledge is preferable. Comfortable working with CI/CD tools, version control, and containers (e.g Git, Jenkins, Docker). Understanding of data governance, security best More ❯
working autonomously. It would be a real bonus, but not a requirement if: You've worked in a start-up environment. You've got DBT experience. You've familiarity with MLOps principles and practices and their application in a production setting. Interview Process: You'll have a 20-minute conversation More ❯
R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start-up or scale up environment. Experience working in the fields of financial technology, traditional financial More ❯
R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start-up or scale up environment. Experience working in the fields of financial technology, traditional financial More ❯
data best practices across teams Champion data quality, governance, and documentation Key Requirements: Strong experience with Python, SQL, and modern ETL tools (e.g., Airflow, dbt) Solid grasp of cloud platforms (AWS/GCP/Azure) and data warehouses (e.g., BigQuery, Snowflake) Familiarity with streaming technologies (Kafka, Kinesis, etc.) Passion for More ❯
Have hands-on experience with cloud infrastructure (GCP/AWS/Azure), infrastructure-as-code (Terraform), containerisation (Docker/k8s) and data pipelines (SQL, dbt, Airbyte) Love automation, process improvement and finding ways to help others work efficiently Are comfortable working autonomously and taking responsibility for the delivery of large More ❯
Snowflake/Databricks/Redshift/BigQuery, including performance tuning and optimisation. Understanding of best practices for designing scalable and efficient data models, leveraging dbt for transformations. Familiarity with CircleCI, Terraform or similar tools for deployment and infrastructure as code. In this role, you will be responsible for: Shipping and More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
as well as 30 Engineers in the businesses data arm. Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT Expertise Terraform Experience Nice to Have: Data Modelling Data Vault Apache Airflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual More ❯
Tools). Experience with one or more of the following is a plus: Kubernetes, Prometheus, Argo workflows, GitHub Actions, Elasticsearch/Opensearch, PostgreSQL, BigQuery, DBTdata pipelines, Fastly, Storybook, Contentful, Deno, Bun. Benefits We want to give you a great work environment; contribute back to both your personal and professional More ❯
the Data space. This role will also allow the successful individual to cross-train into modern Data Engineering tools and Technologies such as Airflow, dbt, Snowflake, as well further develop their skills Python, SQL and Market Data platforms. The firm work on a hybrid working schedule (three days per week More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Searchability®
data warehouse technologies (such as Amazon Redshift, Google BigQuery, or Snowflake) Hands-on experience with ETL tools and frameworks, including Apache Airflow, Talend, or dbt Strong programming ability in Python or another data-focused language Knowledgeable about data management best practices, including governance, security, and compliance standards Familiar with cloud More ❯
and coaching others to succeed. Have a strong background in building and managing data infrastructure at scale, with expertise in Python, SQL, BigQuery, AWS, dbt and Airflow. Have a strong background in data modelling and building scalable data pipelines. Are naturally curious and enthusiastic about experimenting with new tech to More ❯
data warehousing (BigQuery preferred). Hands-on experience with orchestration tools (preferably Airflow). Proficient in SQL and data modelling best practices. Experience with DBT or other modern data transformation frameworks. Ability to use a version control system (e.g. git) for code management and collaboration. Proficiency in efficiently extracting dataMore ❯
data warehousing (BigQuery preferred). Hands-on experience with orchestration tools (preferably Airflow). Proficient in SQL and data modelling best practices. Experience with DBT or other modern data transformation frameworks. Ability to use a version control system (e.g. git) for code management and collaboration. Proficiency in efficiently extracting dataMore ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
Lloyds Bank plc
and integrity. Applying test data management tools for crafting, managing, and maintaining test data sets. Developing and execute data transformation tests using DBT (DataBuildTool). Performing ETL testing to validate data extraction, transformation, and loading processes. Collaborating with data engineers, analysts, and other stakeholders to identify and resolve … need Required Qualifications: Proven experience in defining and implementing data testing strategies. Hands-on experience with test data management tools. Proficiency in DBT (DataBuildTool) for data transformation and testing. Strong understanding of ETL processes and experience in ETL testing. Excellent problem-solving skills and attention to detail. Experience More ❯
Northern Ireland, United Kingdom Hybrid / WFH Options
Ocho
communicator able to interface confidently with both technical and non-technical audiences Bonus Experience • Familiarity with IaC frameworks (CloudFormation, Terraform, SAM) • Exposure to Snowflake, DBT, Airflow, or cost analytics/data pipeline tools • Knowledge of FinOps practices or cost intelligence platforms • Experience contributing to open-source platforms or cloud-native More ❯
analysis Knowledge of dashboard design and data visualization best practices Experience with cloud-based data infrastructure (AWS) Familiarity with modern data stack tools (Airflow, dbt, etc.) Why This Role Matters Judge.me is at an inflection point. As the market leader in Shopify reviews, we've chosen to build our future More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Bounce Digital
/logging, and support architecture decisions What You Bring Strong SQL & Python (PySpark); hands-on with GCP or AWS Experience with modern ETL tools (dbt, Airflow, Fivetran) BI experience (Looker, Power BI, Metabase); Git and basic CI/CD exposure Background in a quantitative field; AI/ML interest a More ❯
Barnsley, South Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Experis
understanding of ETL concepts, data integration, and data warehousing best practices. Familiarity with version control systems (e.g., Git) and workflow orchestration tools (e.g., Airflow, dbt) is a plus. Excellent problem-solving skills and attention to detail. Strong communication and collaboration abilities. Preferred Qualifications: Experience working in agile or cross-functional More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Hargreaves Lansdown
low latency data pipeline with the following skills. Data Engineering Skills Modelling Orchestration using Apache Airflow Cloud native streaming pipelines using Flink, Beam etc. DBT Snowflake Infrastructure Skills Terraform Devops Skills Experienced in developing CI/CD pipelines Integration Skills REST and Graph APIs (Desirable) Serverless API development ( e.g. Lambda More ❯
skills for building and optimising data pipelines Experience working with cloud platforms (e.g., AWS, GCP, or Azure) Familiarity with modern data stack tools (e.g., dbt, Airflow, Snowflake, Redshift, or BigQuery) Understanding of data modelling and warehousing principles Experience working with large datasets and distributed systems What's in it for More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
SR2 | Socially Responsible Recruitment | Certified B Corporation™
databases (Structured, non-relational, Graph etc.) CI/CD experience Python or Java experience is preferred GIS experience is desirable. ETL experience (Ideally with DBT, but not a hard requirement) Benefits ? Hybrid working – 2 days per week on site. Up to £65,000 Private Medical 4x DIS 5% employer pension More ❯
Greater Manchester, England, United Kingdom Hybrid / WFH Options
ECOM
Excellent communication and stakeholder management skills. Desirable: Experience working with large-scale retail datasets (e.g., POS, CRM, supply chain). Familiarity with tools like dbt, Airflow, or MLflow. Master’s or PhD in Data Science, Statistics, Computer Science, or related field. Benefits: Competitive salary and performance bonuses Flexible working options More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
InterQuest Group (UK) Limited
Excellent communication and stakeholder management skills. Desirable: Experience working with large-scale retail datasets (e.g., POS, CRM, supply chain). Familiarity with tools like dbt, Airflow, or MLflow. Master's or PhD in Data Science, Statistics, Computer Science, or related field. Benefits: Competitive salary and performance bonuses Flexible working options More ❯
or management experience in a technical role, with responsibility for people and projects Hands-on expertise in Python and SQL; experience with Spark and DBT is a plus Familiarity with AWS services such as Lambda, S3, Redshift, and Glue; Databricks experience is advantageous Experience designing and implementing scalable data platforms More ❯