Liverpool, Merseyside, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
S3) Deep experience with Snowflake as a data warehouse. Proficiency in Python or Scala for data processing Excellent communication and stakeholder management skills Preferably some experience with Terraform and dbt (although these are not essential) Benefits: Competitive salary Performance bonus scheme 100% remote working (UK only) 26 days holiday + Bank holidays + Birthday off Opportunity for growth and development More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
S3) Deep experience with Snowflake as a data warehouse. Proficiency in Python or Scala for data processing Excellent communication and stakeholder management skills Preferably some experience with Terraform and dbt (although these are not essential) Benefits: Competitive salary Performance bonus scheme 100% remote working (UK only) 26 days holiday + Bank holidays + Birthday off Opportunity for growth and development More ❯
Newcastle upon Tyne, Tyne and Wear, Tyne & Wear, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
S3) Deep experience with Snowflake as a data warehouse. Proficiency in Python or Scala for data processing Excellent communication and stakeholder management skills Preferably some experience with Terraform and dbt (although these are not essential) Benefits: Competitive salary Performance bonus scheme 100% remote working (UK only) 26 days holiday + Bank holidays + Birthday off Opportunity for growth and development More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Tenth Revolution Group
S3) Deep experience with Snowflake as a data warehouse. Proficiency in Python or Scala for data processing Excellent communication and stakeholder management skills Preferably some experience with Terraform and dbt (although these are not essential) Benefits: Competitive salary Performance bonus scheme 100% remote working (UK only) 26 days holiday + Bank holidays + Birthday off Opportunity for growth and development More ❯
systems following best practices in data modelling and storage Writing clean, efficient, maintainable Python code (TypeScript is a plus!) Automating infrastructure deployments with AWS CDK and/or CloudFormation DBT experience Collaborating with cross-functional teams in an Agile/Scrum environment using tools like Jira ✅ We’re Looking for Someone Who: Has hands-on experience with AWS data services More ❯
/analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong More ❯
/analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
data pipelines and systems. Expertise in cloud technologies (AWS) and integrating cloud services into data platforms. Strong proficiency in SQL, Python, and modern data engineering tools such as Snowflake, DBT, and Tableau. Experience with data warehousing, data lakes, and data modelling. In-depth knowledge of best practices for data quality, security, and performance optimisation. Ability to work closely with both More ❯
data platform, ensuring scalability, reliability, and security. Drive modernisation by transitioning from legacy systems to a lean, scalable platform. Act as a lead expert for technologies such as AWS, DBT, Airflow, and Databricks. Establish best practices for data modelling, ingestion, storage, streaming, and APIs. Governance & Standards Ensure all technical decisions are well-justified, documented, and aligned with business needs. Lead … Expertise in data engineering and cloud engineering, including data ingestion, transformation, and storage. Significant hands-on experience with AWS and its data services. Expert-level skills in SQL, Python, DBT, Airflow and Redshift. Confidence in coding, scripting, configuring, versioning, debugging, testing, and deploying. Ability to guide and mentor others in technical best practices. A product mindset, focusing on user needs More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
and naturally curious Comfortable working across multiple business areas with varied responsibilities Nice-to-Haves Exposure to tools like Prefect , Airflow , or Dagster Familiarity with Azure SQL , Snowflake , or dbt Tech Stack/Tools Python SQL (on-prem + Azure SQL Data Warehouse) Git Benefits £35,000 - £40,000 starting salary (up to £45,000 for the right candidate) Discretionary More ❯
in your work. You support colleagues and customers, contributing to a positive impact. Required technical skills & knowledge: PL/SQL - Python - Microsoft Fabric - Microsoft Synapse - Microsoft SQL Server - Snowflake DBT -Agile - Power BI - AI - Databricks - ML Your profile You have a bachelor's and/or master's degree in computer science or equivalent. You have 2 to 5 years More ❯
a modern, cloud-based BI ecosystem , including: Data Integration: Fivetran , HVR, Databricks , Apache Kafka, Google BigQuery , Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API Advanced Analytics: Databricks (AI & Machine Learning) Governance & Infrastructure: Centralised Data Catalogue & Access Control (Okta) Job Scheduling & Monitoring More ❯
a modern, cloud-based BI ecosystem , including: Data Integration: Fivetran , HVR, Databricks , Apache Kafka, Google BigQuery , Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API Advanced Analytics: Databricks (AI & Machine Learning) Governance & Infrastructure: Centralised Data Catalogue & Access Control (Okta) Job Scheduling & Monitoring More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
Lloyds Bank plc
strategies to ensure data quality and integrity. Applying test data management tools for crafting, managing, and maintaining test data sets. Developing and execute data transformation tests using DBT (DataBuildTool). Performing ETL testing to validate data extraction, transformation, and loading processes. Collaborating with data engineers, analysts, and other stakeholders to identify and resolve data quality issues. Automating data … Including you! What you'll need Required Qualifications: Proven experience in defining and implementing data testing strategies. Hands-on experience with test data management tools. Proficiency in DBT (DataBuildTool) for data transformation and testing. Strong understanding of ETL processes and experience in ETL testing. Excellent problem-solving skills and attention to detail. Experience with data integration tools and More ❯
etc) A drive to solve problems using data Experience in a management role What would be a bonus: Familiarity with Git Data visualization tool (Tableau, Looker, PowerBI or equivalent) DBT 2-5 years experience of consumer credit risk or collections in the financial services, utilities or telecommunications industries Why else you'll love it here Wondering what the salary for More ❯
capabilities. Understands modern software delivery methodologies and project management tools and uses them to drive successful outcomes Technical requirements Cloud Data Warehouse (Big Query, Snowflake, Redshift etc) Advanced SQL DBT Airflow (or similar tool) ELT Looker (or similar tool) Perks of Working at Viator Competitive compensation packages (routinely benchmarked against the latest industry data), including base salary and annual bonuses More ❯
help clients make smarter, data-driven decisions. The Role As an Analytics Engineer, you’ll design and build scalable, production-ready data pipelines and analytics infrastructure using tools like dbt, SQL, Python, and cloud data warehouses. You’ll work end-to-end—from scoping to delivery—on projects that directly impact clients’ strategic decisions. The role is highly technical and … Key attributes of the suitable Analytics Engineer include: A few years of experience in analytics or data engineering, with strong SQL and Python skills, and hands-on experience with dbt, Airflow, and cloud platforms (AWS, GCP, or Azure). You should be confident designing ELT/ETL pipelines and working across varied technical stacks. Strong communication skills and experience working More ❯
help clients make smarter, data-driven decisions. The Role As an Analytics Engineer, you’ll design and build scalable, production-ready data pipelines and analytics infrastructure using tools like dbt, SQL, Python, and cloud data warehouses. You’ll work end-to-end—from scoping to delivery—on projects that directly impact clients’ strategic decisions. The role is highly technical and … Key attributes of the suitable Analytics Engineer include: A few years of experience in analytics or data engineering, with strong SQL and Python skills, and hands-on experience with dbt, Airflow, and cloud platforms (AWS, GCP, or Azure). You should be confident designing ELT/ETL pipelines and working across varied technical stacks. Strong communication skills and experience working More ❯
of data from any source — whether databases, applications, or files — into lakehouses like Snowflake, Databricks, and Redshift. With pipelines that just work and features like advanced data transformation using dbt Core and end-to-end pipeline observability, we’re focused on making robust data pipelines accessible to everyone. We are looking to add senior engineers to our core engineering team More ❯
metrics and dashboards they're presented with - Consumers can self-serve if needed - Consumers can retrieve data via automation What you'll do: Build, document and consult on: Using dbt to model analytics for business domains using a data mesh framework. Charts and dashboards using modern BI tools Comprehensive testing frameworks using tools like dbt_expectation and elementary to ensure … models. Build monitoring and alerting tools for gathering insight on AE performance This role requires Experience in the following Languages: SQL (Advanced) Jinja (Advanced) Experience with the following tools: dbt Core Git A Cloud warehouse provider e.g. Databricks, GCP, Snowflake The following would be nice to have Experience in the following Languages: Python Experience with the following tools: Github Lightdash More ❯
and APIs (RESTful/GraphQL), with solid experience in microservices and databases (SQL/NoSQL). You know your way around big data tools (Spark, Dask) and orchestration (Airflow, DBT). You understand NLP and have experience working with Large Language Models. You're cloud-savvy (AWS, GCP, or Azure) and comfortable with containerization (Docker, Kubernetes). You have strong More ❯
step/complex analytics is essential for this role. Experience in cloud platforms (GCP – BigQuery, Azure -Synapse, Snowflake) and exposure to data science tools/languages such as Python, dbt, D3, Github, GCP/AWS would be advantageous. (This is not a technical advanced data science role so advanced experience in these will not be relevant). Experience in the More ❯
City of London, London, United Kingdom Hybrid / WFH Options
MRK Associates
step/complex analytics is essential for this role. Experience in cloud platforms (GCP – BigQuery, Azure -Synapse, Snowflake) and exposure to data science tools/languages such as Python, dbt, D3, Github, GCP/AWS would be advantageous. (This is not a technical advanced data science role so advanced experience in these will not be relevant). Experience in the More ❯
LinkedIn, Hackajob, Welcome to The Jungle) and ATS platforms (Screenloop experience a plus). Solid understanding of tech stacks including Python, React.js, AWS/Azure, and data tools like dbt, Airflow, Snowflake. Ability to conduct structured interviews and technical assessments. Familiarity with software development practices, agile methodologies, DevOps culture, and AI/ML concepts. Exceptional communication and stakeholder management skills More ❯
data platform, including data pipelines, orchestration and modelling. Lead the team in building and maintaining robust data pipelines, data models, and infrastructure using tools such as Airflow, AWS Redshift, DBT and Looker.Ensuring the team follows agile methodologies to improve delivery cadence and responsiveness. Contribute to hands-on coding, particularly in areas requiring architectural input, prototyping, or critical delivery support. Support More ❯