Edinburgh, Midlothian, United Kingdom Hybrid/Remote Options
Aberdeen
join our Data & Analytics team. This role is instrumental in delivering clean, modern, and efficient data solutions across cloud-native platforms. Key Responsibilities Develop solutions across Snowflake, Azure, and DBT platforms. Lead migration and optimisation of applications using Azure cloud-native services. Write clean, testable, and maintainable code following industry standards. Implement CI/CD pipelines and test automation using More ❯
enterprise environments. 3+ years of experience with AI/ML and AI enabled analytics (LLMs, RAG, agents). Strong hands on coding skills with Spark (PySpark/Scala), SQL, dbt, and modern data platforms (Databricks, Snowflake, BigQuery). Experience with cloud platforms (AWS preferred). Proven expertise in BI and semantic modeling using tools such as Power BI, Tableau, Looker More ❯
make customer-centric work possible - providing foundational tooling like address lookup and consent capture, and offering more advanced capabilities like our internal event stream platform and a fully-managed DBT environment. Our services help teams track behavioural signals, enrich customer profiles, and transform raw data into decision-ready insight - all while maintaining a strong focus on data protection. We primarily More ❯
South East London, London, United Kingdom Hybrid/Remote Options
Stepstone UK
and in event-driven and eventual consistency systems using Kafka, .Net, Java, REST APIs, AWS, Terraform and DevOps Nice to have:experience in data pipping and modelling using SQL, DBT, ETL, Data Warehousing, Redshift and Python, and ecommerce and mobile applications background Additional Information Were a community here that cares as much about your life outside work as how you More ❯
analytics solutions using Databricks within a secure environment Critical Skills Extensive experience with Databricks (Spark, Delta Lake, and MLflow). Proficiency in ETL/ELT development and orchestration tools (DBT, Airflow, or similar). Hands-on experience with cloud platforms (AWS, Azure, or GCP). Solid understanding of SQL, Python, and PySpark for data processing. Familiarity with CI/CD More ❯
quality and governance. Working directly with PMs, analysts and tech to deliver data-driven solutions. Tech you'll use: PowerShell, SQL, Python, Snowflake, Power BI, APIs, GitHub, ETL tools (dbt/Matillion), Bloomberg/SimCorp data. Perfect for someone who enjoys both engineering and the business impact of their work. More ❯
quality and governance. Working directly with PMs, analysts and tech to deliver data-driven solutions. Tech you'll use: PowerShell, SQL, Python, Snowflake, Power BI, APIs, GitHub, ETL tools (dbt/Matillion), Bloomberg/SimCorp data. Perfect for someone who enjoys both engineering and the business impact of their work. More ❯
Engineering. What You Bring Advanced experience in Python and SQL Strong experience building ETL/ELT pipelines and data transformations Hands-on experience with orchestration frameworks (Dagster, Airflow, Prefect, dbt) Ability to analyse and structure complex datasets Strong communication skills with both technical and non-technical colleagues Desirable Experience with Spark, Dask, or Polars Experience with containerisation (Docker, Swarm, Kubernetes More ❯
Edinburgh, Scotland, United Kingdom Hybrid/Remote Options
Cathcart Technology
foster a collaborative, forward-thinking culture where engineers work on large-scale datasets, distributed systems, and modern cloud platforms. Teams leverage cutting-edge tools such as Spark, Airflow, Kafka, dbt, and Databricks to build resilient, scalable, and high-quality data solutions. Why this role? ** Lead a talented team of data engineers delivering high-quality, reliable data systems. ** Shape the architecture More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid/Remote Options
Cathcart Technology
foster a collaborative, forward-thinking culture where engineers work on large-scale datasets, distributed systems, and modern cloud platforms. Teams leverage cutting-edge tools such as Spark, Airflow, Kafka, dbt, and Databricks to build resilient, scalable, and high-quality data solutions. Why this role? ** Lead a talented team of data engineers delivering high-quality, reliable data systems. ** Shape the architecture More ❯
giving express consent for us to process (subject to required skills) your application to our client in conjunction with this vacancy only. KEY SKILLS: GCP | Python | SQL | MongoDB | Airflow | dbt | Terraform | Docker | ETL | AI | Machine Learning More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Searchability
giving express consent for us to process (subject to required skills) your application to our client in conjunction with this vacancy only. KEY SKILLS: GCP | Python | SQL | MongoDB | Airflow | dbt | Terraform | Docker | ETL | AI | Machine Learning More ❯
similar role, preferably within UK consumer lending, fintech, or financial services Hands-on experience with building data pipelines, use of SQL, Python, and relevant data engineering tools (e.g.,Snowflake, dbt, third-party ingestion tools, Dagster). Comfortable working with transactional data, credit bureau data, and open banking APIs. Understand the importance of and implement solutions for data quality, lineage, and More ❯
as a Data Engineer (or similar), ideally in consumer lending, fintech, or financial services. Strong hands-on skills in SQL, Python, and modern data engineering tools such as Snowflake, dbt, and Dagster. Experience handling transactional data, credit bureau data, or open banking APIs. Understanding of data quality, lineage, and governance in regulated environments. Comfortable working cross-functionally and turning raw More ❯
in data engineering roles, preferably for a customer facing data product Expertise in designing and implementing large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling in your development More ❯
Tech stack and requirements: Python – must-have, with production-grade engineering expertise FastAPI – Nice to have Google Cloud Platform (GCP) – must-have, ie BigQuery, Cloud Run, and Cloud Storage dbt – strong advantage, with scope to shape best practice Airflow/Composer – highly desirable for orchestration CI/CD – GitHub Actions (or similar) Salary: £75,000 - £85,000 + Equity, PMI More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Identify Solutions
Tech stack and requirements: Python – must-have, with production-grade engineering expertise FastAPI – Nice to have Google Cloud Platform (GCP) – must-have, ie BigQuery, Cloud Run, and Cloud Storage dbt – strong advantage, with scope to shape best practice Airflow/Composer – highly desirable for orchestration CI/CD – GitHub Actions (or similar) Salary: £75,000 - £85,000 + Equity, PMI More ❯
Stroud, England, United Kingdom Hybrid/Remote Options
Ecotricity
person that actively looks for continual improvement opportunities. Knowledge and skills Experience as a Data Engineer or Analyst Databricks/Apache Spark SQL/Python BitBucket/GitHub. Advantageous dbt AWS Azure Devops Terraform Atlassian (Jira, Confluence) About Us What's in it for you... Healthcare plan, life assurance and generous pension contribution Volunteering day Hybrid working Various company discounts More ❯
of experience in data engineering or data science - You’re very well-versed in SQL and Python - You have experience working with data systems and architecture such as AWS, DBT, or similar - You're confident both getting stuck in with data processes and tasks but also thinking strategically about how data can help drive businesses forward - You’re well organised More ❯
/Fabric Semantic Models Ability to work with/alongside stakeholders with their own operational pressures Able to follow best practices and adapt, even without established iteration management Desirable DBT Previous experience of working with D365 data API Integration (OData ideally) Basic accounting knowledge Snowflake This assignment is signed off and approved with the hope of getting someone started ASAP More ❯
WC2H 0AA, Leicester Square, Greater London, United Kingdom Hybrid/Remote Options
Youngs Employment Services
/Fabric Semantic Models Ability to work with/alongside stakeholders with their own operational pressures Able to follow best practices and adapt, even without established iteration management Desirable DBT Previous experience of working with D365 data API Integration (OData ideally) Basic accounting knowledge Snowflake This assignment is signed off and approved with the hope of getting someone started ASAP More ❯
a data or analytics transformation context. Excellent communication and stakeholder management skills, with the ability to translate technical concepts into business outcomes. Familiarity with Google Cloud ecosystem (BigQuery, Dataflow, dbt, Airflow) is a strong advantage. Experience in consumer brands, retail, or eCommerce is preferred. Ideal Candidate You are an experienced data delivery lead who thrives on enabling analytics transformation. You More ❯
and cloud infrastructure (Azure/GCP) Requirements: 3+ years in data engineering or MLOps Strong Python and exposure to ML/AI libraries (e.g. PyTorch, HuggingFace) Proficient with Airflow, DBT, Docker, Kubernetes Cloud deployment experience (Azure preferred) Backend/API development experience with client-facing exposure Interested? Please apply below. More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Tembo
About you Proven experience building a data platform from scratch in a fast-growing tech or fintech environment Strong technical background in SQL, Python, and modern data tooling (e.g. dbt, Airflow, Snowflake, BigQuery, Looker/Tableau) Comfortable joining disparate data sources into a coherent schema and single source of truth Analytical depth — able to design experiments, measure impact, and translate More ❯
About you Proven experience building a data platform from scratch in a fast-growing tech or fintech environment Strong technical background in SQL, Python, and modern data tooling (e.g. dbt, Airflow, Snowflake, BigQuery, Looker/Tableau) Comfortable joining disparate data sources into a coherent schema and single source of truth Analytical depth — able to design experiments, measure impact, and translate More ❯