london, south east england, united kingdom Hybrid/Remote Options
LocalStack
or changes. Build abstractions that make it easy to plug in new service behaviour or data models. Ensure emulators work seamlessly with orchestration and infrastructure-as-code tools (e.g., dbt, Terraform, Airflow, CDKs). Gather and act on feedback from internal and external teams to prioritize high-impact integrations. Build usage analytics and telemetry to understand adoption patterns and developer More ❯
if you have experience with any of the following... Workflow orchestration tooling (e.g. Prefect, Airflow, Dagster) Experience with cloud data warehouses (e.g. BigQuery, Snowflake, Redshift) Data transformation tools (e.g. dbt) and data quality frameworks (e.g. Great Expectations) Backend Python frameworks (e.g. Django, FastAPI, Flask) for API development Modern data processing libraries (e.g. Polars, DuckDB) Infrastructure-as-Code (e.g. Terraform, Pulumi More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid/Remote Options
DCS Recruitment
data tools such as Snowflake, Databricks, or BigQuery. Familiarity with streaming technologies (e.g., Kafka, Spark Streaming, Flink) is an advantage. Experience with orchestration and infrastructure tools such as Airflow, dbt, Prefect, CI/CD pipelines, and Terraform. What you get in return: Up to £60,000 per annum + benefits Hybrid working (3 in office) Opportunity to lead and mentor More ❯
Manchester, Lancashire, England, United Kingdom Hybrid/Remote Options
AJ Bell
policies. Maintain data dictionary. Maintain business level data model. Recommending and introducing new technology where needed. Core: Cloud data platforms (e.g. Snowflake, BigQuery, Redshift) Data transformation technology such as DBT Visual Studio Code Python CI automation systems such as Jenkins A git-based source control system such as BitBucket Data Warehouse/Kimball methodology Data replication technology such as Fivetran More ❯
Edinburgh, Midlothian, United Kingdom Hybrid/Remote Options
Aberdeen
systems. Good understanding of security, compliance, and agile delivery practices. Passionate about learning new technologies and collaborating with teams, including mentoring junior engineers. Nice-to-have experience with Snowflake, DBT, Terraform, and Azure DevOps CI/CD pipelines. We are proud to be a Disability Confident Committed employer. If you have a disability and would like to apply to one More ❯
heavily on the following tools and technologies (note we do not expect applicants to have prior experience of all them): Google Cloud Platform for all of our analytics infrastructure dbt and BigQuery SQL for our data modelling and warehousing Python for data science Go to write our application code AWS for most of our backend infrastructure You should apply if More ❯
Advanced proficiency in Python and SQL, with experience in spatial SQL and PostGIS. Familiarity with spatial tools and libraries (GeoPandas, QGIS) and feature engineering concepts. Experience with data modelling, dbt, and version control (Git). Knowledge of spatial datasets (MasterMap, AddressBase, Land Registry). Desired: Experience with WMS/WFS services, graph theory (NetworkX), GDAL, and Snowflake. Nice to have More ❯
SQL and relational databases (e.g., PostgreSQL, DuckDB). Experience with the modern data stack, building data ingestion pipelines and working with ETL and orchestration tools (e.g., Airflow, Luigi, Argo, dbt), big data technologies (Spark, Kafka, Parquet), and web frameworks for model serving (e.g. Flask or FastAPI). Data Science: Familiarity or experience with classical NLP techniques (BERT, topic modelling, summarisation More ❯
SQL and dimensional data modelling (SCDs, fact/dim, conformed dimensions) Experience with PostgreSQL optimisation. Advanced Python skills ETL/ELT Pipelines: Hands-on experience building pipelines using SSIS, dbt, Airflow, or similar Strong understanding of enterprise ETL frameworks, lineage, and data quality Cloud & Infrastructure: Experience designing and supporting AWS-based analytical infrastructure Skilled in working with S3 and integrating More ❯
Edinburgh, City of Edinburgh, United Kingdom Hybrid/Remote Options
Cathcart Technology
overall data strategy. The ideal person for this role will have a strong background in data engineering , with experience building modern data solutions using technologies like Kafka , Spark , Databricks , dbt , and Airflow . You'll know your way around cloud platforms (AWS, GCP, or Azure) and be confident coding in Python , Java , or Scala . Most importantly, you'll understand More ❯
Employment Type: Permanent
Salary: £80000 - £100000/annum Bonus, Pension and Shares
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid/Remote Options
Cathcart Technology
overall data strategy. The ideal person for this role will have a strong background in data engineering , with experience building modern data solutions using technologies like Kafka , Spark , Databricks , dbt , and Airflow . You'll know your way around cloud platforms (AWS, GCP, or Azure) and be confident coding in Python , Java , or Scala . Most importantly, you'll understand More ❯
data-driven and less data-busy. Key Responsibilities Client Delivery Design, validate, and optimise Data Vault 2.0 architectures across Snowflake, Databricks, and BigQuery environments. Provide best-practice guidance on dbt modelling, testing frameworks, and macros. Define governance and metadata standards (naming, access, lineage, compliance) suited to regulated industries like higher education and retail. Recommend orchestration and deployment strategies using Azure More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
business deliverables.* Collect, transform, and process datasets from various internal and external sources, ensuring data quality, governance, and integrity.* Implement efficient data models and schemas within Snowflake, and use DBT for transformation, orchestration, and workflow management.* Optimise ELT/ETL processes for improved performance, cost efficiency, and scalability.* Troubleshoot and resolve data pipeline issues swiftly and effectively across the dataMore ❯
and data governance (meta data/business catalogs). 4. Knowledge of at least one of the following technologies/methodologies will be an additional advantage: Python, Streamlit, Matillion, DBT, Atlan, Terraform, Kubernetes, Data Vault, Data Mesh 5. Ability to engage with principal data architects of client stakeholders 6. Excellent presentation and communication skills. This role will require regular/ More ❯
exprience in developing architectural strategies, blueprints for hybrid and cloud-native solutions ELT/ETL Frameworks & Pipelines Essential Develop robust ELT/ETL pipelines using tools like Apache Airflow, DBT, AWS Glue, Azure Data Factory, or Kafka Connect. Desirable Optimize data transformations for performance, reusability, and modular design (e.g., using SQL/Scala/Python). Disclosure and Barring Service More ❯
South East London, London, United Kingdom Hybrid/Remote Options
Stepstone UK
and in event-driven and eventual consistency systems using Kafka, .Net, Java, REST APIs, AWS, Terraform and DevOps Nice to have:experience in data pipping and modelling using SQL, DBT, ETL, Data Warehousing, Redshift and Python, and ecommerce and mobile applications background Additional Information Were a community here that cares as much about your life outside work as how you More ❯
Bristol, Avon, South West, United Kingdom Hybrid/Remote Options
IO Associates
specialists to deliver efficient, high-quality solutions. Critical Skills Extensive experience with Databricks (including Spark, Delta Lake, and MLflow). Proficiency in ETL/ELT development and orchestration tools (DBT, Airflow, or similar). Hands-on experience with cloud platforms (AWS, Azure, or GCP). Strong knowledge of SQL, Python, and PySpark for data processing. Familiarity with CI/CD More ❯
this knowledge to develop data platforms solutions for Recursion. Excitement to learn parts of our tech stack that you might not already know. Our current tech stack includes: Python, dbt, Prefect, BigQuery, Datastream, FiveTran, PostgreSQL, GCS, Kubernetes, CI/CD, Infrastructure as Code. Our cloud services are provided by Google Cloud Platform. Experience working collaboratively on projects with significant ambiguity More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid/Remote Options
Cathcart Technology
foster a collaborative, forward-thinking culture where engineers work on large-scale datasets, distributed systems, and modern cloud platforms. Teams leverage cutting-edge tools such as Spark, Airflow, Kafka, dbt, and Databricks to build resilient, scalable, and high-quality data solutions. Why this role? ** Lead a talented team of data engineers delivering high-quality, reliable data systems. ** Shape the architecture More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Searchability
giving express consent for us to process (subject to required skills) your application to our client in conjunction with this vacancy only. KEY SKILLS: GCP | Python | SQL | MongoDB | Airflow | dbt | Terraform | Docker | ETL | AI | Machine Learning More ❯
Birmingham, West Midlands, England, United Kingdom
Harnham - Data & Analytics Recruitment
Work within an Azure-based environment (preferred). Provide input into best practices across the data function and help "keep the lights on." Tech Stack Core: Python, SQL, Airflow, dbt, Terraform, CI/CD, Power BI Nice to have: Kimball methodology Cloud: Azure (preferred) What They're Looking For 2-4 years' experience in a Data Engineering role. Strong SQL More ❯
WC2H 0AA, Leicester Square, Greater London, United Kingdom Hybrid/Remote Options
Youngs Employment Services
/Fabric Semantic Models Ability to work with/alongside stakeholders with their own operational pressures Able to follow best practices and adapt, even without established iteration management Desirable DBT Previous experience of working with D365 data API Integration (OData ideally) Basic accounting knowledge Snowflake This assignment is signed off and approved with the hope of getting someone started ASAP More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid/Remote Options
IO Associates
reliable, accessible, and actionable. This role is 'remote first' and falls OUTSIDE IR35. Key Responsibilities Data Pipeline Development: Design, build, and maintain robust, scalable, and efficient data pipelines using DBT, SQL, Python, and ETL frameworks. Data Modelling: Develop and maintain logical and physical data models to support reporting and analysis needs, implementing best practices for data warehouse design. Data Transformation More ❯
South East London, London, United Kingdom Hybrid/Remote Options
Stepstone UK
tracking code. Implement TMS (Tealium IQ, Adobe Analytics, GTM and Adobe Dynamic Tag Manager) changes. Integrate data sources via web and REST APIs. Data pipping and modelling using SQL, DBT, Airflow, ETL, Data Warehousing, Redshift and Python. Transfer knowledge of the business processes and requirements to the development teams. Collaborate with Product, Marketing and Development teams to collect business requirements More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
the ongoing evolution of the platform, including real-time and streaming capabilities. YOUR SKILLS AND EXPERIENCE: You'll need: Strong experience with SQL and Python Hands-on experience with dbt or similar transformation frameworks. Familiarity with Git , CI/CD , and automated testing for data quality. Excellent collaboration and communication skills with both technical and non-technical stakeholders. A proactive More ❯