london, south east england, united kingdom Hybrid/Remote Options
LocalStack
or changes. Build abstractions that make it easy to plug in new service behaviour or data models. Ensure emulators work seamlessly with orchestration and infrastructure-as-code tools (e.g., dbt, Terraform, Airflow, CDKs). Gather and act on feedback from internal and external teams to prioritize high-impact integrations. Build usage analytics and telemetry to understand adoption patterns and developer More ❯
if you have experience with any of the following... Workflow orchestration tooling (e.g. Prefect, Airflow, Dagster) Experience with cloud data warehouses (e.g. BigQuery, Snowflake, Redshift) Data transformation tools (e.g. dbt) and data quality frameworks (e.g. Great Expectations) Backend Python frameworks (e.g. Django, FastAPI, Flask) for API development Modern data processing libraries (e.g. Polars, DuckDB) Infrastructure-as-Code (e.g. Terraform, Pulumi More ❯
heavily on the following tools and technologies (note we do not expect applicants to have prior experience of all them): Google Cloud Platform for all of our analytics infrastructure dbt and BigQuery SQL for our data modelling and warehousing Python for data science Go to write our application code AWS for most of our backend infrastructure You should apply if More ❯
Advanced proficiency in Python and SQL, with experience in spatial SQL and PostGIS. Familiarity with spatial tools and libraries (GeoPandas, QGIS) and feature engineering concepts. Experience with data modelling, dbt, and version control (Git). Knowledge of spatial datasets (MasterMap, AddressBase, Land Registry). Desired: Experience with WMS/WFS services, graph theory (NetworkX), GDAL, and Snowflake. Nice to have More ❯
SQL and relational databases (e.g., PostgreSQL, DuckDB). Experience with the modern data stack, building data ingestion pipelines and working with ETL and orchestration tools (e.g., Airflow, Luigi, Argo, dbt), big data technologies (Spark, Kafka, Parquet), and web frameworks for model serving (e.g. Flask or FastAPI). Data Science: Familiarity or experience with classical NLP techniques (BERT, topic modelling, summarisation More ❯
data-driven and less data-busy. Key Responsibilities Client Delivery Design, validate, and optimise Data Vault 2.0 architectures across Snowflake, Databricks, and BigQuery environments. Provide best-practice guidance on dbt modelling, testing frameworks, and macros. Define governance and metadata standards (naming, access, lineage, compliance) suited to regulated industries like higher education and retail. Recommend orchestration and deployment strategies using Azure More ❯
and data governance (meta data/business catalogs). 4. Knowledge of at least one of the following technologies/methodologies will be an additional advantage: Python, Streamlit, Matillion, DBT, Atlan, Terraform, Kubernetes, Data Vault, Data Mesh 5. Ability to engage with principal data architects of client stakeholders 6. Excellent presentation and communication skills. This role will require regular/ More ❯
exprience in developing architectural strategies, blueprints for hybrid and cloud-native solutions ELT/ETL Frameworks & Pipelines Essential Develop robust ELT/ETL pipelines using tools like Apache Airflow, DBT, AWS Glue, Azure Data Factory, or Kafka Connect. Desirable Optimize data transformations for performance, reusability, and modular design (e.g., using SQL/Scala/Python). Disclosure and Barring Service More ❯
this knowledge to develop data platforms solutions for Recursion. Excitement to learn parts of our tech stack that you might not already know. Our current tech stack includes: Python, dbt, Prefect, BigQuery, Datastream, FiveTran, PostgreSQL, GCS, Kubernetes, CI/CD, Infrastructure as Code. Our cloud services are provided by Google Cloud Platform. Experience working collaboratively on projects with significant ambiguity More ❯
South East London, London, United Kingdom Hybrid/Remote Options
Stepstone UK
tracking code. Implement TMS (Tealium IQ, Adobe Analytics, GTM and Adobe Dynamic Tag Manager) changes. Integrate data sources via web and REST APIs. Data pipping and modelling using SQL, DBT, Airflow, ETL, Data Warehousing, Redshift and Python. Transfer knowledge of the business processes and requirements to the development teams. Collaborate with Product, Marketing and Development teams to collect business requirements More ❯
high-quality data products What you'll bring: 3-5 years' experience as a Data Engineer Strong SQL and Python skills Proven experience building modern ETL/ELT pipelines (dbt experience ideal) Experience with data orchestration tools (Prefect preferred) Understanding of data modelling , especially event-driven architectures Knowledge of modern data engineering development practices Nice to have: Background in InsurTech More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Oscar Technology
that influence business performance. Ability to operate effectively in a fast-paced, scaling organisation. Excellent communication skills and a collaborative approach. Desirable Skills Experience with analytics engineering tools (e.g., dbt, Airflow). Familiarity with experimentation frameworks and A/B testing platforms. Exposure to cloud-based data environments (e.g., Google Cloud Platform). Experience within digital consumer businesses or online More ❯
to how data is used to drive commercial decisions - particularly around pricing, revenue, and customer insight. Key responsibilities include: Manage and maintain the company's data warehouse (Python, Airflow, DBT, Kimball) Ensure data pipelines are robust, accurate, and performant Maintain and develop cloud infrastructure using Infrastructure as Code (Terraform) Identify opportunities to improve data processes, architecture, and efficiency Support the More ❯
business transitions to the cloud. The company is adopting Salesforce as its SaaS platform and Snowflake as its data platform, supported by a modern data stack including Fivetran, Atlan, DBT, and Airflow. Reporting directly to the Chief Data Officer, you'll lead a team of three, covering Data Quality, Data Governance, and Data Literacy. You'll also act as a More ❯
business transitions to the cloud. The company is adopting Salesforce as its SaaS platform and Snowflake as its data platform, supported by a modern data stack including Fivetran, Atlan, DBT, and Airflow. Reporting directly to the Chief Data Officer, you'll lead a team of three, covering Data Quality, Data Governance, and Data Literacy. You'll also act as a More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
LTV, retention, and experimentation. Experience with a data visualisation or BI tool (they use Mode, but open to others). Familiarity with tools such as Amplitude, Python, R, or dbt a plus but not essential. Confident communicator able to translate data into stories and influence stakeholders. A proactive self-starter who thrives in a fast-paced, scaling environment. WHY APPLY More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
to generative AI platforms or a passion for building AI-powered solutions Ability to lead client delivery in dynamic, fast-paced environments Familiarity with tools like Airflow, Databricks or DBT is a plus What's on offer: Salary up to £75,000 Bonus and equity options Hybrid working: 3 days in a vibrant central London office, 2 days remote This More ❯
including ERPs and CRMs. You'll collaborate closely with client stakeholders to translate ambiguous requirements into clean, maintainable solutions that drive real impact. Familiarity with tools such as Airflow, DBT, Databricks, dashboarding frameworks, and Typescript is a strong plus as you help deliver end-to-end production-ready systems. Interview Process Teams conversation (introductory chat) Technical take home exercise Presentation More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
/retention analysis. Background in subscription-based, marketplace or tech-driven environments. Confident communicator and self-starter, able to influence non-technical stakeholders. Experience with tools like Mode, Amplitude, DBT, Python or R is beneficial (not essential). Why Apply? Equity with upcoming exit - real potential for payout in 1-2 years. Fully remote with a collaborative, friendly, pet-loving More ❯
london, south east england, united kingdom Hybrid/Remote Options
Wise
of stats and statistical modelling Data visualisation and storytelling ability Ability to self organise and manage stakeholders Demonstration of impact/going above and beyond basic role requirements Desirable DBTData modelling in a warehouse context Understanding of testing and experimental design Legally authorised to work in the UK Some important stuff we would like you to know To meet More ❯
championing a unified approach to data. YOUR SKILLS AND EXPERIENCE: Significant experience working with data in an analytical role, ideally in a consumer/tech company. Proficiency in SQL, dbt, Python, R and BI tools such as Looker, Lightdash or Tableau. Ability to translate business problems into analytical tasks and communicate insights clearly at all levels. AB testing/experimentation More ❯
and SQL fundamentals The ability to get to grips with new technologies quickly What's Nice to Have** Experience in dashboarding tools, Typescript and API development Familiarity with Airflow, DBT, Databricks Experience working with Enterprise Resource Planning (e.g. Oracle, SAP) and CRM systems Benefits Become part of a category-defining startup which transforms transforming how industrial companies operate and compete More ❯
london, south east england, united kingdom Hybrid/Remote Options
Mention Me
that move us towards our vision of scaling up through product led growth. This role will be focused on our backend system (Symfony, PHP) and our data products (BigQuery, DBT, Airflow), but there will be opportunities to work across the platform including, agentic AI (Python, Langchain), frontend (React, TypeScript), the APIs (GraphQL, REST), our integration tool of choice ) and all … within a production-strength SaaS platform based on best practice architectures, providing high availability (last year we had 99.97% uptime) Experience building and managing data transformations using DBT (DataBuildTool) Ability to work across our stack - experience in JavaScript, React, TypeScript would be a plus - The platform we use to build and scale our suite of integrations Experience integrating More ❯
in developing machine learning models using advanced techniques. Expert proficiency in SQL and databases, with the ability to write structured and efficient queries on large data sets. Experience with dbt, Python or R, is a plus. Development experience with BI platforms such as Looker, Tableau, or Power BI. Benefits included for Data Scientist - Category Management : Comprehensive medical and dental insurance. More ❯
I'm looking for someone who is excited to work in a product-focused, consumer-facing environment, and who brings: Strong SQL Python/R experience preferrable Experience with dbt and modern data stacks a bonus A proven ability to turn data into actionable insights and communicate them clearly to stakeholders Experience in a B2C or consumer tech environment, ideally More ❯