tool, providing reliable data access to users throughout Trustpilot Design, build, maintain, and rigorously monitor robust data pipelines and transformative models using our modern data stack, including GCP, Airflow, dbt, and potentially emerging technologies like real-time streaming platforms Develop and manage reverse ETL processes to seamlessly integrate data with our commercial systems, ensuring operational efficiency Maintain and optimise our More ❯
autonomy to shape tools, processes, and outcomes Join a values led team that champions innovation, collaboration, and curiosity Responsibilities: Design, build and maintain robust data models and transformations using dbt, Snowflake, and Looker Own integrations from multiple third party sources (Salesforce, Segment, Amplitude, Google Ads) Translate business requirements into well structured metrics and scalable data solutions Enable self serve analytics … pipelines, and data shares Requirements: Solid experience in an Analytics Engineer or similar role within BI or Data Advanced SQL skills and experience working with modern data stacks (Snowflake, dbt) Hands on experience with Looker or similar BI tools (Tableau, Power BI) Strong commercial acumen - you understand both data and the business problems it's solving Familiarity with version control More ❯
experiments, conducting research initiatives, and improving/automating reports that the team depend on to monitor performance. Our current MarTech stack includes: Mode (our BI tool), Segment (our CDP), dbt, Redshift/BigQuery, Google Ads, Meta, Braze and more. We are looking for someone who is ready to take a business from a funnel-like, last-click attributed view of … Organic Social, Affiliate and Partnership channels. Advanced SQL capabilities for transforming, modelling and interrogating data inside databases (we use BigQuery and Redshift); experience with transformation layer technologies (we use dbt) is a big plus. Extensive experience working with BI tools (we use Mode) to analyse and visualise data, including the ability to produce, automate and QA high-quality dashboards. An More ❯
across all Octopus international regions. The role leverages a modern tech stack including SQL, Python, Airflow, Kubernetes, and various other cutting-edge technologies. You'll work with tools like dbt on Databricks, PySpark, Streamlit, and Django, ensuring robust data infrastructure that powers business-critical operations. What makes this role particularly exciting is the combination of technical depth and business impact. … Share knowledge and upskill team members Requirements For Data Engineer Strong aptitude with SQL, Python and Airflow Experience in Kubernetes, Docker, Django, Spark and related monitoring tools Experience with dbt for pipeline modelling Ability to shape needs into requirements and design scalable solutions Quick understanding of new domain areas and data visualization Team player with project ownership mindset Passion for More ❯
across all Octopus international regions. The role leverages a modern tech stack including SQL, Python, Airflow, Kubernetes, and various other cutting-edge technologies. You'll work with tools like dbt on Databricks, PySpark, Streamlit, and Django, ensuring robust data infrastructure that powers business-critical operations. What makes this role particularly exciting is the combination of technical depth and business impact. … Share knowledge and upskill team members Requirements For Data Engineer Strong aptitude with SQL, Python and Airflow Experience in Kubernetes, Docker, Django, Spark and related monitoring tools Experience with dbt for pipeline modelling Ability to shape needs into requirements and design scalable solutions Quick understanding of new domain areas and data visualization Team player with project ownership mindset Passion for More ❯
across all Octopus international regions. The role leverages a modern tech stack including SQL, Python, Airflow, Kubernetes, and various other cutting-edge technologies. You'll work with tools like dbt on Databricks, PySpark, Streamlit, and Django, ensuring robust data infrastructure that powers business-critical operations. What makes this role particularly exciting is the combination of technical depth and business impact. … Share knowledge and upskill team members Requirements For Data Engineer Strong aptitude with SQL, Python and Airflow Experience in Kubernetes, Docker, Django, Spark and related monitoring tools Experience with dbt for pipeline modelling Ability to shape needs into requirements and design scalable solutions Quick understanding of new domain areas and data visualization Team player with project ownership mindset Passion for More ❯
scalable data models, streamlining reporting pipelines, and consolidating multiple data sources into a unified operational reporting layer. What You'll Do Own MI Infrastructure + Design and maintain robust DBT models and SQL pipelines to transform raw data into accurate, timely, and usable reporting layers. + Ensure consistent metric definitions and a single source of truth across all operational reporting. … Must-Haves 3+ years working in BI/MI or analytics roles, with experience in SQL-heavy, reporting-centric environments . Expert-level SQL and hands-on experience with DBT and Python (non-negotiable). Experience building and maintaining data models and reporting infrastructure. Strong understanding of data integrity, version control, and metric standardisation. Excellent communication skills for working with … with Superset/Preset or similar modern dashboarding tools. Experience in regulated environments or with regulatory reporting. Interview Process 1. Intro Call (30 mins)2. Take-home SQL + DBT Task (2 hours)3. Task Debrief (45 mins-1 hour)4. Final Chat with COO & CRO (1 hour) The opportunity to scale up one of the world's most successful More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
by the wider team. Core Requirements: Strong SQL skills for data modelling and transformation Proven experience building and managing ETL pipelines in production Hands-on experience with Snowflake and dbt Proven experience as an Analytics Engineer Experience working in a cross-functional environment with stakeholders Excellent communication skills What Will Make You Stand Out: Experience with ThoughtSpot , Looker, or similar More ❯
wide variety of technical and executive audiences both written and verbal Preferred (but not required) to have: Hands on experience with Python Experience working with modern data technology (e.g. dbt, spark, containers, devops tooling, orchestration tools, git, etc.) Experience with data science and machine learning technology People want to buy from people who understand them. Our Solutions Engineers build connections More ❯
both written and verbal Excellent language skills in Hebrew and English Preferred (but not required) to have: Hands on experience with Python Experience working with modern data technology (e.g. dbt, spark, containers, devops tooling, orchestration tools, git, etc.) Experience with data science and machine learning technology People want to buy from people who understand them. Our Solution Engineers build connections More ❯
hood. In this role, you'll partner closely with teams across the business to deliver high-impact analysis and insights - while also owning the creation and maintenance of the dbt models and metrics that power those insights. This is a hybrid role that blends data analysis, stakeholder enablement, and hands-on data modelling, ideal for someone who wants to drive … business to understand their goals and turn questions into data-driven insights Lead complex analyses and help teams make better decisions with clear, actionable recommendations Own and evolve the dbt models that feed our reporting layer - ensuring data is reliable, well-structured, and documented Build and maintain dashboards and visualisations in tools like Tableau Identify and resolve data quality issues … help shape our data modelling standards and best practices Technology We're pragmatic about our technology choices. These are some of the things we use at the moment: Python, DBT, Tableau ️ PostgreSQL, BigQuery, MySQL pytest ️ AWS, GCP Docker, Terraform, GitHub, GIT How we expect you to work ️ Collaborate - We work in cross-functional, mission driven, autonomous squads that gel over More ❯
The Role Ffern is hiring a mid or senior Analytics Engineer to own projects within our analytics pipeline. The role will focus on building clean, scalable dbt models, improving pipeline reliability and helping stakeholders across the business access trusted, high-quality data. You'll report directly into our Head of Data and work closely with cross-functional teams to maintain … platform, and make improvements where needed Your Profile 3+ years in an Analytics Engineering or similar data role (e.g. Analyst or Data Engineer) Strong SQL and experience building with dbt Solid understanding of transforming raw data into clean, business-ready datasets Ability to communicate clearly with both technical and non-technical stakeholders Experience with BI tools such as Lightdash, Looker More ❯
Crawley, Sussex, United Kingdom Hybrid / WFH Options
Rentokil Initial plc
experience in data collection, preprocessing, and integration from various sources, ensuring accuracy, consistency, and handling missing values or outliers. Proficient in designing and implementing ELT pipelines using tools like dbt, with strong knowledge of data warehousing, data lake concepts, and data pipeline optimization. Skilled in SQL for data manipulation, analysis, query optimisation, and database design. Artificial Intelligence and Machine Learning More ❯
managing the delivery of technical solutions, including exposure to agile product development methodologies Experience building underlying data pipelines and ETL, particularly useful if done using Amazon Web Services, Adverity , DBT etc. Knowledge and experience using other programming and/or statistical languages (e.g. Python, R, etc.) Life at WPP Media & Benefits Our passion for shaping the next era of media More ❯
atmosphere. What are we looking for in the ideal candidate? Proficient in data modelling, data warehousing, and advanced SQL query crafting. Skilled in ETL processes, preferably with tools like dbt and Sigma. Knowledgeable in e-commerce, digital and marketing analytics, including GA4 and BigQuery. Experienced with cloud platforms and database technologies, particularly GCP. Python proficiency is desirable, but not mandatory. … Here’s what your role will entail: Extract and integrate data from diverse sources, aligning with stakeholder needs. Consolidate data on platforms (using dbt, Sigma, Census, Gravity) for optimal stakeholder visualization. Manage the entire ETL cycle within our tech stack. Define and monitor KPIs, maintain documentation, and uphold data governance standards. Collaborate with the Analytics Team, including the Head of More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Pulse Recruit
or analytics. Experience leading cross-functional teams and setting product vision. A commercially minded approach, always focused on business impact. Solid understanding of tools like Python, SQL, Power BI, DBT, and Google BigQuery. A desire to work in a collaborative, high-ownership environment. Curiosity about customer behaviour and a drive to uncover insights through data. 📩 Sound like you? Lauren.stuart@pulserecruit.co.uk More ❯
or analytics. Experience leading cross-functional teams and setting product vision. A commercially minded approach, always focused on business impact. Solid understanding of tools like Python, SQL, Power BI, DBT, and Google BigQuery. A desire to work in a collaborative, high-ownership environment. Curiosity about customer behaviour and a drive to uncover insights through data. 📩 Sound like you? Lauren.stuart@pulserecruit.co.uk More ❯
areas, supplementing these with top analytical tools that allow for a deep-dive. Our tech stack includes some of the best in the industry, SQL (currently on Amazon Redshift), dbt, Tableau and Python. A day in the life Play a key role in building, optimising and innovating existing processes, setting the standard for MI and data manipulation across the business. … levels across product , credit strategy and operations including the senior leadership team to understand their data needs and deliver the best data solution to addressthem using toolssuch as SQL, dbt (opportunity to learn this, if you haven't already!), Python and Tableau. Collaborate with the wider analytics community across the business , such as product analytics and credit strategy, to deliver More ❯
going above and beyond basic role requirements Python/R Understanding of stats and statistical modelling (measures of central tendency, understanding of variance and testing, some regression modelling) Desirable DBTData modelling in a warehouse context Understanding of testing and experimental design Legally authorised to work in the UK Some important stuff we would like you to know To meet More ❯
this isn't set in stone! Familiarity with digital advertising platforms (Google Ads, Facebook Ads, LinkedIn Ads, TikTok, etc.) We use Google Analytics 4, BigQuery, Fivetran, Amplitude, Metabase and DBT amongst other things for our data stack, so experience here would be beneficial, however, experience in similar tools (if not these ones) is a must-have Being a SQL guru More ❯
with demonstrable experience designing and implementing robust, performant data pipelines from a variety of sources such as databases, APIs, SFTP, etc. Experience building ELT pipelines using tools such as dbt and Airflow Proven experience managing software engineering teams, including mentoring juniors and seniors, and promoting professional development. Demonstrated experience in project management, with the ability to manage multiple projects and More ❯
Key Technologies & Tools: Salesforce platform (Admin, Developer, and Deployment tools) Snowflake Data Cloud Git, Bitbucket, GitHub Jenkins, Azure DevOps, or GitLab CI/CD Jira, Confluence DataOps tools (e.g., dbt, Airflow) – Desirable The Ideal Candidate: Has previously built environments for large-scale IT transformation programmes Brings hands-on experience with automation and process optimisation Is highly proficient in JIRA Has More ❯
Key Technologies & Tools: Salesforce platform (Admin, Developer, and Deployment tools) Snowflake Data Cloud Git, Bitbucket, GitHub Jenkins, Azure DevOps, or GitLab CI/CD Jira, Confluence DataOps tools (e.g., dbt, Airflow) - Desirable The Ideal Candidate: Has previously built environments for large-scale IT transformation programmes Brings hands-on experience with automation and process optimisation Is highly proficient in JIRA Has More ❯
Key Technologies & Tools: Salesforce platform (Admin, Developer, and Deployment tools) Snowflake Data Cloud Git, Bitbucket, GitHub Jenkins, Azure DevOps, or GitLab CI/CD Jira, Confluence DataOps tools (e.g., dbt, Airflow) - Desirable The Ideal Candidate: Has previously built environments for large-scale IT transformation programmes Brings hands-on experience with automation and process optimisation Is highly proficient in JIRA Has More ❯