Contribute to centralised documentation and knowledge sharing to promote transparency and collaboration. Your Skills: Significant experience working with data in a commercial or digitally native environment. Proficiency in SQL, dbt, Python, or R, with experience using BI tools such as Looker, Tableau, or Lightdash. Strong ability to translate complex business problems into analytical tasks and communicate insights effectively. Confident dataMore ❯
City of London, London, United Kingdom Hybrid / WFH Options
Wave Talent
platform and data-driven culture, shape the foundations of data models, empower colleagues with clean and reliable datasets and directly impact decision-making across the business. ✅ Must have requirements: DBT SQL Python strong engineering mindset, with focus on reproducibility, reliability, optimisation and model performance 👍 Bonus points for experience with: Snowflake Looker AWS product analytics subscription models experimentation & A/B More ❯
platform and data-driven culture, shape the foundations of data models, empower colleagues with clean and reliable datasets and directly impact decision-making across the business. ✅ Must have requirements: DBT SQL Python strong engineering mindset, with focus on reproducibility, reliability, optimisation and model performance 👍 Bonus points for experience with: Snowflake Looker AWS product analytics subscription models experimentation & A/B More ❯
platform and data-driven culture, shape the foundations of data models, empower colleagues with clean and reliable datasets and directly impact decision-making across the business. Must have requirements: DBT SQL Python strong engineering mindset, with focus on reproducibility, reliability, optimisation and model performance Bonus points for experience with: Snowflake Looker AWS product analytics subscription models experimentation & A/B More ❯
london, south east england, united kingdom Hybrid / WFH Options
Wave Talent
platform and data-driven culture, shape the foundations of data models, empower colleagues with clean and reliable datasets and directly impact decision-making across the business. ✅ Must have requirements: DBT SQL Python strong engineering mindset, with focus on reproducibility, reliability, optimisation and model performance 👍 Bonus points for experience with: Snowflake Looker AWS product analytics subscription models experimentation & A/B More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Wave Talent
platform and data-driven culture, shape the foundations of data models, empower colleagues with clean and reliable datasets and directly impact decision-making across the business. ✅ Must have requirements: DBT SQL Python strong engineering mindset, with focus on reproducibility, reliability, optimisation and model performance 👍 Bonus points for experience with: Snowflake Looker AWS product analytics subscription models experimentation & A/B More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Wave Talent
platform and data-driven culture, shape the foundations of data models, empower colleagues with clean and reliable datasets and directly impact decision-making across the business. ✅ Must have requirements: DBT SQL Python strong engineering mindset, with focus on reproducibility, reliability, optimisation and model performance 👍 Bonus points for experience with: Snowflake Looker AWS product analytics subscription models experimentation & A/B More ❯
experience in designing Enterprise Data Platforms with atleast 5+ years in Snowflake. Strong Expertise in SQL, Data Warehousing. Hands on experience working in Insurance 3+ years of experience in DBT for Data Transformation. Deep understanding of Agile methodologies in Data environment. FamiliaritywithPowerBI. More ❯
for performance data What You'll Need 5+ years' experience in data analytics, ideally with a marketing or growth focus Strong SQL skills and experience building robust data pipelines (DBT, Airflow) Confident using tools like Looker, Amplitude, GA, and Optimizely A/B testing expertise and deep understanding of marketing KPIs and attribution models Great communication skills - able to turn More ❯
for performance data What You’ll Need 5+ years’ experience in data analytics, ideally with a marketing or growth focus Strong SQL skills and experience building robust data pipelines (DBT, Airflow) Confident using tools like Looker, Amplitude, GA, and Optimizely A/B testing expertise and deep understanding of marketing KPIs and attribution models Great communication skills — able to turn More ❯
for performance data What You’ll Need 5+ years’ experience in data analytics, ideally with a marketing or growth focus Strong SQL skills and experience building robust data pipelines (DBT, Airflow) Confident using tools like Looker, Amplitude, GA, and Optimizely A/B testing expertise and deep understanding of marketing KPIs and attribution models Great communication skills — able to turn More ❯
for performance data What You’ll Need 5+ years’ experience in data analytics, ideally with a marketing or growth focus Strong SQL skills and experience building robust data pipelines (DBT, Airflow) Confident using tools like Looker, Amplitude, GA, and Optimizely A/B testing expertise and deep understanding of marketing KPIs and attribution models Great communication skills — able to turn More ❯
for performance data What You’ll Need 5+ years’ experience in data analytics, ideally with a marketing or growth focus Strong SQL skills and experience building robust data pipelines (DBT, Airflow) Confident using tools like Looker, Amplitude, GA, and Optimizely A/B testing expertise and deep understanding of marketing KPIs and attribution models Great communication skills — able to turn More ❯
london (city of london), south east england, united kingdom
Harrington Starr
for performance data What You’ll Need 5+ years’ experience in data analytics, ideally with a marketing or growth focus Strong SQL skills and experience building robust data pipelines (DBT, Airflow) Confident using tools like Looker, Amplitude, GA, and Optimizely A/B testing expertise and deep understanding of marketing KPIs and attribution models Great communication skills — able to turn More ❯
with strong command of dimensional modeling (e.g., Kimball methodology) Expertise in Google Cloud Platform, especially BigQuery architecture, optimization, and cost management Advanced SQL skills and production-level experience using dbt (or similar tools) to build modular, testable transformation pipelines Practical mastery of LookML and semantic layer design within Looker, including Explores, joins, derived tables, and scalability best practices It will More ❯
in developing machine learning models using advanced techniques. Expert proficiency in SQL and databases, with the ability to write structured and efficient queries on large data sets. Experience with dbt, Python or R, is a plus. Development experience with BI platforms such as Looker, Tableau, or Power BI. Benefits included for Data Scientist - Category Management : Comprehensive medical and dental insurance. More ❯
to work with a variety of datasets from multiple sources, familiarity with standard data processing tools/concepts (e.g. SQL, ETL), and experience driving robust QA processes Familiarity with DBT is highly valued In-depth experience of the advertising ecosystem (e.g. ad trafficking, Ad servers, DSPs, Media Strategy and Activation, etc.) and a working knowledge of appropriate metrics, measurement, and More ❯
to work with a variety of datasets from multiple sources, familiarity with standard data processing tools/concepts (e.g. SQL, ETL), and experience driving robust QA processes Familiarity with DBT is highly valued In-depth experience of the advertising ecosystem (e.g. ad trafficking, Ad servers, DSPs, Media Strategy and Activation, etc.) and a working knowledge of appropriate metrics, measurement, and More ❯
to work with a variety of datasets from multiple sources, familiarity with standard data processing tools/concepts (e.g. SQL, ETL), and experience driving robust QA processes Familiarity with DBT is highly valued In-depth experience of the advertising ecosystem (e.g. ad trafficking, Ad servers, DSPs, Media Strategy and Activation, etc.) and a working knowledge of appropriate metrics, measurement, and More ❯
to work with a variety of datasets from multiple sources, familiarity with standard data processing tools/concepts (e.g. SQL, ETL), and experience driving robust QA processes Familiarity with DBT is highly valued In-depth experience of the advertising ecosystem (e.g. ad trafficking, Ad servers, DSPs, Media Strategy and Activation, etc.) and a working knowledge of appropriate metrics, measurement, and More ❯
to work with a variety of datasets from multiple sources, familiarity with standard data processing tools/concepts (e.g. SQL, ETL), and experience driving robust QA processes Familiarity with DBT is highly valued In-depth experience of the advertising ecosystem (e.g. ad trafficking, Ad servers, DSPs, Media Strategy and Activation, etc.) and a working knowledge of appropriate metrics, measurement, and More ❯
london (city of london), south east england, united kingdom
Gain Theory
to work with a variety of datasets from multiple sources, familiarity with standard data processing tools/concepts (e.g. SQL, ETL), and experience driving robust QA processes Familiarity with DBT is highly valued In-depth experience of the advertising ecosystem (e.g. ad trafficking, Ad servers, DSPs, Media Strategy and Activation, etc.) and a working knowledge of appropriate metrics, measurement, and More ❯
backend architecture, builddata pipelines, and integrate AI/LLM components into production systems. Role Breakdown: 50% Backend Engineering: FastAPI, Flask, Node.js, CI/CD 30% Data Engineering: ETL, DBT, Airflow 20% AI/LLM Integration: LangChain, RAG pipelines, orchestration Key Responsibilities: Design and build backend services to support AI agent deployment Develop scalable data pipelines and integration layers Implement More ❯
backend architecture, builddata pipelines, and integrate AI/LLM components into production systems. Role Breakdown: 50% Backend Engineering: FastAPI, Flask, Node.js, CI/CD 30% Data Engineering: ETL, DBT, Airflow 20% AI/LLM Integration: LangChain, RAG pipelines, orchestration Key Responsibilities: Design and build backend services to support AI agent deployment Develop scalable data pipelines and integration layers Implement More ❯
backend architecture, builddata pipelines, and integrate AI/LLM components into production systems. Role Breakdown: 50% Backend Engineering: FastAPI, Flask, Node.js, CI/CD 30% Data Engineering: ETL, DBT, Airflow 20% AI/LLM Integration: LangChain, RAG pipelines, orchestration Key Responsibilities: Design and build backend services to support AI agent deployment Develop scalable data pipelines and integration layers Implement More ❯