EXPERIENCE: The ideal Head of Data Platform will have: Extensive experience with Google Cloud Platform (GCP), particularly BigQuery Proficiency with a modern data tech stack, including SQL, Python, Airflow, dbt, Dataform, Terraform Experience in a mid-large sized company within a regulated industry, with a strong understanding of data governance. A strategic mindset, leadership skills, and a hands-on approach. More ❯
About Airwallex Airwallex is the only unified payments and financial platform for global businesses. Powered by our unique combination of proprietary infrastructure and software, we empower over 150,000 businesses worldwide - including Brex, Rippling, Navan, Qantas, SHEIN and many more More ❯
best practices for performance optimisation, scalability, and cost management, empowering teams to access and utilise data seamlessly. Streamline Data Pipelines: Lead the development and optimisation of data pipelines using DBT , enabling faster and more reliable data flows. Enhance Data Governance and Quality: Design and implement robust data governance frameworks, ensuring high data quality, compliance, and consistency. Develop Scalable Data Models More ❯
tool, providing reliable data access to users throughout Trustpilot Design, build, maintain, and rigorously monitor robust data pipelines and transformative models using our modern data stack, including GCP, Airflow, dbt, and potentially emerging technologies like real-time streaming platforms Develop and manage reverse ETL processes to seamlessly integrate data with our commercial systems, ensuring operational efficiency Maintain and optimise our More ❯
paced project environments Understanding of data governance and compliance (e.g., GDPR) Experience in industries with large consumer marketplaces (e.g., programmatic media, travel, financial services) is a plus AWS Redshift DBT Power BI Note: While our current environment is AWS, we are transitioning to Azure for new development. Why Join Us? At Kantar, you'll be part of a global leader … of data governance and compliance (e.g., GDPR) Experience in industries with large consumer marketplaces (e.g., programmatic media, travel, financial services) is a plus Our Tech Stack AWS Redshift Postgres DBT Power BI Note: While our current environment is AWS, we are transitioning to Azure for new development. Why Join Us? At Kantar, you'll be part of a global leader More ❯
directly to a Lead Analytics Consultant. Your day-to-day will include helping clients build out efficient, scalable analytics stacks, delivering insights, and consulting on modern data tools like dbt, BigQuery, and Looker. As an Analytics Consultant , you'll be instrumental in enabling data-led decision-making and making a significant impact on business strategy across diverse industries. The Opportunity … you'll: Work directly with clients to understand their data goals and guide them in building modern, cloud-based analytics solutions. Design and implement ELT pipelines using tools like dbt and Fivetran. Develop dashboards and reporting solutions using Looker or similar BI tools. Participate in discovery sessions, stakeholder workshops, and training engagements. Collaborate with a team that thrives on knowledge … Pension scheme and private health cover Skills and Experience Must-Have: 2+ years experience in an analytics or data consultancy role Proficiency in SQL and data modelling (preferably with dbt) Hands-on experience with cloud data warehouses (BigQuery, Snowflake, Redshift) Familiarity with BI tools (Looker, Power BI, Tableau, etc.) Excellent communication skills - able to simplify technical concepts for non-technical More ❯
Comfortable partnering with analytics teams to define success metrics, measure impact, and learn from outcomes High level understanding of a modern data team's stack - we use Fivetran, Snowflake, dbt, Mixpanel and Omni An interest in infrastructure and observability-especially in complex, event-driven systems (we run on AWS) Familiarity with MongoDB or experience working with healthcare data systems and … working full stack. Using CI/CD we automate deployment onto Serverless architecture. Our platform data is stored in MongoDB and we have extensive analytics tooling using Fivetran, Snowflake, DBT, Mixpanel and Omni to enable data driven decisions across the business. We have robust monitoring, logging and reporting using AWS Cloudwatch and Sentry, and collaborate using Git, Slack and Notion. More ❯
modern AI tools and models across the entire data and engineering stack. Tech Stack AI Engineering: Python, LangChain, LlamaIndex, LangGraph AI Models: OpenAI, Anthropic, Gemini, Custom Models Data Tools: DBT, BigQuery Databases: PostgreSQL, OpenSearch, Redis Product Engineering: TypeScript (Full-stack) Infrastructure: Serverless, AWS, Google Cloud, GitHub Actions Requirements Strong academic background in a relevant field Proven track record of delivering More ❯
TensorFlow, Hugging Face) Proven MLOps, big data, and backend/API development experience Deep understanding of NLP and LLMs Proficient with cloud platforms (AWS/GCP/Azure), Airflow, DBT, Docker/Kubernetes Strong collaboration, problem-solving, and coding best practices Nice to have: LLM fine-tuning, streaming data, big data warehousing, open-source contributions. More ❯
TensorFlow, Hugging Face) Proven MLOps, big data, and backend/API development experience Deep understanding of NLP and LLMs Proficient with cloud platforms (AWS/GCP/Azure), Airflow, DBT, Docker/Kubernetes Strong collaboration, problem-solving, and coding best practices Nice to have: LLM fine-tuning, streaming data, big data warehousing, open-source contributions. More ❯
mentor some of the brightest minds and friendliest people in the industry! - You work with some of the best tech in the industry, such as AWS SageMaker for ML, DBT for data processing along with standard analytical tools like Python, Redshift, Tableau etc. - All our decisioning systems are built in-house, meaning that we have nearly unlimited flexibility in what More ❯
Nottingham, Nottinghamshire, England, United Kingdom Hybrid / WFH Options
Avanti
Data Engineer Azure - DBT - Midlands - Rapid Growth Scale-Up About the Company Join a fast-growing Data Analytics software house that is shaking up the analytics market. In just 5 years, they've grown from a startup to £11M turnover and offices across the UK and internationally. Helping clients rapidly achieve actionable insights from their data, they have outpaced the … data engineering solutions that help clients become self-sufficient with their data. What makes this role special: Direct client impact - see your work drive real business decisions Cutting-edge DBT implementation (key differentiator for the business) Variety of projects across different industries and use cases Opportunity to work with the latest Microsoft Fabric technology Essential Skills: DBT experience – (required) Strong … creates advancement opportunities Exposure to latest data technologies and methodologies International expansion opportunities Work with cutting-edge AI implementations If you are an Azure Data Engineer with experience of DBT and would like to working in an exciting, fast paced Data company then this could be a great role for you. APPLY TODAY FOR IMMEDIATE CONSIDERATION AND INTERVIEW IN More ❯
Substantial experience designing and implementing data solutions on Microsoft Azure Hands-on expertise with Snowflake, including data modelling, performance optimisation, and secure data sharing practices. Proficiency in DBT (DataBuildTool), with a strong understanding of modular pipeline development, testing, and version control. Familiarity with Power BI,particularly in integrating data models to support both enterprise reporting and self-service … analytics. Candidates must demonstrate experience working in Agile environments, delivering in iterative cycles aligned to business value. Tech Stack: Azure Power BI DBT Snowflake About us esynergy is a technology consultancy and we build products, platforms and services to accelerate value for our clients. We drive measurable impact that is tightly aligned to our clients' business objectives. Put in practice More ❯
scripting languages like Python, particularly in data engineering tasks like data processing, automation, and integrating with external systems (e.g., APIs, databases). o Hands-on experience with DBT (DataBuildTool) for building, testing, and documenting analytical data models. o Strong understanding of data modelling concepts (star/snowflake schema, slowly changing dimensions). o Experience integrating DBT with CI More ❯
join their growing data function. They're looking for a Power BI SME who has a real passion for driving analytics and insights. They have a stack of Snowflake, dbt, Python & Power BI. They're looking for someone who has end-to-end analytics experience and a genuine drive for data analytics and the impact it has on business performance. … coding ability using either normalisation, SQL, or Python). Desirable: Experience working in a Data warehouse, or lake environments e.g. Snowflake, Redshift, DataBricks, and ELT and data pipelines e.g. dbt Familiar with predictive analytics techniques Please apply if this sounds like you More ❯
join their growing data function. They're looking for a Power BI SME who has a real passion for driving analytics and insights. They have a stack of Snowflake, dbt, Python & Power BI. They're looking for someone who has end-to-end analytics experience and a genuine drive for data analytics and the impact it has on business performance. … coding ability using either normalisation, SQL, or Python). Desirable: Experience working in a Data warehouse, or lake environments e.g. Snowflake, Redshift, DataBricks, and ELT and data pipelines e.g. dbt Familiar with predictive analytics techniques Please apply if this sounds like you More ❯
communication capabilities, fostering collaboration and engagement across teams. Key Technology (awareness of) Azure Databricks, Data Factory, Storage, Key Vault Experience with source control systems, such as Git dbt (DataBuildTool) for transforming and modelling data SQL (Spark SQL) & Python (PySpark) Certifications (Ideal) SAFe POPM or Scrum PSPO Microsoft Certified: Azure Fundamentals (AZ-900) Microsoft Certified: Azure Data Fundamentals (DP More ❯
As part of the technical team, this role offers hands-on experience across a range of leading technologies and platforms. Full training will be provided in SQL, Looker, and DBT for reporting, dashboarding, and data pipeline development. You’ll also get exposure to Python, Databricks, and Azure as you grow into the role. There is also future potential to gain … machine learning projects as the team evolves. Day-to-Day Responsibilities: Build and maintain SQL queries and Looker dashboards for reporting and visualisation Develop and maintain data pipelines using DBT Ingest and process data from various sources across the business Collaborate with business stakeholders to understand reporting needs Contribute to large-scale European data integration projects Support internal projects such … technology A proactive, entrepreneurial mindset with a desire to learn and solve problems Excellent communication skills and confidence working with business stakeholders Tech Stack You’ll Learn SQL Looker DBT Python Databricks Azure Let me know if you want to tailor this for LinkedIn or make it sound more casual or more technical More ❯
As part of the technical team, this role offers hands-on experience across a range of leading technologies and platforms. Full training will be provided in SQL, Looker, and DBT for reporting, dashboarding, and data pipeline development. You’ll also get exposure to Python, Databricks, and Azure as you grow into the role. There is also future potential to gain … machine learning projects as the team evolves. Day-to-Day Responsibilities: Build and maintain SQL queries and Looker dashboards for reporting and visualisation Develop and maintain data pipelines using DBT Ingest and process data from various sources across the business Collaborate with business stakeholders to understand reporting needs Contribute to large-scale European data integration projects Support internal projects such … technology A proactive, entrepreneurial mindset with a desire to learn and solve problems Excellent communication skills and confidence working with business stakeholders Tech Stack You’ll Learn SQL Looker DBT Python Databricks Azure Let me know if you want to tailor this for LinkedIn or make it sound more casual or more technical More ❯
operations Advanced proficiency in SQL and experience working with large, complex, and sometimes messy datasets Experience designing, building, and maintaining ETL pipelines or data models, ideally using tools like dbt Proficiency in Python for data analysis, including data manipulation, visualisation, and basic modelling Strong data storytelling and communication skills: you can translate complex data into clear, actionable recommendations for both More ❯
data applications Airflow purely for job scheduling and tracking Circle CI for continuous deployment Parquet and Delta file formats on S3 for data lake storage Spark for data processing DBT for data modelling SparkSQL for analytics Why else you'll love it here Wondering what the salary for this role is? Just ask us! On a call with one of More ❯
science, storytelling and slightly unpredictable humans. Experienced working in fast paced tech scale ups focused on growth. Fluent in Python or R , SQL , and handy with the usual stack (dbt, Snowflake, Looker/Tableau). Experienced in marketing mix modelling , attribution , and statistical testing. Comfortable working in a fast-paced, ever-changing environment - and willing to roll with the odd More ❯
performance data pipelines and APIs Developing ETL processes to support analytics, reporting, and business operations Assembling complex datasets from a wide variety of sources using tools like SQL, Python, dbt, and Azure Supporting and improving data quality, data infrastructure, and performance Defining and documenting cloud-based data architecture and technical solutions Collaborating across teams and contributing to architectural decisions Troubleshooting … What we’re looking for: Strong experience with Azure cloud technologies , particularly around data services Proficient in SQL and experienced with Python for data transformation Hands-on experience with dbt , ETL development , and data warehousing best practices Comfortable with deploying infrastructure as code and building CI/CD pipelines (e.g., using GitHub, Azure DevOps) Ability to manage large, unstructured datasets More ❯
Leicester, Leicestershire, England, United Kingdom
CPS Group
across complex systemsWhat You'll Bring:* Background within data warehouse testing (ETL, pipelines, cloud-native platforms like Snowflake)* Experience building and maintaining test harnesses using tools like dbt (DataBuildTool) for automating data validation across ETL processes and ensuring test integration with CI/CD pipelines.* Proven test leadership and mentoring experience* History of hands-on experience with dataMore ❯
autonomy to shape tools, processes, and outcomes Join a values led team that champions innovation, collaboration, and curiosity Responsibilities: Design, build and maintain robust data models and transformations using dbt, Snowflake, and Looker Own integrations from multiple third party sources (Salesforce, Segment, Amplitude, Google Ads) Translate business requirements into well structured metrics and scalable data solutions Enable self serve analytics … pipelines, and data shares Requirements: Solid experience in an Analytics Engineer or similar role within BI or Data Advanced SQL skills and experience working with modern data stacks (Snowflake, dbt) Hands on experience with Looker or similar BI tools (Tableau, Power BI) Strong commercial acumen - you understand both data and the business problems it's solving Familiarity with version control More ❯