reliability, availability and scalability of all data systems Requirements Significant experience as a Data Science Engineer (or similar senior data role) Expertise in ETL tooling and pipeline development (e.g. dbt, Metabase) Proficiency in Python or R for data modelling and analysis Experience working with cloud platforms (AWS, GCP or Azure) Track record of deploying and maintaining reliable data systems at More ❯
or Scala, SQL, and Apache Spark. You're highly proficient with Databricks and Databricks Asset Bundles, and have a strong understanding of data transformation best practices. Experience with GitHub, DBT, and handling large structured and unstructured datasets is essential. You're detail-oriented, self-motivated, and able to communicate complex technical concepts clearly. Experience with media/social platform dataMore ❯
and statistical techniques to healthcare data. Ability to manage a technical product lifecycle, including through end-user feedback and testing Desirable Experience in R and in Javascript Proficiency using DBT +/- Snowflake +/- Azure Personal Behavoirs Essential Proven experience in the ability to interact with colleagues at all levels both clinical and non-clinical within healthcare Ability to More ❯
a team of Data Engineers Experience with Data modelling, Data warehousing, and building ETL pipelines Experience with AWS (S3, EKS, EC2, RDS) or similar cloud services, Snowflake, Fivetran, Airbyte, dbt, Docker, Argo Experience in SQL, Python, and Terraform Experience with building Data pipelines and applications to stream and process datasets Robust understanding of DevOps principles is required Experience managing cloud More ❯
complex operational businesses Experience using LLMs or AI tools to structure and extract meaning from unstructured data Experience automating workflows and deploying model pipelines (e.g. MLFlow, GCP Vertex, Airflow, dbt or similar) Exposure to business planning, pricing, or commercial decision-making Familiarity with geospatial data Experience in fast-scaling startups or operational teams We're flexible on experience - if you More ❯
a leading Media & Entertainment provider. They're looking for multiple experienced Data Engineers to join on an initial 6 month contract basis, with scope for further extensions. -Python, AWS, dbt, SQL, -Principal Data Engineer -6 Month Contract -Inside IR35 -£500 - 600/Day -Remote (Occasional office visits may be required ad-hoc) If you're interested in this role & meet More ❯
a leading Media & Entertainment provider. They're looking for multiple experienced Data Engineers to join on an initial 6 month contract basis, with scope for further extensions. -Python, AWS, dbt, SQL, -Principal Data Engineer -6 Month Contract -Inside IR35 -£(Apply online only)/Day -Remote (Occasional office visits may be required ad-hoc) If you're interested in this role More ❯
EXPERIENCE: The ideal Head of Data Platform will have: Extensive experience with Google Cloud Platform (GCP), particularly BigQuery Proficiency with a modern data tech stack, including SQL, Python, Airflow, dbt, Dataform, Terraform Experience in a mid-large sized company within a regulated industry, with a strong understanding of data governance. A strategic mindset, leadership skills, and a hands-on approach. More ❯
About Airwallex Airwallex is the only unified payments and financial platform for global businesses. Powered by our unique combination of proprietary infrastructure and software, we empower over 150,000 businesses worldwide - including Brex, Rippling, Navan, Qantas, SHEIN and many more More ❯
best practices for performance optimisation, scalability, and cost management, empowering teams to access and utilise data seamlessly. Streamline Data Pipelines: Lead the development and optimisation of data pipelines using DBT , enabling faster and more reliable data flows. Enhance Data Governance and Quality: Design and implement robust data governance frameworks, ensuring high data quality, compliance, and consistency. Develop Scalable Data Models More ❯
tool, providing reliable data access to users throughout Trustpilot Design, build, maintain, and rigorously monitor robust data pipelines and transformative models using our modern data stack, including GCP, Airflow, dbt, and potentially emerging technologies like real-time streaming platforms Develop and manage reverse ETL processes to seamlessly integrate data with our commercial systems, ensuring operational efficiency Maintain and optimise our More ❯
paced project environments Understanding of data governance and compliance (e.g., GDPR) Experience in industries with large consumer marketplaces (e.g., programmatic media, travel, financial services) is a plus AWS Redshift DBT Power BI Note: While our current environment is AWS, we are transitioning to Azure for new development. Why Join Us? At Kantar, you'll be part of a global leader … of data governance and compliance (e.g., GDPR) Experience in industries with large consumer marketplaces (e.g., programmatic media, travel, financial services) is a plus Our Tech Stack AWS Redshift Postgres DBT Power BI Note: While our current environment is AWS, we are transitioning to Azure for new development. Why Join Us? At Kantar, you'll be part of a global leader More ❯
directly to a Lead Analytics Consultant. Your day-to-day will include helping clients build out efficient, scalable analytics stacks, delivering insights, and consulting on modern data tools like dbt, BigQuery, and Looker. As an Analytics Consultant , you'll be instrumental in enabling data-led decision-making and making a significant impact on business strategy across diverse industries. The Opportunity … you'll: Work directly with clients to understand their data goals and guide them in building modern, cloud-based analytics solutions. Design and implement ELT pipelines using tools like dbt and Fivetran. Develop dashboards and reporting solutions using Looker or similar BI tools. Participate in discovery sessions, stakeholder workshops, and training engagements. Collaborate with a team that thrives on knowledge … Pension scheme and private health cover Skills and Experience Must-Have: 2+ years experience in an analytics or data consultancy role Proficiency in SQL and data modelling (preferably with dbt) Hands-on experience with cloud data warehouses (BigQuery, Snowflake, Redshift) Familiarity with BI tools (Looker, Power BI, Tableau, etc.) Excellent communication skills - able to simplify technical concepts for non-technical More ❯
Brighton, Sussex, United Kingdom Hybrid / WFH Options
MJR Analytics
own or as the lead in a small team. Successful candidates will bring experience with Google Cloud, our strategic technology partner, along with modern data stack technologies such as dbt, Looker, Fivetran, Segment and Cube. You'll already have the Google Cloud Data Engineer certification and the Looker developer certification or be committed to obtaining that certification within your first … provided to you using tools such as Google Cloud, Google BigQuery and Google Cloud SQL Transforming and modeling data into consumable information and business logic using tools such as dbt, Dataform and Cube Developing dashboards, explorations and data visualisations using tools such as Looker, Preset, Power BI, Superset Testing data to ensure it is of high quality Using project management … definitions Designing data flows and data models Mentoring: Mentoring and guiding junior analytics and data engineers. Knowledge: Obtaining, if you do not have so already, developer certifications for Looker, dbt, Segment and other modern data stack partner Working towards Google Cloud Data Engineer certification Staying updated with the latest data engineering technologies and methodologies, and implementing them as appropriate. Collaboration More ❯
Comfortable partnering with analytics teams to define success metrics, measure impact, and learn from outcomes High level understanding of a modern data team's stack - we use Fivetran, Snowflake, dbt, Mixpanel and Omni An interest in infrastructure and observability-especially in complex, event-driven systems (we run on AWS) Familiarity with MongoDB or experience working with healthcare data systems and … working full stack. Using CI/CD we automate deployment onto Serverless architecture. Our platform data is stored in MongoDB and we have extensive analytics tooling using Fivetran, Snowflake, DBT, Mixpanel and Omni to enable data driven decisions across the business. We have robust monitoring, logging and reporting using AWS Cloudwatch and Sentry, and collaborate using Git, Slack and Notion. More ❯
modern AI tools and models across the entire data and engineering stack. Tech Stack AI Engineering: Python, LangChain, LlamaIndex, LangGraph AI Models: OpenAI, Anthropic, Gemini, Custom Models Data Tools: DBT, BigQuery Databases: PostgreSQL, OpenSearch, Redis Product Engineering: TypeScript (Full-stack) Infrastructure: Serverless, AWS, Google Cloud, GitHub Actions Requirements Strong academic background in a relevant field Proven track record of delivering More ❯
TensorFlow, Hugging Face) Proven MLOps, big data, and backend/API development experience Deep understanding of NLP and LLMs Proficient with cloud platforms (AWS/GCP/Azure), Airflow, DBT, Docker/Kubernetes Strong collaboration, problem-solving, and coding best practices Nice to have: LLM fine-tuning, streaming data, big data warehousing, open-source contributions. More ❯
TensorFlow, Hugging Face) Proven MLOps, big data, and backend/API development experience Deep understanding of NLP and LLMs Proficient with cloud platforms (AWS/GCP/Azure), Airflow, DBT, Docker/Kubernetes Strong collaboration, problem-solving, and coding best practices Nice to have: LLM fine-tuning, streaming data, big data warehousing, open-source contributions. More ❯
mentor some of the brightest minds and friendliest people in the industry! - You work with some of the best tech in the industry, such as AWS SageMaker for ML, DBT for data processing along with standard analytical tools like Python, Redshift, Tableau etc. - All our decisioning systems are built in-house, meaning that we have nearly unlimited flexibility in what More ❯
Nottingham, Nottinghamshire, England, United Kingdom Hybrid / WFH Options
Avanti
Data Engineer Azure - DBT - Midlands - Rapid Growth Scale-Up About the Company Join a fast-growing Data Analytics software house that is shaking up the analytics market. In just 5 years, they've grown from a startup to £11M turnover and offices across the UK and internationally. Helping clients rapidly achieve actionable insights from their data, they have outpaced the … data engineering solutions that help clients become self-sufficient with their data. What makes this role special: Direct client impact - see your work drive real business decisions Cutting-edge DBT implementation (key differentiator for the business) Variety of projects across different industries and use cases Opportunity to work with the latest Microsoft Fabric technology Essential Skills: DBT experience – (required) Strong … creates advancement opportunities Exposure to latest data technologies and methodologies International expansion opportunities Work with cutting-edge AI implementations If you are an Azure Data Engineer with experience of DBT and would like to working in an exciting, fast paced Data company then this could be a great role for you. APPLY TODAY FOR IMMEDIATE CONSIDERATION AND INTERVIEW IN More ❯
Substantial experience designing and implementing data solutions on Microsoft Azure Hands-on expertise with Snowflake, including data modelling, performance optimisation, and secure data sharing practices. Proficiency in DBT (DataBuildTool), with a strong understanding of modular pipeline development, testing, and version control. Familiarity with Power BI,particularly in integrating data models to support both enterprise reporting and self-service … analytics. Candidates must demonstrate experience working in Agile environments, delivering in iterative cycles aligned to business value. Tech Stack: Azure Power BI DBT Snowflake About us esynergy is a technology consultancy and we build products, platforms and services to accelerate value for our clients. We drive measurable impact that is tightly aligned to our clients' business objectives. Put in practice More ❯
scripting languages like Python, particularly in data engineering tasks like data processing, automation, and integrating with external systems (e.g., APIs, databases). o Hands-on experience with DBT (DataBuildTool) for building, testing, and documenting analytical data models. o Strong understanding of data modelling concepts (star/snowflake schema, slowly changing dimensions). o Experience integrating DBT with CI More ❯
across Product, Engineering, Finance and Operations Experience working in or transitioning to a matrixed or squad-based organisational model Technical fluency with modern data stacks and tools such as dbt, Snowflake, Airflow, and Looker or Tableau Deep understanding of data privacy, governance and compliance, including GDPR Excellent communication skills, with the ability to influence at senior levels and drive alignment More ❯
join their growing data function. They're looking for a Power BI SME who has a real passion for driving analytics and insights. They have a stack of Snowflake, dbt, Python & Power BI. They're looking for someone who has end-to-end analytics experience and a genuine drive for data analytics and the impact it has on business performance. … coding ability using either normalisation, SQL, or Python). Desirable: Experience working in a Data warehouse, or lake environments e.g. Snowflake, Redshift, DataBricks, and ELT and data pipelines e.g. dbt Familiar with predictive analytics techniques Please apply if this sounds like you More ❯
join their growing data function. They're looking for a Power BI SME who has a real passion for driving analytics and insights. They have a stack of Snowflake, dbt, Python & Power BI. They're looking for someone who has end-to-end analytics experience and a genuine drive for data analytics and the impact it has on business performance. … coding ability using either normalisation, SQL, or Python). Desirable: Experience working in a Data warehouse, or lake environments e.g. Snowflake, Redshift, DataBricks, and ELT and data pipelines e.g. dbt Familiar with predictive analytics techniques Please apply if this sounds like you More ❯