London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
with architects, analysts, and business stakeholders to unlock insights and enable innovation. What You'll Be Doing: Design and build robust, automated data pipelines using Azure Data Factory, Synapse, dbt, and Databricks. Integrate data from enterprise systems (e.g. Dynamics, iTrent, Unit4) into a unified data platform. Cleanse, transform, and model data to support BI tools (e.g., Power BI) and AI More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
with architects, analysts, and business stakeholders to unlock insights and enable innovation. What You'll Be Doing: Design and build robust, automated data pipelines using Azure Data Factory, Synapse, dbt, and Databricks. Integrate data from enterprise systems (e.g. Dynamics, iTrent, Unit4) into a unified data platform. Cleanse, transform, and model data to support BI tools (e.g., Power BI) and AI More ❯
South West London, London, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
with architects, analysts, and business stakeholders to unlock insights and enable innovation. What You'll Be Doing: Design and build robust, automated data pipelines using Azure Data Factory, Synapse, dbt, and Databricks. Integrate data from enterprise systems (e.g. Dynamics, iTrent, Unit4) into a unified data platform. Cleanse, transform, and model data to support BI tools (e.g., Power BI) and AI More ❯
across both engineering and analytics, and is excited about building internal tools that directly improve product and customer experiences. You'll be working with a mature stack (Python, BigQuery, dbt, FastAPI, Metabase), and your day-to-day will include both writing production-level code and making data actually useful for decision-makers. Main responsibilities: Build, maintain, and optimize data pipelines … using Python and dbt Own and evolve the backend codebase (FastAPI, Docker) Ensure pipeline reliability, code quality, and proper testing/documentation Maintain and extend data models and the BI layer (Metabase) Collaborate closely with product, data science, and leadership on strategic data tools Design and deliver internal tools - potentially leveraging LLMs and OpenAI APIs Write clean, production-grade code … with version control (GitLab) Experience Required: 5+ years of Python , including writing production-level APIs Strong SQL and DBT for data transformation and modeling Experience with modern data stack components: BigQuery, GCS, Docker, FastAPI Solid understanding of data warehousing principles Proven ability to work cross-functionally with both technical and non-technical stakeholders Comfortable maintaining and optimizing BI dashboards ( Metabase More ❯
New Malden, Surrey, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
Strong hands-on background in data engineering, with 5+ years working on modern data platforms Experience leading cloud data migrations- GCP and BigQuery strongly preferred Proficiency in SQL, Python, dbt, Airflow, Terraform and other modern tooling Excellent understanding of data architecture, governance, and DevOps best practices Proven leadership or team management experience within a regulated or mid-to-large tech More ❯
take the initiative, and identify creative solutions to deliver outcomes in the face of obstacles. Knowledge of common data science tools around SQL-based data warehousing (eg. Snowflake, Databricks, DBT), BI tools (eg. Tableau, Looker), workflow orchestration, and ML Ops. Excellent spoken and written English skills. Fluency with scripting in Python. Ability to work effectively across time zones. Teammates will More ❯
reliability, availability and scalability of all data systems Requirements Significant experience as a Data Science Engineer (or similar senior data role) Expertise in ETL tooling and pipeline development (e.g. dbt, Metabase) Proficiency in Python or R for data modelling and analysis Experience working with cloud platforms (AWS, GCP or Azure) Track record of deploying and maintaining reliable data systems at More ❯
or Scala, SQL, and Apache Spark. You're highly proficient with Databricks and Databricks Asset Bundles, and have a strong understanding of data transformation best practices. Experience with GitHub, DBT, and handling large structured and unstructured datasets is essential. You're detail-oriented, self-motivated, and able to communicate complex technical concepts clearly. Experience with media/social platform dataMore ❯
and statistical techniques to healthcare data. Ability to manage a technical product lifecycle, including through end-user feedback and testing Desirable Experience in R and in Javascript Proficiency using DBT +/- Snowflake +/- Azure Personal Behavoirs Essential Proven experience in the ability to interact with colleagues at all levels both clinical and non-clinical within healthcare Ability to More ❯
a team of Data Engineers Experience with Data modelling, Data warehousing, and building ETL pipelines Experience with AWS (S3, EKS, EC2, RDS) or similar cloud services, Snowflake, Fivetran, Airbyte, dbt, Docker, Argo Experience in SQL, Python, and Terraform Experience with building Data pipelines and applications to stream and process datasets Robust understanding of DevOps principles is required Experience managing cloud More ❯
complex operational businesses Experience using LLMs or AI tools to structure and extract meaning from unstructured data Experience automating workflows and deploying model pipelines (e.g. MLFlow, GCP Vertex, Airflow, dbt or similar) Exposure to business planning, pricing, or commercial decision-making Familiarity with geospatial data Experience in fast-scaling startups or operational teams We're flexible on experience - if you More ❯
a leading Media & Entertainment provider. They're looking for multiple experienced Data Engineers to join on an initial 6 month contract basis, with scope for further extensions. -Python, AWS, dbt, SQL, -Principal Data Engineer -6 Month Contract -Inside IR35 -£500 - 600/Day -Remote (Occasional office visits may be required ad-hoc) If you're interested in this role & meet More ❯
a leading Media & Entertainment provider. They're looking for multiple experienced Data Engineers to join on an initial 6 month contract basis, with scope for further extensions. -Python, AWS, dbt, SQL, -Principal Data Engineer -6 Month Contract -Inside IR35 -£(Apply online only)/Day -Remote (Occasional office visits may be required ad-hoc) If you're interested in this role More ❯
EXPERIENCE: The ideal Head of Data Platform will have: Extensive experience with Google Cloud Platform (GCP), particularly BigQuery Proficiency with a modern data tech stack, including SQL, Python, Airflow, dbt, Dataform, Terraform Experience in a mid-large sized company within a regulated industry, with a strong understanding of data governance. A strategic mindset, leadership skills, and a hands-on approach. More ❯
About Airwallex Airwallex is the only unified payments and financial platform for global businesses. Powered by our unique combination of proprietary infrastructure and software, we empower over 150,000 businesses worldwide - including Brex, Rippling, Navan, Qantas, SHEIN and many more More ❯
best practices for performance optimisation, scalability, and cost management, empowering teams to access and utilise data seamlessly. Streamline Data Pipelines: Lead the development and optimisation of data pipelines using DBT , enabling faster and more reliable data flows. Enhance Data Governance and Quality: Design and implement robust data governance frameworks, ensuring high data quality, compliance, and consistency. Develop Scalable Data Models More ❯
Measurelab is looking for a skilled Data & Insights Analyst to join its dynamic team in a rewarding remote role. If you have the qualifications and a passion for turning data into actionable insights, this is your chance to work with More ❯
tool, providing reliable data access to users throughout Trustpilot Design, build, maintain, and rigorously monitor robust data pipelines and transformative models using our modern data stack, including GCP, Airflow, dbt, and potentially emerging technologies like real-time streaming platforms Develop and manage reverse ETL processes to seamlessly integrate data with our commercial systems, ensuring operational efficiency Maintain and optimise our More ❯
paced project environments Understanding of data governance and compliance (e.g., GDPR) Experience in industries with large consumer marketplaces (e.g., programmatic media, travel, financial services) is a plus AWS Redshift DBT Power BI Note: While our current environment is AWS, we are transitioning to Azure for new development. Why Join Us? At Kantar, you'll be part of a global leader … of data governance and compliance (e.g., GDPR) Experience in industries with large consumer marketplaces (e.g., programmatic media, travel, financial services) is a plus Our Tech Stack AWS Redshift Postgres DBT Power BI Note: While our current environment is AWS, we are transitioning to Azure for new development. Why Join Us? At Kantar, you'll be part of a global leader More ❯
directly to a Lead Analytics Consultant. Your day-to-day will include helping clients build out efficient, scalable analytics stacks, delivering insights, and consulting on modern data tools like dbt, BigQuery, and Looker. As an Analytics Consultant , you'll be instrumental in enabling data-led decision-making and making a significant impact on business strategy across diverse industries. The Opportunity … you'll: Work directly with clients to understand their data goals and guide them in building modern, cloud-based analytics solutions. Design and implement ELT pipelines using tools like dbt and Fivetran. Develop dashboards and reporting solutions using Looker or similar BI tools. Participate in discovery sessions, stakeholder workshops, and training engagements. Collaborate with a team that thrives on knowledge … Pension scheme and private health cover Skills and Experience Must-Have: 2+ years experience in an analytics or data consultancy role Proficiency in SQL and data modelling (preferably with dbt) Hands-on experience with cloud data warehouses (BigQuery, Snowflake, Redshift) Familiarity with BI tools (Looker, Power BI, Tableau, etc.) Excellent communication skills - able to simplify technical concepts for non-technical More ❯
Comfortable partnering with analytics teams to define success metrics, measure impact, and learn from outcomes High level understanding of a modern data team's stack - we use Fivetran, Snowflake, dbt, Mixpanel and Omni An interest in infrastructure and observability-especially in complex, event-driven systems (we run on AWS) Familiarity with MongoDB or experience working with healthcare data systems and … working full stack. Using CI/CD we automate deployment onto Serverless architecture. Our platform data is stored in MongoDB and we have extensive analytics tooling using Fivetran, Snowflake, DBT, Mixpanel and Omni to enable data driven decisions across the business. We have robust monitoring, logging and reporting using AWS Cloudwatch and Sentry, and collaborate using Git, Slack and Notion. More ❯
modern AI tools and models across the entire data and engineering stack. Tech Stack AI Engineering: Python, LangChain, LlamaIndex, LangGraph AI Models: OpenAI, Anthropic, Gemini, Custom Models Data Tools: DBT, BigQuery Databases: PostgreSQL, OpenSearch, Redis Product Engineering: TypeScript (Full-stack) Infrastructure: Serverless, AWS, Google Cloud, GitHub Actions Requirements Strong academic background in a relevant field Proven track record of delivering More ❯
TensorFlow, Hugging Face) Proven MLOps, big data, and backend/API development experience Deep understanding of NLP and LLMs Proficient with cloud platforms (AWS/GCP/Azure), Airflow, DBT, Docker/Kubernetes Strong collaboration, problem-solving, and coding best practices Nice to have: LLM fine-tuning, streaming data, big data warehousing, open-source contributions. More ❯
TensorFlow, Hugging Face) Proven MLOps, big data, and backend/API development experience Deep understanding of NLP and LLMs Proficient with cloud platforms (AWS/GCP/Azure), Airflow, DBT, Docker/Kubernetes Strong collaboration, problem-solving, and coding best practices Nice to have: LLM fine-tuning, streaming data, big data warehousing, open-source contributions. More ❯