London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
with architects, analysts, and business stakeholders to unlock insights and enable innovation. What You'll Be Doing: Design and build robust, automated data pipelines using Azure Data Factory, Synapse, dbt, and Databricks. Integrate data from enterprise systems (e.g. Dynamics, iTrent, Unit4) into a unified data platform. Cleanse, transform, and model data to support BI tools (e.g., Power BI) and AI More ❯
South West London, London, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
with architects, analysts, and business stakeholders to unlock insights and enable innovation. What You'll Be Doing: Design and build robust, automated data pipelines using Azure Data Factory, Synapse, dbt, and Databricks. Integrate data from enterprise systems (e.g. Dynamics, iTrent, Unit4) into a unified data platform. Cleanse, transform, and model data to support BI tools (e.g., Power BI) and AI More ❯
New Malden, Surrey, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
Strong hands-on background in data engineering, with 5+ years working on modern data platforms Experience leading cloud data migrations- GCP and BigQuery strongly preferred Proficiency in SQL, Python, dbt, Airflow, Terraform and other modern tooling Excellent understanding of data architecture, governance, and DevOps best practices Proven leadership or team management experience within a regulated or mid-to-large tech More ❯
take the initiative, and identify creative solutions to deliver outcomes in the face of obstacles. Knowledge of common data science tools around SQL-based data warehousing (eg. Snowflake, Databricks, DBT), BI tools (eg. Tableau, Looker), workflow orchestration, and ML Ops. Excellent spoken and written English skills. Fluency with scripting in Python. Ability to work effectively across time zones. Teammates will More ❯
reliability, availability and scalability of all data systems Requirements Significant experience as a Data Science Engineer (or similar senior data role) Expertise in ETL tooling and pipeline development (e.g. dbt, Metabase) Proficiency in Python or R for data modelling and analysis Experience working with cloud platforms (AWS, GCP or Azure) Track record of deploying and maintaining reliable data systems at More ❯
or Scala, SQL, and Apache Spark. You're highly proficient with Databricks and Databricks Asset Bundles, and have a strong understanding of data transformation best practices. Experience with GitHub, DBT, and handling large structured and unstructured datasets is essential. You're detail-oriented, self-motivated, and able to communicate complex technical concepts clearly. Experience with media/social platform dataMore ❯
and statistical techniques to healthcare data. Ability to manage a technical product lifecycle, including through end-user feedback and testing Desirable Experience in R and in Javascript Proficiency using DBT +/- Snowflake +/- Azure Personal Behavoirs Essential Proven experience in the ability to interact with colleagues at all levels both clinical and non-clinical within healthcare Ability to More ❯
a team of Data Engineers Experience with Data modelling, Data warehousing, and building ETL pipelines Experience with AWS (S3, EKS, EC2, RDS) or similar cloud services, Snowflake, Fivetran, Airbyte, dbt, Docker, Argo Experience in SQL, Python, and Terraform Experience with building Data pipelines and applications to stream and process datasets Robust understanding of DevOps principles is required Experience managing cloud More ❯
complex operational businesses Experience using LLMs or AI tools to structure and extract meaning from unstructured data Experience automating workflows and deploying model pipelines (e.g. MLFlow, GCP Vertex, Airflow, dbt or similar) Exposure to business planning, pricing, or commercial decision-making Familiarity with geospatial data Experience in fast-scaling startups or operational teams We're flexible on experience - if you More ❯
a leading Media & Entertainment provider. They're looking for multiple experienced Data Engineers to join on an initial 6 month contract basis, with scope for further extensions. -Python, AWS, dbt, SQL, -Principal Data Engineer -6 Month Contract -Inside IR35 -£500 - 600/Day -Remote (Occasional office visits may be required ad-hoc) If you're interested in this role & meet More ❯
a leading Media & Entertainment provider. They're looking for multiple experienced Data Engineers to join on an initial 6 month contract basis, with scope for further extensions. -Python, AWS, dbt, SQL, -Principal Data Engineer -6 Month Contract -Inside IR35 -£(Apply online only)/Day -Remote (Occasional office visits may be required ad-hoc) If you're interested in this role More ❯
best practices for performance optimisation, scalability, and cost management, empowering teams to access and utilise data seamlessly. Streamline Data Pipelines: Lead the development and optimisation of data pipelines using DBT , enabling faster and more reliable data flows. Enhance Data Governance and Quality: Design and implement robust data governance frameworks, ensuring high data quality, compliance, and consistency. Develop Scalable Data Models More ❯
tool, providing reliable data access to users throughout Trustpilot Design, build, maintain, and rigorously monitor robust data pipelines and transformative models using our modern data stack, including GCP, Airflow, dbt, and potentially emerging technologies like real-time streaming platforms Develop and manage reverse ETL processes to seamlessly integrate data with our commercial systems, ensuring operational efficiency Maintain and optimise our More ❯
modern AI tools and models across the entire data and engineering stack. Tech Stack AI Engineering: Python, LangChain, LlamaIndex, LangGraph AI Models: OpenAI, Anthropic, Gemini, Custom Models Data Tools: DBT, BigQuery Databases: PostgreSQL, OpenSearch, Redis Product Engineering: TypeScript (Full-stack) Infrastructure: Serverless, AWS, Google Cloud, GitHub Actions Requirements Strong academic background in a relevant field Proven track record of delivering More ❯
TensorFlow, Hugging Face) Proven MLOps, big data, and backend/API development experience Deep understanding of NLP and LLMs Proficient with cloud platforms (AWS/GCP/Azure), Airflow, DBT, Docker/Kubernetes Strong collaboration, problem-solving, and coding best practices Nice to have: LLM fine-tuning, streaming data, big data warehousing, open-source contributions. More ❯
TensorFlow, Hugging Face) Proven MLOps, big data, and backend/API development experience Deep understanding of NLP and LLMs Proficient with cloud platforms (AWS/GCP/Azure), Airflow, DBT, Docker/Kubernetes Strong collaboration, problem-solving, and coding best practices Nice to have: LLM fine-tuning, streaming data, big data warehousing, open-source contributions. More ❯
mentor some of the brightest minds and friendliest people in the industry! - You work with some of the best tech in the industry, such as AWS SageMaker for ML, DBT for data processing along with standard analytical tools like Python, Redshift, Tableau etc. - All our decisioning systems are built in-house, meaning that we have nearly unlimited flexibility in what More ❯
across Product, Engineering, Finance and Operations Experience working in or transitioning to a matrixed or squad-based organisational model Technical fluency with modern data stacks and tools such as dbt, Snowflake, Airflow, and Looker or Tableau Deep understanding of data privacy, governance and compliance, including GDPR Excellent communication skills, with the ability to influence at senior levels and drive alignment More ❯
and its data science ecosystem (e.g., pandas, scikit-learn, TensorFlow/PyTorch) , Statistical methods and machine learning (e.g., A/B testing, model validation) , Data pipelining tools like SQL, dbt, BigQuery, or Spark , A strong communicator with the ability to communicate technical concepts into layman's terms for a non-technical audience , You're not afraid to challenge the status More ❯
data science ecosystem (e.g., pandas, scikit-learn, TensorFlow/PyTorch). Statistical methods and machine learning (e.g., A/B testing, model validation). Data pipelining tools like SQL, dbt, BigQuery, or Spark. A strong communicator with the ability to communicate technical concepts into layman's terms for a non-technical audience. You're not afraid to challenge the status More ❯
operations Advanced proficiency in SQL and experience working with large, complex, and sometimes messy datasets Experience designing, building, and maintaining ETL pipelines or data models, ideally using tools like dbt Proficiency in Python for data analysis, including data manipulation, visualisation, and basic modelling Strong data storytelling and communication skills: you can translate complex data into clear, actionable recommendations for both More ❯
science, storytelling and slightly unpredictable humans. Experienced working in fast paced tech scale ups focused on growth. Fluent in Python or R , SQL , and handy with the usual stack (dbt, Snowflake, Looker/Tableau). Experienced in marketing mix modelling , attribution , and statistical testing. Comfortable working in a fast-paced, ever-changing environment - and willing to roll with the odd More ❯
company that takes data seriously, where your work has direct impact on product quality, decision making, and customer outcomes. Skills and Experience Strong Python SQL Experience working with GCP dbt If you are looking for a new challenge, then please submit your CV for initial screening and more details. Senior Data Engineer More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
by the wider team. Core Requirements: Strong SQL skills for data modelling and transformation Proven experience building and managing ETL pipelines in production Hands-on experience with Snowflake and dbt Proven experience as an Analytics Engineer Experience working in a cross-functional environment with stakeholders Excellent communication skills What Will Make You Stand Out: Experience with ThoughtSpot , Looker, or similar More ❯
wide variety of technical and executive audiences both written and verbal Preferred (but not required) to have: Hands on experience with Python Experience working with modern data technology (e.g. dbt, spark, containers, devops tooling, orchestration tools, git, etc.) Experience with data science and machine learning technology People want to buy from people who understand them. Our Solutions Engineers build connections More ❯