and scientists working across Africa, Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data architectures such as Data Mesh, lakehouse, data vault and data warehouses. Our data engineers More ❯
and scientists working across Africa, Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data architectures such as Data Mesh, lakehouse, data vault and data warehouses. Our data engineers More ❯
from web analytics, content management systems (CMS), subscription platforms, ad tech, and social media. Ability to automate and optimise data workflows, using modern ETL/ELT tools (e.g., Airflow, dbt, Apache Spark) to ensure timely and reliable delivery of data. Experience building robust data models and reporting layers to support performance dashboards, user engagement analytics, ad revenue tracking, and A More ❯
from web analytics, content management systems (CMS), subscription platforms, ad tech, and social media. Ability to automate and optimise data workflows, using modern ETL/ELT tools (e.g., Airflow, dbt, Apache Spark) to ensure timely and reliable delivery of data. Experience building robust data models and reporting layers to support performance dashboards, user engagement analytics, ad revenue tracking, and A More ❯
experience building production data pipelines Advanced Python skills (NumPy, Pandas, SQL Alchemy) and expert-level SQL across multiple database platforms Hands-on experience with modern data stack tools including dbt, Airflow, and cloud data warehouses (Snowflake, BigQuery, Redshift) Strong understanding of data modelling, schema design, and building maintainable ELT/ETL pipelines Experience with cloud platforms (AWS, Azure, GCP) and More ❯
experience building production data pipelines Advanced Python skills (NumPy, Pandas, SQL Alchemy) and expert-level SQL across multiple database platforms Hands-on experience with modern data stack tools including dbt, Airflow, and cloud data warehouses (Snowflake, BigQuery, Redshift) Strong understanding of data modelling, schema design, and building maintainable ELT/ETL pipelines Experience with cloud platforms (AWS, Azure, GCP) and More ❯
Tech Stack: Cloud Data Warehouse - Snowflake AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github Actions/Jenkins Business Intelligence - Looker Skills & Attributes We'd Like To See: Extensive experience in data engineering, including designing and maintaining robust dataMore ❯
understood diagrams and visuals for both technical and non-technical people alike. AWS cloud products (Lambda functions, Redshift, S3, AmazonMQ, Kinesis, EMR, RDS (Postgres . Apache Airflow for orchestration. DBT for data transformations. Machine Learning for product insights and recommendations. Experience with microservices using technologies like Docker for local development. Apply engineering best practices to your work, e.g. unit tests More ❯
data modelling. Experience with CI/CD GITHUB Actions (or similar) AWS fundamentals (e.g., AWS Certified Data Engineer) Knowledge of Snowflake/SQL Knowledge of Apache Airflow Knowledge of DBT Familiarity with Atlan for data catalog and metadata management Understanding of iceberg tables Who we are: Were a business with a global reach that empowers local teams, and we undertake More ❯
data models. Experience with Interface/API data modelling. Experience with CI/CD GITHUB Actions (or similar) Knowledge of Snowflake/SQL Knowledge of Apache Airflow Knowledge of DBT Familiarity with Atlan for data catalog and metadata management Understanding of iceberg tables Who we are: Were a business with a global reach that empowers local teams, and we undertake More ❯
South West London, London, United Kingdom Hybrid / WFH Options
Talent Hero Ltd
needs Optimise data storage and retrieval for performance Work with batch and real-time processing frameworks Implement and manage ETL processes Use tools like Python, SQL, Spark, Airflow, Kafka, dbt, Snowflake, Redshift, BigQuery Requirements Bachelors degree in Computer Science, Engineering, or a related field 1+ year in a Data Engineer or similar role Proficiency in SQL and Python (or another More ❯
data platform evolution Has experience (or strong interest) in building real-time or event-driven architectures Modern Data Stack Includes: Python , SQL Snowflake , Postgres AWS (S3, ECS, Terraform) Airflow , dbt , Docker Apache Spark , Iceberg What they're looking for: Solid experience as a Senior/Lead/Principal Data Engineer, ideally with some line management or mentoring Proven ability to More ❯
London, Victoria, United Kingdom Hybrid / WFH Options
Boston Hale
Key Responsibilities: Design and maintain scalable data pipelines across diverse sources including CMS, analytics, ad tech, and social platforms. Lead engineering efforts to automate workflows using tools like Airflow, dbt, and Spark. Build robust data models to support dashboards, A/B testing, and revenue analytics. Collaborate with cross-functional teams to deliver actionable insights and support strategic initiatives. Drive More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Boston Hale
Key Responsibilities: Design and maintain scalable data pipelines across diverse sources including CMS, analytics, ad tech, and social platforms. Lead engineering efforts to automate workflows using tools like Airflow, dbt, and Spark. Build robust data models to support dashboards, A/B testing, and revenue analytics. Collaborate with cross-functional teams to deliver actionable insights and support strategic initiatives. Drive More ❯
Our products are recognised by industry leaders like Gartner's Magic Quadrant, Forrester Wave and Frost Radar. Our tech stack: Superset and similar data visualisation tools. ETL tools: Airflow, DBT, Airbyte, Flink, etc. Data warehousing and storage solutions: ClickHouse, Trino, S3. AWS Cloud, Kubernetes, Helm. Relevant programming languages for data engineering tasks: SQL, Python, Java, etc. What you will be More ❯
data pipelines, and ETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to translate complex More ❯
detail and care about the features they implement. What we need from you: At least 3 years of relevant data engineering experience Strong Python and SQL skills Experience with dbt Experience with AWS Experience working with a columnar database such as Redshift Strong Experience with ETL/ELT and the management of data pipelines Familiarity with snowplow Experience with DataMore ❯
reliable data-focused backend services Have hands-on experience with cloud infrastructure (GCP/AWS/Azure), infrastructure-as-code (Terraform), containerisation (Docker/k8s) and data pipelines (SQL, dbt, Airbyte) Love automation, process improvement and finding ways to help others work efficiently Are comfortable working autonomously and taking responsibility for the delivery of large technical projects Are eager to More ❯
Wandsworth, Greater London, UK Hybrid / WFH Options
Count Technologies Ltd
reliable data-focused backend services Have hands-on experience with cloud infrastructure (GCP/AWS/Azure), infrastructure-as-code (Terraform), containerisation (Docker/k8s) and data pipelines (SQL, dbt, Airbyte) Love automation, process improvement and finding ways to help others work efficiently Are comfortable working autonomously and taking responsibility for the delivery of large technical projects Are eager to More ❯
They are also proficient in Azure Event Hub and Streaming Analytics, Managed Streaming for Apache Kafka, Azure DataBricks with Spark, and other open source technologies like Apache Airflow and dbt, Spark/Python, or Spark/Scala. Required technical and professional expertise Commercial experience as a Data Engineer or similar role, with a strong emphasis on Azure technologies. Proficiency in More ❯
reliable data-focused backend services Have hands-on experience with cloud infrastructure (GCP/AWS/Azure), infrastructure-as-code (Terraform), containerisation (Docker/k8s) and data pipelines (SQL, dbt, Airbyte) Love automation, process improvement and finding ways to help others work efficiently Are comfortable working autonomously and taking responsibility for the delivery of large technical projects Are eager to More ❯
Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the level of fairness and More ❯
London, Victoria, United Kingdom Hybrid / WFH Options
Boston Hale
for a Data Engineer to join their London-based team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources. Automate and optimise workflows using tools like Airflow, dbt, and Spark. Support data modelling for analytics, dashboards, and A/B testing. Collaborate with cross-functional teams to deliver data-driven insights. Work with cloud platforms (GCP preferred) and More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Boston Hale
for a Data Engineer to join their London-based team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources. Automate and optimise workflows using tools like Airflow, dbt, and Spark. Support data modelling for analytics, dashboards, and A/B testing. Collaborate with cross-functional teams to deliver data-driven insights. Work with cloud platforms (GCP preferred) and More ❯