variety of databases Working knowledge of one or more of the cloud platforms (AWS, Azure, GCP) Experience building ETL/ELT pipelines specifically using DBT for structured and semi-structured datasets Any orchestration toolings such as Airflow, Dagster, Azure Data Factory, Fivetran etc It will be nice to have: Software More ❯
variety of databases Working knowledge of one or more of the cloud platforms (AWS, Azure, GCP) Experience building ETL/ELT pipelines specifically using DBT for structured and semi-structured datasets Any orchestration toolings such as Airflow, Dagster, Azure Data Factory, Fivetran etc It will be nice to have: Software More ❯
variety of databases Working knowledge of one or more of the cloud platforms (AWS, Azure, GCP) Experience building ETL/ELT pipelines specifically using DBT for structured and semi-structured datasets Any orchestration toolings such as Airflow, Dagster, Azure Data Factory, Fivetran etc It will be nice to have: Software More ❯
data solutions using API and microservice-based architecture. Deep understanding of ETL/ELT architecture, streaming, and event-driven processing; familiarity with tools like dbt, Airflow, Kafka, or equivalents. Familiarity with mid-sized firm tech stacks, especially in financial services, including systems such as NetSuite, Salesforce, Addepar, Experience with Atlassian More ❯
data solutions using API and microservice-based architecture. Deep understanding of ETL/ELT architecture, streaming, and event-driven processing; familiarity with tools like dbt, Airflow, Kafka, or equivalents. Familiarity with mid-sized firm tech stacks, especially in financial services, including systems such as NetSuite, Salesforce, Addepar, Experience with Atlassian More ❯
data solutions using API and microservice-based architecture. Deep understanding of ETL/ELT architecture, streaming, and event-driven processing; familiarity with tools like dbt, Airflow, Kafka, or equivalents. Familiarity with mid-sized firm tech stacks, especially in financial services, including systems such as NetSuite, Salesforce, Addepar, Experience with Atlassian More ❯
platform roles, including at least 3 years in a leadership position. Deep hands-on expertise in modern data architecture, pipelines, and tooling (e.g., Airflow, DBT, Kafka, Spark, Python, SQL). Strong understanding of cloud infrastructure (AWS, GCP, or Azure) and scalable data systems. Familiarity with analytics and ML workflows, including More ❯
platform roles, including at least 3 years in a leadership position. Deep hands-on expertise in modern data architecture, pipelines, and tooling (e.g., Airflow, DBT, Kafka, Spark, Python, SQL). Strong understanding of cloud infrastructure (AWS, GCP, or Azure) and scalable data systems. Familiarity with analytics and ML workflows, including More ❯
platform roles, including at least 3 years in a leadership position. Deep hands-on expertise in modern data architecture, pipelines, and tooling (e.g., Airflow, DBT, Kafka, Spark, Python, SQL). Strong understanding of cloud infrastructure (AWS, GCP, or Azure) and scalable data systems. Familiarity with analytics and ML workflows, including More ❯
ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify More ❯
solutions, while also driving innovation in analytics engineering practices. Key Responsibilities Technical Leadership : Design and implement robust, scalable data pipelines using tools such as dbt and Airflow. Stakeholder Collaboration : Work closely with stakeholders to understand business needs and deliver tailored, data-driven solutions. Data Transformation : Convert raw data into clean More ❯
solutions, while also driving innovation in analytics engineering practices. Key Responsibilities Technical Leadership : Design and implement robust, scalable data pipelines using tools such as dbt and Airflow. Stakeholder Collaboration : Work closely with stakeholders to understand business needs and deliver tailored, data-driven solutions. Data Transformation : Convert raw data into clean More ❯
solutions, while also driving innovation in analytics engineering practices. Key Responsibilities Technical Leadership : Design and implement robust, scalable data pipelines using tools such as dbt and Airflow. Stakeholder Collaboration : Work closely with stakeholders to understand business needs and deliver tailored, data-driven solutions. Data Transformation : Convert raw data into clean More ❯
languages commonly used for data work (e.g., Python, Java, Scala) Deep understanding of ETL/ELT tools and workflow orchestration platforms (e.g., Airflow, Fivetran, dbt) Proficiency with SQL and solid grounding in data modeling concepts Familiarity with cloud services and architectures (AWS, GCP, or Azure) Proven experience managing or mentoring More ❯
languages commonly used for data work (e.g., Python, Java, Scala) Deep understanding of ETL/ELT tools and workflow orchestration platforms (e.g., Airflow, Fivetran, dbt) Proficiency with SQL and solid grounding in data modeling concepts Familiarity with cloud services and architectures (AWS, GCP, or Azure) Proven experience managing or mentoring More ❯
languages commonly used for data work (e.g., Python, Java, Scala) Deep understanding of ETL/ELT tools and workflow orchestration platforms (e.g., Airflow, Fivetran, dbt) Proficiency with SQL and solid grounding in data modeling concepts Familiarity with cloud services and architectures (AWS, GCP, or Azure) Proven experience managing or mentoring More ❯
preferred). Experience with CI/CD pipelines and version control. Proficiency in data visualisation tools (e.g. Tableau, PowerBI). Exposure to tools like DBT, Apache Airflow, Docker. Experience working with large-scale datasets (terabyte-level or higher). Excellent problem-solving capabilities. Strong communication and collaboration skills. Proficiency in More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Owen Thomas | Pending B Corp™
preferred). Experience with CI/CD pipelines and version control. Proficiency in data visualisation tools (e.g. Tableau, PowerBI). Exposure to tools like DBT, Apache Airflow, Docker. Experience working with large-scale datasets (terabyte-level or higher). Excellent problem-solving capabilities. Strong communication and collaboration skills. Proficiency in More ❯
slough, south east england, United Kingdom Hybrid / WFH Options
Owen Thomas | Pending B Corp™
preferred). Experience with CI/CD pipelines and version control. Proficiency in data visualisation tools (e.g. Tableau, PowerBI). Exposure to tools like DBT, Apache Airflow, Docker. Experience working with large-scale datasets (terabyte-level or higher). Excellent problem-solving capabilities. Strong communication and collaboration skills. Proficiency in More ❯
either AWS or GCP. Experience with Terraform to define and manage cloud infrastructure through code. Desirables: Experience in SQL-based transformation workflows, particularly using DBT in BigQuery Experience with containerisation technologies (Docker, Kubernetes). Familiarity with streaming data ingestion technologies (Kafka, Debezium). Exposure to Data management and Linux admisntration More ❯
data best practices across teams Champion data quality, governance, and documentation Key Requirements: Strong experience with Python, SQL, and modern ETL tools (e.g., Airflow, dbt) Solid grasp of cloud platforms (AWS/GCP/Azure) and data warehouses (e.g., BigQuery, Snowflake) Familiarity with streaming technologies (Kafka, Kinesis, etc.) Passion for More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Block MB
data best practices across teams Champion data quality, governance, and documentation Key Requirements: Strong experience with Python, SQL, and modern ETL tools (e.g., Airflow, dbt) Solid grasp of cloud platforms (AWS/GCP/Azure) and data warehouses (e.g., BigQuery, Snowflake) Familiarity with streaming technologies (Kafka, Kinesis, etc.) Passion for More ❯
slough, south east england, United Kingdom Hybrid / WFH Options
Block MB
data best practices across teams Champion data quality, governance, and documentation Key Requirements: Strong experience with Python, SQL, and modern ETL tools (e.g., Airflow, dbt) Solid grasp of cloud platforms (AWS/GCP/Azure) and data warehouses (e.g., BigQuery, Snowflake) Familiarity with streaming technologies (Kafka, Kinesis, etc.) Passion for More ❯
cloud-based database services (Snowflake). Knowledge of data warehousing, orchestration and pipeline technologies (Apache Airflow/Kafka, Azure DataFactory etc.). Experience with DBT for modelling Server administration and networking fundamentals More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Kolayo
cloud-based database services (Snowflake). Knowledge of data warehousing, orchestration and pipeline technologies (Apache Airflow/Kafka, Azure DataFactory etc.). Experience with DBT for modelling Server administration and networking fundamentals More ❯