Efficiency, and Drive. What will I be doing? Design, build, and maintain scalable and reliable data pipelines. Manage Zeelo's serverless centralized data architecture (Fivetran, BigQuery, dbt, and other tools) that supports analytical functions across the business. Design, build, and maintain ETL, ELT and other data pipelines for purposes to More ❯
AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github Actions/Jenkins Nice to Have Experience Understanding of various data architecture paradigms (e.g., Data Lakehouse, Data Warehouse, Data Mesh More ❯
Redshift, Snowflake, or BigQuery. Strong command of SQL and programming languages like Python, Scala, or Java. Familiarity with ETL/ELT tools (e.g., Airflow, Fivetran, dbt) and cloud data stacks (AWS/GCP/Azure). A deep understanding of data modelling, access controls, and infrastructure performance tuning. A background More ❯
Redshift, Snowflake, or BigQuery. Strong command of SQL and programming languages like Python, Scala, or Java. Familiarity with ETL/ELT tools (e.g., Airflow, Fivetran, dbt) and cloud data stacks (AWS/GCP/Azure). A deep understanding of data modelling, access controls, and infrastructure performance tuning. A background More ❯
data processing, analysis and automation. Proficiency in building and maintaining batch and streaming ETL/ELT pipelines at scale, employing tools such as Airflow, Fivetran, Kafka, Iceberg, Parquet, Spark, Glue for developing end-to-end data orchestration leveraging on AWS services to ingest, transform and process large volumes of structured More ❯
to engage technical and non-technical stakeholders alike. The Head of BI will work within a modern, cloud-based BI ecosystem, including: Data Integration: Fivetran, HVR, Databricks, Apache Kafka, Google BigQuery, Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics More ❯
in languages commonly used for data work (e.g., Python, Java, Scala) Deep understanding of ETL/ELT tools and workflow orchestration platforms (e.g., Airflow, Fivetran, dbt) Proficiency with SQL and solid grounding in data modeling concepts Familiarity with cloud services and architectures (AWS, GCP, or Azure) Proven experience managing or More ❯
in languages commonly used for data work (e.g., Python, Java, Scala) Deep understanding of ETL/ELT tools and workflow orchestration platforms (e.g., Airflow, Fivetran, dbt) Proficiency with SQL and solid grounding in data modeling concepts Familiarity with cloud services and architectures (AWS, GCP, or Azure) Proven experience managing or More ❯
building ETL/ELT pipelines specifically using DBT for structured and semi-structured datasets Any orchestration toolings such as Airflow, Dagster, Azure Data Factory, Fivetran etc It will be nice to have: Software engineering background Exposure to building or deploying AI/ML models into a production environment Previously used More ❯
building ETL/ELT pipelines specifically using DBT for structured and semi-structured datasets Any orchestration toolings such as Airflow, Dagster, Azure Data Factory, Fivetran etc It will be nice to have: Software engineering background Exposure to building or deploying AI/ML models into a production environment Previously used More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Finatal
to improve reliability, data quality and consistency. Requirements: Strong experience working across a modern cloud environment, a GCP stack including BigQuery or Snowflake DWH, Fivetran and transformation in dbt. Strong skills in building dashboards and using visualisation tools such as Tableau (beneficial), Power BI or Looker. Experience in database design More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Finatal
to improve reliability, data quality and consistency. Requirements: Strong experience working across a modern cloud environment, a GCP stack including BigQuery or Snowflake DWH, Fivetran and transformation in dbt. Strong skills in building dashboards and using visualisation tools such as Tableau (beneficial), Power BI or Looker. Experience in database design More ❯
Knowledge of data governance and metadata management • Experience with data cataloguing tools • Experience mentoring junior team members • Familiarity with modern data stack tools (Airbyte, Fivetran, etc.) EEO Statement Viasat is proud to be an equal opportunity employer, seeking to create a welcoming and diverse environment. All qualified applicants will receive More ❯
at least one BI/visualisation tool (e.g. Looker/Power BI/Tableau etc.) Knowledge of ETL/ELT tools (e.g. dbt/Fivetran etc.) An understanding of Data Governance principles, and the importance of maintaining data quality and providing accurate data and sound insights. An agile mindset - striving More ❯
hands-on experience in the Lloyd's of London or Specialty Insurance Market. Snowflake (expert level) ETL/Data Integration tools such as Talend, Fivetran Familiarity with MS SQL Server, Liquibase, GitLab Data visualization tools (e.g. Power BI, SAP BO) Data Solutions Architect More ❯
hands-on experience in the Lloyd's of London or Specialty Insurance Market. Snowflake: Expert level. ETL/Data Integration tools: Such as Talend , Fivetran . Familiarity with: MS SQL Server , Liquibase , GitLab . Data visualization tools: (e.g. Power BI , SAP BO ). Data Solutions Architect Due to the volume More ❯
definitions, and accelerate self-serve reporting. You'll also work with Data Engineers to optimise data ingestion and orchestration pipelines via Azure Data Factory, Fivetran and related tools. This role would be well-suited to a data professional with core competencies in data modelling, and who is excited by the More ❯
AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github Actions/Jenkins What You Bring to the Party Solid track record of building and leading high-performance data and More ❯
value above having the 'perfect' solution. Cares deeply about data privacy and security. Here's the Data stack in place today: Extraction/Load: FiveTran, Meltano Storage: AWS S3/AWS RDS/Snowflake Data streaming: Kafka Connect (via Strimzi) Transformation: DBT Product Analytics & event collection: Amplitude & Segment Commercial Tooling More ❯
improvement • Manage stakeholder communications and drive commercial thinking around data 👥 About You: • Expertise in Power BI and SQL is a must • Familiarity with DBT, FiveTran, and Microsoft Fabric highly desirable • Strong stakeholder management and communication skills • Ability to roll your sleeves up and get hands-on when needed • Experience managing More ❯
improvement • Manage stakeholder communications and drive commercial thinking around data 👥 About You: • Expertise in Power BI and SQL is a must • Familiarity with DBT, FiveTran, and Microsoft Fabric highly desirable • Strong stakeholder management and communication skills • Ability to roll your sleeves up and get hands-on when needed • Experience managing More ❯
a requirement to have worked with every tool we use, but the more the better! BigQuery as our data warehouse Metabase for data visualization Fivetran to pipe raw data from third party tools (eg Stripe, which we use for billing) into our data warehouse dbt hosted on Github Actions for More ❯
Staines, Middlesex, United Kingdom Hybrid / WFH Options
Industrial and Financial Systems
Experienced in orchestrating data workflows and Kubernetes clusters on AKS using Airflow, Kubeflow, Argo, Dagster or similar. Skilled with data ingestion tools like Airbyte, Fivetran, etc. for diverse data sources. Expert in large-scale data processing with Spark or Dask. Strong in Python, Scala, C# or Java, cloud SDKs and More ❯
practices , ensuring a highly available, resilient, and self-service platform for data engineers and consumers. Ensure seamless data ingestion and processing pipelines , leveraging Kafka, Fivetran, Snowflake, and Data Lakes . Define and enforce best practices for Data Governance, security, access control, and observability . Establish CI/CD pipelines and More ❯
practices , ensuring a highly available, resilient, and self-service platform for data engineers and consumers. Ensure seamless data ingestion and processing pipelines , leveraging Kafka, Fivetran, Snowflake, and Data Lakes . Define and enforce best practices for Data Governance, security, access control, and observability . Establish CI/CD pipelines and More ❯