variety of databases Working knowledge of one or more of the cloud platforms (AWS, Azure, GCP) Experience building ETL/ELT pipelines specifically using DBT for structured and semi-structured datasets Any orchestration toolings such as Airflow, Dagster, Azure Data Factory, Fivetran etc It will be nice to have: Software More ❯
variety of databases Working knowledge of one or more of the cloud platforms (AWS, Azure, GCP) Experience building ETL/ELT pipelines specifically using DBT for structured and semi-structured datasets Any orchestration toolings such as Airflow, Dagster, Azure Data Factory, Fivetran etc It will be nice to have: Software More ❯
data solutions using API and microservice-based architecture. Deep understanding of ETL/ELT architecture, streaming, and event-driven processing; familiarity with tools like dbt, Airflow, Kafka, or equivalents. Familiarity with mid-sized firm tech stacks, especially in financial services, including systems such as NetSuite, Salesforce, Addepar, Experience with Atlassian More ❯
data solutions using API and microservice-based architecture. Deep understanding of ETL/ELT architecture, streaming, and event-driven processing; familiarity with tools like dbt, Airflow, Kafka, or equivalents. Familiarity with mid-sized firm tech stacks, especially in financial services, including systems such as NetSuite, Salesforce, Addepar, Experience with Atlassian More ❯
platform roles, including at least 3 years in a leadership position. Deep hands-on expertise in modern data architecture, pipelines, and tooling (e.g., Airflow, DBT, Kafka, Spark, Python, SQL). Strong understanding of cloud infrastructure (AWS, GCP, or Azure) and scalable data systems. Familiarity with analytics and ML workflows, including More ❯
platform roles, including at least 3 years in a leadership position. Deep hands-on expertise in modern data architecture, pipelines, and tooling (e.g., Airflow, DBT, Kafka, Spark, Python, SQL). Strong understanding of cloud infrastructure (AWS, GCP, or Azure) and scalable data systems. Familiarity with analytics and ML workflows, including More ❯
ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify More ❯
solutions, while also driving innovation in analytics engineering practices. Key Responsibilities Technical Leadership : Design and implement robust, scalable data pipelines using tools such as dbt and Airflow. Stakeholder Collaboration : Work closely with stakeholders to understand business needs and deliver tailored, data-driven solutions. Data Transformation : Convert raw data into clean More ❯
languages commonly used for data work (e.g., Python, Java, Scala) Deep understanding of ETL/ELT tools and workflow orchestration platforms (e.g., Airflow, Fivetran, dbt) Proficiency with SQL and solid grounding in data modeling concepts Familiarity with cloud services and architectures (AWS, GCP, or Azure) Proven experience managing or mentoring More ❯
languages commonly used for data work (e.g., Python, Java, Scala) Deep understanding of ETL/ELT tools and workflow orchestration platforms (e.g., Airflow, Fivetran, dbt) Proficiency with SQL and solid grounding in data modeling concepts Familiarity with cloud services and architectures (AWS, GCP, or Azure) Proven experience managing or mentoring More ❯
either AWS or GCP. Experience with Terraform to define and manage cloud infrastructure through code. Desirables: Experience in SQL-based transformation workflows, particularly using DBT in BigQuery Experience with containerisation technologies (Docker, Kubernetes). Familiarity with streaming data ingestion technologies (Kafka, Debezium). Exposure to Data management and Linux admisntration More ❯
data best practices across teams Champion data quality, governance, and documentation Key Requirements: Strong experience with Python, SQL, and modern ETL tools (e.g., Airflow, dbt) Solid grasp of cloud platforms (AWS/GCP/Azure) and data warehouses (e.g., BigQuery, Snowflake) Familiarity with streaming technologies (Kafka, Kinesis, etc.) Passion for More ❯
cloud-based database services (Snowflake). Knowledge of data warehousing, orchestration and pipeline technologies (Apache Airflow/Kafka, Azure DataFactory etc.). Experience with DBT for modelling Server administration and networking fundamentals More ❯
Herndon, Virginia, United States Hybrid / WFH Options
Maxar Technologies Holdings Inc
data. Experience building data-streaming processes. Experience using PostGIS. Experience with any of the following: Apache-Hive, Trino, Presto, Starburst, OpenMetadata, Apache-SuperSet, Terraform, dbt, Tableau, Fivetran, Airflow. Experience implementing resilient, scalable, and supportable systems in AWS. Experience using a wide variety of open-source technologies and cloud services. Experience More ❯
San Diego, California, United States Hybrid / WFH Options
Avidity Biosciences
of experience (BA/BS); or a Master's degree with 8+ years of experience with modern data engineering using SQL & Python Mastery of dbt for modular, scalable, and testable data transformations Solid expertise in BI and visualization tools (e.g., Looker, Tableau, Mode) and their data modeling layers. Experience in More ❯
governance frameworks . Excellent problem-solving, communication, and leadership skills. Preferred Qualifications: Snowflake and/or Databricks certifications or similar Experience with tools like dbt , Airflow , or Terraform . Background in machine learning or advanced analytics is a plus. More ❯
governance frameworks . Excellent problem-solving, communication, and leadership skills. Preferred Qualifications: Snowflake and/or Databricks certifications or similar Experience with tools like dbt , Airflow , or Terraform . Background in machine learning or advanced analytics is a plus. More ❯
the Data space. This role will also allow the successful individual to cross-train into modern Data Engineering tools and Technologies such as Airflow, dbt, Snowflake, as well further develop their skills Python, SQL and Market Data platforms. The firm work on a hybrid working schedule (three days per week More ❯
Education & Experience : Bachelor's in Computer Science or related field; 2-3 years in data engineering. Technical Skills : Strong SQL & Python, experience with Airflow, DBT, cloud platforms (AWS), and warehousing tools (Snowflake, Redshift, BigQuery). Bonus : Familiarity with APIs, Docker, and automation tools. Soft Skills : Team player, proactive, strong communicator More ❯
a leader within the business over time. The current tech stack is varied, and it’s currently made up of TypeScript, Python, PostgreSQL, Redis, DBT on AWS. You’ll be encouraged to take ownership of products and you’ll be given this autonomy from the co-founders. The products handle More ❯
understanding of ETL processes, data modelling, and relational databases. Ability to collaborate across teams and independently solve complex data problems. Bonus: Experience with Airflow, dbt, or cloud-based data platforms such as BigQuery, Snowflake, or AWS. Why Join Us? Work on real-world FinTech products with data at their core. More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Searchability®
data warehouse technologies (such as Amazon Redshift, Google BigQuery, or Snowflake) Hands-on experience with ETL tools and frameworks, including Apache Airflow, Talend, or dbt Strong programming ability in Python or another data-focused language Knowledgeable about data management best practices, including governance, security, and compliance standards Familiar with cloud More ❯
a related field, with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow OR Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ More ❯
a related field, with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow OR Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ More ❯
preferred) or similar Programming languages: SQL, Python, RestAPI Data Ingestion/ETL tools: Airflow; Snowpipe, or similar Data Transformation tools (SQL or python based): dbt or similar. Data management & Governance platforms: Collate/Open Metadata, Collibra, Infosphere, Montecarlo, Informatica or similar Cloud Data Visualization Tools: Tableau (preferred) or similar Git More ❯