relational, dimensional, semantic) Proven experience in data warehouse design, implementation, and maintenance (Snowflake) Hands-on with DBT for modular, testable transformations Experience with orchestration and ingestion tools: Airflow, Prefect, Airbyte, Fivetran, Kafka Familiar with ELT, schema-on-read, DAGs, and performance optimization Cloud & Infrastructure Experience with AWS (S3, RDS, Redshift, etc.) Familiar with Terraform, Docker, and containerized workflows (bonus) Skilled More ❯
for cloud infrastructure as code and automated deployments Hands on with dbt for analytics engineering and transformation-in-warehouse Familiarity with modern data ingestion tools like dlt, Sling, Fivetran, Airbyte, or Stitch Apache Spark experience, especially useful for working with large-scale batch data or bridging into heavier data science workflows Exposure to real-time/event-driven architectures , including More ❯
engineering solutions, ideally with Snowflake. Strong proficiency in SQL, Python, and dbt for transformations and pipeline automation. Practical experience with Snowflake features and RBAC management. Familiarity with ingestion tools (Airbyte, Fivetran, Hevo) and cloud services (AWS preferred). Solid understanding of data modelling, governance principles, and BI enablement (Power BI). Knowledge of CI/CD and version-controlled development More ❯
but we care more about general engineering expertise and problem-solving than specific language knowledge. Strong interest in data and modern data infrastructure technologies such as Big Query, DBT, Airbyte, Meltano, Airflow/Dagster/Prefect, and others. Familiarity with the full cycle of software development, from design and implementation to testing and deployment. Excellent communication skills and the ability More ❯
Deploying containerized AI workloads using Docker, K3s, and Istio across Zero Trust micro-segmented environments following NSA Kubernetes STIG guidance. Designing data acquisition and preprocessing workflows with Apache Kafka, Airbyte, and Apache NiFi for ingesting telemetry, EO/IR, and SIGINT sources into LLM/LVM training pipelines. Using Elastic Stack, OpenSearch, and Neo4j to store, search, and graph-represent More ❯
in SQL, DBT, and Python for ELT workflows. Hands-on experience with Snowflake features: schema design, micro-partitioning, warehouses, tasks, streams, and Snowpark. Familiarity with replication and transformation tools (Airbyte, DBT) and cloud ecosystems (AWS preferred). Strong understanding of data governance, metadata, and lineage frameworks. Experience enabling BI tools (e.g., Power BI) through Snowflake data models. Desirable: Experience in More ❯
infrastructure. Experience with Kubernetes (K8S) is a plus. - Cloud Expertise: Familiarity with AI infrastructure on AWS and GCP, including Sagemaker, Vertex, Triton, and GPU computing. - Bonus Points: Experience with Airbyte is a significant advantage. Perks and Benefits: - Hybrid/Remote Option: Freedom to work from anywhere in the world with flexible core working hours. 🏠 - In-person Meetups and Regular Team More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Clarity
infrastructure. Experience with Kubernetes (K8S) is a plus. - Cloud Expertise: Familiarity with AI infrastructure on AWS and GCP, including Sagemaker, Vertex, Triton, and GPU computing. - Bonus Points: Experience with Airbyte is a significant advantage. Perks and Benefits: - Hybrid/Remote Option: Freedom to work from anywhere in the world with flexible core working hours. 🏠 - In-person Meetups and Regular Team More ❯
supply chain data structures (work orders, BOMs, routings, inventory, purchase orders) Leadership capabilities and ability to deliver independently and communicate with executives Familiarity with data pipelines and ETL tools (Airbyte, dbt, Fivetran, Airflow, or similar) is a plus Benefits & Compensation Salary Range: $110,000 - $165,000 Health, dental, and vision insurance 401(k) with company match Paid time off and More ❯
replication, backups, DR, change management and on call experience. Communication : Strong documentation, runbook writing and stakeholder collaboration. Desirable NoSQL & Caching : Couchbase operations and Memorystore/Redis experience. Tooling: DataStream, Airbyte,ORM, Laravel. Containers & orchestration : Kubernetes basics where DB tooling/operators are used. Agile and Jira : experience in Agile teams (Scrum/Kanban); create/refine user stories, estimate and More ❯
in a role where your work is seen, heard, and used - we'd love to hear from you. Our Data/Analytics Tech Stack: Domo, SQL, DBT, Looker, Snowflake, Airbyte, Stitch, Python, Docker, MS SQL Servier, AWS: EC2, S3 etc. What you'll do: Work on a wide range of different Analytical projects across different teams in the company Enhance More ❯
and audit logging. Proven operational experience with high availability, backups, disaster recovery, and change management. Excellent documentation and stakeholder communication skills. Bonus points for experience with Couchbase, Redis, Kubernetes, Airbyte, Laravel, and Agile methodologies. GCP certifications (Data Engineer or Cloud Architect) are also desirable. What you'll get in return Salary: £65,000-£75,000 depending on experience Company equity More ❯
such as Jira, Salesforce, NetSuite, Greenhouse, GSuite, and a few dozen others. To achieve this goal, we leverage Python and Go in combination with open source tools, such as Airbyte and Temporal. Our work helps to make data-driven decisions, and increases the efficiency of business operations. Location: This role can be held anywhere in EMEA time zones. The role More ❯
with both digital and offline marketing, a plus. Proficiency in SQL is essential. Knowledge of Python or R, a plus. Hands-on experience with modern data analytics stack (e.g. Airbyte, BigQuery, dbt, Omni, Hex, or equivalents) and data modeling best practices. Strong communication and storytelling skills Deeply curious with a drive to learn. Pragmatic. Looking for real world business impact More ❯
such as Jira, Salesforce, NetSuite, Greenhouse, GSuite, and a few dozen others. To achieve this goal, we leverage Python and Go in combination with open source tools, such as Airbyte and Temporal. Our work helps to make data-driven decisions, and increases the efficiency of business operations. Location: This role can be held anywhere in EMEA time zones. The role More ❯
LangChain and LlamaIndex. He knows this space better than anyone. Dream Team : We've assembled authentication, integrations, distributed systems, and AI experts from Okta, Redis, Microsoft, Splunk, Ngrok, Google, Airbyte, Disney, and HPE who've built and founded multiple successful developer platforms. Perfect Timing : We're at the inflection point of AI adoption. The biggest problem isn't better models More ❯
San Francisco, California, United States Hybrid/Remote Options
Plum Inc
PLUM is a fintech company empowering financial institutions to grow their business through a cutting-edge suite of AI-driven software, purpose-built for lenders and their partners across the financial ecosystem. We are a boutique firm, where each person More ❯
PLUM is a fintech company empowering financial institutions to grow their business through a cutting-edge suite of AI-driven software, purpose-built for lenders and their partners across the financial ecosystem. We are a boutique firm, where each person More ❯
Austin, Texas, United States Hybrid/Remote Options
Plum Inc
PLUM is a fintech company empowering financial institutions to grow their business through a cutting-edge suite of AI-driven software, purpose-built for lenders and their partners across the financial ecosystem. We are a boutique firm, where each person More ❯
Atlanta, Georgia, United States Hybrid/Remote Options
Plum Inc
PLUM is a fintech company empowering financial institutions to grow their business through a cutting-edge suite of AI-driven software, purpose-built for lenders and their partners across the financial ecosystem. We are a boutique firm, where each person More ❯
Rochester, New York, United States Hybrid/Remote Options
Mindex
Founded in 1994 and celebrating 30 years in business, Mindex is a software development company with a rich history of demonstrated software and product development success. We specialize in agile software development, cloud professional services, and creating our own innovative More ❯
with are learning management systems, student enrollment, and academic operations on web and mobile platforms. What You'll Do Design, implement, and maintain scalable data pipelines using Snowflake, Coalesce.io, Airbyte, and SQL Server/SSIS, with some use of Azure Data Factory Build and maintain dimensional data models to ensure high-quality, structured data for analytics and reporting Implement Medallion … reliable ETL processes, data transformations, and data integration workflows Help improve data modeling practices and address weaknesses in dimensional modeling What You Bring Hands-on experience with Snowflake, Coalesce.io, Airbyte, SQL Server/SSIS, and Azure Data Factory Strong understanding of Medallion architecture and dimensional data modeling Practical experience in building ETL pipelines and transforming data for analytics Familiarity with More ❯
Raleigh, North Carolina, United States Hybrid/Remote Options
Buildops
experience who enjoys building clean datasets from raw business data. Responsibilities Build and maintain data pipelines from internal and external data sources into Snowflake using technologies such as dbt, Airbyte, Airflow, and AWS. Develop clean, well-modeled datasets from internal data sources to support reporting and analytics. Own Snowflake administration, including network controls, personal access tokens, role-based access, performance … Qualifications 3+ years of experience in data engineering or a related field. Strong SQL and Python skills and hands-on experience with Snowflake administration. Experience building pipelines with dbt, Airbyte, Airflow, and AWS. Proficiency in dimensional data modeling (e.g., star/snowflake schemas). Experience leveraging AI-enabled tools to streamline engineering workflows Strong problem-solving skills and ability to More ❯
Manchester, England, United Kingdom Hybrid/Remote Options
KDR Talent Solutions
in an ambitious, people-focused culture. The Role Lead, mentor, and develop a small team of two Data Engineers. Design, build, and optimise data pipelines and integrations (using dbt, Airbyte , OpenFlow , APIs , and Python ) to deliver reliable and scalable data. Support the development of a new cloud-based data platform, driving modernisation from legacy Microsoft systems. Oversee data architecture, modelling … Python . Experience with data modelling (3NF, Kimball) and modern data integration tools. Exposure to cloud-based platforms such as Snowflake , BigQuery , or similar is advantageous. Experience with DBT , Airbyte , or APIs is desirable. Strong communicator who can link data initiatives to commercial outcomes. Enjoys mentoring, providing direction, and developing others’ technical growth. Comfortable working in a dynamic, evolving environment More ❯
Airbyte is the open-source standard for Data Movement. We enable data teams to move data from applications, APIs, unstructured sources, and databases to data warehouses, lakes, AI applications and LLMs. With our approach we are finally solving the need for extensibility and control that every company needs with data. So far, our customers, users, and ourselves have built over … in product-led growth, where we build something awesome that all our users love. We're committed to providing as much context to our current employees and candidates. The Airbyte company handbook is open to all. If you find this role exciting, we encourage you to apply even if you think you don't meet all requirements. Opportunity You'll … be joining a senior team of frontend engineers who own the Airbyte frontend end to end. We have big plans for expanding our data product to include orchestration, data observability, lineage, and streaming. You will design and build products that will allow Airbyte to be the single pane of glass for data teams and individuals to manage their entire data More ❯