Herndon, Virginia, United States Hybrid / WFH Options
Maxar Technologies Holdings Inc
data. Experience building data-streaming processes. Experience using PostGIS. Experience with any of the following: Apache-Hive, Trino, Presto, Starburst, OpenMetadata, Apache-SuperSet, Terraform, dbt, Tableau, Fivetran, Airflow. Experience implementing resilient, scalable, and supportable systems in AWS. Experience using a wide variety of open-source technologies and cloud services. Experience More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
as well as 30 Engineers in the businesses data arm. Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT Expertise Terraform Experience Nice to Have: Data Modelling Data Vault Apache Airflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual More ❯
or a related field, with a focus on building scalable data systems and platforms. Expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency More ❯
San Diego, California, United States Hybrid / WFH Options
Avidity Biosciences
of experience (BA/BS); or a Master's degree with 8+ years of experience with modern data engineering using SQL & Python Mastery of dbt for modular, scalable, and testable data transformations Solid expertise in BI and visualization tools (e.g., Looker, Tableau, Mode) and their data modeling layers. Experience in More ❯
Tools). Experience with one or more of the following is a plus: Kubernetes, Prometheus, Argo workflows, GitHub Actions, Elasticsearch/Opensearch, PostgreSQL, BigQuery, DBTdata pipelines, Fastly, Storybook, Contentful, Deno, Bun. Benefits We want to give you a great work environment; contribute back to both your personal and professional More ❯
governance frameworks . Excellent problem-solving, communication, and leadership skills. Preferred Qualifications: Snowflake and/or Databricks certifications or similar Experience with tools like dbt , Airflow , or Terraform . Background in machine learning or advanced analytics is a plus. More ❯
governance frameworks . Excellent problem-solving, communication, and leadership skills. Preferred Qualifications: Snowflake and/or Databricks certifications or similar Experience with tools like dbt , Airflow , or Terraform . Background in machine learning or advanced analytics is a plus. More ❯
the Data space. This role will also allow the successful individual to cross-train into modern Data Engineering tools and Technologies such as Airflow, dbt, Snowflake, as well further develop their skills Python, SQL and Market Data platforms. The firm work on a hybrid working schedule (three days per week More ❯
Education & Experience : Bachelor's in Computer Science or related field; 2-3 years in data engineering. Technical Skills : Strong SQL & Python, experience with Airflow, DBT, cloud platforms (AWS), and warehousing tools (Snowflake, Redshift, BigQuery). Bonus : Familiarity with APIs, Docker, and automation tools. Soft Skills : Team player, proactive, strong communicator More ❯
a leader within the business over time. The current tech stack is varied, and it’s currently made up of TypeScript, Python, PostgreSQL, Redis, DBT on AWS. You’ll be encouraged to take ownership of products and you’ll be given this autonomy from the co-founders. The products handle More ❯
a leader within the business over time. The current tech stack is varied, and it’s currently made up of TypeScript, Python, PostgreSQL, Redis, DBT on AWS. You’ll be encouraged to take ownership of products and you’ll be given this autonomy from the co-founders. The products handle More ❯
understanding of ETL processes, data modelling, and relational databases. Ability to collaborate across teams and independently solve complex data problems. Bonus: Experience with Airflow, dbt, or cloud-based data platforms such as BigQuery, Snowflake, or AWS. Why Join Us? Work on real-world FinTech products with data at their core. More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Searchability®
data warehouse technologies (such as Amazon Redshift, Google BigQuery, or Snowflake) Hands-on experience with ETL tools and frameworks, including Apache Airflow, Talend, or dbt Strong programming ability in Python or another data-focused language Knowledgeable about data management best practices, including governance, security, and compliance standards Familiar with cloud More ❯
Team The Data team is a cross-functional team of experienced and passionate data enthusiasts. We use and own modern data tools (Fivetran, Snowflake, dbt, Looker) and cover a diverse range of data problems and stakeholders. What we're offering you: Flexible hours and summer hours Competitive holiday benefits More ❯
a related field, with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow OR Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ More ❯
preferred) or similar Programming languages: SQL, Python, RestAPI Data Ingestion/ETL tools: Airflow; Snowpipe, or similar Data Transformation tools (SQL or python based): dbt or similar. Data management & Governance platforms: Collate/Open Metadata, Collibra, Infosphere, Montecarlo, Informatica or similar Cloud Data Visualization Tools: Tableau (preferred) or similar Git More ❯
and coaching others to succeed. Have a strong background in building and managing data infrastructure at scale, with expertise in Python, SQL, BigQuery, AWS, dbt and Airflow. Have a strong background in data modelling and building scalable data pipelines. Are naturally curious and enthusiastic about experimenting with new tech to More ❯
Desired Skills & Experience: Core Technical Skills Expert in Python, SQL, and modern data science toolkits (e.g. scikit-learn, XGBoost, statsmodels). Solid grasp of dbt for data transformation. Experience with modern cloud data stacks - Snowflake, BigQuery, Redshift, etc. Comfortable working in agile environments with tools like Git, Jupyter, Airflow. Domain More ❯
features they implement. What we need from you: At least 3 years of relevant data engineering experience Strong Python and SQL skills Experience with dbt Experience with AWS Experience working with a columnar database such as Redshift Strong Experience with ETL/ELT and the management of data pipelines Familiarity More ❯
timelines and quality standards are met. Required Skills & Experience: 5+ years' experience as a Data Analyst Strong skills in Python, SQL and tools like dbt, Snowflake, AWS S3 and SQL Server. Solid understanding of financial instruments such as Equities, Futures, Forwards, CDS, IRS and ETFs with deep knowledge in at More ❯
timelines and quality standards are met. Required Skills & Experience: 5+ years' experience as a Data Analyst Strong skills in Python, SQL and tools like dbt, Snowflake, AWS S3 and SQL Server. Solid understanding of financial instruments such as Equities, Futures, Forwards, CDS, IRS and ETFs with deep knowledge in at More ❯
timelines and quality standards are met. Required Skills & Experience: 5+ years' experience as a Data Analyst Strong skills in Python, SQL and tools like dbt, Snowflake, AWS S3 and SQL Server. Solid understanding of financial instruments such as Equities, Futures, Forwards, CDS, IRS and ETFs with deep knowledge in at More ❯
a related field, with a focus on building scalable data systems and platforms. Strong expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Deep understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures More ❯
Role Job Description: AWS Services: Glue, Lambda, IAM, Service Catalogue, Cloud Formation, Lake Formation, SNS, SQS, Event Bridge Language & Scripting: Python and Spark ETL: DBT Good to Have: Airflow, Snowflake, Big Data (Hadoop), and Teradata Responsibilities: Serve as the primary point of contact for all AWS related data initiatives and More ❯
Analyst, especially with experience in a systematic Hedge Fund or similar Quantitative Trading environment. Strong technical skills in Python, SQL and tools such as dbt, Snowflake, AWS S3, KDB and SQL Server. Solid understanding of financial instruments such as Equities, Futures, Forwards, CDS, IRS and ETFs with deep knowledge in More ❯