good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income JavaScript Node Fixed … times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech attitude towards technology. Hours are More ❯
typical day will look like this: Connect with the team at standup to catch up on the latest. Builddata pipelines with Spark or DBT on Starburst. Use SQL to transform data into meaningful insights. Build and deploy infrastructure with Terraform. Implement DDL, DML with Iceberg. Do a code review More ❯
. Experience deploying and maintaining cloud infrastructure (e.g. AWS, GCP, or Azure). Familiarity with data modeling and warehousing concepts, and dimensional modeling techniques. dbt knowledge is preferable. Comfortable working with CI/CD tools, version control, and containers (e.g Git, Jenkins, Docker). Understanding of data governance, security best More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Duel
with Snowflake. You understand event-driven architectures and real-time data processing. You have experience implementing and maintaining scalable data pipelines using tools like dbt, Apache Airflow, or similar. You have no issue working with either structured or semi-structured data. You are comfortable working with data engineering and scripting More ❯
Stack: Python and Scala Starburst and Athena Kafka and Kinesis DataHub ML Flow and Airflow Docker and Terraform Kafka, Spark, Kafka Streams and KSQL DBT AWS, S3, Iceberg, Parquet, Glue and EMR for our Data Lake Elasticsearch and DynamoDB More information: Enjoy fantastic perks like private healthcare & dental insurance, a More ❯
Nottinghamshire, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (Python, Data Modelling, ETL/ELT, Apache Airflow, DBT, AWS) Enterprise-scale tech firm Up to £70,000 plus benefits - FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalabale data structures and data pipelines within a truly More ❯
Birmingham, Staffordshire, United Kingdom Hybrid / WFH Options
Yelp USA
technologies: Python, AWS Redshift, AWS Athena/Apache Presto, Big Data technologies (e.g S3, Hadoop, Hive, Spark, Flink, Kafka etc), NoSQL systems like Cassandra, DBT is nice to have. What you'll get: Full responsibility for projects from day one, a collaborative team, and a dynamic work environment. Competitive salary More ❯
R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start-up or scale up environment. Experience working in the fields of financial technology, traditional financial More ❯
R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start-up or scale up environment. Experience working in the fields of financial technology, traditional financial More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
as well as 30 Engineers in the businesses data arm. Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT Expertise Terraform Experience Nice to Have: Data Modelling Data Vault Apache Airflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual More ❯
e.g., Snowflake). Proficiency in SQL and at least one scripting language (e.g., Python, Bash). Familiarity with data pipeline orchestration tools (e.g., Airflow, dbt). Strong understanding of data quality principles and best practices. Excellent communication and collaboration skills. Working with AWS, Twilio Segment and Google Analytics. Hands-on More ❯
to other engineering teams in a Flask API; Our current technology stack primarily includes Linux, Docker, Kubernetes, Jenkins, Kafka, Python, Flask, Postgres, AWS Redshift, dbt, Google Bigquery, Prometheus, Grafana, Elastic Search, Kibana, Sisense. Your responsibilities As a member of the data team, your responsibilities will include contributions to: Developing and More ❯
either AWS or GCP. Experience with Terraform to define and manage cloud infrastructure through code. Desirables: Experience in SQL-based transformation workflows, particularly using DBT in BigQuery Experience with containerisation technologies (Docker, Kubernetes). Familiarity with streaming data ingestion technologies (Kafka, Debezium). Exposure to Data management and Linux admisntration More ❯
while collaborating with technical and non-technical stakeholders. What you'll do: Crafting robust Business Intelligence (BI) solutions using cutting-edge tools like Alteryx, DBT, Snowflake, Azure, Tableau, Power BI, and ThoughtSpot. Collaborating with stakeholders to gather business requirements, designing innovative solutions, and creating intuitive reports and dashboards. You'll More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
efficient code and comfortable undertaking system optimisation and performance tuning tasks Comfortable working with relational databases such as Oracle, PostgreSQL, MySQL Has exposure to DBT and data quality test frameworks Has awareness of Infrastructure as Code such as Terraform and Ansible BENEFITS Competitive Salary. Company Laptop supplied. Bonus Scheme. More ❯
working autonomously. It would be a real bonus, but not a requirement if: You've worked in a start-up environment. You've got DBT experience. You've familiarity with MLOps principles and practices and their application in a production setting. Interview Process: You'll have a 20-minute conversation More ❯
Snowflake/Databricks/Redshift/BigQuery, including performance tuning and optimisation. Understanding of best practices for designing scalable and efficient data models, leveraging dbt for transformations. Familiarity with CircleCI, Terraform or similar tools for deployment and infrastructure as code. In this role, you will be responsible for: Shipping and More ❯
San Diego, California, United States Hybrid / WFH Options
Avidity Biosciences
of experience (BA/BS); or a Master's degree with 8+ years of experience with modern data engineering using SQL & Python Mastery of dbt for modular, scalable, and testable data transformations Solid expertise in BI and visualization tools (e.g., Looker, Tableau, Mode) and their data modeling layers. Experience in More ❯
analysis Knowledge of dashboard design and data visualization best practices Experience with cloud-based data infrastructure (AWS) Familiarity with modern data stack tools (Airflow, dbt, etc.) Why This Role Matters Judge.me is at an inflection point. As the market leader in Shopify reviews, we've chosen to build our future More ❯
and coaching others to succeed. Have a strong background in building and managing data infrastructure at scale, with expertise in Python, SQL, BigQuery, AWS, dbt and Airflow. Have a strong background in data modelling and building scalable data pipelines. Are naturally curious and enthusiastic about experimenting with new tech to More ❯
Have hands-on experience with cloud infrastructure (GCP/AWS/Azure), infrastructure-as-code (Terraform), containerisation (Docker/k8s), and data pipelines (SQL, dbt, Airbyte). Love automation, process improvement, and finding ways to help others work efficiently. Are comfortable working autonomously and taking responsibility for the delivery of More ❯
about our Minimum Standards of Product Development here . The Tech SQL & Python are your native languages, with a dash of Scala when needed. DBT, data modeling , and analytics are your go-to tools; Airflow is your daily companion. BigQuery/GCP hold no secrets for you, and AWS is More ❯
Experience with at least one BI/visualisation tool (e.g. Looker/Power BI/Tableau etc.) Knowledge of ETL/ELT tools (e.g. dbt/Fivetran etc.) An understanding of Data Governance principles, and the importance of maintaining data quality and providing accurate data and sound insights. An agile More ❯
Tools). Experience with one or more of the following is a plus: Kubernetes, Prometheus, Argo workflows, GitHub Actions, Elasticsearch/Opensearch, PostgreSQL, BigQuery, DBTdata pipelines, Fastly, Storybook, Contentful, Deno, Bun. Benefits We want to give you a great work environment; contribute back to both your personal and professional More ❯