engineering, or similar roles. Hands-on expertise with Python (Numpy/Pandas) and SQL. Proven experience designing and building robust ETL/ELT pipelines (dbt, Airflow). Strong knowledge of data pipelining, schema design, and cloud platforms (e.g., Snowflake, AWS). Excellent communication skills and the ability to translate technical More ❯
. Experience deploying and maintaining cloud infrastructure (e.g. AWS, GCP, or Azure). Familiarity with data modeling and warehousing concepts, and dimensional modeling techniques. dbt knowledge is preferable. Comfortable working with CI/CD tools, version control, and containers (e.g Git, Jenkins, Docker). Understanding of data governance, security best More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Lloyds Bank plc
Agile environment. Deep technical expertise in software and data engineering, programming languages (python, java etc.). Understanding of orchestration (Composer, DAGs), data processing (DataFlow, dbt), and database capabilities (e.g. BigQuery, CloudSQL, BigTable). Knowledge of container technologies (Docker, Kubernetes), IaaC (Terraform) and experience with cloud platforms such as GCP. Detailed More ❯
working autonomously. It would be a real bonus, but not a requirement if: You've worked in a start-up environment. You've got DBT experience. You've familiarity with MLOps principles and practices and their application in a production setting. Interview Process: You'll have a 20-minute conversation More ❯
R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start-up or scale up environment. Experience working in the fields of financial technology, traditional financial More ❯
ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify More ❯
either AWS or GCP. Experience with Terraform to define and manage cloud infrastructure through code. Desirables: Experience in SQL-based transformation workflows, particularly using DBT in BigQuery Experience with containerisation technologies (Docker, Kubernetes). Familiarity with streaming data ingestion technologies (Kafka, Debezium). Exposure to Data management and Linux admisntration More ❯
data best practices across teams Champion data quality, governance, and documentation Key Requirements: Strong experience with Python, SQL, and modern ETL tools (e.g., Airflow, dbt) Solid grasp of cloud platforms (AWS/GCP/Azure) and data warehouses (e.g., BigQuery, Snowflake) Familiarity with streaming technologies (Kafka, Kinesis, etc.) Passion for More ❯
cloud-based database services (Snowflake). Knowledge of data warehousing, orchestration and pipeline technologies (Apache Airflow/Kafka, Azure DataFactory etc.). Experience with DBT for modelling Server administration and networking fundamentals More ❯
Have hands-on experience with cloud infrastructure (GCP/AWS/Azure), infrastructure-as-code (Terraform), containerisation (Docker/k8s) and data pipelines (SQL, dbt, Airbyte) Love automation, process improvement and finding ways to help others work efficiently Are comfortable working autonomously and taking responsibility for the delivery of large More ❯
Snowflake/Databricks/Redshift/BigQuery, including performance tuning and optimisation. Understanding of best practices for designing scalable and efficient data models, leveraging dbt for transformations. Familiarity with CircleCI, Terraform or similar tools for deployment and infrastructure as code. In this role, you will be responsible for: Shipping and More ❯
Herndon, Virginia, United States Hybrid / WFH Options
Maxar Technologies Holdings Inc
data. Experience building data-streaming processes. Experience using PostGIS. Experience with any of the following: Apache-Hive, Trino, Presto, Starburst, OpenMetadata, Apache-SuperSet, Terraform, dbt, Tableau, Fivetran, Airflow. Experience implementing resilient, scalable, and supportable systems in AWS. Experience using a wide variety of open-source technologies and cloud services. Experience More ❯
San Diego, California, United States Hybrid / WFH Options
Avidity Biosciences
of experience (BA/BS); or a Master's degree with 8+ years of experience with modern data engineering using SQL & Python Mastery of dbt for modular, scalable, and testable data transformations Solid expertise in BI and visualization tools (e.g., Looker, Tableau, Mode) and their data modeling layers. Experience in More ❯
Tools). Experience with one or more of the following is a plus: Kubernetes, Prometheus, Argo workflows, GitHub Actions, Elasticsearch/Opensearch, PostgreSQL, BigQuery, DBTdata pipelines, Fastly, Storybook, Contentful, Deno, Bun. Benefits We want to give you a great work environment; contribute back to both your personal and professional More ❯
the Data space. This role will also allow the successful individual to cross-train into modern Data Engineering tools and Technologies such as Airflow, dbt, Snowflake, as well further develop their skills Python, SQL and Market Data platforms. The firm work on a hybrid working schedule (three days per week More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Searchability®
data warehouse technologies (such as Amazon Redshift, Google BigQuery, or Snowflake) Hands-on experience with ETL tools and frameworks, including Apache Airflow, Talend, or dbt Strong programming ability in Python or another data-focused language Knowledgeable about data management best practices, including governance, security, and compliance standards Familiar with cloud More ❯
and coaching others to succeed. Have a strong background in building and managing data infrastructure at scale, with expertise in Python, SQL, BigQuery, AWS, dbt and Airflow. Have a strong background in data modelling and building scalable data pipelines. Are naturally curious and enthusiastic about experimenting with new tech to More ❯
data warehousing (BigQuery preferred). Hands-on experience with orchestration tools (preferably Airflow). Proficient in SQL and data modelling best practices. Experience with DBT or other modern data transformation frameworks. Ability to use a version control system (e.g. git) for code management and collaboration. Proficiency in efficiently extracting dataMore ❯
data warehousing (BigQuery preferred). Hands-on experience with orchestration tools (preferably Airflow). Proficient in SQL and data modelling best practices. Experience with DBT or other modern data transformation frameworks. Ability to use a version control system (e.g. git) for code management and collaboration. Proficiency in efficiently extracting dataMore ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
Lloyds Bank plc
and integrity. Applying test data management tools for crafting, managing, and maintaining test data sets. Developing and execute data transformation tests using DBT (DataBuildTool). Performing ETL testing to validate data extraction, transformation, and loading processes. Collaborating with data engineers, analysts, and other stakeholders to identify and resolve … need Required Qualifications: Proven experience in defining and implementing data testing strategies. Hands-on experience with test data management tools. Proficiency in DBT (DataBuildTool) for data transformation and testing. Strong understanding of ETL processes and experience in ETL testing. Excellent problem-solving skills and attention to detail. Experience More ❯
Northern Ireland, United Kingdom Hybrid / WFH Options
Ocho
communicator able to interface confidently with both technical and non-technical audiences Bonus Experience • Familiarity with IaC frameworks (CloudFormation, Terraform, SAM) • Exposure to Snowflake, DBT, Airflow, or cost analytics/data pipeline tools • Knowledge of FinOps practices or cost intelligence platforms • Experience contributing to open-source platforms or cloud-native More ❯
Down District, Northern Ireland, UK Hybrid / WFH Options
Ocho
communicator able to interface confidently with both technical and non-technical audiences Bonus Experience • Familiarity with IaC frameworks (CloudFormation, Terraform, SAM) • Exposure to Snowflake, DBT, Airflow, or cost analytics/data pipeline tools • Knowledge of FinOps practices or cost intelligence platforms • Experience contributing to open-source platforms or cloud-native More ❯
analysis Knowledge of dashboard design and data visualization best practices Experience with cloud-based data infrastructure (AWS) Familiarity with modern data stack tools (Airflow, dbt, etc.) Why This Role Matters Judge.me is at an inflection point. As the market leader in Shopify reviews, we've chosen to build our future More ❯
/logging, and support architecture decisions What You Bring Strong SQL & Python (PySpark); hands-on with GCP or AWS Experience with modern ETL tools (dbt, Airflow, Fivetran) BI experience (Looker, Power BI, Metabase); Git and basic CI/CD exposure Background in a quantitative field; AI/ML interest a More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Bounce Digital
/logging, and support architecture decisions What You Bring Strong SQL & Python (PySpark); hands-on with GCP or AWS Experience with modern ETL tools (dbt, Airflow, Fivetran) BI experience (Looker, Power BI, Metabase); Git and basic CI/CD exposure Background in a quantitative field; AI/ML interest a More ❯