The team you'll be working with: We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in the full data modeling lifecycle, including designing, implementing, and More ❯
Xenith Solutions is a small family focused business where we focus on taking care of our employees and customers equally. We are focused on serving Federal/Civilian, Defense and Intelligence organizations with superior service. If you want to be More ❯
insight. Your role is to improve their interaction with these tools, whether they are internally or externally developed. Some examples of this type of work: Improving our in-house dbt CLI wrapper to make it more user friendly and optimise runtimes Monitor tooling interaction with tools like Sentry or Datadog to identify areas for improvement Developing our internal BI tooling … experience; we have plenty of incredible developers at Octopus Energy who are willing to teach if you're willing to learn! Required experience: Python Git Nice to have: SQL dbt Github CircleCI Airflow Kubernetes Terraform A Cloud warehouse provider e.g. Databricks, GCP, Snowflake AWS We aren't necessarily looking for someone who is "10-out-of-10" in all these More ❯
week in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … with a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
week in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … with a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
reconciliation, and integration verification activities. Core skills and experience: Proven experience designing scalable data architectures in cloud and hybrid environments. Expertise in data modelling, SQL, and platforms like Snowflake, dbt, Power BI, and Databricks. Fluency in Python and knowledge of multiple cloud providers (AWS, Azure, GCP). Understanding of security principles including role-based access control. Experience with legacy-to More ❯
top of our list Google Cloud Storage Google Data Transfer Service Google Dataflow (Apache Beam) Google PubSub Google CloudRun BigQuery or any RDBMS Python Debezium/Kafka dbt (DataBuildtool) Interview process Interviewing is a two way process and we want you to have the time and opportunity to get to know us, as much as we are getting More ❯
data platform evolution Has experience (or strong interest) in building real-time or event-driven architectures ️ Modern Data Stack Includes: Python , SQL Snowflake , Postgres AWS (S3, ECS, Terraform) Airflow , dbt , Docker Apache Spark , Iceberg What they're looking for: Solid experience as a Senior/Lead/Principal Data Engineer, ideally with some line management or mentoring Proven ability to More ❯
. Proficiency with Docker, Linux, and bash. Ability to document code, architectures, and experiments. Preferred Qualifications Experience with databases and data warehousing (Hive, Iceberg). Data transformation skills (SQL, DBT). Experience with orchestration platforms (Airflow, Argo). Knowledge of data catalogs, metadata management, vector databases, relational/object databases. Experience with Kubernetes. Understanding of computational geometry (meshes, boundary representations More ❯
with containerization and CI/CD tools (e.g., Docker, GitHub Actions). Knowledge of networking and cloud infrastructure (e.g., AWS, Azure). Experience with modern data processing frameworks (e.g., dbt, Apache Airflow, Spark, or similar). Requirements A strong focus on system observability and data quality. Emphasis on rapid scalability of solutions ( consider market ramp up when entering a new More ❯
love to talk to you if: You've led technical delivery of data engineering projects in a consultancy or client-facing environment You're experienced with Python, SQL, .NET, dbt, Airflow and cloud-native data tools (AWS, GCP or Azure) You have strong knowledge of data architecture patterns - including Lakehouse and modern warehouse design (e.g. Snowflake, BigQuery, Databricks) You know More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Dept Holding B.V
data pipelines, and ETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to translate complex More ❯
Our products are recognised by industry leaders like Gartner's Magic Quadrant, Forrester Wave and Frost Radar. Our tech stack: Superset and similar data visualisation tools. ETL tools: Airflow, DBT, Airbyte, Flink, etc. Data warehousing and storage solutions: ClickHouse, Trino, S3. AWS Cloud, Kubernetes, Helm. Relevant programming languages for data engineering tasks: SQL, Python, Java, etc. What you will be More ❯
detail and care about the features they implement. What we need from you: At least 3 years of relevant data engineering experience Strong Python and SQL skills Experience with dbt Experience with AWS Experience working with a columnar database such as Redshift Strong Experience with ETL/ELT and the management of data pipelines Familiarity with snowplow Experience with DataMore ❯
Herndon, Virginia, United States Hybrid / WFH Options
Maxar Technologies Holdings Inc
software development. Experience with geospatial data. Experience building data-streaming processes. Experience using PostGIS. Experience with any of the following: Apache-Hive, Trino, Presto, Starburst, OpenMetadata, Apache-SuperSet, Terraform, dbt, Tableau, Fivetran, Airflow. Experience implementing resilient, scalable, and supportable systems in AWS. Experience using a wide variety of open-source technologies and cloud services. Experience developing multi-step ETLs including More ❯
Cloud Run, Cloud Monitoring & Logging, Dataplex, Beam, Tentacles and Pub/Sub; Fluent Python, SQL skills with real life project experience; Experience on orchestration tools such as Airflow and DBT; Experience with one of major analytical DWHs is plus: BigQuery, Redshift, Snowflake, Databricks, Synapse; Work experience with following technologies are noteworthy to mention and might be seen as bonus: AWS More ❯
Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the level of fairness and More ❯
Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the level of fairness and More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Starling Bank Limited
Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the level of fairness and More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
Starling Bank Limited
Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the level of fairness and More ❯
pipelines Maintain and develop data warehouses Provide guidance and mentorship to junior team members To be successful in this role you will have. Extensive experience with Snowflake Experience with DBT, Airflow or Python Cloud Data Engineering experience with AWS/Azure/GCP This is a hybrid role based from the companies London office with some of the benefits including More ❯
pipelines Maintain and develop data warehouses Provide guidance and mentorship to junior team members To be successful in this role you will have. Extensive experience with Snowflake Experience with DBT, Airflow or Python Cloud Data Engineering experience with AWS/Azure/GCP This is a hybrid role based from the companies London office with some of the benefits including More ❯
There are opportunities for professional development, such as training programs, certifications, and career advancement paths. KEY RESPONSIBILITIES Design, develop, and maintain scalable data pipelines SQL, Azure ADF, Azure Functions, DBT Collaborate with analysts and stakeholders to understand their data needs, scoping and implementing solutions Optimising and cleaning of data warehouse, cleaning existing codebase and creating documentation. Monitor and troubleshoot dataMore ❯
There are opportunities for professional development, such as training programs, certifications, and career advancement paths. KEY RESPONSIBILITIES Design, develop, and maintain scalable data pipelines SQL, Azure ADF, Azure Functions, DBT Collaborate with analysts and stakeholders to understand their data needs, scoping and implementing solutions Optimising and cleaning of data warehouse, cleaning existing codebase and creating documentation. Monitor and troubleshoot dataMore ❯
There are opportunities for professional development, such as training programs, certifications, and career advancement paths. KEY RESPONSIBILITIES Design, develop, and maintain scalable data pipelines SQL, Azure ADF, Azure Functions, DBT Collaborate with analysts and stakeholders to understand their data needs, scoping and implementing solutions Optimising and cleaning of data warehouse, cleaning existing codebase and creating documentation. Monitor and troubleshoot dataMore ❯