understand data needs and provide insights through presentations and reports. Deliver data-driven recommendations to support business objectives. Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt … understand data needs and provide insights through presentations and reports. Deliver data-driven recommendations to support business objectives. Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt … related field. Proven experience as a Data Analyst/Analytics Engineer role, preferably in the payments industry with issuer processors. Proven experience in SQL,DBT and Snowflake. Proficiency in building and managing data transformations with dbt, with experience in optimizing complex transformations and documentation. Hands-on experience with Snowflake as More ❯
The team you'll be working with: We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in the full data modeling lifecycle, including designing, implementing, and More ❯
providing meaningful, human support as well as fast, hassle free processes to deliver an unbeatable customer experience. The role Building and owning high quality dbt models, ensuring they are performant, well documented and thoroughly tested Collaborating with analysts and data scientists to understand their needs and build models that drive … in-class Analytics Engineering What we're looking for Strong SQL skills, with experience writing efficient and well structured queries Hands on experience with DBT, building and maintaining modular, scalable data models that follow best practices Strong understanding of dimensional modelling Familiarity with AWS data services ( S3, Athena, Glue ) Experience More ❯
providing meaningful, human support as well as fast, hassle free processes to deliver an unbeatable customer experience. The role Building and owning high quality dbt models, ensuring they are performant, well documented and thoroughly tested Collaborating with analytics and data scientists to understand their needs and build models that drive … in-class Analytics Engineering What we're looking for Strong SQL skills, with experience writing efficient and well structured queries Hands on experience with DBT, building and maintaining modular, scalable data models that follow best practices Strong understanding of dimensional modelling Familiarity with AWS data services (S3, Athena, Glue) Experience More ❯
Proficiency in SQL and Python for data processing and automation Experience working with data modeling tools and practices (e.g., dimensional, star/snowflake schema, dbt) Solid understanding of data governance, metadata, and quality frameworks Strong collaboration and communication skills, with the ability to work cross-functionally in an Agile environment … Exposure to data product management principles (SLAs, contracts, ownership models) Familiarity with orchestration tools and observability platforms (Airflow, dbt, Monte Carlo, etc.) Exposure to real-time/streaming pipelines Understanding of information security best practices Familiarity with BI tools (QuickSight, Power BI, Tableau, Looker, etc.) Interest or experience in building More ❯
and best practices, including data modeling, observable ETL/ELT processes, data warehousing, and data governance. Proficiency in data manipulation languages (e.g., SQL/DBT) and programming languages relevant to data engineering (e.g., Python). Experience with a variety of data processing frameworks and technologies, including cloud-based data services. … Snowflake AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github Actions/Jenkins Nice to Have Experience Understanding of various data architecture paradigms (e.g., Data Lakehouse, Data Warehouse, DataMore ❯
of Data Analytics, Software Development, and Data Engineering. You'll a pivotal role in designing complex data pipelines, managing data warehouses using tools like dbt and Airflow, and enabling predictive analytics solutions for our diverse group of clients. As part of a consultancy, you'll have the opportunity to work … Work in a dynamic consultancy environment where creativity, autonomy, and teamwork are valued. Cutting-Edge Tools: Gain hands-on experience with leading tools like dbt, Airflow, Snowflake, and cloud platforms. Key Responsibilities Design and manage data warehouses using SQL, NoSQL, and cloud platforms. Develop ETL/ELT pipelines using Airflow More ❯
Job Title: Contract Analytics Engineer (DBT & Snowflake Specialist) Location: Hybrid (at least 1 day per week in London) Contract … Length: 6 Months Rate: Up to £550 per day (Outside IR35) We are seeking an experienced Analytics Engineer with deep expertise in DBT (DataBuildTool) Core and Snowflake to join a fast-moving data team on a 6-month contract . This is an exciting opportunity to work on … leveraging your skills in data modelling, data warehousing, and Snowflake to drive actionable insights for a dynamic organisation. Key Responsibilities: Design, develop, and optimise DBT models to transform raw data into valuable analytics-ready datasets. Utilise your in-depth knowledge of DBT Core to upskill, train and mentor the team More ❯
company Experience building or maintaining third-party or in-house data quality and cataloguing solutions Experience with documentation of system architecture Pandas, Jupyter, Plotly DBT, Kafka BI tools such as Tableau, Metabase and Superset The current tech stack: Airflow Clickhouse DBT Python MongoDB PostgreSQL MariaDB Kafka K8s AWS FXC Intelligence More ❯
the latest tools and trends, especially in AWS and open-source data tech. What you bring: Expert-level SQL and strong scripting (Python/DBT). Deep knowledge of ETL/ELT processes, CI/CD, and data performance tuning. Strong communication skills, with a knack for translating complex tech … mentoring experience. Nice to have: Hands-on experience with Snowflake, Airflow, AWS Glue, Spark, and S3. Familiarity with open-source data libraries (e.g., Pandas, DBT). Experience with modern data stacks and AWS cloud services. This is your chance to shape the future of our data ecosystem from the ground More ❯
the latest tools and trends, especially in AWS and open-source data tech. What you bring: Expert-level SQL and strong scripting (Python/DBT). Deep knowledge of ETL/ELT processes, CI/CD, and data performance tuning. Strong communication skills, with a knack for translating complex tech … mentoring experience. Nice to have: Hands-on experience with Snowflake, Airflow, AWS Glue, Spark, and S3. Familiarity with open-source data libraries (e.g., Pandas, DBT). Experience with modern data stacks and AWS cloud services. This is your chance to shape the future of our data ecosystem from the ground More ❯
Drive. What will I be doing? Design, build, and maintain scalable and reliable data pipelines. Manage Zeelo's serverless centralized data architecture (Fivetran, BigQuery, dbt, and other tools) that supports analytical functions across the business. Design, build, and maintain ETL, ELT and other data pipelines for purposes to support analytics … degrees are a plus. Min 3+ years data engineering experience in a commercial environment. Proficiency in SQL. Experience building SQL-based transformation flows in dbt or similar tools. Good understanding of cloud platforms such as GCP, AWS or Azure. Experience configuring orchestration of SQL and Python via Airflow or similar More ❯
the Design and Evolution of Data Models: Architecting and overseeing the development of sophisticated, high-performance data models within our cloud data warehouse (BigQuery, dbt), ensuring they are scalable, maintainable, and effectively serve the complex analytical and operational needs of the entire organisation. You will be instrumental in defining and … performant data models in enterprise-grade cloud environments (e.g., BigQuery, Snowflake, Redshift), consistently optimising for scale and cost-efficiency. Demonstrate mastery of SQL and dbt, exhibiting expertise in advanced SQL techniques and extensive experience in developing, optimising, and troubleshooting data transformations within dbt, including more advanced features such as macros … materialisation configurations. Have strong proficiency in ELT processes and data orchestration, evidenced by a thorough understanding of methodologies and significant hands-on experience using dbt, complemented by practical experience with orchestration tools such as Dagster or Airflow. Have a proven track record of successful semantic layer implementation, showcasing experience in More ❯
Modelling & Engineering: Gather requirements from stakeholders and design scalable data solutions Build and maintain robust data models and exposures in the data warehouse using dbt, Snowflake, and Looker Document architectural decisions, modelling challenges, and outcomes in a clear and structured way Wrangle and integrate data from multiple third-party sources … You will have proven experience in designing and maintaining data models for warehousing and business intelligence You will have advanced SQL skills; experience with dbt and/or Looker is strongly preferred You will be proficient with modern data platforms (e.g., Snowflake, dbt, AWS, GCP, Looker, Tableau, Airflow) You will More ❯
Modelling & Engineering: Gather requirements from stakeholders and design scalable data solutions Build and maintain robust data models and exposures in the data warehouse using dbt, Snowflake, and Looker Document architectural decisions, modelling challenges, and outcomes in a clear and structured way Wrangle and integrate data from multiple third-party sources … You will have proven experience in designing and maintaining data models for warehousing and business intelligence You will have advanced SQL skills; experience with dbt and/or Looker is strongly preferred You will be proficient with modern data platforms (e.g., Snowflake, dbt, AWS, GCP, Looker, Tableau, Airflow) You will More ❯
CD pipelines for Snowflake data workflows Automate AWS infrastructure with Terraform Deploy containers via Docker & Kubernetes (EKS) Ensure observability, performance & security Integrate tooling like dbt, Airflow, and support platform evolution Requirements 3+ years in DevOps/cloud/SRE roles Expert in AWS (EC2, IAM, Lambda, etc.) Strong in Terraform … in Python, Bash, or similar Strong team collaboration & Agile mindset Nice to Have Background in commodity trading, finance, or energy Experience with data governance, dbt, or Airflow More ❯
CD pipelines for Snowflake data workflows Automate AWS infrastructure with Terraform Deploy containers via Docker & Kubernetes (EKS) Ensure observability, performance & security Integrate tooling like dbt, Airflow, and support platform evolution Requirements 3+ years in DevOps/cloud/SRE roles Expert in AWS (EC2, IAM, Lambda, etc.) Strong in Terraform … in Python, Bash, or similar Strong team collaboration & Agile mindset Nice to Have Background in commodity trading, finance, or energy Experience with data governance, dbt, or Airflow More ❯
london, south east england, United Kingdom Hybrid / WFH Options
twentyAI
CD pipelines for Snowflake data workflows Automate AWS infrastructure with Terraform Deploy containers via Docker & Kubernetes (EKS) Ensure observability, performance & security Integrate tooling like dbt, Airflow, and support platform evolution Requirements 3+ years in DevOps/cloud/SRE roles Expert in AWS (EC2, IAM, Lambda, etc.) Strong in Terraform … in Python, Bash, or similar Strong team collaboration & Agile mindset Nice to Have Background in commodity trading, finance, or energy Experience with data governance, dbt, or Airflow More ❯
maintain ETL/ELT pipelines to transform raw data into actionable insights Build and optimize data models for scalability and performance in tools like dbt Collaborate with analysts and product teams to deliver reliable datasets for reporting and analysis Monitor and improve data quality using validation frameworks. Proactively identify data … alignment across teams Experience Experience in a similar analytics engineeringrole Strong SQL skills and experience with data warehouses (e.g., Redshift, Snowflake, BigQuery) Proficiency with dbt or similar data transformation tools Experience with BI tools (e.g., Tableau, Looker) for creating reports and dashboards Knowledge of data governance and best practices for More ❯
CD, DevOps, and data reliability engineering (DRE) best practices across Databricks environments. Integrate with cloud-native services and orchestrate workflows using tools such as dbt, Airflow, and Databricks Workflows. Drive performance tuning, cost optimisation, and monitoring across data workloads. Mentor engineering teams and support architectural decisions as a recognised Databricks … Catalog. Excellent communication, leadership, and problem-solving skills. Desirable: Databricks certifications (e.g., Data Engineer Associate/Professional or Solutions Architect). Familiarity with MLflow, dbt, and BI tools such as Power BI or Tableau. Exposure to MLOps practices and deploying ML models within Databricks. Experience working within Agile and DevOps More ❯
About Hakkoda Hakkoda is a modern data consultancy that empowers data driven organizations to realize the full value of the Snowflake Data Cloud. We provide consulting and managed services in data architecture, data engineering, analytics and data science. We are More ❯
Position Overview: We are seeking a talented and experienced Senior Python Engineer to join our growing AI team. As a Senior Python Engineer at Keystone Education Group, you will play a crucial role in building and optimizing our AI driven More ❯
Orama are delighted to be partnering with a $56.7 million series B next gen CDI startup. Responsibilities: Partner with Account Executives to qualify opportunities and identify use cases across key verticals (retail, media, digital products/services). Leverage your More ❯
Orama are delighted to be partnering with a $56.7 million series B next gen CDI startup. Responsibilities: Partner with Account Executives to qualify opportunities and identify use cases across key verticals (retail, media, digital products/services). Leverage your More ❯
stoke-on-trent, midlands, United Kingdom Hybrid / WFH Options
bet365
skills and experience with data warehousing solutions such as BigQuery and Snowflake. Proficiency in data modelling and transformation tools such as DataBuildTool (DBT). Familiarity with data visualisation tools such as Power BI, Looker and Tableau. Excellent analytical skills with a strong attention to detail. Ability to communicate More ❯