pipelines. Proficiency in JVM-based languages (Java, Kotlin), ideally combined with Python and experience in Spring Boot Solid understanding of data engineering tools and frameworks, like Spark, Flink, Kafka, dbt, Trino, and Airflow. Hands-on experience with cloud environments (AWS, GCP, or Azure), infrastructure-as-code practices, and ideally container orchestration with Kubernetes. Familiarity with SQL and NoSQL databases (Cassandra More ❯
ingestion, storage, transformation and distribution of tick, timeseries, reference and alternative datasets. The technology stack spans both On-Premise and Cloud Infrastructure with technologies and tooling such as Python, DBT, Snowflake KDB+, Snowflake and AWS, This is an exciting time for you to join the team they modernise their technology estate, revamp how they process and filter data, and overhaul More ❯
ingestion, storage, transformation and distribution of tick, timeseries, reference and alternative datasets. The technology stack spans both On-Premise and Cloud Infrastructure with technologies and tooling such as Python, DBT, Snowflake KDB+, Snowflake and AWS, This is an exciting time for you to join the team they modernise their technology estate, revamp how they process and filter data, and overhaul More ❯
orchestration tools (Airflow, Dagster, Prefect) Understanding of distributed systems, batch and streaming data (Kafka, Kinesis) Knowledge of IaC (Terraform, CloudFormation) and containerisation (Docker, Kubernetes) Nice to have: Experience with dbt, feature stores, or ML pipeline tooling Familiarity with Elasticsearch or real-time analytics (Flink, Materialize) Exposure to eCommerce, marketplace, or transactional environments More ❯
orchestration tools (Airflow, Dagster, Prefect) Understanding of distributed systems, batch and streaming data (Kafka, Kinesis) Knowledge of IaC (Terraform, CloudFormation) and containerisation (Docker, Kubernetes) Nice to have: Experience with dbt, feature stores, or ML pipeline tooling Familiarity with Elasticsearch or real-time analytics (Flink, Materialize) Exposure to eCommerce, marketplace, or transactional environments More ❯
of the AWS services like Redshift, Lambda,S3,Step Functions, Batch, Cloud formation, Lake Formation, Code Build, CI/CD, GitHub, IAM, SQS, SNS, Aurora DB Good experience with DBT, Apache Iceberg, Docker, Microsoft BI stack (nice to have) Experience in data warehouse design (Kimball and lake house, medallion and data vault) is a definite preference as is knowledge of More ❯
of the AWS services like Redshift, Lambda,S3,Step Functions, Batch, Cloud formation, Lake Formation, Code Build, CI/CD, GitHub, IAM, SQS, SNS, Aurora DB - Good experience with DBT, Apache Iceberg, Docker, Microsoft BI stack (nice to have) - Experience in data warehouse design (Kimball and lake house, medallion and data vault) is a definite preference as is knowledge of More ❯
field ● Proficiency in writing SQL queries and knowledge of cloud-based databases like Snowflake, Redshift, BigQuery or other big data solutions ● Experience in data modelling and tools such as dbt, ETL processes, and data warehousing ● Experience with at least one of the programming languages like Python, C++[2] , Java ● Experience with version control and code review tools such as Git More ❯
field ● Proficiency in writing SQL queries and knowledge of cloud-based databases like Snowflake, Redshift, BigQuery or other big data solutions ● Experience in data modelling and tools such as dbt, ETL processes, and data warehousing ● Experience with at least one of the programming languages like Python, C++[2] , Java ● Experience with version control and code review tools such as Git More ❯
with JSON and XML Strong understanding of cloud computing concepts and services (AWS preferably) Experience with Git or equivalent version control systems and CI/CD pipelines. Familiarity with dbt a plus Highly analytical with strong problem-solving skills: ability to apply solutions forward, not just completing the task at hand. Ability to investigate, analyze and solve problems as well More ❯
with JSON and XML Strong understanding of cloud computing concepts and services (AWS preferably) Experience with Git or equivalent version control systems and CI/CD pipelines. Familiarity with dbt a plus Highly analytical with strong problem-solving skills: ability to apply solutions forward, not just completing the task at hand. Ability to investigate, analyze and solve problems as well More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Az-Tec Talent
Strong communication and problem-solving skills. Ability to work both independently and collaboratively within client teams. Desirable: Consulting experience or client-facing delivery background. Familiarity with tools such as dbt, Fivetran, Matillion , or similar. Programming skills in Python, Java, or Scala . Cloud certifications (SnowPro, Databricks Certified, AWS/Azure/GCP). Knowledge of DataOps, CI/CD , and More ❯
Strong communication and problem-solving skills. Ability to work both independently and collaboratively within client teams. Desirable: Consulting experience or client-facing delivery background. Familiarity with tools such as dbt, Fivetran, Matillion , or similar. Programming skills in Python, Java, or Scala . Cloud certifications (SnowPro, Databricks Certified, AWS/Azure/GCP). Knowledge of DataOps, CI/CD , and More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Involved Solutions
lakes or Lakehouse's Excellent communication skills with the ability to explain complex technical concepts clearly and persuasively Desirable Skills for the Senior Data Engineer: Experience with event sourcing, dbt, or related data transformation tools Familiarity with PostgreSQL and cloud-native data services (Azure Event Hub, Redshift, Kinesis, S3, Blob Storage, OneLake, or Microsoft Fabric) Understanding of machine learning model More ❯
While in this position your duties may include but are not limited to: Support the design, development, and maintenance of scalable data pipelines using tools such as Apache Airflow, dbt, or Azure Data Factory. Learn how to ingest, transform, and load data from a variety of sources, including APIs, databases, and flat files. Assist in the setup and optimisation of More ❯
abilities and analytical thinking Highly Desirable: Consulting experience, particularly in client-facing delivery roles Data architecture and solution design experience Hands-on experience with modern data tools such as dbt, Fivetran, Matillion, or similar data integration platforms Programming skills in Python, Java, or Scala Relevant cloud certifications (SnowPro, Databricks Certified, AWS/Azure/GCP Data Engineering certifications) Experience with More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Billigence
abilities and analytical thinking Highly Desirable: Consulting experience, particularly in client-facing delivery roles Data architecture and solution design experience Hands-on experience with modern data tools such as dbt, Fivetran, Matillion, or similar data integration platforms Programming skills in Python, Java, or Scala Relevant cloud certifications (SnowPro, Databricks Certified, AWS/Azure/GCP Data Engineering certifications) Experience with More ❯
engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
DBTData Engineer (Snowflake, Azure) | Insurance | London (Hybrid) Robert Half International (an S&P 500 global staffing provider) is supporting a global consulting firm in sourcing an experienced DBTData Engineer to join a major insurance client engagement . The role focuses on scaling a Snowflake Data Warehouse and expanding its DBT Cloud modelling capabilities to support new analytics and … s NI & Tax deducted at source, unlike umbrella companies and no umbrella company admin fees) Role Overview You'll be part of a growing data engineering function focused on DBT model development , Snowflake optimisation , and data governance across multiple data domains.This role suits a technically strong engineer with proven DBT Cloud experience who can take ownership of data pipelines and … drive best practices in transformation, testing, and automation. Key Skills & Experience Deep DBT Cloud expertise, including models, macros, tests, documentation, and CI/CD integration. Hands-on experience developing and optimising in Snowflake Cloud Data Warehouse (schemas, warehouses, security, time travel, performance tuning). Familiarity with Snowflake cost monitoring, governance, replication , and environment management. Strong understanding of data modelling (star More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Harrington Starr
experience with Snowflake or other cloud data warehouses. Proven experience building dashboards and reports in Power BI (DAX, Power Query). Hands-on experience with ETL/ELT tools (dbt, Matillion, Informatica, etc.). Familiarity with APIs and data integration from market data providers. Knowledge of financial data and systems (Bloomberg, SimCorp Dimension) preferred. Understanding of investment management workflows (portfolio More ❯
experience with Snowflake or other cloud data warehouses. Proven experience building dashboards and reports in Power BI (DAX, Power Query). Hands-on experience with ETL/ELT tools (dbt, Matillion, Informatica, etc.). Familiarity with APIs and data integration from market data providers. Knowledge of financial data and systems (Bloomberg, SimCorp Dimension) preferred. Understanding of investment management workflows (portfolio More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid/Remote Options
Vivedia Ltd
Experience with cloud platforms (AWS, Azure, GCP) and tools like Snowflake, Databricks, or BigQuery . Familiarity with streaming technologies (Kafka, Spark Streaming, Flink) is a plus. Tools & Frameworks: Airflow, dbt, Prefect, CI/CD pipelines, Terraform. Mindset: Curious, data-obsessed, and driven to create meaningful business impact. Soft Skills: Excellent communication and collaboration — translating complex technical ideas into business insight More ❯
Proven experience in a leadership or technical lead role, with official line management responsibility. Strong experience with modern data stack technologies, including Python, Snowflake, AWS (S3, EC2, Terraform), Airflow, dbt, Apache Spark, Apache Iceberg, and Postgres. Skilled in balancing technical excellence with business priorities in a fast-paced environment. Strong communication and stakeholder management skills, able to translate technical concepts More ❯
Proven track record building and maintaining scalable data platforms in production, enabling advanced users such as ML and analytics engineers. Hands-on experience with modern data stack tools - Airflow, DBT, Databricks, and data catalogue/observability solutions like Monte Carlo, Atlan, or DataHub. Solid understanding of cloud environments (AWS or GCP), including IAM, S3, ECS, RDS, or equivalent services. Experience More ❯