ingestion, storage, transformation and distribution of tick, timeseries, reference and alternative datasets. The technology stack spans both On-Premise and Cloud Infrastructure with technologies and tooling such as Python, DBT, Snowflake KDB+, Snowflake and AWS, This is an exciting time for you to join the team they modernise their technology estate, revamp how they process and filter data, and overhaul More ❯
ingestion, storage, transformation and distribution of tick, timeseries, reference and alternative datasets. The technology stack spans both On-Premise and Cloud Infrastructure with technologies and tooling such as Python, DBT, Snowflake KDB+, Snowflake and AWS, This is an exciting time for you to join the team they modernise their technology estate, revamp how they process and filter data, and overhaul More ❯
orchestration tools (Airflow, Dagster, Prefect) Understanding of distributed systems, batch and streaming data (Kafka, Kinesis) Knowledge of IaC (Terraform, CloudFormation) and containerisation (Docker, Kubernetes) Nice to have: Experience with dbt, feature stores, or ML pipeline tooling Familiarity with Elasticsearch or real-time analytics (Flink, Materialize) Exposure to eCommerce, marketplace, or transactional environments More ❯
with JSON and XML Strong understanding of cloud computing concepts and services (AWS preferably) Experience with Git or equivalent version control systems and CI/CD pipelines. Familiarity with dbt a plus Highly analytical with strong problem-solving skills: ability to apply solutions forward, not just completing the task at hand. Ability to investigate, analyze and solve problems as well More ❯
with JSON and XML Strong understanding of cloud computing concepts and services (AWS preferably) Experience with Git or equivalent version control systems and CI/CD pipelines. Familiarity with dbt a plus Highly analytical with strong problem-solving skills: ability to apply solutions forward, not just completing the task at hand. Ability to investigate, analyze and solve problems as well More ❯
Strong communication and problem-solving skills. Ability to work both independently and collaboratively within client teams. Desirable: Consulting experience or client-facing delivery background. Familiarity with tools such as dbt, Fivetran, Matillion , or similar. Programming skills in Python, Java, or Scala . Cloud certifications (SnowPro, Databricks Certified, AWS/Azure/GCP). Knowledge of DataOps, CI/CD , and More ❯
abilities and analytical thinking Highly Desirable: Consulting experience, particularly in client-facing delivery roles Data architecture and solution design experience Hands-on experience with modern data tools such as dbt, Fivetran, Matillion, or similar data integration platforms Programming skills in Python, Java, or Scala Relevant cloud certifications (SnowPro, Databricks Certified, AWS/Azure/GCP Data Engineering certifications) Experience with More ❯
engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and More ❯
experience with Snowflake or other cloud data warehouses. Proven experience building dashboards and reports in Power BI (DAX, Power Query). Hands-on experience with ETL/ELT tools (dbt, Matillion, Informatica, etc.). Familiarity with APIs and data integration from market data providers. Knowledge of financial data and systems (Bloomberg, SimCorp Dimension) preferred. Understanding of investment management workflows (portfolio More ❯
Proven experience in a leadership or technical lead role, with official line management responsibility. Strong experience with modern data stack technologies, including Python, Snowflake, AWS (S3, EC2, Terraform), Airflow, dbt, Apache Spark, Apache Iceberg, and Postgres. Skilled in balancing technical excellence with business priorities in a fast-paced environment. Strong communication and stakeholder management skills, able to translate technical concepts More ❯
Proven track record building and maintaining scalable data platforms in production, enabling advanced users such as ML and analytics engineers. Hands-on experience with modern data stack tools - Airflow, DBT, Databricks, and data catalogue/observability solutions like Monte Carlo, Atlan, or DataHub. Solid understanding of cloud environments (AWS or GCP), including IAM, S3, ECS, RDS, or equivalent services. Experience More ❯
data platforms (AWS, Azure, or GCP). Proficiency in Python and SQL for data manipulation and transformation. Experience with ETL/ELT development and orchestration tools (e.g., Apache Airflow, dbt, Prefect). Knowledge of data modelling, data warehousing, and lakehouse architectures. Familiarity with DevOps practices, CI/CD pipelines, and infrastructure-as-code. Strong problem-solving skills and the ability More ❯
data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we’re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset – adaptable, collaborative, and delivery More ❯
Data Engineer in a fast-scaling tech or data-driven environment Strong proficiency in Python (or Scala/Java) and SQL Deep experience with data pipeline orchestration tools (Airflow, dbt, Dagster, Prefect) Strong knowledge of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift) Hands-on experience with streaming technologies (Kafka, Kinesis, or similar) Solid understanding More ❯
Data Engineer in a fast-scaling tech or data-driven environment Strong proficiency in Python (or Scala/Java) and SQL Deep experience with data pipeline orchestration tools (Airflow, dbt, Dagster, Prefect) Strong knowledge of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift) Hands-on experience with streaming technologies (Kafka, Kinesis, or similar) Solid understanding More ❯
Data Engineer in a fast-scaling tech or data-driven environment Strong proficiency in Python (or Scala/Java) and SQL Deep experience with data pipeline orchestration tools (Airflow, dbt, Dagster, Prefect) Strong knowledge of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift) Hands-on experience with streaming technologies (Kafka, Kinesis, or similar) Solid understanding More ❯
London, England, United Kingdom Hybrid / WFH Options
Harnham
testing, CI/CD, automation). Proven track record of designing, building, and scaling data platforms in production environments. Hands-on experience with big data technologies such as Airflow, DBT, Databricks, and data catalogue/observability tools (e.g. Monte Carlo, Atlan, Datahub). Knowledge of cloud infrastructure (AWS or GCP) - including services such as S3, RDS, EMR, ECS, IAM. Experience More ❯
definition, clean code, CI/CD, path to production Worked with AWS as a cloud platform Extensive hands-on experience with modern data technologies, ETL tools (e.g. Kafka, Flink, DBT etc.) , data storage (e.g. Snowflake, Redshift, etc.) and also IaC ( e.g. Terraform, CloudFormation ) Software development experience with one or more languages (e.g. Python, Java, Scala, Go ) Pragmatic approach to solving More ❯
optimise the code to ensure processes perform optimally Tech Stack: Cloud: On AWS Infrastructure/Access Management: Using Terraform Data Platform: Snowflake Data Integration: Tool Fivetran Data Transformation Tool: DBT core Data Orchestration Tool: MWWA(Airflow managed by AWS) CI/CD Pipelines: Github Actions Program Languages: SQL, Python and Terraform For more information on how we process your personal More ❯
optimise the code to ensure processes perform optimally Tech Stack: Cloud: On AWS Infrastructure/Access Management: Using Terraform Data Platform: Snowflake Data Integration: Tool Fivetran Data Transformation Tool: DBT core Data Orchestration Tool: MWWA(Airflow managed by AWS) CI/CD Pipelines: Github Actions Program Languages: SQL, Python and Terraform For more information on how we process your personal More ❯
Experience with version control (Git), testing, and code review practices Familiarity with cloud-based data environments (e.g. AWS, GCP, or Azure) Exposure to modern data tools such as Airflow, dbt, or Snowflake Experience or strong interest in streaming technologies like Apache Kafka Interest in MLOps and modern data engineering best practices Why join: You’ll be part of a company More ❯
a scalable data pipelines with cloud-native tools in AWS or similar. Extensive experience working with a data warehousing solution preferrably Snowflake. Excellent data modelling and analytics techniques preferrably dbt (for transforming raw data into meaningful insights within modern data warehouses) Strong Python and SQL coding skills. JavaScript is a plus but not required. Experience implementing development best practices including More ❯
data engineering type! Significant hands-on experience with AWS services focused on data flow, pipelines, data transformation, storage and streaming. Excellent data engineering skills, for example with SQL, Python, DBT and Airflow. Good understanding of service-oriented architecture; experience of exposing and consuming data via APIs, streams and webhooks. Experience designing scalable data models, warehouse/lakehouse architectures, and dimensional More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Travelex
data engineering type! Significant hands-on experience with AWS services focused on data flow, pipelines, data transformation, storage and streaming. Excellent data engineering skills, for example with SQL, Python, DBT and Airflow. Good understanding of service-oriented architecture; experience of exposing and consuming data via APIs, streams and webhooks. Experience designing scalable data models, warehouse/lakehouse architectures, and dimensional More ❯
and a strong commitment to data integrity. Excellent problem-solving skills and ability to work in a fast-paced environment. Preferred Qualifications Experience with Databricks or Ray. Familiarity with dbt and Airflow. Experience with data quality frameworks. Understanding of ML requirements and experience working with ML teams. Experience in robotics or a related field. Familiarity with cloud-based data storage More ❯