Remote (Mostly remote with some occasional travel to Hemel Hempstead) Contract: Outside IR35 Day rate: Up to £550 per day Duration: 6 months Start date: ASAP Key skills: Snowflake, DBT, SQL, Python, AWS and Kimball The Client who are in the process of migrating to SnowFlake therefore require extra support. As a result, they require someone from a strong SQL … Python development background with excellent working knowledge of DataBuildTool (DBT). You will be undertaking aspects of the development lifecycle and be experienced in data modeling, process design, development, and testing. And whilst this company is going through a large-scale migration, this will present you with an opportunity to be at the cutting edge of data engineering. … have experience in the following: - Advanced SQL knowledge - SnowFlake (ideally certified) - Python development - AWS cloud experience essential, relating to data tooling and development - Working knowledge of DataBuildTool (DBT). o Develop staging, intermediate and marts in DBT to achieve analytics requirements o Optimize existing models to make it more reusable by following DBT best practices o Spot opportunities More ❯
engineering the ingestion, storage, transformation and distribution of tick, timeseries, reference and alternative datasets. The technology stack spanson-premise and cloud infrastructure with technologies and tooling such as Python, dbt, KDB+, Snowflake, SQL, interfacing with Market Data vendors such as Bloomberg, Refinitiv, Factset and MorningStar. This is an exciting time for you to join the team they consolidate our technology More ❯
engineering the ingestion, storage, transformation and distribution of tick, timeseries, reference and alternative datasets. The technology stack spanson-premise and cloud infrastructure with technologies and tooling such as Python, dbt, KDB+, Snowflake, SQL, interfacing with Market Data vendors such as Bloomberg, Refinitiv, Factset and MorningStar. This is an exciting time for you to join the team they consolidate our technology More ❯
engineering the ingestion, storage, transformation and distribution of tick, timeseries, reference and alternative datasets. The technology stack spanson-premise and cloud infrastructure with technologies and tooling such as Python, dbt, KDB+, Snowflake, SQL, interfacing with Market Data vendors such as Bloomberg, Refinitiv, Factset and MorningStar. This is an exciting time for you to join the team they consolidate our technology More ❯
engineering the ingestion, storage, transformation and distribution of tick, timeseries, reference and alternative datasets. The technology stack spanson-premise and cloud infrastructure with technologies and tooling such as Python, dbt, KDB+, Snowflake, SQL, interfacing with Market Data vendors such as Bloomberg, Refinitiv, Factset and MorningStar. This is an exciting time for you to join the team they consolidate our technology More ❯
london (city of london), south east england, united kingdom
Winston Fox
engineering the ingestion, storage, transformation and distribution of tick, timeseries, reference and alternative datasets. The technology stack spanson-premise and cloud infrastructure with technologies and tooling such as Python, dbt, KDB+, Snowflake, SQL, interfacing with Market Data vendors such as Bloomberg, Refinitiv, Factset and MorningStar. This is an exciting time for you to join the team they consolidate our technology More ❯
pipelines. Proficiency in JVM-based languages (Java, Kotlin), ideally combined with Python and experience in Spring Boot Solid understanding of data engineering tools and frameworks, like Spark, Flink, Kafka, dbt, Trino, and Airflow. Hands-on experience with cloud environments (AWS, GCP, or Azure), infrastructure-as-code practices, and ideally container orchestration with Kubernetes. Familiarity with SQL and NoSQL databases (Cassandra More ❯
While in this position your duties may include but are not limited to: Support the design, development, and maintenance of scalable data pipelines using tools such as Apache Airflow, dbt, or Azure Data Factory. Learn how to ingest, transform, and load data from a variety of sources, including APIs, databases, and flat files. Assist in the setup and optimisation of More ❯
field ● Proficiency in writing SQL queries and knowledge of cloud-based databases like Snowflake, Redshift, BigQuery or other big data solutions ● Experience in data modelling and tools such as dbt, ETL processes, and data warehousing ● Experience with at least one of the programming languages like Python, C++[2] , Java ● Experience with version control and code review tools such as Git More ❯
field ● Proficiency in writing SQL queries and knowledge of cloud-based databases like Snowflake, Redshift, BigQuery or other big data solutions ● Experience in data modelling and tools such as dbt, ETL processes, and data warehousing ● Experience with at least one of the programming languages like Python, C++[2] , Java ● Experience with version control and code review tools such as Git More ❯
North London, London, England, United Kingdom Hybrid / WFH Options
Lynx Recruitment Ltd
or GCP experience desirable). Solid understanding of data modelling and performance optimisation. Knowledge of data governance, compliance, and security frameworks. Desirable Skills: Familiarity with orchestration tools like Airflow, dbt, or Prefect . Experience with CI/CD and DevOps practices for data pipelines. Exposure to Power BI , Tableau , or other visualisation tools. Background in financial services or other regulated More ❯
of the AWS services like Redshift, Lambda,S3,Step Functions, Batch, Cloud formation, Lake Formation, Code Build, CI/CD, GitHub, IAM, SQS, SNS, Aurora DB - Good experience with DBT, Apache Iceberg, Docker, Microsoft BI stack (nice to have) - Experience in data warehouse design (Kimball and lake house, medallion and data vault) is a definite preference as is knowledge of More ❯
full stack data development (from ingestion to visualization). Strong expertise in Snowflake, including data modeling, warehousing, and performance optimization. Hands-on experience with ETL tools (e.g., Apache Airflow, dbt, Fivetran) and integrating data from ERP systems like NetSuite. Proficiency in SQL, Python, and/or other scripting languages for data processing and automation. Familiarity with LLM integrations (e.g., for More ❯
an Agile environment. Technical Proficiency: Deep technical expertise in software and data engineering, programming languages (python, java etc.). Understanding of orchestration (Composer, DAGs), data processing (Kafka, Flink, DataFlow, dbt), and database capabilities (e.g. BigQuery, CloudSQL, BigTable). Container technologies (Docker, Kubernetes), IaaC (Terraform) and experience with cloud platforms such as GCP. CI/CD: Detailed understanding of working automated More ❯
bristol, south west england, united kingdom Hybrid / WFH Options
Lloyds Banking Group
an Agile environment. Technical Proficiency: Deep technical expertise in software and data engineering, programming languages (python, java etc.). Understanding of orchestration (Composer, DAGs), data processing (Kafka, Flink, DataFlow, dbt), and database capabilities (e.g. BigQuery, CloudSQL, BigTable). Container technologies (Docker, Kubernetes), IaaC (Terraform) and experience with cloud platforms such as GCP. CI/CD: Detailed understanding of working automated More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Intellect Group
/ELT processes and database fundamentals. A motivated, proactive mindset with a strong desire to learn and grow. Nice to Have: Exposure to data pipeline tools (e.g. Apache Airflow, dbt). Experience with containerisation tools like Docker. Familiarity with version control and CI/CD practices. Knowledge of BI tools (Looker, Tableau, or similar). If you’re excited to More ❯
/ELT processes and database fundamentals. A motivated, proactive mindset with a strong desire to learn and grow. Nice to Have: Exposure to data pipeline tools (e.g. Apache Airflow, dbt). Experience with containerisation tools like Docker. Familiarity with version control and CI/CD practices. Knowledge of BI tools (Looker, Tableau, or similar). If you’re excited to More ❯
london, south east england, united kingdom Hybrid / WFH Options
Intellect Group
/ELT processes and database fundamentals. A motivated, proactive mindset with a strong desire to learn and grow. Nice to Have: Exposure to data pipeline tools (e.g. Apache Airflow, dbt). Experience with containerisation tools like Docker. Familiarity with version control and CI/CD practices. Knowledge of BI tools (Looker, Tableau, or similar). If you’re excited to More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Intellect Group
/ELT processes and database fundamentals. A motivated, proactive mindset with a strong desire to learn and grow. Nice to Have: Exposure to data pipeline tools (e.g. Apache Airflow, dbt). Experience with containerisation tools like Docker. Familiarity with version control and CI/CD practices. Knowledge of BI tools (Looker, Tableau, or similar). If you’re excited to More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Intellect Group
/ELT processes and database fundamentals. A motivated, proactive mindset with a strong desire to learn and grow. Nice to Have: Exposure to data pipeline tools (e.g. Apache Airflow, dbt). Experience with containerisation tools like Docker. Familiarity with version control and CI/CD practices. Knowledge of BI tools (Looker, Tableau, or similar). If you’re excited to More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid / WFH Options
Vivedia Ltd
Experience with cloud platforms (AWS, Azure, GCP) and tools like Snowflake, Databricks, or BigQuery . Familiarity with streaming technologies (Kafka, Spark Streaming, Flink) is a plus. Tools & Frameworks: Airflow, dbt, Prefect, CI/CD pipelines, Terraform. Mindset: Curious, data-obsessed, and driven to create meaningful business impact. Soft Skills: Excellent communication and collaboration — translating complex technical ideas into business insight More ❯
Proven experience in a leadership or technical lead role, with official line management responsibility. Strong experience with modern data stack technologies, including Python, Snowflake, AWS (S3, EC2, Terraform), Airflow, dbt, Apache Spark, Apache Iceberg, and Postgres. Skilled in balancing technical excellence with business priorities in a fast-paced environment. Strong communication and stakeholder management skills, able to translate technical concepts More ❯
Proven track record building and maintaining scalable data platforms in production, enabling advanced users such as ML and analytics engineers. Hands-on experience with modern data stack tools - Airflow, DBT, Databricks, and data catalogue/observability solutions like Monte Carlo, Atlan, or DataHub. Solid understanding of cloud environments (AWS or GCP), including IAM, S3, ECS, RDS, or equivalent services. Experience More ❯
handle sensitive and confidential information Experience working with non-data stakeholders to translate their needs and generate useful results presented in an understandable way Familiarity with orchestration tools (Airflow, DBT) and data warehouse modelling Managing other data engineers Experience with customer and commercial datasets, especially in retail or FMCG A love of pets! About Jollyes Pets Jollyes are an award More ❯
modern data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What were looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset adaptable, collaborative, and delivery More ❯