big data technologies (e.g., Hadoop, Spark, Kafka). Familiarity with AWS and its data services (e.g. S3, Athena, AWS Glue). Familiarity with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake). Knowledge of containerization and orchestration tools (e.g., Docker, ECS, Kubernetes). Familiarity of data orchestration tools (e.g. Prefect, Apache Airflow). Familiarity with CI/CD pipelines and DevOps More ❯
be great if you have: Experience of relevant cloud services within AWS, Azure or GCP. Experience working in an Agile environment. Experience working with common vendor products such as Snowflake or Data Bricks. Experience working with CI/CD tooling. What you'll get in return is: 25 days' annual leave, rising to 30 days with each year of service. More ❯
Job Title: Snowflake Centre of Excellence Lead Location: Central London (Hybrid - 2 to 3 days on site per week) Employment Type: Permanent Salary: up to £120,000 per annum + benefits About the Role: We are working with a prestigious client based in London who are seeking a Snowflake Lead to play a pivotal role in establishing and scaling their … Snowflake capability. This is a unique opportunity for a seasoned Snowflake professional to build a Centre of Excellence from the ground up within a fast-paced, high-impact environment. As the Snowflake CoE Lead, you will be instrumental in shaping the organisation's Snowflake strategy, architecture, and delivery model. You'll bring your deep technical expertise, leadership experience, and direct … engagement with Snowflake to build a best-in-class data platform offering. Key Responsibilities: Lead the design, setup, and growth of a Snowflake practice, including establishing a Centre of Excellence. Architect, implement, and maintain scalable data solutions using Snowflake. Collaborate closely with stakeholders across the organisation and with Snowflake directly to influence strategic direction. Mentor and lead a team of More ❯
DI, SAS Viya). An ability to write complex SQL queries. Project experience using one or more of the following technologies: Tableau, Python, Power BI, Cloud (Azure, AWS, GCP, Snowflake, Databricks). Project lifecycle experience, having played a leading role in the delivery of end-to-end projects, as 14, rue Pergolèse, Paris, Francewell as a familiarity with different development More ❯
HTTP/S REST APIs Experience with time-series databases (e.g., InfluxDB, kdb+, TimescaleDB) and real-time data processing. Familiarity with distributed computing and data warehousing technologies (e.g., Spark, Snowflake, Delta Lake). Strong understanding of data governance, master data management, and data quality frameworks. Solid grasp of web technologies and APIs (REST, JSON, XML, authentication protocols). Experience with More ❯
Familiarity with Database Replication & CDC technologies such as Debezium Familiarity with message & event-driven architecture, including tools like AWS MQ, Kafka Exposure to cloud database services (e.g., AWS RDS, Snowflake) 25 days of holiday Bonus Pension contribution Private medical, dental, and vision coverage Life assurance Critical illness cover Wellness contribution program with access to ClassPass More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
problem-solving skills and attention to detail. Excellent communication skills and a collaborative mindset. Nice To Have Experience with Python for data transformation, scripting, or automation tasks. Familiarity with Snowflake or other modern cloud data warehouses. Exposure to investment management, financial services, or regulated environments. Understanding of data modelling, governance, and best practices in data architecture. Company Market leading financial More ❯
like Tableau Big data – Hadoop eco-system, Distributions like Cloudera/Hortonworks, Pig and HIVE Data processing frameworks – Spark & Spark streaming Hands-on experience with multiple databases like PostgreSQL, Snowflake, Oracle, MS SQL Server, NOSQL (HBase/Cassandra, MongoDB) Experience in cloud data eco-system - AWS, Azure or GCP in the data engineering space with at least few complex & high More ❯
like Tableau Big data – Hadoop eco-system, Distributions like Cloudera/Hortonworks, Pig and HIVE Data processing frameworks – Spark & Spark streaming Hands-on experience with multiple databases like PostgreSQL, Snowflake, Oracle, MS SQL Server, NOSQL (HBase/Cassandra, MongoDB) Experience in cloud data eco-system - AWS, Azure or GCP in the data engineering space with at least few complex & high More ❯
with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to translate complex technical concepts for business stakeholders Strategic thinking with More ❯
orchestration tools and Agile/DevOps practices. Data Analytics Lead | Data Engineering Lead | Risk Reporting Lead | Risk Data Engineering | SQL Expert | Data Warehouse | Financial Risk Analytics | Risk Data Management | Snowflake | SQL Server | SSIS | Power BI | Regulatory Compliance | Market Risk | Credit Risk | Data Team Manager | Data Platform Lead | Data Transformation | Financial Institution | International Data Team | Data Platform Architecture Deerfoot Recruitment Solutions More ❯
City of London, London, Coleman Street, United Kingdom
Deerfoot Recruitment Solutions Limited
orchestration tools and Agile/DevOps practices. Data Analytics Lead | Data Engineering Lead | Risk Reporting Lead | Risk Data Engineering | SQL Expert | Data Warehouse | Financial Risk Analytics | Risk Data Management | Snowflake | SQL Server | SSIS | Power BI | Regulatory Compliance | Market Risk | Credit Risk | Data Team Manager | Data Platform Lead | Data Transformation | Financial Institution | International Data Team | Data Platform Architecture Deerfoot Recruitment Solutions More ❯
Employment Type: Permanent
Salary: £135000/annum bonus + good benefits package
on building scalable data solutions. Experience with data pipeline orchestration tools such as Dagster or similar. Familiarity with cloud platforms (e.g. AWS) and their data services (e.g., S3, Redshift, Snowflake). Understanding of data warehousing concepts and experience with modern warehousing solutions. Experience with GitHub Actions (or similar) and implementing CI/CD pipelines for data workflows and version-controlled More ❯
formats (CSV, JSON, XML, Parquet) Experience with time-series databases (e.g., InfluxDB, kdb+, TimescaleDB) and real-time data processing. Familiarity with distributed computing and data warehousing technologies (e.g., Spark, Snowflake, Delta Lake). Strong understanding of data governance, master data management, and data quality frameworks. Excellent communication and stakeholder management skills. Ability to mentor junior engineers and foster a collaborative More ❯
making across the data and ML engineering domain on technical approaches to balance delivering near-term commercial impact and building long-term foundations. Our Tech Stack Cloud Data Warehouse - Snowflake AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github More ❯
of-breed vendor tools. We deploy team members from on-shore, near-shore and off-shore teams, and often work alongside our major alliance partners, such as Microsoft, IBM, Snowflake, Moody's, Service Now and Pega to deploy solutions. Increasingly we collaborate with FinTech firms too. We are passionate about keeping pace with the latest emerging technology. We have recently More ❯
use cases Implement CI/CD workflows Ensure GDPR compliance and secure data handling Requirements: 5+ years in data engineering or Strong experience with Azure, Databricks, Microsoft Fabric, and Snowflake Proficiency in SQL, Python, and tools like dbt and Airflow Familiarity with DevOps practices in a data context Benefits: Work on impactful, enterprise-wide data projects Collaborate with architects, analysts More ❯
use cases Implement CI/CD workflows Ensure GDPR compliance and secure data handling Requirements: 5+ years in data engineering or Strong experience with Azure, Databricks, Microsoft Fabric, and Snowflake Proficiency in SQL, Python, and tools like dbt and Airflow Familiarity with DevOps practices in a data context Benefits: Work on impactful, enterprise-wide data projects Collaborate with architects, analysts More ❯
skills in SQL and Python , with hands-on experience in relational databases across cloud and on-prem environments. Familiarity with modern data technologies such as Apache Spark , Kafka , or Snowflake . A comprehensive understanding of the data engineering lifecycle, including Agile delivery , DevOps , Git , APIs , containers , microservices , and pipeline orchestration . Nice to have: DP-203 Azure Data Engineering/ More ❯
skills in SQL and Python , with hands-on experience in relational databases across cloud and on-prem environments. Familiarity with modern data technologies such as Apache Spark , Kafka , or Snowflake . A comprehensive understanding of the data engineering lifecycle, including Agile delivery , DevOps , Git , APIs , containers , microservices , and pipeline orchestration . Nice to have: DP-203 Azure Data Engineering/ More ❯
client requirements into solution blueprints and supporting proposal development. Key Responsibilities Architect and oversee the delivery of enterprise-scale data platforms (data lakes, lakehouses, warehouses) using tools like Databricks , Snowflake , Synapse , and Azure Fabric Define and execute cloud migration strategies, leveraging CI/CD pipelines, Terraform, Azure DevOps, and GitHub Support RFP/RFI responses, conduct client workshops, and shape More ❯
documentation, data warehousing, and data modeling. Experience with Python for interaction with Web Services (e.g., Rest and Postman). Experience with using and developing data APIs. Experience using AWS, Snowflake, or other comparable large-scale analytics platforms. Experience monitoring and managing databases (we use Elasticsearch/MongoDB/PostgreSQL). Experience with SQL. Experience with data versioning tools. Experience developing More ❯
and query optimization 2+ years using Python or similar scripting languages for data science Experience with data processes and building ETL pipelines Experience with cloud data warehouses such as Snowflake, Azure Data Warehouse, Amazon Redshift, or Google BigQuery Proficiency in creating visualizations using Power BI or Tableau Experience designing ETL/ELT solutions with tools like SSIS, Alteryx, AWS Glue More ❯
Prior experience delivering and owning a reporting cycle (monthly or quarterly) with evidence improving processes while maintaining high quality • Strong SQL skills and experience with relational databases (e.g. Postgres, Snowflake, MS SQL) • Familiarity with modern data stack tools including dbt, Prefect and other cloud platforms • Comfortable working with large, complex datasets in a fast-paced, regulated environment • Excellent communication and More ❯
City of London, London, United Kingdom Hybrid / WFH Options
McCabe & Barton
/remote working and a salary between £50,000 and £70,000. As part of the data engineering team, youll design and deliver scalable data products using technologies like Snowflake, Power BI, Python, and SQL. Your work will enable self-service analytics and support data governance across the business. Key Responsibilities: Develop robust ETL/ELT pipelines and dimensional models More ❯