Brighton, Sussex, United Kingdom Hybrid / WFH Options
MJR Analytics
goals. What We're Looking For Must Haves Real-world experience with Google Cloud and modern data stack tools such as dbt, Looker, Fivetran, Google BigQuery and/or Snowflake Great analytical, problem-solving and technical skills, a head for numbers and an attention to detail An excellent grasp of SQL, data modelling, cloud data warehousing and database design Awareness More ❯
to teach if you're willing to learn! Required experience: Python Git Nice to have: SQL dbt Github CircleCI Airflow Kubernetes Terraform A Cloud warehouse provider e.g. Databricks, GCP, Snowflake AWS We aren't necessarily looking for someone who is "10-out-of-10" in all these areas; but rather someone who has good experience in most of them, combined More ❯
and Python (Pandas, NumPy preferred) Knowledge of statistical testing methodologies Experience with BI tools (Tableau, PowerBI preferred) Experience with cloud computing services & solutions (AWS, Azure, GCP, Amazon Marketing Cloud, Snowflake) Experience working with very large datasets and distributed data processing technologies (Spark, DuckDB etc ) We are only able to consider applicants with an existing right to live and work in More ❯
Helm, Terraform, Vault, Grafana, ELK Stack, New Relic Relevant experience in the maintenance of data APIs and data lake architectures, including experience with Apache Iceberg, Trino/Presto, Clickhouse, Snowflake, BigQuery. Master's degree in Computer Science or Engineering-related field Get to know us better YouGov is a global online research company, offering insight into what the world thinks. More ❯
Helm, Terraform, Vault, Grafana, ELK Stack, New Relic Relevant experience in the maintenance of data APIs and data lake architectures, including experience with Apache Iceberg, Trino/Presto, Clickhouse, Snowflake, BigQuery. Master's degree in Computer Science or Engineering-related field #LI-PM1 Get to know us better... YouGov is a global online research company, offering insight into what the More ❯
Data Engineer (Snowflake) Position Description If you're looking for a challenge that stretches your talents and want to make a real difference in how modern businesses harness cloud-native data solutions, come and help us grow our Data Engineering capability at CGI. We need a skilled Data Engineer with a focus on Snowflake to help us build scalable, impactful … travel in the London area. All applicants must have the right to live and work in the UK. Your future duties and responsibilities As a Data Engineer specialising in Snowflake, you'll contribute to the design, development, and optimisation of cloud data platforms, often working with a wide array of cloud services and tools. You'll play a hands-on … delivering data solutions that help clients extract insight and business value, while also promoting engineering best practices. Key responsibilities will include: - Designing and implementing scalable data warehouse solutions using Snowflake - Building efficient ELT/ETL pipelines using DBT and other modern tooling - Writing and optimising complex SQL queries for large datasets - Applying software engineering principles to data systems, including version More ❯
Engage stakeholders and oversee technical delivery across Agile projects Skills & Experience: Proven experience as a Lead Data Solution Architect in consulting environments Expertise in cloud platforms (AWS, Azure, GCP, Snowflake) Strong knowledge of big data technologies (Spark, Hadoop), ETL/ELT, and data modelling Familiarity with Python, R, Java, SQL, NoSQL, and data visualisation tools Understanding of machine learning and More ❯
data leadership – ideally within fintech, SaaS, or regulated tech environments. Technical depth across data engineering, analytics, or data science. Hands-on familiarity with modern data stacks – SQL, dbt, Airflow, Snowflake, Looker/Power BI. Understanding of the AI/ML lifecycle – including tooling (Python, MLflow) and best-practice MLOps. Comfortable working across finance, risk, and commercial functions. Experience operating in More ❯
data leadership – ideally within fintech, SaaS, or regulated tech environments. Technical depth across data engineering, analytics, or data science. Hands-on familiarity with modern data stacks – SQL, dbt, Airflow, Snowflake, Looker/Power BI. Understanding of the AI/ML lifecycle – including tooling (Python, MLflow) and best-practice MLOps. Comfortable working across finance, risk, and commercial functions. Experience operating in More ❯
Visualise insights via tools like Power BI or Tableau Support integration testing and third-party specifications What you’ll bring: Strong background in data architecture and cloud platforms (e.g. Snowflake, Databricks) Cloud experience with AWS, Azure or GCP Excellent SQL and data modelling skills (data lakes, warehousing, OLTP/OLAP) Experience with access control, data migration and data governance Proficient More ❯
Warwick, Warwickshire, United Kingdom Hybrid / WFH Options
Pontoon
Expert knowledge of leading ETL/ELT tools such as ODI, Informatica, Matillion, or SSIS. Proficient in SQL programming and relational database concepts. Experience with cloud-based platforms like Snowflake and complex data integration solutions. Nice to Have: Domain knowledge in finance data is preferred. Experience with SAP Systems and Databases. Familiarity with data visualisation tools such as PowerBI or More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Anson Mccade
Visualise insights via tools like Power BI or Tableau Support integration testing and third-party specifications What you'll bring: Strong background in data architecture and cloud platforms (e.g. Snowflake, Databricks) Cloud experience with AWS, Azure or GCP Excellent SQL and data modelling skills (data lakes, warehousing, OLTP/OLAP) Experience with access control, data migration and data governance Proficient More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Client Server
of data engineering principles and best practices, including data modelling, observable ETL/ELT processes, data warehousing and data governance You have experience with AWS and tools such as Snowflake You're collaborative and pragmatic with excellent communication skills What's in it for you: As a Principal Data Engineer you will earn a competitive package including: Salary to £120k More ❯
Milton Keynes, England, United Kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
technical depth and client-facing credibility . Key Responsibilities: Lead the design and delivery of enterprise-scale data platforms using cloud technologies (Azure, AWS, GCP) and tools like Databricks, Snowflake, Synapse. Shape cloud migration and modernization strategies with a strong focus on DevOps practices. Architect scalable data models and robust ETL/ELT pipelines using industry-standard frameworks. Implement data More ❯
North West London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
consulting engagements Deep knowledge of Kafka , Confluent , and event-driven architecture Hands-on experience with Databricks , Unity Catalog , and Lakehouse architectures Strong architectural understanding across AWS, Azure, GCP , and Snowflake Familiarity with Apache Spark, SQL/NoSQL databases, and programming (Python, R, Java) Knowledge of data visualisation, DevOps principles, and ML/AI integration into data architectures Strong grasp of More ❯
Southampton, Hampshire, United Kingdom Hybrid / WFH Options
gen2fund.com
manipulate data. Experience beneficial in Object Oriented language development, including integrations with SOAP and REST based APIs. Familiarity with reporting and analytics tools, Qlik, Microsoft Excel, Power BI, Synapse, Snowflake or similar. Self-motivated, proactive, with a strong sense of ownership, initiative, and problem-solving abilities. Excellent communication skills, with the ability to translate complex technical concepts to non- technical More ❯
building internal and external risk metric reports using SQL and data visualization tools like Tableau Web development skills for risk management UI applications Development experience with databases such as Snowflake, Sybase IQ, and distributed systems like HDFS Interaction with business users to resolve application issues Design and support of batch processes using scheduling tools for data calculation and distribution Leadership More ❯
z2bz0 years of total experience in DWBI, Big Data, Cloud Technologies Implementation experience and hands on experience in either of the 2 Cloud technologies Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth More ❯
management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching, Good knowledge of Databricks, Snowflake, Azure/AWS/Oracle cloud, R, Python. Company Description Version 1 has celebrated over 28 years in Technology Services and continues to be trusted by global brands to deliver More ❯
leadership position. Proven track record of building and scaling data teams and capabilities in a global context. Deep understanding of data architecture, data warehousing, and modern analytics platforms (e.g., Snowflake, Power BI, Tableau, Databricks). Hands-on experience with Microsoft Data Factory, Azure Data Lake, and Microsoft Fabric. Strong knowledge of data governance, privacy regulations (e.g., GDPR), and data lifecycle More ❯
visualization tools, such as Tableau. Utilize web development technologies to facilitate application development for front end UI used for risk management actions Develop software for calculations using databases like Snowflake, Sybase IQ and distributed HDFS systems. Interact with business users for resolving issues with applications. Design and support batch processes using scheduling infrastructure for calculation and distributing data to other More ❯
visualization tools, such as Tableau. Utilize web development technologies to facilitate application development for front end UI used for risk management actions Develop software for calculations using databases like Snowflake, Sybase IQ and distributed HDFS systems. Interact with business users for resolving issues with applications. Design and support batch processes using scheduling infrastructure for calculation and distributing data to other More ❯
technologies essential for automating models and advancing our engineering practices. You're familiar with cloud technologies . You have experience working with data in a cloud data warehouse (Redshift, Snowflake, Databricks, or BigQuery) Experience with a modern data modeling technology (DBT) You document and communicate clearly . Some experience with technical content writing would be a plus You are excited More ❯
NumPy) and deep expertise in SQL for building robust data extraction, transformation, and analysis pipelines. Hands-on experience with big data processing frameworks such as Apache Spark, Databricks, or Snowflake, with a focus on scalability and performance optimization. PREFERRED QUALIFICATIONS: Solid understanding of cloud infrastructure, particularly AWS, with practical experience using Docker, Kubernetes, and implementing CI/CD pipelines for More ❯
and data analytics languages (SQL, Python). Familiarity with Salesforce, Dynamics 365, or similar enterprise systems. Excellent communication, collaboration, and stakeholder management skills. Nice-to-Haves Knowledge of Kafka, Snowflake, or Databricks. Experience with AI in data (e.g., real-time forecasting, visualisation). Background in advisory or consulting roles within data strategy. Ability to thrive in ambiguous, fast-paced environments. More ❯