to teach if you're willing to learn! Required experience: Python Git Nice to have: SQL dbt Github CircleCI Airflow Kubernetes Terraform A Cloud warehouse provider e.g. Databricks, GCP, Snowflake AWS We aren't necessarily looking for someone who is "10-out-of-10" in all these areas; but rather someone who has good experience in most of them, combined More ❯
and Python (Pandas, NumPy preferred) Knowledge of statistical testing methodologies Experience with BI tools (Tableau, PowerBI preferred) Experience with cloud computing services & solutions (AWS, Azure, GCP, Amazon Marketing Cloud, Snowflake) Experience working with very large datasets and distributed data processing technologies (Spark, DuckDB etc ) We are only able to consider applicants with an existing right to live and work in More ❯
Helm, Terraform, Vault, Grafana, ELK Stack, New Relic Relevant experience in the maintenance of data APIs and data lake architectures, including experience with Apache Iceberg, Trino/Presto, Clickhouse, Snowflake, BigQuery. Master's degree in Computer Science or Engineering-related field Get to know us better YouGov is a global online research company, offering insight into what the world thinks. More ❯
Helm, Terraform, Vault, Grafana, ELK Stack, New Relic Relevant experience in the maintenance of data APIs and data lake architectures, including experience with Apache Iceberg, Trino/Presto, Clickhouse, Snowflake, BigQuery. Master's degree in Computer Science or Engineering-related field #LI-PM1 Get to know us better... YouGov is a global online research company, offering insight into what the More ❯
Data Engineer (Snowflake) Position Description If you're looking for a challenge that stretches your talents and want to make a real difference in how modern businesses harness cloud-native data solutions, come and help us grow our Data Engineering capability at CGI. We need a skilled Data Engineer with a focus on Snowflake to help us build scalable, impactful … travel in the London area. All applicants must have the right to live and work in the UK. Your future duties and responsibilities As a Data Engineer specialising in Snowflake, you'll contribute to the design, development, and optimisation of cloud data platforms, often working with a wide array of cloud services and tools. You'll play a hands-on … delivering data solutions that help clients extract insight and business value, while also promoting engineering best practices. Key responsibilities will include: - Designing and implementing scalable data warehouse solutions using Snowflake - Building efficient ELT/ETL pipelines using DBT and other modern tooling - Writing and optimising complex SQL queries for large datasets - Applying software engineering principles to data systems, including version More ❯
data leadership – ideally within fintech, SaaS, or regulated tech environments. Technical depth across data engineering, analytics, or data science. Hands-on familiarity with modern data stacks – SQL, dbt, Airflow, Snowflake, Looker/Power BI. Understanding of the AI/ML lifecycle – including tooling (Python, MLflow) and best-practice MLOps. Comfortable working across finance, risk, and commercial functions. Experience operating in More ❯
data leadership – ideally within fintech, SaaS, or regulated tech environments. Technical depth across data engineering, analytics, or data science. Hands-on familiarity with modern data stacks – SQL, dbt, Airflow, Snowflake, Looker/Power BI. Understanding of the AI/ML lifecycle – including tooling (Python, MLflow) and best-practice MLOps. Comfortable working across finance, risk, and commercial functions. Experience operating in More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Client Server
of data engineering principles and best practices, including data modelling, observable ETL/ELT processes, data warehousing and data governance You have experience with AWS and tools such as Snowflake You're collaborative and pragmatic with excellent communication skills What's in it for you: As a Principal Data Engineer you will earn a competitive package including: Salary to £120k More ❯
North West London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
consulting engagements Deep knowledge of Kafka , Confluent , and event-driven architecture Hands-on experience with Databricks , Unity Catalog , and Lakehouse architectures Strong architectural understanding across AWS, Azure, GCP , and Snowflake Familiarity with Apache Spark, SQL/NoSQL databases, and programming (Python, R, Java) Knowledge of data visualisation, DevOps principles, and ML/AI integration into data architectures Strong grasp of More ❯
z2bz0 years of total experience in DWBI, Big Data, Cloud Technologies Implementation experience and hands on experience in either of the 2 Cloud technologies Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth More ❯
management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching, Good knowledge of Databricks, Snowflake, Azure/AWS/Oracle cloud, R, Python. Company Description Version 1 has celebrated over 28 years in Technology Services and continues to be trusted by global brands to deliver More ❯
leadership position. Proven track record of building and scaling data teams and capabilities in a global context. Deep understanding of data architecture, data warehousing, and modern analytics platforms (e.g., Snowflake, Power BI, Tableau, Databricks). Hands-on experience with Microsoft Data Factory, Azure Data Lake, and Microsoft Fabric. Strong knowledge of data governance, privacy regulations (e.g., GDPR), and data lifecycle More ❯
technologies essential for automating models and advancing our engineering practices. You're familiar with cloud technologies . You have experience working with data in a cloud data warehouse (Redshift, Snowflake, Databricks, or BigQuery) Experience with a modern data modeling technology (DBT) You document and communicate clearly . Some experience with technical content writing would be a plus You are excited More ❯
NumPy) and deep expertise in SQL for building robust data extraction, transformation, and analysis pipelines. Hands-on experience with big data processing frameworks such as Apache Spark, Databricks, or Snowflake, with a focus on scalability and performance optimization. PREFERRED QUALIFICATIONS: Solid understanding of cloud infrastructure, particularly AWS, with practical experience using Docker, Kubernetes, and implementing CI/CD pipelines for More ❯
and data analytics languages (SQL, Python). Familiarity with Salesforce, Dynamics 365, or similar enterprise systems. Excellent communication, collaboration, and stakeholder management skills. Nice-to-Haves Knowledge of Kafka, Snowflake, or Databricks. Experience with AI in data (e.g., real-time forecasting, visualisation). Background in advisory or consulting roles within data strategy. Ability to thrive in ambiguous, fast-paced environments. More ❯
for quantitative trading and analytics, with demonstrated expertise in: The python data engineering stack (Polars, Parquet, FastAPI, Jupyter, Airflow, Streamlit, Ray) High-performance data stores and query engines (Starburst, Snowflake) Real-time streaming analytics technologies (Kafka, Flink) Cloud container technologies (AWS, Azure, GCP, Docker, Kubernetes) Proven success in enhancing developer experience that reduces friction in coding, building and deploying APIs More ❯
classical statistics (hypothesis testing, regression, significance, p-value pitfalls) Ability to translate data into plain-English insights and present to C-level audiences Experience working in cloud data warehouses (Snowflake, BigQuery, Redshift) and version control (Git) What we can offer Bonus Hybrid working; meaning you'll be in our Farringdon office Tuesdays to Thursdays 25 days annual leave, plus the More ❯
concepts to a wide variety of audiences. For candidates applying for the Senior Consultant role, we additionally require: Working experience with at least one Cloud Platform (AWS, Azure, GCP, Snowflake, Databricks etc.) and exposure to Cloud Architecture principles. Demonstrated experience in people management, product owner or workstream management. Experience supporting and participating in the commercial cycle, including defining project scope More ❯
a related role Proficient in SQL, Python or R, and experienced with BI tools such as Tableau. Strong analytical, structured, and critical thinking skills. Familiar with data warehousing concepts (Snowflake, AWS). Excellent attention to detail and proficiency in MS Office Suite. A proactive self-starter with strong time and task management skills. Eagerness to learn and adapt to new More ❯
Essential Skills & Experience: Minimum 5 years of experience in a Data Engineering role or equivalent Strong Python and SQL proficiency Proven experience with a modern data warehousing solution (e.g. Snowflake, Azure Data Factory) Experience with modern Python-based orchestration tools (e.g. Prefect) Familiarity with Git-based development workflows and CI/CD pipelines Package: Salary up to £85,000 (depending More ❯
build and manage data pipelines. Our no-code/low-code ETL platform allows seamless integration of data from any source whether databases, applications, or files into lakehouses like Snowflake, Databricks, and Redshift. With pipelines that just work and features like advanced data transformation using dbt Core and end-to-end pipeline observability, were focused on making robust data pipelines More ❯
end-to-end AI/ML projects. Nice to Have: Exposure to LLMs (Large Language Models), generative AI , or transformer architectures . Experience with data engineering tools (Spark, Airflow, Snowflake). Prior experience in fintech, healthtech, or similar domains is a plus. More ❯
including Lambda, ECS/EC2, S3 and RDS. Deep experience with Terraform and infrastructure-as-code practices. Familiarity with tools like Airflow or DBT , and data platforms such as Snowflake or Databricks . Solid experience with CI/CD, observability, and platform reliability practices in cloud-native environments. Understanding of distributed computing concepts , and experience designing systems for scale, security More ❯
and with external clients. Strong hands-on experience using SQL for multi-step/complex analytics is essential for this role. Experience in cloud platforms (GCP – BigQuery, Azure -Synapse, Snowflake) and exposure to data science tools/languages such as Python, dbt, D3, Github, GCP/AWS would be advantageous. (This is not a technical advanced data science role so More ❯
City of London, London, United Kingdom Hybrid / WFH Options
MRK Associates
and with external clients. Strong hands-on experience using SQL for multi-step/complex analytics is essential for this role. Experience in cloud platforms (GCP – BigQuery, Azure -Synapse, Snowflake) and exposure to data science tools/languages such as Python, dbt, D3, Github, GCP/AWS would be advantageous. (This is not a technical advanced data science role so More ❯