Have These: Exposure to, and ideally experience with, modern data architectures (e.g. data lake, lake house, data mesh) and accompanying technologies (e.g. Azure Synapse, Snowflake, Amazon Redshift) Awareness of data governance and all surrounding legislation Interest in security and how to handle PII data Excellent written and verbal communication skills more »
of expertise. As a great influencer with great communication skills, you love sharing your knowledge with others and helping them grow. Our Technology Stack Snowflake Salesforce CDP AWS AWS Lakeformation AWS Kinesis AWS Event Bridge Glue/Glue Data Brew App Flow NOSQL Databases e.g. DynamoDB SQL Databases e.g. MySQL more »
Manchester, North West, United Kingdom Hybrid / WFH Options
N Brown Group
principles and practices and tools like Jira and Confluence. What technical skills you will have Experience with general Cloud products (Cloyd SQL, BigQuery, RedShift, Snowflake, Apache Beam, Spark) or similar products. Experience with open-source data-stack tools such as Airflow, Airbyte, DBT, Kafka etc. Awareness of data visualisation tools more »
processes, and technologies. Proficiency in SQL, ideally with Azure SQL, and experience working with relational databases. Programming experience in Python or Scala. Experience with Snowflake and its tooling (Snowpark, Snowpipe, etc.). Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components. Knowledge of data integration and ETL more »
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
industry standard processing patterns SKILLS AND QUALIFICATIONS * Experience within a commercial environment creating ETL pipelines in python * Exposure to analytical data warehouses such as Snowflake, Redshift or BigQuery (Redshift Preferred) * Experience with data orchestrators such as Airflow, AWS Step Functions, AWS Batch * Knowledge of Agile development methodologies * Knowledge of automated more »
tools, including Azure Data Factory and dbt, developing a wide variety of integration solutions incorporating APIs, files, databases, etc. Experience with database platforms, including Snowflake, Azure SQL Database and MS SQL Server, along with excellent SQL skills. Experience with performance tuning, data migration strategies both on-prem and cloud, and more »
with data orchestration tools: e.g. Apache Airflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL). Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments more »
Greater London, England, United Kingdom Hybrid / WFH Options
InterEx Group
definition of Big Data architecture with different tools and environments: Cloud (AWS, Azure and GCP), Cloudera, No-sql databases (Cassandra, Mongo DB), ELK, Kafka, Snowflake, etc. Past experience in Data Engineering and data quality tools (Informatica, Talend, etc.) Previous involvement in working in a Multilanguage and multicultural environment Proactive, tech more »
Manchester Area, United Kingdom Hybrid / WFH Options
Airtime Rewards
equivalent experience. Experience designing cloud data warehouse solutions, data modelling, and building ETL/ELT processes, preferably on GCP or equivalent platforms (AWS, Azure, Snowflake). Proficiency with SQL, Python, Docker, and Terraform (or similar IaC tools). Strong knowledge of security best practices, data privacy, and GDPR compliance. Proven more »
we’re looking for: Experience of creating advanced visualisations in Tableau. Experience in SQL and Python. Experience of AWS, Airflow, S3 and working with Snowflake in a large complex organisation is advantageous. Experience of establishing processes to identify and managing issues, in data or technology, and then take ownership through more »
Title: Snowflake Data Engineer - 6-Month Contract (Inside IR35) Remote Pay rate: (Apply online only) per day dependent on skill set Position Overview: As a Snowflake Data Engineer, you will play a pivotal role in designing, developing, and maintaining our Snowflake data platform. Your expertise will be critical in ensuring … technical solutions within Snowflake. Design and implement scalable data pipelines for ingestion, transformation, and loading of data into Snowflake. Optimize performance and efficiency of Snowflake data warehouse through tuning, indexing, and partitioning strategies. Develop and maintain documentation, including data models, ETL processes, and system configurations. Troubleshoot issues related to data … quality, pipeline failures, and performance bottlenecks. Stay updated on industry best practices and emerging trends in Snowflake and data engineering. Requirements: Proven experience as a Data Engineer with a focus on Snowflake data platform. Strong proficiency in SQL and experience with Snowflake SQL extensions. Hands-on experience with Snowflake architecture more »
Python, Spark, SQL Experience of developing enterprise grade ETL/ELT data pipelines. NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. Snowflake Data Warehouse/Platform Experience of working with CI/CD technologies, Git, Jenkins, Spinnaker, GCP Cloud Build, Ansible etc Experience building and deploying solutions more »
London, England, United Kingdom Hybrid / WFH Options
Austin Fraser
with the following skills and qualifications: Data Expertise: Proficiency in cloud-based data pipelines and SQL. Cloud Savvy: Experience with cloud-based platforms like Snowflake and BigQuery. Development Prowess: Thriving in a CI/CD, TDD, and Agile development environment. Data Visualisation: Navigating data visualisation tools like Tableau and Looker more »
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
First Derivative
designs, database development standards, implementation and management of data warehouses and data analytics systems. What experience will you need? Apache Spark Azure Databricks ETL Snowflake Big Query What's in it for you? You will embark upon a career with life-long learning at its core, facilitating rapid professional and more »
Good working knowledge of programming techniques and languages like Python, Scala, Java script etc Good working knowledge of Cloud data warehousing platforms such as Snowflake, Redshift or BigQuery Good Understanding of the HE Regulatory Framework Benefits: Private medical (health) insurance with BUPA Annual leave (5.6 weeks), including bank holidays. Workplace more »
Good working knowledge of programming techniques and languages like Python, Scala, Java script etc. Good working knowledge of Cloud data warehousing platforms such as Snowflake, Redshift or BigQuery. Good Understanding of the HE Regulatory Framework Benefits: Private medical (health) insurance with BUPA Annual leave (5.6 weeks), including bank holidays. Workplace more »
languages or toolsets: AutoSys, Azure Function App, Azure GIT, Azure Portal, C#, Databricks, GraphQL/Graph API, Informatica CDI, Informatica Power Center, Java, Javascript, Snowflake, PowerBI, PyRecs, Python, Selenium, Spark, and SQL Nice to have skills Ability to propose and estimate the financial impact of architectural alternatives Existing knowledge of more »
implement automated data quality checks and monitoring processes. KEY SKILLS Proficiency in SQL and data querying validation and testing purposes. Hands on experience with Snowflake or Airflow or DBT. Familiarity with data integration, ETL processes, and data governance frameworks. Solid understanding of data structures, relational databases and data modelling concepts. more »
Trade finance, etc.) Demonstrable experience in the architecture and deliver of real time analytics platforms Python/Java experience Cloud computing platforms (Azure preferable) Snowflake, Power BI, Kafka, SQL will be utilised Relocation to the UAE is a must This is a great role for an experienced data engineer/ more »
With Data at the core, they are looking for someone to really help drive forward the data strategy for the business. Having recently implemented Snowflake, this role will be well suited to those comfortable with creating data pipelines, using various ETL tools, testing data and writing queries. Please take look more »
Coventry, West Midlands, West Midlands (County), United Kingdom
Investigo
lifecycles. Transform technical data into actionable insights for stakeholders. The key skills we are looking for are: Expertise with Python and Pysprark Strong with Snowflake Experience with AWS Experience with data visualisation Experience developing datasets and data pipelines Strong SQL skills Familiarity with Power-Bi Platform If this sounds like more »
best practices. A team player with excellent communication & collaboration skills. Desirable: Experience developing and maintaining data warehouses/lakes using big data solutions (e.g., Snowflake, Databricks). Familiarity with SQL Server, PostgreSQL or NoSQL database technologies. Previous experience within banking or finance desirable, but by no means necessary. If you more »
professional experience. Knowledge of DropWizard or similar frameworks is a plus. • A solid understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. • Experience with AWS services. Familiarity with S3, ECS, and EC2/Fargate would more »
Oxford, Oxfordshire, South East, United Kingdom Hybrid / WFH Options
Job Heron
common information security management frameworks and regulatory requirements and applicable standards such as ISO 27001 Experience in using modern data warehouse platforms such as Snowflake and Iceberg Certifications in cloud platforms and/or data engineering (e.g., AWS Certified Solutions Architect, Google Professional Data Engineer) are a plus Benefits more »