Have These: Exposure to, and ideally experience with, modern data architectures (e.g. data lake, lake house, data mesh) and accompanying technologies (e.g. Azure Synapse, Snowflake, Amazon Redshift) Awareness of data governance and all surrounding legislation Interest in security and how to handle PII data Excellent written and verbal communication skills more »
of expertise. As a great influencer with great communication skills, you love sharing your knowledge with others and helping them grow. Our Technology Stack Snowflake Salesforce CDP AWS AWS Lakeformation AWS Kinesis AWS Event Bridge Glue/Glue Data Brew App Flow NOSQL Databases e.g. DynamoDB SQL Databases e.g. MySQL more »
Manchester Area, United Kingdom Hybrid / WFH Options
Airtime Rewards
equivalent experience. Experience designing cloud data warehouse solutions, data modelling, and building ETL/ELT processes, preferably on GCP or equivalent platforms (AWS, Azure, Snowflake). Proficiency with SQL, Python, Docker, and Terraform (or similar IaC tools). Strong knowledge of security best practices, data privacy, and GDPR compliance. Proven more »
with data orchestration tools: e.g. Apache Airflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL). Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments more »
processes, and technologies. Strong SQL skills (ideally with Azure SQL), experience working with relational databases, and programming experience in Python or Scala. Experience with Snowflake and its tooling (Snowpark, Snowpipe, etc.). Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components. Knowledge of data integration and ETL more »
GCP DataProc or GCP Cloud Data Fusion. * NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. * BigQuery and Data Studio/Looker. * Snowflake Data Warehouse/Platform * Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. * Experience of working CI/CD technologies, Git more »
GCP DataProc or GCP Cloud Data Fusion. • NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. • BigQuery and Data Studio/Looker. • Snowflake Data Warehouse/Platform • Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. • Experience of working CI/CD technologies, Git more »
varying data proficiency Thorough understanding of data lake and data warehousing principles and full project involvement in one or more major technology platforms, e.g. Snowflake, Databricks Proven experience with one or more Cloud Services provider, e.g. AWS, Azure or Google Cloud Platform. Good understanding of role-based access control, its more »
London (city), London, England Hybrid / WFH Options
T Rowe Price
Python, Java 11+ or similar, with 6+ years of professional experience. A good understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate more »
CD. professional experience with SQL and data transformation, ideally with dbt or similar. with at least one of these Cloud technologies: AWS, Microsoft Azure, Snowflake, GCP. Apply to the Role Roles like these are snapped up very quickly, so act now if you do not want to miss out! Reply more »
for achieving project success. Key Responsibilities: Software Development: Write high-quality, maintainable code using languages such as Python and SQL Establish data tools like Snowflake and Azure Data Lake Services (ADLS) Gen 2 Utilize PowerBI, Tableau, or similar tools to design and create interactive and visually appealing dashboards and reports. more »
automation, data visualization tools, DevOps practices, machine learning frameworks, performance tuning, and data governance tools. Technical proficiency in Microsoft Azure SQL (PaaS & IaaS), CosmosDB, Snowflake Data Warehouse, Power Apps, Reporting Services, Tableau, T-SQL, Python Programming, and Azure Purview. If you're ready to join a dynamic team and drive more »
Coventry, England, United Kingdom Hybrid / WFH Options
WEG Tech
or equivalent experience Strong experience working as a Data Engineer, preferably in a cloud-based environment. Experience with cloud-based data storage platforms, preferably Snowflake, Azure SQL Data Warehouse or AWS Redshift. Solid practical experience and understanding of DataVault and Kimball-style data warehousing methodologies. Proficient in SQL and data more »
experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching, Good knowledge of Databricks, Snowflake, Azure/AWS/Oracle cloud, R, Python. Additional Information Location: This role can be delivered in a hybrid nature from one of these offices more »
for achieving project success. Key Responsibilities: Software Development: Write high-quality, maintainable code using languages such as Python , SQL . Establish data tools like Snowflake and Azure Data Lake Services (ADLS) Gen 2 Utilize PowerBI, Tableau, or similar tools to design and create interactive and visually appealing dashboards and reports. more »
to complex business requirements spanning a number of systems At least 10 years of relevant experience Hands-on in-depth experience in the following: Snowflake/DBT/Airflow Background/working experience in the following: Azure Power BI/DAX Traditional SQL (SqlServer, MySql, Postgres) JIRA, Confluence, (Github/ more »
for 5 or more consecutive years Demonstrated experience in data architecture or similar role Practical experience across a variety of platforms and languages i.e.Databricks, Snowflake, Azure, AWS, Oracle Cloud, R, Python or similar Understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing more »
for 5 or more consecutive years Demonstrated experience in data architecture or similar role Practical experience across a variety of platforms and languages i.e.Databricks, Snowflake, Azure, AWS, Oracle Cloud, R, Python or similar Understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing more »
for 5 or more consecutive years Demonstrated experience in data architecture or similar role Practical experience across a variety of platforms and languages i.e.Databricks, Snowflake, Azure, AWS, Oracle Cloud, R, Python or similar Understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing more »
or more consecutive years Demonstrated experience in data architecture or similar role Practical experience across a variety of platforms and languages i.e. Databricks, Snowflake, Azure, AWS, Oracle Cloud, R, Python or similar Understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical more »
or more consecutive years Demonstrated experience in data architecture or similar role Practical experience across a variety of platforms and languages i.e. Databricks, Snowflake, Azure, AWS, Oracle Cloud, R, Python or similar Understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical more »
for 5 or more consecutive years Demonstrated experience in data architecture or similar role Practical experience across a variety of platforms and languages i.e.Databricks, Snowflake, Azure, AWS, Oracle Cloud, R, Python or similar Understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing more »