London (city), London, England Hybrid / WFH Options
T Rowe Price
of professional experience. A good understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate would be considered particularly beneficial. Extensive experience with more »
London (city), London, England Hybrid / WFH Options
T Rowe Price
of professional experience. A good understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate would be considered particularly beneficial. Extensive experience with more »
Required Skills and Experience: Extensive experience in: Data Warehousing Data Engineering, overall Data Analytics Data Visualisation Proficiency in: Google Cloud (GCP) GCP BigQuery Python DBT or similar FastAPI or similar Airflow or similar Desirable: Google Apigee (as an application developer) Exposure to Machine Learning projects Exposure to DataOps Exposure to more »
insights. Experience needed: - Snowflake expertise, demonstrated by performance optimisation, cost management, and utilisation of advanced features, validated by Snowflake certification (e.g., SnowPro Core). - DBT expertise coupled with advanced Python skills, specifically in data pipeline development and data task automation. - Mastery of SQL (PostgreSQL, MySQL) for intricate queries and performance more »
of IR35 Required experience will include: Expertise in designing and implementing data pipelines using Azure services such as Azure Data Factory , Azure Databricks and DBT . Hands-on experience with SQL database design Experience of product lifecycle management principles and tools (e.g. DevOps , terraform ) and relational database manipulation and interrogation. more »
Job Title: Senior Data Engineer (SAS) Responsibilities: Rewrite existing data pipelines (SAS, Info, pearl, shell scripts) to Python and DBT transformation-based solutions. Convert SAS-based modules to Python-based pipelines. Load transformed data to Snowflake on Azure. Requirements: Strong experience with SAS programming (primary skill). Proficiency in pearl … and shell scripting (secondary skills). Expertise in Python and DBT for data transformation. Experience in converting SAS-based modules to Python-based solutions. Familiarity with Snowflake for data management. Experience with Airflow or similar technologies is a plus. Desired: Experience with DBT and Snowflake is advantageous. more »