knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc Strong proven knowledge of Kimball/Dimensional data modelling and/or Data vault If you are interested in applying for more »
trends and best practices. Qualifications: Expertise in Java and Python development (Essential). Experience with Spark or Hadoop (Essential). Knowledge of Trino or Airflow (Desirable). Proven ability to design and implement scalable and secure solutions. Excellent communication and collaboration skills. more »
London (city), London, England Hybrid / WFH Options
T Rowe Price
experience. A good understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate would be considered particularly beneficial. Extensive experience with database more »
London (city), London, England Hybrid / WFH Options
T Rowe Price
experience. A good understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate would be considered particularly beneficial. Extensive experience with database more »
Python and DBT for data transformation. Experience in converting SAS-based modules to Python-based solutions. Familiarity with Snowflake for data management. Experience with Airflow or similar technologies is a plus. Desired: Experience with DBT and Snowflake is advantageous. more »
data and AI models. Data Engineer Required Experience Data engineering experience (2+ years) Cloud platform proficiency (e.g., AWS, Azure, GCP) Data pipeline development (e.g., Airflow, Apache Spark) SQL proficiency, database design Visualization tools knowledge (e.g., Tableau, PowerBI, Looker) Data Engineer Application Process This is a 1 year contract more »