London (city), London, England Hybrid / WFH Options
T Rowe Price
years of professional experience. A good understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate would be considered particularly beneficial. Extensive more »
London (city), London, England Hybrid / WFH Options
T Rowe Price
years of professional experience. A good understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate would be considered particularly beneficial. Extensive more »
Apache Iceberg & Spark. Exposure to Apache Airflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake Formation, EMR, EventBridge, Athena, etc.) & metadata management tools (such as Amundsen, Atlas, DataHub, OpenDataDiscovery, Marquez, etc.), Experience on RDBMS like PostgreSQL would be a plus. Experience more »
The role is 2 years fixed term contract. GTTS builds products that help Amazon run the world's largest transportation network, using cutting-edge technologies and machine learning, all running on AWS. We are looking for someone who is passionate about technology, loves solving customer problems, and delivers … Amazon. You help establish technical standards and drive Amazons overall technical architecture, engineering practices, and methodologies. You think globally when building systems, ensuring Amazon builds high performing, scalable systems that work well together. You are hands on, producing both detailed technical work and high-level designs. In GTTS … critical issues arise with the team's products. Key job responsibilities working with AWS technologies such as Lambda, ECS Fargate, API Gateway, RDS, DynamoDB, EMR building customer-facing applications and APIs building data pipelines using Spark + Scala that process Tb of data per day working with customers to more »