processes, and technologies. Strong SQL skills (ideally with Azure SQL), experience working with relational databases, and programming experience in Python or Scala. Experience with Snowflake and its tooling (Snowpark, Snowpipe, etc.). Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components. Knowledge of data integration and ETL more »
GCP DataProc or GCP Cloud Data Fusion. * NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. * BigQuery and Data Studio/Looker. * Snowflake Data Warehouse/Platform * Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. * Experience of working CI/CD technologies, Git more »
GCP DataProc or GCP Cloud Data Fusion. • NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. • BigQuery and Data Studio/Looker. • Snowflake Data Warehouse/Platform • Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. • Experience of working CI/CD technologies, Git more »
London (city), London, England Hybrid / WFH Options
T Rowe Price
Python, Java 11+ or similar, with 6+ years of professional experience. A good understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate more »
London (city), London, England Hybrid / WFH Options
T Rowe Price
Python, Java 11+ or similar, with 6+ years of professional experience. A good understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate more »
solutions including the choice of data sources and ETL approach Familiar with engineering processes for developing APIs Understanding the principles of building solutions using Snowflake, open-source frameworks, multi-cloud infrastructure This is a contract position. more »
CD. professional experience with SQL and data transformation, ideally with dbt or similar. with at least one of these Cloud technologies: AWS, Microsoft Azure, Snowflake, GCP. Apply to the Role Roles like these are snapped up very quickly, so act now if you do not want to miss out! Reply more »
for achieving project success. Key Responsibilities: Software Development: Write high-quality, maintainable code using languages such as Python and SQL Establish data tools like Snowflake and Azure Data Lake Services (ADLS) Gen 2 Utilize PowerBI, Tableau, or similar tools to design and create interactive and visually appealing dashboards and reports. more »
automation, data visualization tools, DevOps practices, machine learning frameworks, performance tuning, and data governance tools. Technical proficiency in Microsoft Azure SQL (PaaS & IaaS), CosmosDB, Snowflake Data Warehouse, Power Apps, Reporting Services, Tableau, T-SQL, Python Programming, and Azure Purview. If you're ready to join a dynamic team and drive more »
for achieving project success. Key Responsibilities: Software Development: Write high-quality, maintainable code using languages such as Python , SQL . Establish data tools like Snowflake and Azure Data Lake Services (ADLS) Gen 2 Utilize PowerBI, Tableau, or similar tools to design and create interactive and visually appealing dashboards and reports. more »
experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching, Good knowledge of Databricks, Snowflake, Azure/AxoWS/Oracle cloud, R, Python. Additional Information Location: This role can be delivered in a hybrid nature from one of these offices more »
for 5 or more consecutive years Demonstrated experience in data architecture or similar role Practical experience across a variety of platforms and languages i.e.Databricks, Snowflake, Azure, AWS, Oracle Cloud, R, Python or similar Understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing more »
an extremely fast paced environment. Within this role, you will be responsible for building data pipelines for a cloud-based warehouse using Azure and Snowflake, enhancing data capabilities for analytics and science. What you need: 3+ years of hands-on experience as a Data Engineer, building ETL pipelines and managing more »
Knowledge of Scala, R, is a plus. Experienced in SQL. Familiarity with various relational database platforms is a plus (SQL Server, MySql, PostgreSQL, Oracle, Snowflake, Vertica, etc). Ability to write efficient and robust queries. Familiarity with DevOps process for model deployment and unit testing. Experience of work in cloud more »
experience developing ML or statistical models related to pricing. Strong familiarity with data visualization software (e.g., Tableau, PowerBI) and data management tools (e.g., SQL, Snowflake). Bonus Points for: Experience implementing Machine Learning models and familiarity with large language models. Knowledge of cloud-based solutions on major providers (Azure, GCP more »
data and information systems design Experience with data management, database management systems, data services (including APIs), and enterprise data platform technologies (e.g. Fabric, Databricks, Snowflake, etc.) Experience architecting data-centric solutions at a conceptual, logical and physical level Experience in architecting, designing and implementing solutions on AWS and Microsoft Azure more »
London, England, United Kingdom Hybrid / WFH Options
Anson McCade
SSIS, Talend or Pentaho • Data governance and data management tools such as Informatica MDM, Informatica AXON, Informatica EDC, and Collibra • MySQL, SQL Server, Oracle, Snowflake, PostgreSQL and NoSQL databases • Programming languages such as Spark or Python • Amazon Web Services, Microsoft Azure or Google Cloud and distributed processing technologies such as more »