Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by … led small teams on the delivery of projects AWS Glue Dremio Agile The following is DESIRABLE, not essential: Snowflake Spark, Airflow, Apache Iceberg, Arrow, DBT Trading, Front Office finance Some appreciation of asset classes such as Fixed Income, equities, FX or commodities Role: Python Software Engineer Team Lead (Architecture Programmer … Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Arrow DBT gRPC protobuf TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my more »
good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income JavaScript Node Fixed … times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech attitude towards technology. Hours are more »
London (city), London, England Hybrid / WFH Options
T Rowe Price
of professional experience. A good understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate would be considered particularly beneficial. Extensive experience with more »
London (city), London, England Hybrid / WFH Options
T Rowe Price
of professional experience. A good understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate would be considered particularly beneficial. Extensive experience with more »
help them further grow their already exciting business. Within this role, you will be responsible for Maintaining, supporting and expanding existing data pipelines using DBT, Snowflake and S3. You will also be tasked with implementing standardised data ingress/egress pipelines coupled with onboarding new, disparate data sets, sourced from more »
Experience with building and optimizing data pipelines for large-scale datasets. Solid understanding of data modeling concepts and ETL processes. Experience with DBT (DataBuildTool). Desirable Skills: Experience with DBT (DataBuildTool). Experience with FastAPI. Familiarity with Python UI framework packages. Knowledge of Apigee (as an more »
and tracking the technology innovations applicable to the solutions. Proven experience of working effectively with senior business stakeholders Experience of using tools including Snowflake, DBT, ADF and Azure Synapse Ability to lead teams and projects towards a common architecture approach and language. Strong communication and collaboration skills. Additional Information Legal more »
Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the CTO. 2 + more »
South East London, England, United Kingdom Hybrid / WFH Options
Burns Sheehan
or warehouse.Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed.Modern tech stack - Python, AWS, Airflow and DBT Must haves:A team player, happy to work with several teams, this is key as you will be reporting directly to the CTO.2 + years more »
value their employees. As a testament to this, you’ll also receive an unrivalled benefits package. 🛠Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt 🌳Environment: Agile ✍️Process: 3 stages No CV? No problem. Email me at athomas@trg-uk.com, and let’s arrange a call more »
A world-leading hedge fund seeks a Senior Data Engineer with exceptional Python, SQL, and DBT skills for its Algorithmic Trading team. We seek an exceptional Financial Markets Data Engineer with front-office trading experience and experience building and managing large (TB+) quantitative research data pipelines . Ideally, you will more »
value their employees. As a testament to this, you’ll also receive an unrivalled benefits package. Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt TC: £85,000 + bonus + up to 22% pension Process: 2 stages No CV? No problem. Email me at athomas@trg-uk.com, and let more »
years of demonstrated commercial experience as a Data Engineer or similar role within large-scale environments dealing with large data sets. Expertise in SQL & dbt and ideally Kafka Significant Python coding skills Containerisation experience (Dockers, Kubernetes) Cloud Computing experience ( GCP/AWS/Azure ) Strong preference for a Snowflake background more »
and capabilities.Expert data engineering and database practitioner; Experience of designing and building AWS based data solutions including pipelines, data warehouse/lake. Experience of DBT or equivalent data modelling tool. Experience of administrating SQL databases and highly proficient in SQL query optimisation.Broad experience of data analytics and reporting platforms for more »
capabilities. Expert data engineering and database practitioner; Experience of designing and building AWS based data solutions including pipelines, data warehouse/lake. Experience of DBT or equivalent data modelling tool. Experience of administrating SQL databases and highly proficient in SQL query optimisation. Broad experience of data analytics and reporting platforms more »
working with relational databases, and programming experience in Python or Scala. Experience with Snowflake and its tooling (Snowpark, Snowpipe, etc.). Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components. Knowledge of data integration and ETL frameworks and tools. Understanding of DataOps, data mining, and data visualization more »
with AWS, especially AWS services focused on data flow, pipelines, data transformation, storage and streaming. Excellent data engineering skills, for example with SQL, Python, DBT and Airflow. Good understanding of service-oriented architecture; experience of exposing and consuming data via APIs, streams and webhooks. Confidence in coding, scripting, configuring, versioning more »
smooth operations and reliability. YOUR EXPERIENCE From a technical perspective, you will require: Proficiency in Tableau, SQL and cloud data warehouses. Experience with BigQuery, DBT, or Python/R is beneficial. Your background should include: Demonstrated commercial acumen and adaptability in dynamic environments. Proven ability to translate data insights into more »
smooth operations and reliability. YOUR EXPERIENCE From a technical perspective, you will require: Proficiency in Tableau, SQL and cloud data warehouses. Experience with BigQuery, DBT, or Python/R is beneficial. Your background should include: Experience leading BI capabilities as an individual contributor. Demonstrated commercial acumen and adaptability in dynamic more »