delta one, store of value, and/or FICC options trading Experience with Linux-based, concurrent, high-throughput, low-latency software systems Experience with pipeline orchestration frameworks (e.g. Airflow, Dagster) Experience with streaming platforms (e.g. Kafka), data lake platforms (e.g. Delta Lake, Apache Iceberg), and relational databases Have a Bachelors or advanced degree in Computer Science, Mathematics, Statistics, Physics More ❯
Data Analysis. SQL & Python: schema design, transformations, query optimisation, automation, testing. Track record of buildingETL/ELT pipelines into modern warehouses (BigQuery, Snowflake, Redshift). Familiar with tools like Dagster, Airflow, Prefect, dbt, Dataform, SQLMesh. Cloud experience (we're on GCP) + containerisation (Docker, Kubernetes). Strong sense of ownership over data standards, security, and roadmap. A collaborator at More ❯
Python and SQL; strong experience with data engineering libraries (e.g., Pandas, PySpark, Dask). Deep knowledge of ETL/ELT frameworks and orchestration tools (e.g., Airflow, Azure Data Factory, Dagster). Proficient in cloud platforms (preferably Azure) and services such as Data Lake, Synapse, Event Hubs, and Functions. Authoring reports and dashboards with either open source or commercial products More ❯