London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
for modern data and platform technologies. Nice to Have: Experience implementing data governance and observability stacks (lineage, data contracts, quality monitoring). Knowledge of data lake formats (Delta Lake, Parquet, Iceberg, Hudi). Familiarity with containerisation and streaming technologies (Docker, Kubernetes, Kafka, Flink). Exposure to lakehouse or medallion architectures within Databricks. More ❯
cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability to work in More ❯
cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability to work in More ❯
london (city of london), south east england, united kingdom
Vertus Partners
cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability to work in More ❯
meet the needs of platform tools. Work with product managers to capture requirements, wireframe solutions, and design user experiences. Work with big data technologies such as Kafka, Iceberg, and Parquet, and managed database technologies including PostgreSQL and Oracle vector databases. Ensure applications are secure. Operate, monitor, and maintain associated Oracle Cloud infrastructure to ensure platform tools are highly available More ❯
in peer reviews. Collaborate with stakeholders and researchers to support analytics and product development. Integrate data from APIs, S3 buckets, and structured/unstructured sources (JSON, CSV, Excel, PDF, Parquet). Join geospatial datasets with external data sources and apply complex transformations. Define validated data schemas and create clear documentation for partners and teams. Explore and evaluate new data More ❯
other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for more information. More ❯
london (city of london), south east england, united kingdom
Hunter Bond
other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for more information. More ❯
other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for more information. More ❯
to quants and researchers What they are looking for: Experience in distributed data engineering at scale Strong coding skills in Python, C++ or Java Familiarity with kdb+, ClickHouse, Kafka, Parquet/Arrow Previous experience with tick-level financial data is a strong advantage More ❯
to quants and researchers What they are looking for: Experience in distributed data engineering at scale Strong coding skills in Python, C++ or Java Familiarity with kdb+, ClickHouse, Kafka, Parquet/Arrow Previous experience with tick-level financial data is a strong advantage More ❯
london (city of london), south east england, united kingdom
Harrington Starr
to quants and researchers What they are looking for: Experience in distributed data engineering at scale Strong coding skills in Python, C++ or Java Familiarity with kdb+, ClickHouse, Kafka, Parquet/Arrow Previous experience with tick-level financial data is a strong advantage More ❯
with 2+ years in Microsoft Fabric or related Microsoft Data Stack. Expertise in Power BI Datasets, Semantic Models, and Direct Lake optimization. Strong understanding of Lakehouse architecture, Delta/Parquet formats, and data governance tools. Experience integrating Fabric with Azure-native services (e.g., Azure AI Foundry, Purview, Entra ID). Familiarity with Generative AI, RAG-based architecture, and Fabric More ❯
with 2+ years in Microsoft Fabric or related Microsoft Data Stack. Expertise in Power BI Datasets, Semantic Models, and Direct Lake optimization. Strong understanding of Lakehouse architecture, Delta/Parquet formats, and data governance tools. Experience integrating Fabric with Azure-native services (e.g., Azure AI Foundry, Purview, Entra ID). Familiarity with Generative AI, RAG-based architecture, and Fabric More ❯
london (city of london), south east england, united kingdom
Capgemini
with 2+ years in Microsoft Fabric or related Microsoft Data Stack. Expertise in Power BI Datasets, Semantic Models, and Direct Lake optimization. Strong understanding of Lakehouse architecture, Delta/Parquet formats, and data governance tools. Experience integrating Fabric with Azure-native services (e.g., Azure AI Foundry, Purview, Entra ID). Familiarity with Generative AI, RAG-based architecture, and Fabric More ❯
Database, and PostgreSQL. Supporting innovation efforts by exploring new technologies such as vector databases to enable search and AI use cases. Using big data technologies like Kafka, Iceberg, and Parquet, along with managed databases including PostgreSQL and Oracle vector databases. Operating, monitoring, and maintaining Oracle Cloud infrastructure to ensure backend services are highly available, scalable, and secure. Collaborating with … NodeJS, Django, and FastAPI. Experience building flexible APIs using GraphQL. Expertise in at least one cloud platform and its managed data services. Familiarity with big data technologies such as Parquet, Iceberg, and streaming platforms like Kafka. Strong knowledge of database systems, SQL data model design, and query optimization. Experience with containerization using Kubernetes and Docker. Proven ability to deliver More ❯
ll Do Build and maintain high-performance tick data pipelines for ingesting, processing, and storing large volumes of market data. Work with time-series databases (e.g., KDB, OneTick) and Parquet-based file storage to optimize data access and retrieval. Design scalable cloud-native solutions (AWS preferred) for market data ingestion and distribution. (Bonus) Integrate Apache Iceberg for large-scale … focus on market data systems. Strong Python skills and familiarity with cloud platforms (AWS, GCP, or Azure). Experience with tick data and building tick data pipelines. Proficiency with Parquet-based file storage; Iceberg experience is a plus. Familiarity with Kubernetes, containerization, and modern orchestration tools. Experience with time-series databases (KDB, OneTick) and C++ is a plus. Strong More ❯
london (city of london), south east england, united kingdom
Selby Jennings
ll Do Build and maintain high-performance tick data pipelines for ingesting, processing, and storing large volumes of market data. Work with time-series databases (e.g., KDB, OneTick) and Parquet-based file storage to optimize data access and retrieval. Design scalable cloud-native solutions (AWS preferred) for market data ingestion and distribution. (Bonus) Integrate Apache Iceberg for large-scale … focus on market data systems. Strong Python skills and familiarity with cloud platforms (AWS, GCP, or Azure). Experience with tick data and building tick data pipelines. Proficiency with Parquet-based file storage; Iceberg experience is a plus. Familiarity with Kubernetes, containerization, and modern orchestration tools. Experience with time-series databases (KDB, OneTick) and C++ is a plus. Strong More ❯