Parquet Jobs in the South East

9 of 9 Parquet Jobs in the South East

Databricks Data Engineer Contract

London, South East, England, United Kingdom
Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
for modern data and platform technologies. Nice to Have: Experience implementing data governance and observability stacks (lineage, data contracts, quality monitoring). Knowledge of data lake formats (Delta Lake, Parquet, Iceberg, Hudi). Familiarity with containerisation and streaming technologies (Docker, Kubernetes, Kafka, Flink). Exposure to lakehouse or medallion architectures within Databricks. More ❯
Employment Type: Contractor
Rate: £550 - £600 per day
Posted:

Data Software Engineer

London, South East, England, United Kingdom
Cobalt Recruitment
in peer reviews. Collaborate with stakeholders and researchers to support analytics and product development. Integrate data from APIs, S3 buckets, and structured/unstructured sources (JSON, CSV, Excel, PDF, Parquet). Join geospatial datasets with external data sources and apply complex transformations. Define validated data schemas and create clear documentation for partners and teams. Explore and evaluate new data More ❯
Employment Type: Full-Time
Salary: £70,000 - £90,000 per annum
Posted:

Tick Data Engineer

london, south east england, united kingdom
Harrington Starr
to quants and researchers What they are looking for: Experience in distributed data engineering at scale Strong coding skills in Python, C++ or Java Familiarity with kdb+, ClickHouse, Kafka, Parquet/Arrow Previous experience with tick-level financial data is a strong advantage More ❯
Posted:

Tick Data Engineer

london (city of london), south east england, united kingdom
Harrington Starr
to quants and researchers What they are looking for: Experience in distributed data engineering at scale Strong coding skills in Python, C++ or Java Familiarity with kdb+, ClickHouse, Kafka, Parquet/Arrow Previous experience with tick-level financial data is a strong advantage More ❯
Posted:

Tick Data Engineer

slough, south east england, united kingdom
Harrington Starr
to quants and researchers What they are looking for: Experience in distributed data engineering at scale Strong coding skills in Python, C++ or Java Familiarity with kdb+, ClickHouse, Kafka, Parquet/Arrow Previous experience with tick-level financial data is a strong advantage More ❯
Posted:

Microsoft Fabric Architect

slough, south east england, united kingdom
Capgemini
with 2+ years in Microsoft Fabric or related Microsoft Data Stack. Expertise in Power BI Datasets, Semantic Models, and Direct Lake optimization. Strong understanding of Lakehouse architecture, Delta/Parquet formats, and data governance tools. Experience integrating Fabric with Azure-native services (e.g., Azure AI Foundry, Purview, Entra ID). Familiarity with Generative AI, RAG-based architecture, and Fabric More ❯
Posted:

Microsoft Fabric Architect

london, south east england, united kingdom
Capgemini
with 2+ years in Microsoft Fabric or related Microsoft Data Stack. Expertise in Power BI Datasets, Semantic Models, and Direct Lake optimization. Strong understanding of Lakehouse architecture, Delta/Parquet formats, and data governance tools. Experience integrating Fabric with Azure-native services (e.g., Azure AI Foundry, Purview, Entra ID). Familiarity with Generative AI, RAG-based architecture, and Fabric More ❯
Posted:

Microsoft Fabric Architect

london (city of london), south east england, united kingdom
Capgemini
with 2+ years in Microsoft Fabric or related Microsoft Data Stack. Expertise in Power BI Datasets, Semantic Models, and Direct Lake optimization. Strong understanding of Lakehouse architecture, Delta/Parquet formats, and data governance tools. Experience integrating Fabric with Azure-native services (e.g., Azure AI Foundry, Purview, Entra ID). Familiarity with Generative AI, RAG-based architecture, and Fabric More ❯
Posted:

Senior Backend Software Engineer - Pathogen

Oxford, Oxfordshire, United Kingdom
Ellison Institute, LLC
Database, and PostgreSQL. Supporting innovation efforts by exploring new technologies such as vector databases to enable search and AI use cases. Using big data technologies like Kafka, Iceberg, and Parquet, along with managed databases including PostgreSQL and Oracle vector databases. Operating, monitoring, and maintaining Oracle Cloud infrastructure to ensure backend services are highly available, scalable, and secure. Collaborating with … NodeJS, Django, and FastAPI. Experience building flexible APIs using GraphQL. Expertise in at least one cloud platform and its managed data services. Familiarity with big data technologies such as Parquet, Iceberg, and streaming platforms like Kafka. Strong knowledge of database systems, SQL data model design, and query optimization. Experience with containerization using Kubernetes and Docker. Proven ability to deliver More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:
Parquet
the South East
25th Percentile
£42,500
Median
£45,000
75th Percentile
£47,500