London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
for modern data and platform technologies. Nice to Have: Experience implementing data governance and observability stacks (lineage, data contracts, quality monitoring). Knowledge of data lake formats (Delta Lake, Parquet, Iceberg, Hudi). Familiarity with containerisation and streaming technologies (Docker, Kubernetes, Kafka, Flink). Exposure to lakehouse or medallion architectures within Databricks. More ❯
Northern Ireland, United Kingdom Hybrid / WFH Options
Ocho
/Sub) * Observability stacks (OpenTelemetry, Prometheus, Grafana) * IaC (Terraform), security-by-design, OAuth/OIDC, secrets management * Batch & streaming data processing (Spark/Flink/Beam) or columnar analytics (Parquet/Arrow) Career Paths (we're hiring multiple profiles): * Go Backend (API-first): Own product APIs, performance, reliability. * Go Platform (ML-first): Focus on data lake, pipelines, model serving. More ❯
or experience working with text-to-code language models Knowledge or experience with big data analytics platforms (Databricks being a plus) Knowledge or data processing file format such as parquet, avro, csv) Who You Are Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent practical experience. Proven experience as a Full Stack Software Engineer or More ❯
warrington, cheshire, north west england, united kingdom
RAAS LAB
or experience working with text-to-code language models Knowledge or experience with big data analytics platforms (Databricks being a plus) Knowledge or data processing file format such as parquet, avro, csv) Who You Are Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent practical experience. Proven experience as a Full Stack Software Engineer or More ❯
bolton, greater manchester, north west england, united kingdom
RAAS LAB
or experience working with text-to-code language models Knowledge or experience with big data analytics platforms (Databricks being a plus) Knowledge or data processing file format such as parquet, avro, csv) Who You Are Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent practical experience. Proven experience as a Full Stack Software Engineer or More ❯
Familiarity with C++ is a plus Bonus++: Experience working with quants, data scientists, or in trading environments Tech Stack: Python 3.11+ (fast, modern, typed) Dask, pandas, PyArrow, NumPy PostgreSQL, Parquet, S3 Airflow, Docker, Kubernetes, GitLab CI Internal frameworks built for scale and speed Why Join: Engineers own projects end-to-end—from design to deployment to impact Work side More ❯
in peer reviews. Collaborate with stakeholders and researchers to support analytics and product development. Integrate data from APIs, S3 buckets, and structured/unstructured sources (JSON, CSV, Excel, PDF, Parquet). Join geospatial datasets with external data sources and apply complex transformations. Define validated data schemas and create clear documentation for partners and teams. Explore and evaluate new data More ❯
while also developing robust data pipelines and storage solutions optimized for high throughput, performance, and cost. You'll leverage technologies such as time-series databases, columnar storage formats (e.g., Parquet), and distributed data processing frameworks to advance the platform's capabilities. Collaboration with cross-functional teams is critical, as you'll integrate observability into Roku's cloud-native stack More ❯
with 2+ years in Microsoft Fabric or related Microsoft Data Stack. Expertise in Power BI Datasets, Semantic Models, and Direct Lake optimization. Strong understanding of Lakehouse architecture, Delta/Parquet formats, and data governance tools. Experience integrating Fabric with Azure-native services (e.g., Azure AI Foundry, Purview, Entra ID). Familiarity with Generative AI, RAG-based architecture, and Fabric More ❯
with 2+ years in Microsoft Fabric or related Microsoft Data Stack. Expertise in Power BI Datasets, Semantic Models, and Direct Lake optimization. Strong understanding of Lakehouse architecture, Delta/Parquet formats, and data governance tools. Experience integrating Fabric with Azure-native services (e.g., Azure AI Foundry, Purview, Entra ID). Familiarity with Generative AI, RAG-based architecture, and Fabric More ❯
london (city of london), south east england, united kingdom
Capgemini
with 2+ years in Microsoft Fabric or related Microsoft Data Stack. Expertise in Power BI Datasets, Semantic Models, and Direct Lake optimization. Strong understanding of Lakehouse architecture, Delta/Parquet formats, and data governance tools. Experience integrating Fabric with Azure-native services (e.g., Azure AI Foundry, Purview, Entra ID). Familiarity with Generative AI, RAG-based architecture, and Fabric More ❯
Database, and PostgreSQL. Supporting innovation efforts by exploring new technologies such as vector databases to enable search and AI use cases. Using big data technologies like Kafka, Iceberg, and Parquet, along with managed databases including PostgreSQL and Oracle vector databases. Operating, monitoring, and maintaining Oracle Cloud infrastructure to ensure backend services are highly available, scalable, and secure. Collaborating with … NodeJS, Django, and FastAPI. Experience building flexible APIs using GraphQL. Expertise in at least one cloud platform and its managed data services. Familiarity with big data technologies such as Parquet, Iceberg, and streaming platforms like Kafka. Strong knowledge of database systems, SQL data model design, and query optimization. Experience with containerization using Kubernetes and Docker. Proven ability to deliver More ❯
Welwyn Garden City, Hertfordshire, England, United Kingdom
Pontoon
Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone's chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We More ❯