Ecosystem: Experience with logging/monitoring Exposure to data governance, cataloguing, and lineage tools Ability to work with a range of structured, semi-structured and unstructured file formats including Parquet, json, csv, xml, pdf, jpg. Tools and methods to develop comprehensive data reliability and active metadata solutions. Ability to work with and develop APIs (including data transformations). Ability More ❯
for modern data and platform technologies. Nice to Have: Experience implementing data governance and observability stacks (lineage, data contracts, quality monitoring). Knowledge of data lake formats (Delta Lake, Parquet, Iceberg, Hudi). Familiarity with containerisation and streaming technologies (Docker, Kubernetes, Kafka, Flink). Exposure to lakehouse or medallion architectures within Databricks. More ❯
London, England, United Kingdom Hybrid / WFH Options
Harnham
for modern data and platform technologies. Nice to Have: Experience implementing data governance and observability stacks (lineage, data contracts, quality monitoring). Knowledge of data lake formats (Delta Lake, Parquet, Iceberg, Hudi). Familiarity with containerisation and streaming technologies (Docker, Kubernetes, Kafka, Flink). Exposure to lakehouse or medallion architectures within Databricks. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
for modern data and platform technologies. Nice to Have: Experience implementing data governance and observability stacks (lineage, data contracts, quality monitoring). Knowledge of data lake formats (Delta Lake, Parquet, Iceberg, Hudi). Familiarity with containerisation and streaming technologies (Docker, Kubernetes, Kafka, Flink). Exposure to lakehouse or medallion architectures within Databricks. More ❯
cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability to work in More ❯
cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability to work in More ❯
cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability to work in More ❯
cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability to work in More ❯
london (city of london), south east england, united kingdom
Vertus Partners
cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability to work in More ❯
City Of Westminster, London, United Kingdom Hybrid / WFH Options
Additional Resources
Kubernetes, Docker, and cloud-native data ecosystems. Demonstrable experience with Infrastructure as Code tools (Terraform, Ansible). Hands-on experience with PostgreSQL and familiarity with lakehouse technologies (e.g. ApacheParquet, Delta Tables). Exposure to Spark, Databricks, and data lake/lakehouse environments. Understanding of Agile development methods, CI/CD pipelines, GitHub, and automated testing. Practical experience monitoring More ❯
Westminster, City of Westminster, Greater London, United Kingdom Hybrid / WFH Options
Additional Resources
Kubernetes, Docker, and cloud-native data ecosystems. Demonstrable experience with Infrastructure as Code tools (Terraform, Ansible). Hands-on experience with PostgreSQL and familiarity with lakehouse technologies (e.g. ApacheParquet, Delta Tables). Exposure to Spark, Databricks, and data lake/lakehouse environments. Understanding of Agile development methods, CI/CD pipelines, GitHub, and automated testing. Practical experience monitoring More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Additional Resources Ltd
Kubernetes, Docker, and cloud-native data ecosystems. Demonstrable experience with Infrastructure as Code tools (Terraform, Ansible). Hands-on experience with PostgreSQL and familiarity with lakehouse technologies (e.g. ApacheParquet, Delta Tables). Exposure to Spark, Databricks, and data lake/lakehouse environments. Understanding of Agile development methods, CI/CD pipelines, GitHub, and automated testing. Practical experience monitoring More ❯
observability Experience with logging/monitoring Exposure to data governance, cataloguing, and lineage tools Ability to work with a range of structured, semi-structured and unstructured file formats including Parquet, json, csv, xml, pdf, jpg. Tools and methods to develop comprehensive data reliability and active metadata solutions. Ability to work with and develop APIs (including data transformations). Ability More ❯
Northern Ireland, United Kingdom Hybrid / WFH Options
Ocho
/Sub) * Observability stacks (OpenTelemetry, Prometheus, Grafana) * IaC (Terraform), security-by-design, OAuth/OIDC, secrets management * Batch & streaming data processing (Spark/Flink/Beam) or columnar analytics (Parquet/Arrow) Career Paths (we're hiring multiple profiles): * Go Backend (API-first): Own product APIs, performance, reliability. * Go Platform (ML-first): Focus on data lake, pipelines, model serving. More ❯
Log Analytics, Serverless Architecture, ARM Templates. Strong proficiency in Spark, SQL, and Python/scala/Java. Experience in building Lakehouse architecture using open-source table formats like delta, parquet and tools like jupyter notebook. Strong notions of security best practices (e.g., using Azure Key Vault, IAM, RBAC, Monitor etc.). Proficient in integrating, transforming, and consolidating data from More ❯
will create secure, efficient, and scalable environments for our data platforms. You will leverage cloud-native technologies and AWS tools such as AWS S3, EKS, Glue, Airflow, Trino, and Parquet, while preparing to adopt Apache Iceberg for even greater performance and flexibility. You'll tackle high-performance data workloads, ensuring seamless execution of massive queries, including 600+ billion-row More ❯
Actions, Jenkins, or AWS CodePipeline). Proven ability to secure and scale production systems. Monitoring and observability tools (CloudWatch, Grafana, OpenTelemetry). Familiar with data exchange formats (JSON, YAML, Parquet) and API design. Leadership & Delivery 4-8 years in software development and/or DevOps, including 2+ in a management or team lead role. Experience in defence, aerospace, or More ❯
Actions, Jenkins, or AWS CodePipeline). Proven ability to secure and scale production systems. Monitoring and observability tools (CloudWatch, Grafana, OpenTelemetry). Familiar with data exchange formats (JSON, YAML, Parquet) and API design. Leadership & Delivery 4–8 years in software development and/or DevOps , including 2+ in a management or team lead role. Experience in defence, aerospace, or More ❯
warrington, cheshire, north west england, united kingdom
RiskPod
Actions, Jenkins, or AWS CodePipeline). Proven ability to secure and scale production systems. Monitoring and observability tools (CloudWatch, Grafana, OpenTelemetry). Familiar with data exchange formats (JSON, YAML, Parquet) and API design. Leadership & Delivery 4–8 years in software development and/or DevOps , including 2+ in a management or team lead role. Experience in defence, aerospace, or More ❯
bolton, greater manchester, north west england, united kingdom
RiskPod
Actions, Jenkins, or AWS CodePipeline). Proven ability to secure and scale production systems. Monitoring and observability tools (CloudWatch, Grafana, OpenTelemetry). Familiar with data exchange formats (JSON, YAML, Parquet) and API design. Leadership & Delivery 4–8 years in software development and/or DevOps , including 2+ in a management or team lead role. Experience in defence, aerospace, or More ❯
or experience working with text-to-code language models Knowledge or experience with big data analytics platforms (Databricks being a plus) Knowledge or data processing file format such as parquet, avro, csv) Who You Are Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent practical experience. Proven experience as a Full Stack Software Engineer or More ❯
meet the needs of platform tools. Work with product managers to capture requirements, wireframe solutions, and design user experiences. Work with big data technologies such as Kafka, Iceberg, and Parquet, and managed database technologies including PostgreSQL and Oracle vector databases. Ensure applications are secure. Operate, monitor, and maintain associated Oracle Cloud infrastructure to ensure platform tools are highly available More ❯
and/or Rust Experience with distributed data processing frameworks such as PySpark Experience with agentic learning models Experience using MLOps frameworks and components (e.g. DVC, Horovod, Spark, ONNX, Parquet) Familiarity with SQL and modern database technologies (e.g., MinIO, Yugabyte) Understanding of secure software development practices and/or experience working in classified environments Ability to build and manage More ❯
optimising performance across SQL Server, PostgreSQL, and cloud databases Proven track record with complex data migration projects (terabyte+ datasets, multiple legacy source systems, structures and unstructured data) Proficiency with Parquet/Delta Lake or other modern data storage formats Experience with streaming architectures using Kafka, Event Hubs, or Kinesis for real-time data processing Knowledge of data architectures supporting More ❯
in peer reviews. Collaborate with stakeholders and researchers to support analytics and product development. Integrate data from APIs, S3 buckets, and structured/unstructured sources (JSON, CSV, Excel, PDF, Parquet). Join geospatial datasets with external data sources and apply complex transformations. Define validated data schemas and create clear documentation for partners and teams. Explore and evaluate new data More ❯