Ecosystem: Experience with logging/monitoring Exposure to data governance, cataloguing, and lineage tools Ability to work with a range of structured, semi-structured and unstructured file formats including Parquet, json, csv, xml, pdf, jpg. Tools and methods to develop comprehensive data reliability and active metadata solutions. Ability to work with and develop APIs (including data transformations). Ability More ❯
London, England, United Kingdom Hybrid / WFH Options
Harnham
for modern data and platform technologies. Nice to Have: Experience implementing data governance and observability stacks (lineage, data contracts, quality monitoring). Knowledge of data lake formats (Delta Lake, Parquet, Iceberg, Hudi). Familiarity with containerisation and streaming technologies (Docker, Kubernetes, Kafka, Flink). Exposure to lakehouse or medallion architectures within Databricks. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
for modern data and platform technologies. Nice to Have: Experience implementing data governance and observability stacks (lineage, data contracts, quality monitoring). Knowledge of data lake formats (Delta Lake, Parquet, Iceberg, Hudi). Familiarity with containerisation and streaming technologies (Docker, Kubernetes, Kafka, Flink). Exposure to lakehouse or medallion architectures within Databricks. More ❯
cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability to work in More ❯
cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability to work in More ❯
cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability to work in More ❯
cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability to work in More ❯
london (city of london), south east england, united kingdom
Vertus Partners
cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability to work in More ❯
Northern Ireland, United Kingdom Hybrid / WFH Options
Ocho
/Sub) * Observability stacks (OpenTelemetry, Prometheus, Grafana) * IaC (Terraform), security-by-design, OAuth/OIDC, secrets management * Batch & streaming data processing (Spark/Flink/Beam) or columnar analytics (Parquet/Arrow) Career Paths (we're hiring multiple profiles): * Go Backend (API-first): Own product APIs, performance, reliability. * Go Platform (ML-first): Focus on data lake, pipelines, model serving. More ❯
Actions, Jenkins, or AWS CodePipeline). Proven ability to secure and scale production systems. Monitoring and observability tools (CloudWatch, Grafana, OpenTelemetry). Familiar with data exchange formats (JSON, YAML, Parquet) and API design. Leadership & Delivery 4-8 years in software development and/or DevOps, including 2+ in a management or team lead role. Experience in defence, aerospace, or More ❯
Actions, Jenkins, or AWS CodePipeline). Proven ability to secure and scale production systems. Monitoring and observability tools (CloudWatch, Grafana, OpenTelemetry). Familiar with data exchange formats (JSON, YAML, Parquet) and API design. Leadership & Delivery 4–8 years in software development and/or DevOps , including 2+ in a management or team lead role. Experience in defence, aerospace, or More ❯
warrington, cheshire, north west england, united kingdom
RiskPod
Actions, Jenkins, or AWS CodePipeline). Proven ability to secure and scale production systems. Monitoring and observability tools (CloudWatch, Grafana, OpenTelemetry). Familiar with data exchange formats (JSON, YAML, Parquet) and API design. Leadership & Delivery 4–8 years in software development and/or DevOps , including 2+ in a management or team lead role. Experience in defence, aerospace, or More ❯
bolton, greater manchester, north west england, united kingdom
RiskPod
Actions, Jenkins, or AWS CodePipeline). Proven ability to secure and scale production systems. Monitoring and observability tools (CloudWatch, Grafana, OpenTelemetry). Familiar with data exchange formats (JSON, YAML, Parquet) and API design. Leadership & Delivery 4–8 years in software development and/or DevOps , including 2+ in a management or team lead role. Experience in defence, aerospace, or More ❯
or experience working with text-to-code language models Knowledge or experience with big data analytics platforms (Databricks being a plus) Knowledge or data processing file format such as parquet, avro, csv) Who You Are Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent practical experience. Proven experience as a Full Stack Software Engineer or More ❯
warrington, cheshire, north west england, united kingdom
RAAS LAB
or experience working with text-to-code language models Knowledge or experience with big data analytics platforms (Databricks being a plus) Knowledge or data processing file format such as parquet, avro, csv) Who You Are Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent practical experience. Proven experience as a Full Stack Software Engineer or More ❯
bolton, greater manchester, north west england, united kingdom
RAAS LAB
or experience working with text-to-code language models Knowledge or experience with big data analytics platforms (Databricks being a plus) Knowledge or data processing file format such as parquet, avro, csv) Who You Are Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent practical experience. Proven experience as a Full Stack Software Engineer or More ❯
meet the needs of platform tools. Work with product managers to capture requirements, wireframe solutions, and design user experiences. Work with big data technologies such as Kafka, Iceberg, and Parquet, and managed database technologies including PostgreSQL and Oracle vector databases. Ensure applications are secure. Operate, monitor, and maintain associated Oracle Cloud infrastructure to ensure platform tools are highly available More ❯
optimising performance across SQL Server, PostgreSQL, and cloud databases Proven track record with complex data migration projects (terabyte+ datasets, multiple legacy source systems, structures and unstructured data) Proficiency with Parquet/Delta Lake or other modern data storage formats Experience with streaming architectures using Kafka, Event Hubs, or Kinesis for real-time data processing Knowledge of data architectures supporting More ❯
in peer reviews. Collaborate with stakeholders and researchers to support analytics and product development. Integrate data from APIs, S3 buckets, and structured/unstructured sources (JSON, CSV, Excel, PDF, Parquet). Join geospatial datasets with external data sources and apply complex transformations. Define validated data schemas and create clear documentation for partners and teams. Explore and evaluate new data More ❯
in peer reviews. Collaborate with stakeholders and researchers to support analytics and product development. Integrate data from APIs, S3 buckets, and structured/unstructured sources (JSON, CSV, Excel, PDF, Parquet). Join geospatial datasets with external data sources and apply complex transformations. Define validated data schemas and create clear documentation for partners and teams. Explore and evaluate new data More ❯
Belfast, County Antrim, Northern Ireland, United Kingdom
Experis
Analyse and reconcile data, with a strong focus on SQL and Postgres JSON/JSONB functions. Perform testing activities across ETL processes and data validation tasks. Validate JSON and Parquet data, ensuring accuracy and adherence to schema standards. Use Postman for API testing and PowerShell to support automation. Contribute to continuous improvement within the Test and Validation team. Skills More ❯
other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for more information. More ❯
other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for more information. More ❯
other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for more information. More ❯
other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for more information. More ❯