Ecosystem: Experience with logging/monitoring Exposure to data governance, cataloguing, and lineage tools Ability to work with a range of structured, semi-structured and unstructured file formats including Parquet, json, csv, xml, pdf, jpg. Tools and methods to develop comprehensive data reliability and active metadata solutions. Ability to work with and develop APIs (including data transformations). Ability More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
for modern data and platform technologies. Nice to Have: Experience implementing data governance and observability stacks (lineage, data contracts, quality monitoring). Knowledge of data lake formats (Delta Lake, Parquet, Iceberg, Hudi). Familiarity with containerisation and streaming technologies (Docker, Kubernetes, Kafka, Flink). Exposure to lakehouse or medallion architectures within Databricks. More ❯
cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability to work in More ❯
cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability to work in More ❯
london (city of london), south east england, united kingdom
Vertus Partners
cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability to work in More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Additional Resources Ltd
Kubernetes, Docker, and cloud-native data ecosystems. Demonstrable experience with Infrastructure as Code tools (Terraform, Ansible). Hands-on experience with PostgreSQL and familiarity with lakehouse technologies (e.g. ApacheParquet, Delta Tables). Exposure to Spark, Databricks, and data lake/lakehouse environments. Understanding of Agile development methods, CI/CD pipelines, GitHub, and automated testing. Practical experience monitoring More ❯
Northern Ireland, United Kingdom Hybrid / WFH Options
Ocho
/Sub) * Observability stacks (OpenTelemetry, Prometheus, Grafana) * IaC (Terraform), security-by-design, OAuth/OIDC, secrets management * Batch & streaming data processing (Spark/Flink/Beam) or columnar analytics (Parquet/Arrow) Career Paths (we're hiring multiple profiles): * Go Backend (API-first): Own product APIs, performance, reliability. * Go Platform (ML-first): Focus on data lake, pipelines, model serving. More ❯
Actions, Jenkins, or AWS CodePipeline). Proven ability to secure and scale production systems. Monitoring and observability tools (CloudWatch, Grafana, OpenTelemetry). Familiar with data exchange formats (JSON, YAML, Parquet) and API design. Leadership & Delivery 4-8 years in software development and/or DevOps, including 2+ in a management or team lead role. Experience in defence, aerospace, or More ❯
Actions, Jenkins, or AWS CodePipeline). Proven ability to secure and scale production systems. Monitoring and observability tools (CloudWatch, Grafana, OpenTelemetry). Familiar with data exchange formats (JSON, YAML, Parquet) and API design. Leadership & Delivery 4–8 years in software development and/or DevOps , including 2+ in a management or team lead role. Experience in defence, aerospace, or More ❯
bolton, greater manchester, north west england, united kingdom
RiskPod
Actions, Jenkins, or AWS CodePipeline). Proven ability to secure and scale production systems. Monitoring and observability tools (CloudWatch, Grafana, OpenTelemetry). Familiar with data exchange formats (JSON, YAML, Parquet) and API design. Leadership & Delivery 4–8 years in software development and/or DevOps , including 2+ in a management or team lead role. Experience in defence, aerospace, or More ❯
warrington, cheshire, north west england, united kingdom
RiskPod
Actions, Jenkins, or AWS CodePipeline). Proven ability to secure and scale production systems. Monitoring and observability tools (CloudWatch, Grafana, OpenTelemetry). Familiar with data exchange formats (JSON, YAML, Parquet) and API design. Leadership & Delivery 4–8 years in software development and/or DevOps , including 2+ in a management or team lead role. Experience in defence, aerospace, or More ❯
meet the needs of platform tools. Work with product managers to capture requirements, wireframe solutions, and design user experiences. Work with big data technologies such as Kafka, Iceberg, and Parquet, and managed database technologies including PostgreSQL and Oracle vector databases. Ensure applications are secure. Operate, monitor, and maintain associated Oracle Cloud infrastructure to ensure platform tools are highly available More ❯
in peer reviews. Collaborate with stakeholders and researchers to support analytics and product development. Integrate data from APIs, S3 buckets, and structured/unstructured sources (JSON, CSV, Excel, PDF, Parquet). Join geospatial datasets with external data sources and apply complex transformations. Define validated data schemas and create clear documentation for partners and teams. Explore and evaluate new data More ❯
Belfast, County Antrim, Northern Ireland, United Kingdom
Experis
Analyse and reconcile data, with a strong focus on SQL and Postgres JSON/JSONB functions. Perform testing activities across ETL processes and data validation tasks. Validate JSON and Parquet data, ensuring accuracy and adherence to schema standards. Use Postman for API testing and PowerShell to support automation. Contribute to continuous improvement within the Test and Validation team. Skills More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
Overview All our office locations considered: Newbury & Liverpool (UK); Šibenik, Croatia (considered) We're on the hunt for builders. Not in construction, but a designer and builder of systems for all things data related, helping us conquer the Data World. More ❯
other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for more information. More ❯
london (city of london), south east england, united kingdom
Hunter Bond
other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for more information. More ❯
other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for more information. More ❯
with 2+ years in Microsoft Fabric or related Microsoft Data Stack. Expertise in Power BI Datasets, Semantic Models, and Direct Lake optimization. Strong understanding of Lakehouse architecture, Delta/Parquet formats, and data governance tools. Experience integrating Fabric with Azure-native services (e.g., Azure AI Foundry, Purview, Entra ID). Familiarity with Generative AI, RAG-based architecture, and Fabric More ❯
with 2+ years in Microsoft Fabric or related Microsoft Data Stack. Expertise in Power BI Datasets, Semantic Models, and Direct Lake optimization. Strong understanding of Lakehouse architecture, Delta/Parquet formats, and data governance tools. Experience integrating Fabric with Azure-native services (e.g., Azure AI Foundry, Purview, Entra ID). Familiarity with Generative AI, RAG-based architecture, and Fabric More ❯
london (city of london), south east england, united kingdom
Capgemini
with 2+ years in Microsoft Fabric or related Microsoft Data Stack. Expertise in Power BI Datasets, Semantic Models, and Direct Lake optimization. Strong understanding of Lakehouse architecture, Delta/Parquet formats, and data governance tools. Experience integrating Fabric with Azure-native services (e.g., Azure AI Foundry, Purview, Entra ID). Familiarity with Generative AI, RAG-based architecture, and Fabric More ❯
Bristol, Avon, England, United Kingdom Hybrid / WFH Options
Tank Recruitment
on experience with Azure cloud technologies: Synapse, Fabric, AzureML, ADX, ADF, Azure Data Lake Storage, Event Hubs. Experience with visualisation tools such as Power BI and Streamlit. Familiarity with Parquet and Delta Parquet formats. Strong data modelling and architecture knowledge across SQL and NoSQL databases. Understanding of the software development lifecycle and ML-Ops. Knowledge of advanced statistical More ❯
this is preferred. Back End: Node/Python Node is used for application back end whereas Python is used for Data, APIs, Workflows etc. Database Postgres/DuckDB/Parquet Public cloud experience Please note: As they scale the business and build out their engineering function, they are looking for people to be onsite regularly to bring the team More ❯
this is preferred. Back End: Node/Python – Node is used for application back end whereas Python is used for Data, APIs, Workflows etc. Database – Postgres/DuckDB/Parquet Public cloud experience Please note: As they scale the business, they are looking for this person to be a Leader within the business. Therefore, want someone to be visible More ❯
this is preferred. Back End: Node/Python – Node is used for application back end whereas Python is used for Data, APIs, Workflows etc. Database – Postgres/DuckDB/Parquet Public cloud experience Please note: As they scale the business, they are looking for this person to be a Leader within the business. Therefore, want someone to be visible More ❯