Must Have Technical Skills: Level 4 - 6+ years (will not have this with Apache Iceberg as it is newer) • Open Source Development • Open Source Software • Data Formats: Apache Iceberg, Parquet, ORC • Catalogs: JDBC, Nessie, Polaris • Programming Languages: Python, Scala, Java • Data Processing Frameworks: Spark, Trino, Flink • Databases: SQL, NoSQL (e.g., Cassandra, DynamoDB) • Workflow Orchestration: Apache Airflow, Apache Kafka Education More ❯
Belfast, County Antrim, Northern Ireland, United Kingdom
Experis
Analyse and reconcile data, with a strong focus on SQL and Postgres JSON/JSONB functions. Perform testing activities across ETL processes and data validation tasks. Validate JSON and Parquet data, ensuring accuracy and adherence to schema standards. Use Postman for API testing and PowerShell to support automation. Contribute to continuous improvement within the Test and Validation team. Skills More ❯
analytic datasets. Ideally you have experience of Python and associated common data processing packages. You will have used these to process data from databases or other sources, for example, parquet data stored in a data lake You're passionate about the tools you use to make development as efficient as possible - for example, you may have experience with git More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
Overview All our office locations considered: Newbury & Liverpool (UK); Šibenik, Croatia (considered) We're on the hunt for builders. Not in construction, but a designer and builder of systems for all things data related, helping us conquer the Data World. More ❯
other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for more information. More ❯
other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for more information. More ❯
other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for more information. More ❯
other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for more information. More ❯
other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for more information. More ❯
other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for more information. More ❯
london (city of london), south east england, united kingdom
Hunter Bond
other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly desirable Please apply ASAP for more information. More ❯
to quants and researchers What they are looking for: Experience in distributed data engineering at scale Strong coding skills in Python, C++ or Java Familiarity with kdb+, ClickHouse, Kafka, Parquet/Arrow Previous experience with tick-level financial data is a strong advantage More ❯
to quants and researchers What they are looking for: Experience in distributed data engineering at scale Strong coding skills in Python, C++ or Java Familiarity with kdb+, ClickHouse, Kafka, Parquet/Arrow Previous experience with tick-level financial data is a strong advantage More ❯
with 2+ years in Microsoft Fabric or related Microsoft Data Stack. Expertise in Power BI Datasets, Semantic Models, and Direct Lake optimization. Strong understanding of Lakehouse architecture, Delta/Parquet formats, and data governance tools. Experience integrating Fabric with Azure-native services (e.g., Azure AI Foundry, Purview, Entra ID). Familiarity with Generative AI, RAG-based architecture, and Fabric More ❯
with 2+ years in Microsoft Fabric or related Microsoft Data Stack. Expertise in Power BI Datasets, Semantic Models, and Direct Lake optimization. Strong understanding of Lakehouse architecture, Delta/Parquet formats, and data governance tools. Experience integrating Fabric with Azure-native services (e.g., Azure AI Foundry, Purview, Entra ID). Familiarity with Generative AI, RAG-based architecture, and Fabric More ❯
with 2+ years in Microsoft Fabric or related Microsoft Data Stack. Expertise in Power BI Datasets, Semantic Models, and Direct Lake optimization. Strong understanding of Lakehouse architecture, Delta/Parquet formats, and data governance tools. Experience integrating Fabric with Azure-native services (e.g., Azure AI Foundry, Purview, Entra ID). Familiarity with Generative AI, RAG-based architecture, and Fabric More ❯
with 2+ years in Microsoft Fabric or related Microsoft Data Stack. Expertise in Power BI Datasets, Semantic Models, and Direct Lake optimization. Strong understanding of Lakehouse architecture, Delta/Parquet formats, and data governance tools. Experience integrating Fabric with Azure-native services (e.g., Azure AI Foundry, Purview, Entra ID). Familiarity with Generative AI, RAG-based architecture, and Fabric More ❯
london (city of london), south east england, united kingdom
Capgemini
with 2+ years in Microsoft Fabric or related Microsoft Data Stack. Expertise in Power BI Datasets, Semantic Models, and Direct Lake optimization. Strong understanding of Lakehouse architecture, Delta/Parquet formats, and data governance tools. Experience integrating Fabric with Azure-native services (e.g., Azure AI Foundry, Purview, Entra ID). Familiarity with Generative AI, RAG-based architecture, and Fabric More ❯
Bristol, Avon, England, United Kingdom Hybrid / WFH Options
Tank Recruitment
on experience with Azure cloud technologies: Synapse, Fabric, AzureML, ADX, ADF, Azure Data Lake Storage, Event Hubs. Experience with visualisation tools such as Power BI and Streamlit. Familiarity with Parquet and Delta Parquet formats. Strong data modelling and architecture knowledge across SQL and NoSQL databases. Understanding of the software development lifecycle and ML-Ops. Knowledge of advanced statistical More ❯
this is preferred. Back End: Node/Python – Node is used for application back end whereas Python is used for Data, APIs, Workflows etc. Database – Postgres/DuckDB/Parquet Public cloud experience Please note: As they scale the business and build out their engineering function, they are looking for people to be onsite regularly to bring the team More ❯
this is preferred. Back End: Node/Python Node is used for application back end whereas Python is used for Data, APIs, Workflows etc. Database Postgres/DuckDB/Parquet As they scale the business, they are looking for this person to be a Leader within the business. Therefore, want someone to be visible and in the office. Especially More ❯
this is preferred. Back End: Node/Python – Node is used for application back end whereas Python is used for Data, APIs, Workflows etc. Database – Postgres/DuckDB/Parquet Public cloud experience Please note: As they scale the business, they are looking for this person to be a Leader within the business. Therefore, want someone to be visible More ❯
this is preferred. Back End: Node/Python – Node is used for application back end whereas Python is used for Data, APIs, Workflows etc. Database – Postgres/DuckDB/Parquet Public cloud experience Please note: As they scale the business, they are looking for this person to be a Leader within the business. Therefore, want someone to be visible More ❯
this is preferred. Back End: Node/Python – Node is used for application back end whereas Python is used for Data, APIs, Workflows etc. Database – Postgres/DuckDB/Parquet Public cloud experience Please note: As they scale the business, they are looking for this person to be a Leader within the business. Therefore, want someone to be visible More ❯