new technologies and frameworks Nice to have: Knowledge of databases, SQL Familiarity with Boost ASIO Familiarity with data serialization formats such Apache Arrow/Parquet, Google Protocol Buffers, Flatbuffers Contra Experience with gRPC, http/REST and Websocket protocols Experience with Google Cloud/AWS and/or containerization More ❯
Good understanding of cloud environments (ideally Azure), distributed computing and scaling workflows and pipelines Understanding of common data transformation and storage formats, e.g. ApacheParquet Awareness of data standards such as GA4GH ( ) and FAIR ( ). Exposure of genotyping and imputation is highly advantageous Benefits: Competitive base salary Generous Pension More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
RVU Co UK
e.g. duckdb, polars, daft). Familiarity with eventing technologies (Event Hubs, Kafka etc ). Deep understanding of file formats and their behaviour such as parquet, delta and iceberg. What we offer We want to give you a great work environment; contribute back to both your personal and professional development More ❯
and Lambda IAM - Experience handling IAM resource permissions Networking - fundamental understanding of VPC, subnet routing and gateways Storage - strong understanding of S3, EBS and Parquet Databases - RDS, DynamoDB Experience doing cost estimation in Cost Explorer and planning efficiency changes Terraform and containerisation experience Understanding of a broad range of More ❯
particularly within the Kafka ecosystem. What Gives You an Edge: Extensive experience with modern data architectures, including Data Warehouses, Lakehouses, data formats (e.g., Avro, Parquet), and cloud-native platforms (AWS, GCP, Azure). Expertise in integrating Kafka-based solutions with cloud services and enterprise data ecosystems. Demonstrated success designing More ❯
and delivering customer proposals aligned with Analytics Solutions. Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, HCatalog, Solr, Avro, Parquet, Iceberg, Hudi). Experience developing software and data engineering code in one or more programming languages (Java, Python, PySpark, Node, etc). AWS and More ❯
solid decision-making and documentation abilities Desirable: Background in finance or trading, or a strong interest in the domain. Familiarity with AWS infrastructure (Athena, Parquet, Lakehouse architecture, etc.). Experience with Postgres, ClickHouse, and Apache NiFi. Knowledge of containerization and building Docker-based solutions. Experience with monitoring tools like More ❯
/ML to extract, format, and expose in indexed search tools relevant content such as raw text, multimedia (audio, image, video, document), tabular (CSV, Parquet, Avro) or nested (JSON, JSONL, XML), and other structured/unstructured data types. Data is expected to be of varying formats, schemas, and structures. More ❯
leatherhead, south east england, United Kingdom Hybrid / WFH Options
JCW
Logic Apps, ADF, Service Bus, Functions) Comfortable working with Git , Azure DevOps , and unit testing practices Knowledge of common data formats: CSV, JSON, XML, Parquet Ability to lead integration designs with minimal rework required 🧾 Preferred Qualifications 🎓 Certification in SSIS or relevant Microsoft technologies 💡 Proven track record of delivering robust More ❯
Pipelines : Design and optimize data pipelines using Azure Data Factory and Databricks Medallion Architecture : Implement Bronze, Silver, and Gold layers using formats like Delta , Parquet , and JSON for data transformation. Data Modeling : Develop and optimize data models using star schema and slowly changing dimensions for analytics and operations. Data … Proficiency with Azure Data Factory , Databricks and Azure Storage . Strong skills in SQL , Python , and data modeling techniques. Familiarity with data formats like Parquet and JSON. Experience with AI/ML model management on Azure Databricks . Education : Bachelor's degree in IT, Computer Science, or a related More ❯
ETL workflows and cloud data warehouse development. Experience participating in RFI/RFP processes and crafting technical proposals. Strong familiarity with data formats (JSON, Parquet, etc.) and RDBMS. Eligible for SC clearance (public sector experience is a strong plus). Nice-to-Have: Experience with big data tools (e.g. More ❯
Factory , Databricks , and Apache Spark to handle large-scale, real-time data. Medallion Architecture : Implement Bronze, Silver, and Gold layers using formats like Delta , Parquet , and JSON for data transformation. Data Modeling : Develop and optimize data models using star schema and slowly changing dimensions for analytics and operations. Data … Azure Data Factory , Databricks , Synapse Analytics , and Azure Storage . Strong skills in SQL , Python , and data modeling techniques. Familiarity with data formats like Parquet and JSON. Experience with AI/ML model management on Azure Databricks . Education : Bachelor's degree in IT, Computer Science, or a related More ❯
Functions, Datastore, and Cloud Spanner. Experience with message queues (e.g., RabbitMQ) and event-driven patterns. Hands-on experience with data serialization formats (e.g., Avro, Parquet, JSON) and schema registries. Strong understanding of DevOps and CI/CD pipelines for data streaming solutions. Familiarity with containerization and orchestration tools Excellent More ❯
and others. • Proficiency in technologies in the Apache Hadoop ecosystem, especially Hive, Impala and Ranger • Experience working with open file and table formats such Parquet, AVRO, ORC, Iceberg and Delta Lake • Extensive knowledge of automation and software development tools and methodologies. • Excellent working knowledge of Linux. Good working networking More ❯
Market Insurance industry experience Using Data Modelling such as Erwin Experience with modelling methodologies including Kimball etc Usage of Data Lake formats such as Parquet and Delta Lake Strong SQL skills Rate: £600 - £700 P/D Outside IR35 Contract Duration: 6 months Location: London/WFH hybrid Start More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Arthur Recruitment
Market Insurance industry experience Using Data Modelling such as Erwin Experience with modelling methodologies including Kimball etc Usage of Data Lake formats such as Parquet and Delta Lake Strong SQL skills Rate: £600 - £700 P/D Outside IR35 Contract Duration: 6 months Location: London/WFH hybrid Start More ❯
to write clear documented code. • Experience with Python and C code • Experience with storing, retrieving and processing data in scientific databases (SQL, Pandas, HDF5, Parquet) • Good reporting skills • Fluent in written and spoken English (junior or senior profiles can apply for the position) Desired Skills • Statistical modelling and error More ❯
multi-threading, concurrency, etc. Fluency in C++ and/or Java. Experience working with text or semi-structured data (i.e. JSON, XML, ORC, Avro, Parquet, etc.). BS in Computer Science or a related field; Masters or PhD preferred. Snowflake is growing fast, and we're scaling our team More ❯
We are looking for a Data Engineer to join our growing data engineering team at Our Future Health. The Data Engineer will bring an in-depth knowledge of NHS data and data solutions to help solve some of the key More ❯
Integration Engineer (SSIS) - Leatherhead (Hybrid) - £40,000-£50,000 My client is a UK based consutruction company. They are searching for an experienced Integration Engineer to support the Integration Manager in managing, maintaining, and developing our application integration solutions. This More ❯