and Lambda IAM - Experience handling IAM resource permissions Networking - fundamental understanding of VPC, subnet routing and gateways Storage - strong understanding of S3, EBS and Parquet Databases - RDS, DynamoDB Experience doing cost estimation in Cost Explorer and planning efficiency changes Terraform and containerisation experience Understanding of a broad range of More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
RVU Co UK
e.g. duckdb, polars, daft). Familiarity with eventing technologies (Event Hubs, Kafka etc ). Deep understanding of file formats and their behaviour such as parquet, delta and iceberg. What we offer We want to give you a great work environment; contribute back to both your personal and professional development More ❯
Java Experience with full lifecycle agile software development projects Desired skills: Experience with Python. Experience building data products in Apache Avro and/or Parquet On-the-job experience with Java software development. Experience deploying the complete DevOps Lifecycle including integration of build pipelines, automated deployments, and compliance scanning More ❯
new technologies and frameworks Nice to have: Knowledge of databases, SQL Familiarity with Boost ASIO Familiarity with data serialization formats such Apache Arrow/Parquet, Google Protocol Buffers, Flatbuffers Contra Experience with gRPC, http/REST and Websocket protocols Experience with Google Cloud/AWS and/or containerization More ❯
new technologies and frameworks Nice to have: Knowledge of databases, SQL Familiarity with Boost ASIO Familiarity with data serialization formats such Apache Arrow/Parquet, Google Protocol Buffers, Flatbuffers Contra Experience with gRPC, http/REST and Websocket protocols Experience with Google Cloud/AWS and/or containerization More ❯
new technologies and frameworks Nice to have: Knowledge of databases, SQL Familiarity with Boost ASIO Familiarity with data serialization formats such Apache Arrow/Parquet, Google Protocol Buffers, Flatbuffers Contra Experience with gRPC, http/REST and Websocket protocols Experience with Google Cloud/AWS and/or containerization More ❯
/data structures in resilient SQL and NoSQL databases (e.g. CockroachDB ) Experience with gRPC Knowledge of data serialisation formats (e.g. Google Protocol Buffers and Parquet) Experience with caching technologies, e.g. Redis Experience with Infrastructure as Code software, e.g. Terraform Good knowledge of the Unix/Linux Operating System and More ❯
/ML to extract, format, and expose in indexed search tools relevant content such as raw text, multimedia (audio, image, video, document), tabular (CSV, Parquet, Avro) or nested (JSON, JSONL, XML), and other structured/unstructured data types. Data is expected to be of varying formats, schemas, and structures. More ❯
leatherhead, south east england, United Kingdom Hybrid / WFH Options
JCW
Logic Apps, ADF, Service Bus, Functions) Comfortable working with Git , Azure DevOps , and unit testing practices Knowledge of common data formats: CSV, JSON, XML, Parquet Ability to lead integration designs with minimal rework required 🧾 Preferred Qualifications 🎓 Certification in SSIS or relevant Microsoft technologies 💡 Proven track record of delivering robust More ❯
guildford, south east england, United Kingdom Hybrid / WFH Options
JCW
Logic Apps, ADF, Service Bus, Functions) Comfortable working with Git , Azure DevOps , and unit testing practices Knowledge of common data formats: CSV, JSON, XML, Parquet Ability to lead integration designs with minimal rework required 🧾 Preferred Qualifications 🎓 Certification in SSIS or relevant Microsoft technologies 💡 Proven track record of delivering robust More ❯
Pipelines : Design and optimize data pipelines using Azure Data Factory and Databricks Medallion Architecture : Implement Bronze, Silver, and Gold layers using formats like Delta , Parquet , and JSON for data transformation. Data Modeling : Develop and optimize data models using star schema and slowly changing dimensions for analytics and operations. Data … Proficiency with Azure Data Factory , Databricks and Azure Storage . Strong skills in SQL , Python , and data modeling techniques. Familiarity with data formats like Parquet and JSON. Experience with AI/ML model management on Azure Databricks . Education : Bachelor's degree in IT, Computer Science, or a related More ❯
store data effectively. The ideal candidate will have experience with Postgres (UDFs, SQL Triggers), Spark, Ray, ElasticSearch, Python, AWS S3, and structured data formats (Parquet, Avro, JSON Schema). This hybrid role requires candidates to be local to Maryland, Northern Virginia, or the DC Metro area for occasional on … including UDFs and Triggers. Expertise in ETL processes, data modeling, and data transformation. Proficiency in working with structured and semi-structured data formats (Parquet, Avro, JSON Schema) along with various schema description formats. Skilled in Python for scripting and data manipulation. Familiarity with other languages such as Java and More ❯
and others. • Proficiency in technologies in the Apache Hadoop ecosystem, especially Hive, Impala and Ranger • Experience working with open file and table formats such Parquet, AVRO, ORC, Iceberg and Delta Lake • Extensive knowledge of automation and software development tools and methodologies. • Excellent working knowledge of Linux. Good working networking More ❯
Useful: Experience engineering systems for the cloud-based storage and processing of large datasets or using frameworks such as Spark and open standards like Parquet and Arrow. Experience working with other cloud and related technologies, such as Docker and Kubernetes. Experience working as a software developer in the quantitative More ❯
to write clear documented code. • Experience with Python and C code • Experience with storing, retrieving and processing data in scientific databases (SQL, Pandas, HDF5, Parquet) • Good reporting skills • Fluent in written and spoken English (junior or senior profiles can apply for the position) Desired Skills • Statistical modelling and error More ❯
We are looking for a Data Engineer to join our growing data engineering team at Our Future Health. The Data Engineer will bring an in-depth knowledge of NHS data and data solutions to help solve some of the key More ❯