Looker, etc.) Interest or experience in building internal data communities or enablement programs Working with diverse data sources (APIs, CRMs, SFTP, databases) and formats (Parquet, JSON, XML, CSV) Exposure to machine learning models or AI agents Why Join Us Help shape the future of data in an organization that More ❯
Herndon, Virginia, United States Hybrid / WFH Options
Maxar Technologies Holdings Inc
with Python. Demonstrated experience building & orchestrating automated, production-level data pipelines and solutions (ETL/ELT). Experience with file-based data storage, including Parquet or Iceberg. Experience with data catalogs (ex. Hive, AWS Glue). General understanding of key AWS services (e.g. EC2, S3, EKS, IAM, lambda). More ❯
serverless technologies*: Lambda, Glue, SQS, Step Functions, API Gateway, CloudFormation, S3, etc. * Expertise in *data integration* and working with varied data formats (JSON, XML, Parquet, delimited, etc.). * Experience with *cloud data warehouses* (Redshift, Azure, Snowflake). * Strong knowledge of *NoSQL* (e.g., MongoDB) and *Relational* databases (e.g., Oracle). More ❯
processing applications, secure data access tools. Experience integrating data driven applications with different data sources, for example: SQL Databases Document Databases (MongoDB, CosmosDB etc) Parquet Experience of taking different business applications and use cases and supporting their needs (query patterns etc) within appropriate data solutions, whilst maintaining data integrity More ❯
data, etc. Personal skills and experience Solid experience with Python. Able to propose and design Big Data ETLs. Knowledge of Spark, Hadoop, etc. using Parquet file format Mastering SQL queries and data models Hands-on AWS experience, with a focus on data & analytics Infrastructure automation for both cloud-based More ❯
and automation. Proficiency in building and maintaining batch and streaming ETL/ELT pipelines at scale, employing tools such as Airflow, Fivetran, Kafka, Iceberg, Parquet, Spark, Glue for developing end-to-end data orchestration leveraging on AWS services to ingest, transform and process large volumes of structured and unstructured More ❯
Starburst and Athena Kafka and Kinesis DataHub ML Flow and Airflow Docker and Terraform Kafka, Spark, Kafka Streams and KSQL DBT AWS, S3, Iceberg, Parquet, Glue and EMR for our Data Lake Elasticsearch and DynamoDB More information: Enjoy fantastic perks like private healthcare & dental insurance, a generous work from More ❯
Experience in data modelling and design patterns; in-depth knowledge of relational databases (PostgreSQL) and familiarity with data lakehouse formats (storage formats, e.g. ApacheParquet, Delta tables). Experience with Spark, Databricks, data lakes/lakehouses. Experience working with external data suppliers (defining requirements for suppliers, defining Service Level More ❯
Experience in data modelling and design patterns; in-depth knowledge of relational databases (PostgreSQL) and familiarity with data lakehouse formats (storage formats, e.g. ApacheParquet, Delta tables). Experience with Spark, Databricks, data lakes/lakehouses. Experience working with external data suppliers (defining requirements for suppliers, defining Service Level More ❯
Good understanding of cloud environments (ideally Azure), distributed computing and scaling workflows and pipelines Understanding of common data transformation and storage formats, e.g. ApacheParquet Awareness of data standards such as GA4GH ( ) and FAIR ( ). Exposure of genotyping and imputation is highly advantageous Benefits: Competitive base salary Generous Pension More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
RVU Co UK
e.g. duckdb, polars, daft). Familiarity with eventing technologies (Event Hubs, Kafka etc ). Deep understanding of file formats and their behaviour such as parquet, delta and iceberg. What we offer We want to give you a great work environment; contribute back to both your personal and professional development More ❯
solid decision-making and documentation abilities Desirable: Background in finance or trading, or a strong interest in the domain. Familiarity with AWS infrastructure (Athena, Parquet, Lakehouse architecture, etc.). Experience with Postgres, ClickHouse, and Apache NiFi. Knowledge of containerization and building Docker-based solutions. Experience with monitoring tools like More ❯
leatherhead, south east england, United Kingdom Hybrid / WFH Options
JCW
Logic Apps, ADF, Service Bus, Functions) Comfortable working with Git , Azure DevOps , and unit testing practices Knowledge of common data formats: CSV, JSON, XML, Parquet Ability to lead integration designs with minimal rework required 🧾 Preferred Qualifications 🎓 Certification in SSIS or relevant Microsoft technologies 💡 Proven track record of delivering robust More ❯
Pipelines : Design and optimize data pipelines using Azure Data Factory and Databricks Medallion Architecture : Implement Bronze, Silver, and Gold layers using formats like Delta , Parquet , and JSON for data transformation. Data Modeling : Develop and optimize data models using star schema and slowly changing dimensions for analytics and operations. Data … Proficiency with Azure Data Factory , Databricks and Azure Storage . Strong skills in SQL , Python , and data modeling techniques. Familiarity with data formats like Parquet and JSON. Experience with AI/ML model management on Azure Databricks . Education : Bachelor's degree in IT, Computer Science, or a related More ❯
Factory , Databricks , and Apache Spark to handle large-scale, real-time data. Medallion Architecture : Implement Bronze, Silver, and Gold layers using formats like Delta , Parquet , and JSON for data transformation. Data Modeling : Develop and optimize data models using star schema and slowly changing dimensions for analytics and operations. Data … Azure Data Factory , Databricks , Synapse Analytics , and Azure Storage . Strong skills in SQL , Python , and data modeling techniques. Familiarity with data formats like Parquet and JSON. Experience with AI/ML model management on Azure Databricks . Education : Bachelor's degree in IT, Computer Science, or a related More ❯
Market Insurance industry experience Using Data Modelling such as Erwin Experience with modelling methodologies including Kimball etc Usage of Data Lake formats such as Parquet and Delta Lake Strong SQL skills Rate: £600 - £700 P/D Outside IR35 Contract Duration: 6 months Location: London/WFH hybrid Start More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Arthur Recruitment
Market Insurance industry experience Using Data Modelling such as Erwin Experience with modelling methodologies including Kimball etc Usage of Data Lake formats such as Parquet and Delta Lake Strong SQL skills Rate: £600 - £700 P/D Outside IR35 Contract Duration: 6 months Location: London/WFH hybrid Start More ❯
We are looking for a Data Engineer to join our growing data engineering team at Our Future Health. The Data Engineer will bring an in-depth knowledge of NHS data and data solutions to help solve some of the key More ❯