as part of the development life cycle Nice to have: Databricks, Snowflake Good to have: diverse data sources and data formats (xml, json, yaml, parquet, avro, delta) and respective use cases Good to have: knowledge of version control systems such as git and CI/CD workflows and practices more »
Live Experience working with continuous integration Good understanding of data modelling (medallion architectures, Kimball architectures), data schemas (Avro Schema, JSON Schema) and data serialisation (Parquet, ORC, Avro) Azure Services Knowledge: Azure Data Factory: Advanced knowledge of Azure Data Factory for orchestrating and automating data workflows. Azure SQL Database: Expertise more »
years) Experience in: Database orchestration technologies, specifically Airflow and/or DBT Experience with streaming data architectures, specifically Kafka Knowledge of semi structured data: Parquet, Avro, JSONA deep understanding of AWS Cloud Data technologies (RDS, DynamoDB, Aurora) Knowledge and experience with Snowflake and other databases (PostgreSQL, MS SQL Server more »
Ensure systems meet business requirements and industry practices for data integrity and quality. Manage ETL and ELT pipelines across many data sources (CSV/parquet files, API endpoints, etc) Design and build data models for the business end users. Write complex SQL queries for standard as well as ad more »
tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar more »
tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar more »
tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working more »
tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working more »
No SQL databases BI tools (Tableau, Power BI etc.) Federated query tools such as Presto/Trino Data lake file formats such as Avro, Parquet, ORC Responsibilities Design the data architecture of organization to support data driven vision. Create design and blueprint of the data capabilities for the organization more »
No SQL databases BI tools (Tableau, Power BI etc.) Federated query tools such as Presto/Trino Data lake file formats such as Avro, Parquet, ORC Responsibilities Design the data architecture of organization to support data driven vision. Create design and blueprint of the data capabilities for the organization more »
service ad-hoc analysis. * Develop and manage data pipelines using Azure Synapse Analytics or Azure Data Factory. * Work with columnar storage formats such as Parquet and Delta to optimize data storage and retrieval processes. * Design, develop, and maintain Power BI reports and dashboards to meet business needs. * Implement DevOps … very important!), with the ability to effectively collaborate with cross-functional teams and customers Other skills * Experience or knowledge of columnar storage formats, especially Parquet and Delta * Familiarity with DevOps practices, particularly source control using Git. * Strong analytical and problem-solving abilities. * Ability to prioritize and manage multiple tasks more »
Employment Type: Permanent
Salary: £65000 - £70000/annum Hybrid, Health, Dental, Extra Hols
Entrepreneurial spirit and previous experience in early stage startups. Experience with Scala. Experience with query processing optimization. Understanding of Big Data file formats (e.g. Parquet, ORC, Avro). Since we are a remote company, we are open to candidates from overseas. No recruitment agencies please. more »
of data modelling (particularly star-schema) and can produce, maintain, and update relevant data models for specific business needs including bespoke serialization techniques (e.g. Parquet) and table formats (Delta etc) This is an incredibly exciting role that rarely becomes available. You will own a well-designed data lake serving more »
Chicago, Illinois, United States Hybrid / WFH Options
Request Technology - Robyn Honquest
data messaging design of data science data analytics Kafka and protocol buffers SQL no SQL tableau power bi presto/trino data lakes avro parquet ORC infrastructure technologies ServiceNow or similar. 10 years as a senior data architect data engineer DBA lead logical and conceptual data models data modelling … tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working more »