as part of the development life cycle Nice to have: Databricks, Snowflake Good to have: diverse data sources and data formats (xml, json, yaml, parquet, avro, delta) and respective use cases Good to have: knowledge of version control systems such as git and CI/CD workflows and practices more »
of data modelling (particularly star-schema) and can produce, maintain, and update relevant data models for specific business needs including bespoke serialization techniques (e.g. Parquet) and table formats (Delta etc) This is an incredibly exciting role that rarely becomes available. You will own a well-designed data lake serving more »
years) Experience in: Database orchestration technologies, specifically Airflow and/or DBT Experience with streaming data architectures, specifically Kafka Knowledge of semi structured data: Parquet, Avro, JSONA deep understanding of AWS Cloud Data technologies (RDS, DynamoDB, Aurora) Knowledge and experience with Snowflake and other databases (PostgreSQL, MS SQL Server more »
Ensure systems meet business requirements and industry practices for data integrity and quality. Manage ETL and ELT pipelines across many data sources (CSV/parquet files, API endpoints, etc) Design and build data models for the business end users. Write complex SQL queries for standard as well as ad more »
Entrepreneurial spirit and previous experience in early stage startups. Experience with Scala. Experience with query processing optimization. Understanding of Big Data file formats (e.g. Parquet, ORC, Avro). Since we are a remote company, we are open to candidates from overseas. No recruitment agencies please. more »
Live Experience working with continuous integration Good understanding of data modelling (medallion architectures, Kimball architectures), data schemas (Avro Schema, JSON Schema) and data serialisation (Parquet, ORC, Avro) Azure Services Knowledge: Azure Data Factory: Advanced knowledge of Azure Data Factory for orchestrating and automating data workflows. Azure SQL Database: Expertise more »
service ad-hoc analysis. * Develop and manage data pipelines using Azure Synapse Analytics or Azure Data Factory. * Work with columnar storage formats such as Parquet and Delta to optimize data storage and retrieval processes. * Design, develop, and maintain Power BI reports and dashboards to meet business needs. * Implement DevOps … very important!), with the ability to effectively collaborate with cross-functional teams and customers Other skills * Experience or knowledge of columnar storage formats, especially Parquet and Delta * Familiarity with DevOps practices, particularly source control using Git. * Strong analytical and problem-solving abilities. * Ability to prioritize and manage multiple tasks more »
Employment Type: Permanent
Salary: £65000 - £70000/annum Hybrid, Health, Dental, Extra Hols