Ensure systems meet business requirements and industry practices for data integrity and quality. Manage ETL and ELT pipelines across many data sources (CSV/parquet files, API endpoints, etc) Design and build data models for the business end users. Write complex SQL queries for standard as well as ad more »
service ad-hoc analysis. * Develop and manage data pipelines using Azure Synapse Analytics or Azure Data Factory. * Work with columnar storage formats such as Parquet and Delta to optimize data storage and retrieval processes. * Design, develop, and maintain Power BI reports and dashboards to meet business needs. * Implement DevOps … very important!), with the ability to effectively collaborate with cross-functional teams and customers Other skills * Experience or knowledge of columnar storage formats, especially Parquet and Delta * Familiarity with DevOps practices, particularly source control using Git. * Strong analytical and problem-solving abilities. * Ability to prioritize and manage multiple tasks more »
Employment Type: Permanent
Salary: £65000 - £70000/annum Hybrid, Health, Dental, Extra Hols
new technologies and frameworks Nice to have: Knowledge of databases, SQL Familiarity with Boost ASIO Familiarity with data serialization formats such Apache Arrow/Parquet, Google Protocol Buffers, Flatbuffers Experience with gRPC, http/REST and Websocket protocols Experience with Google Cloud/AWS and/or containerization in more »
of data modelling (particularly star-schema) and can produce, maintain, and update relevant data models for specific business needs including bespoke serialization techniques (e.g. Parquet) and table formats (Delta etc) This is an incredibly exciting role that rarely becomes available. You will own a well-designed data lake serving more »