models in Snowflake to support analytics and reporting needs. Architecture Implementation: Apply defined data architecture standards to ingestion, transformation, storage, and optimisation processes. Pipeline Development: Develop robust ELT/ETL workflows using dbt and orchestration tools, ensuring reliability and maintainability. Performance & Cost Optimisation: Configure Snowflake warehouses and implement query optimisation techniques for efficiency. Data Quality & Governance: Apply data quality checks More ❯
making. In this role, you will be responsible for: Building and managing data pipelines using Azure Data Factory and related services. Building and maintaining data lakes, data warehouses, andETL/ELT processes. Designing scalable data solutions and models for reporting in Power BI. Supporting data migration from legacy systems into the new platform. Ensuring data models are optimised for More ❯
Banbury, Oxfordshire, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
into Features, Epics, and Stories and setting engineering standards. Collaborate with the Data Architect to design and implement data architecture and build plans. Build and maintain scalable data pipelines, ETL/ELT processes, and large-scale data workflows. Optimise data systems for performance, reliability, and scalability. Implement data quality processes and maintain data models, schemas, and documentation. Operate CI/ More ❯
Hook Norton, Oxfordshire, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
into Features, Epics, and Stories and setting engineering standards. Collaborate with the Data Architect to design and implement data architecture and build plans. Build and maintain scalable data pipelines, ETL/ELT processes, and large-scale data workflows. Optimise data systems for performance, reliability, and scalability. Implement data quality processes and maintain data models, schemas, and documentation. Operate CI/ More ❯
closely with the Data Architect to collaborate on Design of our data architecture and interpret into a build plan Lead the build and maintenance of scalable data pipelines andETL processes to support data integration and analytics from a diverse range of data sources, Cloud storage, databases and APIs Deliver large-scale data processing workflows (ingestion, cleansing, transformation, validation, storage … closely with the Data Architect to collaborate on Design of our data architecture and interpret into a build plan Lead the build and maintenance of scalable data pipelines andETL processes to support data integration and analytics from a diverse range of data sources, Cloud storage, databases and APIs Deliver large-scale data processing workflows (ingestion, cleansing, transformation, validation, storage More ❯
Proficiency in Python, with a strong grasp of Python best practices Experience using Git for version control and collaboration Experience in use of ML, NLP and Foundation Models in ETL pipelines Knowledge of secure coding principles Familiarity with geospatial libraries such as GeoPandas, Shapely, and GDAL Knowledge of PostgreSQL/PostGIS for spatial data management Familiarity with machine learning frameworks More ❯