implement errorhandling strategies. • Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: • Proficiency in Java and SQL. • Experience with C# and Scala is a plus. • Experience with ETL tools and big data platforms. • Knowledge of data modeling, replication, and query optimization. • Hands-on experience with SQL and NoSQL databases is desirable. • Familiarity with data warehousing solutions (e.g., Snowflake More ❯
closely with the Data Architect to collaborate on Design of our data architecture and interpret into a build plan Lead the build and maintenance of scalable data pipelines andETL processes to support data integration and analytics from a diverse range of data sources, Cloud storage, databases and APIs Deliver large-scale data processing workflows (ingestion, cleansing, transformation, validation, storage … closely with the Data Architect to collaborate on Design of our data architecture and interpret into a build plan Lead the build and maintenance of scalable data pipelines andETL processes to support data integration and analytics from a diverse range of data sources, Cloud storage, databases and APIs Deliver large-scale data processing workflows (ingestion, cleansing, transformation, validation, storage More ❯
closely with the Data Architect to collaborate on Design of our data architecture and interpret into a build plan Lead the build and maintenance of scalable data pipelines andETL processes to support data integration and analytics from a diverse range of data sources, Cloud storage, databases and APIs Deliver large-scale data processing workflows (ingestion, cleansing, transformation, validation, storage … closely with the Data Architect to collaborate on Design of our data architecture and interpret into a build plan Lead the build and maintenance of scalable data pipelines andETL processes to support data integration and analytics from a diverse range of data sources, Cloud storage, databases and APIs Deliver large-scale data processing workflows (ingestion, cleansing, transformation, validation, storage More ❯
Reading, Oxfordshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
forward thinking team that values innovation, quality, and continuous improvement. In this role, you will be responsible for: Designing and delivering data solutions using Microsoft technologies Building and maintaining ETL pipelines across complex data environments Collaborating with Architects, Engineers, and cross-functional teams to develop solutions that meet business needs Participating in Agile ceremonies and contributing to code reviews Creating … To be successful in this role, you will have: Experience working on the Microsoft Platform e.g. SQL Server, SSIS, SSRS Strong data warehousing and data modelling experience Experience creating ETL pipelines (primarily using SSIS) Experience working in regulated environments Power BI experience for reporting would be beneficial Some of the package details include: Salary of up to 60,000 Fully More ❯
Oxford, Oxfordshire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
forward thinking team that values innovation, quality, and continuous improvement. In this role, you will be responsible for: Designing and delivering data solutions using Microsoft technologies Building and maintaining ETL pipelines across complex data environments Collaborating with Architects, Engineers, and cross-functional teams to develop solutions that meet business needs Participating in Agile ceremonies and contributing to code reviews Creating … To be successful in this role, you will have: Experience working on the Microsoft Platform e.g. SQL Server, SSIS, SSRS Strong data warehousing and data modelling experience Experience creating ETL pipelines (primarily using SSIS) Experience working in regulated environments Power BI experience for reporting would be beneficial Some of the package details include: Salary of up to £60,000 Fully More ❯
team) Outside IR35 Experienced Integration Engineer needed to help design and develop data integration solutions using Informatica IPaaS and Snowflake data Lakehouse You will Design, develop and implement ELT & ETL pipelines and workflows using Informatica PowerCenter and Informatica Cloud. Build connectors from multiple sources including Cloud platforms, bespoke and out of the box APIs, databases, and flat files. Develop and … maintain data mappings, transformations, data pipelines and integration workflows. Implement real-time and batch data integration solutions to support business requirements Ensure any integration pipelines or ELT/ETL workflows are built with error and exception handling built into the pipeline/workflow. Your experience Proven experience in a similar role so you can provide advice and guidance to the … team Excellent experience in Informatica ELT & ETL development and integration solutions using Informatica as an IPaaS. SQL and relational databases (e.g., Oracle, SQL Server, PostgreSQL). Cloud based data storage solutions such as Snowflake and data lakehouse Experience with cloud platforms (e.g., AWS, Azure, GCP) and cloud-based data integration. Strong understanding of data warehousing concepts, dimensional modelling and medallion More ❯