using modern data architectures, such as lakehouse. Experience with CI/CD pipelines and version control systems like Git. Knowledge of ETL tools and technologies such as Apache Airflow, Informatica, or Talend. Knowledge of data governance and best practices in data management. Familiarity with cloud platforms and services such as AWS, Azure, or GCP for deploying and managing data More ❯
A proactive awareness of industry standards, regulations, and developments. Ideally, you'll also have: Experience of Relational Databases and Data Warehousing concepts. Experience of Enterprise ETL tools such as Informatica, Talend, Datastage or Alteryx. Project experience using the any of the following technologies: Hadoop, Spark, Scala, Oracle, Pega, Salesforce. Cross and multi-platform experience. Team building and leading. You More ❯
using tools like ERwin, ER/Studio, or PowerDesigner, ensuring scalability, performance, and maintainability. ETL/ELT Frameworks: Design and build robust data pipelines with Cloud Composer, Dataproc, Dataflow, Informatica, or IBM DataStage, supporting both batch and streaming data ingestion. Data Governance & Quality: Implement data governance frameworks, metadata management, and data quality controls using Unity Catalog, Profisee, Alation, DQ More ❯
from complex and disparate data sets and communicate clearly with stakeholders Hands-on experience with cloud platforms such as AWS, Azure, or GCP Familiarity with traditional ETL tools (e.g., Informatica, Talend, Pentaho, DataStage) and data warehousing concepts Strong understanding of data security, compliance , and governance best practices Experience leading or influencing cross-functional teams in a product or platform More ❯
years experience with reporting tools: Power BI, Business Objects, Tableau or OBI. Understanding of Master Data Management technology landscape, processes and design principles. Minimum 3 years of experience with Informatica MDM or any other MDM tools (both customer and product domains). Understanding of established data management and reporting technologies, and have some knowledge of columnar and NoSQL databases More ❯
re Looking For: Deep Understanding of Data Management: Proven expertise in data quality, governance, security, and metadata management. Proficiency in Data Management Tools: Strong technical skills in tools like Informatica, Collibra, Talend, and Erwin. Data Modelling and Architecture: Ability to design and implement complex data models and architectures. Analytical and Problem-Solving Skills: Proficiency in data analysis, problem-solving More ❯
creation. Strong delivery orientation, adaptability, and comfort with ambiguity. Multicultural awareness and professional integrity. Familiarity with enterprise architecture tools (e.g., LeanIX). Experience with data governance tools like Collibra, Informatica Axon/EDC. Knowledge of advanced data architecture concepts (e.g., data mesh, data fabric, domain-oriented design). Familiarity with data science and AI/ML platforms and their More ❯
Required Qualifications Bachelor's degree with at least 5 years of experience, or equivalent. In-depth knowledge and expertise in data engineering, including: Snowflake (data warehousing and performance tuning) Informatica (ETL/ELT development and orchestration) - nice to have Python (data processing and scripting) - required AWS (data services such as S3, Glue, Redshift, Lambda) - required Cloud data practices and More ❯
using Azure D&A stack, Databricks and Azure Open AI solution. Proficiency in coding (Python, PL/SQL, Shell Script) relational and non-relational databases, ETL tooling (such as Informatica), scalable data platforms. Proficiency in Azure Data and Analytics stack; working knowledge on AWS and GCP data solutions. Good understanding of deploying AI solutions in Azure Open AI, GCP More ❯
using Azure D&A stack, Databricks and Azure Open AI solution. Proficiency in coding (Python,PL/SQL, Shell Script) relational and non-relational databases, ETL tooling (such as Informatica), scalable data platforms. Proficiency in Azure Data and Analytics stack; working knowledge on AWS and GCP data solutions. Good understanding of deploying AI solutions in Azure Open AI,GCP More ❯
Experience with one or more scripting language (e.g., Python, KornShell) PREFERRED QUALIFICATIONS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or More ❯
using Azure D&A stack, Databricks and Azure Open AI solution. Proficiency in coding (Python, PL/SQL, Shell Script) relational and non-relational databases, ETL tooling (such as Informatica), scalable data platforms. Proficiency in Azure Data and Analytics stack; working knowledge on AWS and GCP data solutions. Good understanding of deploying AI solutions in Azure Open AI, GCP More ❯
knowledge of SQL and NoSQL databases (e.g., MongoDB, Cassandra). Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud). Experience with data integration and ETL tools (e.g., Talend, Informatica). Excellent analytical and technical skills. Excellent planning and organizational skills. Knowledge of all components of holistic enterprise architecture. What we offer: Colt is a growing business that is More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
the role) Advise senior client stakeholders (CDOs, CIOs, Heads of Data) on data strategy, governance, and platform modernisation Design robust ETL/ELT frameworks with Azure Data Factory , SSIS , Informatica , or IBM DataStage Lead data modelling using ERwin , ER/Studio , or PowerDesigner Implement data governance and quality frameworks using Unity Catalog , Profisee , Alation , or similar platforms Provide leadership More ❯
the role) Advise senior client stakeholders (CDOs, CIOs, Heads of Data) on data strategy, governance, and platform modernisation Design robust ETL/ELT frameworks with Azure Data Factory , SSIS , Informatica , or IBM DataStage Lead data modelling using ERwin , ER/Studio , or PowerDesigner Implement data governance and quality frameworks using Unity Catalog , Profisee , Alation , or similar platforms Provide leadership More ❯
the role) Advise senior client stakeholders (CDOs, CIOs, Heads of Data) on data strategy, governance, and platform modernisation Design robust ETL/ELT frameworks with Azure Data Factory , SSIS , Informatica , or IBM DataStage Lead data modelling using ERwin , ER/Studio , or PowerDesigner Implement data governance and quality frameworks using Unity Catalog , Profisee , Alation , or similar platforms Provide leadership More ❯
the role) Advise senior client stakeholders (CDOs, CIOs, Heads of Data) on data strategy, governance, and platform modernisation Design robust ETL/ELT frameworks with Azure Data Factory , SSIS , Informatica , or IBM DataStage Implement data governance and quality frameworks using Unity Catalog , Profisee , Alation , or similar platforms Provide leadership and mentoring to cross-functional technical teams Ideal Candidate Profile More ❯
S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions - Experience writing and optimizing SQL queries with large-scale, complex datasets - Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or More ❯
tasks Strong understanding of data design principles and dimensional data modeling Advanced SQL skills and understanding of query optimization strategies Preferred skills and experience across the following; ETL Tools –Informatica IICS, Unix/Linux Shell Scripting, SQL Server & Stored Procedures (SSMS), SSMA (SQL Server Migration Assistant), GitHub, API Integration, Alteryx, Data Visualization, Automation & Scheduling, Documentation & Communication What’s Next More ❯
Employment Type: Permanent
Salary: £60000 - £67000/annum Hybrid, Great Benefits
tasks Strong understanding of data design principles and dimensional data modeling Advanced SQL skills and understanding of query optimization strategies Preferred skills and experience across the following; ETL Tools –Informatica IICS, Unix/Linux Shell Scripting, SQL Server & Stored Procedures (SSMS), SSMA (SQL Server Migration Assistant), GitHub, API Integration, Alteryx, Data Visualization, Automation & Scheduling, Documentation & Communication What’s Next More ❯
Experience with one or more scripting language (e.g., Python, KornShell) PREFERRED QUALIFICATIONS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. - Knowledge of cloud services such as AWS or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you More ❯
in regulated financial-services (wealth/pensions a plus). Hands-on design of lakehouse or cloud-data-platform (Snowflake, BigQuery, Redshift). Proven lineage & quality frameworks (Collibra, Atlan, Informatica, or equivalent). Deep grasp of UK/EU retail-investor regs (CASS, Consumer Duty, PRIIPs, GDPR). Track record steering £20-100 m change budgets and influencing C More ❯
tasks Strong understanding of data design principles and dimensional data modeling Advanced SQL skills and understanding of query optimization strategies Preferred skills and experience across the following ETL Tools -Informatica IICS Unix/Linux Shell Scripting SQL Server & Stored Procedures (SSMS) SSMA (SQL Server Migration Assistant) GitHub API Integration Alteryx Data Visualization Automation & Scheduling Documentation & Communication What's Next More ❯
professionalism, integrity, and discretion when handling sensitive information Excellent written and spoken English Strong ability to manage multiple tasks and priorities at once Strong experience with Talend, SAP DataServices, Informatica, or similar ETL Tool. Bachelor's degree in business, Engineering, Computer Science or other related analytical or technical disciplines, or at least four (4) years related experience. More ❯
inclusive world. YOUR ROLE Assist in designing and developing ETL processes to extract, transform, and load data from various sources into data warehouses or data marts. Very good in Informatica development, setup and IDMC cloud migration Strong in writing SQL, joining between tables and compare the table data Collaborate with team members to understand data requirements and translate them … in Computer Science, Information Technology, or a related field. Basic understanding of ETL concepts and processes. Familiarity with SQL and database concepts. Knowledge of any ETL tools (such as Informatica) is a plus. Strong analytical and problem-solving skills. Good communication and teamwork skills. Eagerness to learn and adapt to new technologies and methodologies. ABOUT CAPGEMINI Capgemini is a More ❯