London, England, United Kingdom Hybrid / WFH Options
Nadara
cloud-native designs. Experience with Inmon, Data Vault 2.0, Kimball, and dimensional modelling. Knowledge of integration patterns, ETL/ELT processes, and tools (e.g., Apache Airflow, Azure Data Factory, Informatica, Talend) to orchestrate data workflows. Familiarity with DevOps/MLOps principles, CI/CD pipelines, and infrastructure as code (e.g., Terraform, CloudFormation). Basic understanding of data security measures More ❯
such as lakehouse. Experience with CI/CD pipelines, version control systems like Git, and containerization (e.g., Docker). Experience with ETL tools and technologies such as Apache Airflow, Informatica, or Talend. Strong understanding of data governance and best practices in data management. Experience with cloud platforms and services such as AWS, Azure, or GCP for deploying and managing More ❯
using modern data architectures, such as lakehouse. Experience with CI/CD pipelines and version control systems like Git. Knowledge of ETL tools and technologies such as Apache Airflow, Informatica, or Talend. Knowledge of data governance and best practices in data management. Familiarity with cloud platforms and services such as AWS, Azure, or GCP for deploying and managing data More ❯
data platforms such as Snowflake, Azure SQL Database, Databricks, Microsoft Fabric or Azure Synapse Analytics. Demonstrated success implementing data governance programs with tools like Collibra, Alation, Microsoft Purview, or Informatica, including projects around lineage, cataloging, and quality rules. Strong hands-on development experience in SQL and Python, with working knowledge of Spark or other distributed data processing frameworks. Design More ❯
modeling (conceptual, logical, physical), metadata management, and master data management (MDM). Deep understanding of data integration, transformation, and ingestion techniques using modern tools (e.g., Azure Data Factory, Boomi, Informatica, Talend, dbt, Apache NiFi). Strong familiarity with data warehousing, data lake/lakehouse architectures, and cloud-native analytics platforms. Hands-on experience with SQL and cloud data platforms More ❯
one time, and be able to organise and priories work University degree or equivalent with strong experience in data migration , data integration or similar role Familiarity with tooling (Alteryx, Informatica, Denodo, Power Query, Power BI, Tableau) Experience Microsoft technology Ability to use JIRA or Service Now Project Management tools Understanding of the data sets within a Bank that has More ❯
data modeling (conceptual, logical, physical), metadata management, and master data management (MDM) Deep understanding of data integration, transformation, and ingestion techniques using modern tools (e.g., Azure Data Factory, Boomi, Informatica, Talend, dbt, Apache NiFi) Strong familiarity with data warehousing, data lake/lakehouse architectures, and cloud-native analytics platforms Hands-on experience with SQL and cloud data platforms (e.g. More ❯
one time, and be able to organise and priories work University degree or equivalent with strong experience in data migration , data integration or similar role Familiarity with tooling (Alteryx, Informatica, Denodo, Power Query, Power BI, Tableau) Experience Microsoft technology Ability to use JIRA or Service Now Project Management tools Understanding of the data sets within a Bank that has More ❯
London, England, United Kingdom Hybrid / WFH Options
EXL
using tools like ERwin, ER/Studio, or PowerDesigner, ensuring scalability, performance, and maintainability. ETL/ELT Frameworks: Design and build robust data pipelines with Cloud Composer, Dataproc, Dataflow, Informatica, or IBM DataStage, supporting both batch and streaming data ingestion. Data Governance & Quality: Implement data governance frameworks, metadata management, and data quality controls using Unity Catalog, Profisee, Alation, DQ More ❯
Data Engineer (Informatica/Teradata/Datawarehouse) page is loaded Data Engineer (Informatica/Teradata/Datawarehouse) Apply locations Two PNC Plaza (PA374) Birmingham - Brock (AL112) Dallas Innovation Center - Luna Rd (TX270) Strongsville Technology Center (OH537) time type Full time posted on Posted 2 Days Ago job requisition id R184257 Position Overview At PNC, our people are our … data quality and integrity at all times - Stay current with industry trends, technologies and best practices to continuously improve our data ecosystem Skills- - Experience with data pipelines and ETL - informatica - Experience in SQL and database management systems - Knowledge of data modelling , warehousing concepts , and ETL processes - Experience with big data technologies and frameworks such as Hadoop, Hive, Spark. Programming … California Consumer Privacy Act Privacy Notice to gain understanding of how PNC may use or disclose your personal information in our hiring practices. Similar Jobs (5) Data Engineer Sr - Informatica ETL Expert locations 4 Locations time type Full time posted on Posted 8 Days Ago Software Engineer Lead - Informatica/Database Tech/Data Solutions locations 4 Locations More ❯
query optimization. Experience designing large-scale, distributed data systems in hybrid or cloud environments (e.g., Azure SQL, AWS RDS). Strong knowledge of ETL frameworks and integration tools (SSIS, Informatica, Azure Data Factory). Understanding of CI/CD pipelines and automation in database deployments. Excellent problem-solving, communication, and leadership skills. Preferred Qualifications: Cloud certification (Azure Data Engineer More ❯
cloud environments using Azure D&A stack, Databricks, and Azure Open AI. Proficiency in coding (Python, PL/SQL, Shell Script), relational and non-relational databases, ETL tools (e.g., Informatica), and scalable data platforms. Knowledge of Azure Data and Analytics stack; familiarity with AWS and GCP data solutions. Experience deploying AI solutions on Azure Open AI, GCP, and AWS More ❯
using Azure D&A stack, Databricks, and Azure Open AI solutions. Proficiency in coding (Python, PL/SQL, Shell Script), relational and non-relational databases, ETL tooling (such as Informatica), and scalable data platforms. Proficiency in Azure Data and Analytics stack; working knowledge of AWS and GCP data solutions. Good understanding of deploying AI solutions in Azure Open AI More ❯
A proactive awareness of industry standards, regulations, and developments. Ideally, you'll also have: Experience of Relational Databases and Data Warehousing concepts. Experience of Enterprise ETL tools such as Informatica, Talend, Datastage or Alteryx. Project experience using the any of the following technologies: Hadoop, Spark, Scala, Oracle, Pega, Salesforce. Cross and multi-platform experience. Team building and leading. You More ❯
Data Factory), Azure DB, Azure Synapse, Azure Data lake and Azure Monitor providing added flexibility for diverse migration and integration projects. Prior experience with tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or Snowflake Schema). Skilled More ❯
A proactive awareness of industry standards, regulations, and developments. Ideally, you’ll also have: Experience of Relational Databases and Data Warehousing concepts. Experience of Enterprise ETL tools such as Informatica, Talend, Datastage or Alteryx. Project experience using the any of the following technologies: Hadoop, Spark, Scala, Oracle, Pega, Salesforce. Cross and multi-platform experience. Financial services sector experience. Line More ❯
develop along the way. A relevant data or AI qualification e.g. CDMC, DAMA, DCAM, ISACA Experience with data and AI platforms and products e.g. Snowflake, Databricks, Microsoft Purview, Collibra, Informatica or AWS Experience of implementing and refining data and AI analytics tools to automate internal audit processes, enhancing efficiency and accuracy. Hands-on experience with data and AI analytics More ❯
from complex and disparate data sets and communicate clearly with stakeholders Hands-on experience with cloud platforms such as AWS, Azure, or GCP Familiarity with traditional ETL tools (e.g., Informatica, Talend, Pentaho, DataStage) and data warehousing concepts Strong understanding of data security, compliance , and governance best practices Experience leading or influencing cross-functional teams in a product or platform More ❯
e.g., Hadoop, Spark, Kafka). • Familiarity with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud). • Expertise in ETL tools and processes (e.g., Apache NiFi, Talend, Informatica). • Proficiency in data integration tools and technologies. • Familiarity with data visualization and reporting tools (e.g., Tableau, Power BI) is a plus. • Deep understanding of data governance frameworks and More ❯
in data modelling (both structured and unstructured data) working directly with the business & data scientists. Ability to optimise data solutions for performance, scalability, and efficiency. Highly Desirable: Experience with Informatica ETL, Hyperion Reporting, and intermediate/advanced PL/SQL. Desirable Experience in a financial corporation Lake House/Delta Lake and Snowflake Experience with Spark clusters, both elastic More ❯
London, England, United Kingdom Hybrid / WFH Options
Sentara
products, stewards, critical data elements, and quality thresholds. Coordinate with data engineers and quality developers to set up monitoring across source systems. Monitor data quality reports using platforms like Informatica, Tableau, or PowerBI. Manage operational support for data quality monitoring, root cause analysis, and remediation. Lead data stewardship practice rollout, including training and content development. Support policy management in … specific requirements. Experience Minimum 5+ years in healthcare or health plan environments. Extensive experience with SQL Server, Databricks, Azure Synapse (5+ years). Experience with data quality platforms like Informatica, SAS Dataflux, etc. (5+ years). Experience with ServiceNow, JIRA, or Azure DevOps. At least 3+ years in business analysis, project management, or training roles. Experience with Service Management More ❯
years experience with reporting tools: Power BI, Business Objects, Tableau or OBI. Understanding of Master Data Management technology landscape, processes and design principles. Minimum 3 years of experience with Informatica MDM or any other MDM tools (both customer and product domains). Understanding of established data management and reporting technologies, and have some knowledge of columnar and NoSQL databases More ❯
platforms (e.g., OpenTWINS, DXC Assure, Sequel, IRIS). Knowledge of BI/MI tooling (e.g., Power BI, Tableau, Qlik). Familiarity with data warehouse technologies (e.g., SQL, Snowflake, Azure, Informatica, etc.). Exposure to Agile delivery and use of tools such as Jira or Azure DevOps. More ❯
re Looking For: Deep Understanding of Data Management: Proven expertise in data quality, governance, security, and metadata management. Proficiency in Data Management Tools: Strong technical skills in tools like Informatica, Collibra, Talend, and Erwin. Data Modelling and Architecture: Ability to design and implement complex data models and architectures. Analytical and Problem-Solving Skills: Proficiency in data analysis, problem-solving More ❯
re Looking For: Deep Understanding of Data Management: Proven expertise in data quality, governance, security, and metadata management. Proficiency in Data Management Tools: Strong technical skills in tools like Informatica, Collibra, Talend, and Erwin. Data Modelling and Architecture: Ability to design and implement complex data models and architectures. Analytical and Problem-Solving Skills: Proficiency in data analysis, problem-solving More ❯