data integration.Deep understanding of ETL concepts, data warehousing principles, and data modeling techniques.Proficiency in SQL and PL/SQL with experience working on major RDBMS platforms (Oracle, SQL Server, Teradata, Snowflake, etc.Experience with performance tuning and optimization of Informatica mappings and sessions.Strong understanding of data governance, data quality, and metadata management.Familiarity with cloud-based data integration platforms (Informatica Cloud, AWS More ❯
consumption patterns leveraging tools like DBT, Airflow, Matillion, Fivetran, or Informatica. Establish architecture blueprints for multi-region, multi-tenant, and secure Snowflake deployments. Lead migration from legacy data warehouses (Teradata, Oracle, SQL Server, Redshift, BigQuery, etc.) to Snowflake. Qualifications we seek in you! Minimum qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field . More ❯
consumption patterns leveraging tools like DBT, Airflow, Matillion, Fivetran, or Informatica. Establish architecture blueprints for multi-region, multi-tenant, and secure Snowflake deployments. Lead migration from legacy data warehouses (Teradata, Oracle, SQL Server, Redshift, BigQuery, etc.) to Snowflake. Qualifications we seek in you! Minimum qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field . More ❯
and management of finance data entities, finance data models and attributes across the Finance systems landscape. Experience of working with existing finance logical data model from package vendors (e.g. Teradata and IBM) and new cloud technologies such as GCP, AWS and Azure. Understanding of the current technology and data challenges and trends in Financial Services industry. Excellent communication and presentation More ❯
skills to query and analyze customer databases using SQL or Python. Experience with BI visualization tools and technologies such as Power BI & Business Objects. Understanding of database management systems (Teradata, SQL Server, Big Data) and ETL (Extract, transform, load) framework. BS/BA in Information Systems, Statistics, Mathematics, Computer Science, Engineering or relevant field is required. Master s degree in More ❯
Technical & Analytical Skills Demonstrated experience with data integration tools (ETL/ELT), data platforms (cloud and on-premise), and emerging data technologies Prior hands-on experience with SQL, Python, Teradata, Hive, and data analysis tools Strong data profiling and data quality assessment capabilities Ability to understand metadata management, data lineage, and data cataloging concepts and tools Governance & Risk Management Solid More ❯
Technical & Analytical Skills Demonstrated experience with data integration tools (ETL/ELT), data platforms (cloud and on-premise), and emerging data technologies Prior hands-on experience with SQL, Python, Teradata, Hive, and data analysis tools Strong data profiling and data quality assessment capabilities Ability to understand metadata management, data lineage, and data cataloging concepts and tools Governance & Risk Management Solid More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Travelers Europe
Technical & Analytical Skills Demonstrated experience with data integration tools (ETL/ELT), data platforms (cloud and on-premise), and emerging data technologies Prior hands-on experience with SQL, Python, Teradata, Hive, and data analysis tools Strong data profiling and data quality assessment capabilities Ability to understand metadata management, data lineage, and data cataloging concepts and tools Governance & Risk Management Solid More ❯
and Metadata Hub. Proficiency in SQL, Unix/Linux shell scripting, and performance tuning. Familiarity with job schedulers like Control-M or similar. Experience working with RDBMS (e.g., Oracle, Teradata, DB2, PostgreSQL). Strong problem-solving and debugging skills. More ❯
performance tuning. Strong Data Analysis skills. Strong skills in transferring functional specifications and requirements into technical design. Must Have Skills Informatica (Power Exchange, Power Center) Must have knowledge in Teradata and UNIX. More ❯
Shell-Scripting: Solid working knowledge. Proficient in Excel for data analysis and reporting Oracle DBA experience. Experience with Oracle 19c, PDBs, and Oracle Enterprise Manager (OEM). Knowledge of Teradata TTU (BTEQ & TPT). More ❯
Charlotte, NC Payrange: $60-65/hr Must Have: 7+ years of professional experience in Python development with a focus on back-end technologies Strong understanding and experience with Teradata development Need to demonstrate strong SQL and database management. Familiarity with version control systems (e.g., Git) and development tools. Day to Day: Develop, test, and maintain efficient back-end components More ❯
micro-partitions, table clustering, andmaterialized views. Hands-on experience with Query Profile analysis, query optimization, time travel, and zero-copy cloning. Experience with SnowPipe, Streams, Tasks, and Data Sharing. Teradata: Proficient in writing and optimizing complex SQL queries andstored procedures. Experience in data warehousing, performance optimization, andquery tuning. Python: Skilled in developing automation scripts, data transformation workflows, andintegration with Snowflake More ❯
Job Title: Teradata DBA Location: San Jose, California Job Type: Contract Job Description 1. Database Installation and Configuration: Installing, configuring, and upgrading Teradata database systems, including software and utilities. 2. User and Security Management: Creating and managing user accounts, roles, profiles, and basic access privileges to control database access. 3. Backup and Recovery Operations: Performing routine data backups, managing recovery … into SQL query optimization, analyzing explain plans, and implementing advanced techniques (e.g., join indexes, partitioning) to enhance system performance. 8. Workload Management (TWM/TDWM): Designing, configuring, and managing Teradata Workload Manager rules to prioritize queries, manage concurrency, and ensure service level agreements (SLAs). 9. Capacity Planning and Growth Management: Analyzing historical usage patterns, forecasting future growth, and planning More ❯
help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role: Seeking a skilled Data Analyst with expertise in Teradata, Informatica ETL, and risk modeling using SAS and Python. Design and implement ETL workflows to manage data extraction, transformation, and loading into Teradata. Build and maintain risk models for credit … gather requirements and deliver data-driven solutions. Perform exploratory data analysis to uncover trends and support risk mitigation. Automate reporting and dashboard creation using Python and BI tools. Optimize Teradata queries and ETL performance for efficient data processing. Document data flows, model logic, and technical specifications for transparency. Ensure compliance with data governance and contribute to continuous improvement initiatives. Your … Profile: 3+ years of experience in data analysis and ETL development. Strong proficiency in Teradata SQL and Informatica PowerCenter. Experience building and validating risk models using SAS and Python. Solid understanding of statistical techniques and risk modeling frameworks. Familiarity with data governance and compliance standards. Excellent problem-solving and communication skills. ABOUT CAPGEMINI Capgemini is a global business and technology More ❯
help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role: Seeking a skilled Data Analyst with expertise in Teradata, Informatica ETL, and risk modeling using SAS and Python. Design and implement ETL workflows to manage data extraction, transformation, and loading into Teradata. Build and maintain risk models for credit … gather requirements and deliver data-driven solutions. Perform exploratory data analysis to uncover trends and support risk mitigation. Automate reporting and dashboard creation using Python and BI tools. Optimize Teradata queries and ETL performance for efficient data processing. Document data flows, model logic, and technical specifications for transparency. Ensure compliance with data governance and contribute to continuous improvement initiatives. Your … Profile: 3+ years of experience in data analysis and ETL development. Strong proficiency in Teradata SQL and Informatica PowerCenter. Experience building and validating risk models using SAS and Python. Solid understanding of statistical techniques and risk modeling frameworks. Familiarity with data governance and compliance standards. Excellent problem-solving and communication skills. ABOUT CAPGEMINI Capgemini is a global business and technology More ❯
Excel, PowerPoint, etc. Ability to guide multiple technology onsite/offshore team members Candidate with strong business analyst experience will also be considered Good in software development using SAS, Teradata and Unix scripting Experience in Python, Spark and Hadoop will be an advantage More ❯