data integration.Deep understanding of ETL concepts, data warehousing principles, and data modeling techniques.Proficiency in SQL and PL/SQL with experience working on major RDBMS platforms (Oracle, SQL Server, Teradata, Snowflake, etc.Experience with performance tuning and optimization of Informatica mappings and sessions.Strong understanding of data governance, data quality, and metadata management.Familiarity with cloud-based data integration platforms (Informatica Cloud, AWS More ❯
self-serve analytics solutions for business teams using SQL, R, and Excel; Designing and operationalizing data transformation workflows for structured and unstructured datasets across multiple systems including Oracle and Teradata; Validate data integrity, perform quality assurance checks, and troubleshoot anomalies across large datasets; Gathering business requirements, defining KPIs, and translating stakeholder needs into actionable analytics and reporting solutions; Database maintenance More ❯
SQL, Python, Java and/or Spark to build, operate and maintain data analytics solutions Extensive knowledge of and experience with large-scale database technology (e.g. Snowflake, Netezza, Exadata, Teradata, Greenplum, etc.) Proficiency in implementing data security measures, access controls, and design specifically within the Snowflake platform Required Skills - FinOps Implementation A STRONG CANDIDATE WILL ADDITIONALLY HAVE : Experience in the More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Travelers Europe
Technical & Analytical Skills Demonstrated experience with data integration tools (ETL/ELT), data platforms (cloud and on-premise), and emerging data technologies Prior hands-on experience with SQL, Python, Teradata, Hive, and data analysis tools Strong data profiling and data quality assessment capabilities Ability to understand metadata management, data lineage, and data cataloging concepts and tools Governance & Risk Management Solid More ❯
Technical & Analytical Skills Demonstrated experience with data integration tools (ETL/ELT), data platforms (cloud and on-premise), and emerging data technologies Prior hands-on experience with SQL, Python, Teradata, Hive, and data analysis tools Strong data profiling and data quality assessment capabilities Ability to understand metadata management, data lineage, and data cataloging concepts and tools Governance & Risk Management Solid More ❯
skills to query and analyze customer databases using SQL or Python. Experience with BI visualization tools and technologies such as Power BI & Business Objects. Understanding of database management systems (Teradata, SQL Server, Big Data) and ETL (Extract, transform, load) framework. BS/BA in Information Systems, Statistics, Mathematics, Computer Science, Engineering or relevant field is required. Master s degree in More ❯
understanding of data warehousing concepts, data modeling, and database design.Experience with performance tuning, error handling, and debugging ETL processes.Strong SQL skills and experience working with relational databases (e.g., Oracle, Teradata, or SQL ServerExcellent communication and problem-solving skills.Preferred Skills (Nice to Have):Experience with big data platforms or cloud-based ETL tools.Knowledge of CI/CD processes and version control More ❯
and Metadata Hub. Proficiency in SQL, Unix/Linux shell scripting, and performance tuning. Familiarity with job schedulers like Control-M or similar. Experience working with RDBMS (e.g., Oracle, Teradata, DB2, PostgreSQL). Strong problem-solving and debugging skills. More ❯
Saint Paul, Minnesota, United States Hybrid/Remote Options
Genesis10
SAFe Agile Framework Preferred Qualifications 5 or more years of work experience in data modeling and database design Experience working with industry and reference data models (HL7, FHIR, NCPDP, Teradata Healthcare Data Model, IBM Healthcare Data Model, Pega Foundation for Healthcare, Salesforce Health Cloud, etc.) Experience with relational, document, message and business event data stores and technologies Experience working with More ❯
performance tuning. Strong Data Analysis skills. Strong skills in transferring functional specifications and requirements into technical design. Must Have Skills Informatica (Power Exchange, Power Center) Must have knowledge in Teradata and UNIX. More ❯
Shell-Scripting: Solid working knowledge. Proficient in Excel for data analysis and reporting Oracle DBA experience. Experience with Oracle 19c, PDBs, and Oracle Enterprise Manager (OEM). Knowledge of Teradata TTU (BTEQ & TPT). More ❯
quickly learn technology and eager to stretch their skills Proven ability to lead complex projects to successful, on-time completion. Eager to learn other technolgies such as MySQL and Teradata Required Skills & Qualifications: Lead the design, implementation, and maintenance of the organization's overall database strategy Define and drive a strategy for database performance. Proactively monitor usage trends and forecast More ❯
Charlotte, NC Payrange: $60-65/hr Must Have: 7+ years of professional experience in Python development with a focus on back-end technologies Strong understanding and experience with Teradata development Need to demonstrate strong SQL and database management. Familiarity with version control systems (e.g., Git) and development tools. Day to Day: Develop, test, and maintain efficient back-end components More ❯
micro-partitions, table clustering, andmaterialized views. Hands-on experience with Query Profile analysis, query optimization, time travel, and zero-copy cloning. Experience with SnowPipe, Streams, Tasks, and Data Sharing. Teradata: Proficient in writing and optimizing complex SQL queries andstored procedures. Experience in data warehousing, performance optimization, andquery tuning. Python: Skilled in developing automation scripts, data transformation workflows, andintegration with Snowflake More ❯
Role: Senior Data Engineer (Ab Initio, Teradata, Modern ETL) Location: Addison, TX Duration: Long Term Contract ONLY USC/GC Experienced in developing and supporting Ab Initio and Teradata based complex Data Warehousing applications as well as the ability to upskill to open source based modern ETL technologies like Python, Spark, Java etc Should have expertise in full SDLC for … techniques and with all components of Ab Initio such as Scan, join, Partition by key, Partition by Round Robin, Gather, Merge, Interleave, Lookup etc. 3. Experience and expertise in Teradata, PL/SQL and Unix shell scripting. 4. Financial services experience is a plus and data warehousing experience is required. 5. The candidate should have a very Robust technical experience … these 9. Work closely with Quality Control teams to deliver quality software to agreed project schedules. Looking for candidates who have experience in developing and supporting Ab Initio and Teradata based complex Data Warehousing applications as well as the ability to upskill to open source based modern technologies. Should have expertise in full SDLC for Data Management apps and be More ❯
Job Title: Teradata DBA Location: San Jose, California Job Type: Contract Job Description 1. Database Installation and Configuration: Installing, configuring, and upgrading Teradata database systems, including software and utilities. 2. User and Security Management: Creating and managing user accounts, roles, profiles, and basic access privileges to control database access. 3. Backup and Recovery Operations: Performing routine data backups, managing recovery … into SQL query optimization, analyzing explain plans, and implementing advanced techniques (e.g., join indexes, partitioning) to enhance system performance. 8. Workload Management (TWM/TDWM): Designing, configuring, and managing Teradata Workload Manager rules to prioritize queries, manage concurrency, and ensure service level agreements (SLAs). 9. Capacity Planning and Growth Management: Analyzing historical usage patterns, forecasting future growth, and planning More ❯
help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role: Seeking a skilled Data Analyst with expertise in Teradata, Informatica ETL, and risk modeling using SAS and Python. Design and implement ETL workflows to manage data extraction, transformation, and loading into Teradata. Build and maintain risk models for credit … gather requirements and deliver data-driven solutions. Perform exploratory data analysis to uncover trends and support risk mitigation. Automate reporting and dashboard creation using Python and BI tools. Optimize Teradata queries and ETL performance for efficient data processing. Document data flows, model logic, and technical specifications for transparency. Ensure compliance with data governance and contribute to continuous improvement initiatives. Your … Profile: 3+ years of experience in data analysis and ETL development. Strong proficiency in Teradata SQL and Informatica PowerCenter. Experience building and validating risk models using SAS and Python. Solid understanding of statistical techniques and risk modeling frameworks. Familiarity with data governance and compliance standards. Excellent problem-solving and communication skills. ABOUT CAPGEMINI Capgemini is a global business and technology More ❯
help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role: Seeking a skilled Data Analyst with expertise in Teradata, Informatica ETL, and risk modeling using SAS and Python. Design and implement ETL workflows to manage data extraction, transformation, and loading into Teradata. Build and maintain risk models for credit … gather requirements and deliver data-driven solutions. Perform exploratory data analysis to uncover trends and support risk mitigation. Automate reporting and dashboard creation using Python and BI tools. Optimize Teradata queries and ETL performance for efficient data processing. Document data flows, model logic, and technical specifications for transparency. Ensure compliance with data governance and contribute to continuous improvement initiatives. Your … Profile: 3+ years of experience in data analysis and ETL development. Strong proficiency in Teradata SQL and Informatica PowerCenter. Experience building and validating risk models using SAS and Python. Solid understanding of statistical techniques and risk modeling frameworks. Familiarity with data governance and compliance standards. Excellent problem-solving and communication skills. ABOUT CAPGEMINI Capgemini is a global business and technology More ❯
improve profitability; Utilize statistical analysis, simulations, predictive modeling, or other analytical methods to analyze data and develop practical solutions to business problems; Collect, pull, and push data to Oracle, Teradata, and HDFS storage using SQL, Spark, and data analysis technologies; Identify, analyze, and solve problems, and evaluate methods and results; Utilize statistical techniques including clustering, regression, and time series analysis … Excel analytics including use of PowerPivot; Experience with Python, R, writing complex SQL queries and data mining packages; Pulling and analyzing data from complex and big data sources including Teradata SQL, Oracle SQL, and Databricks; SQL or related data querying and manipulating tools; Strong analytical skills with work experience in dynamic environment; Demonstrated ability to solve and to lead others More ❯
Job Description: Database Installation and Configuration: Installing, configuring, and upgrading Teradata database systems, including software and utilities. User and Security Management: Creating and managing user accounts, roles, profiles, and basic access privileges to control database access. Backup and Recovery Operations: Performing routine data backups, managing recovery processes, and ensuring data integrity and availability. System Monitoring and Alerting: Monitoring database health … diving into SQL query optimization, analyzing explain plans, and implementing advanced techniques (e.g., join indexes, partitioning) to enhance system performance.8. Workload Management (TWM/TDWM): Designing, configuring, and managing Teradata Workload Manager rules to prioritize queries, manage concurrency, and ensure service level agreements (SLAs9. Capacity Planning and Growth Management: Analyzing historical usage patterns, forecasting future growth, and planning for hardware More ❯