modern data architectures (ideally Azure, AWS. Microsoft Fabric, GCP, Data Factory) and modern data warehouse technologies (Snowflake, Databricks) Experience with database technologies such as SQL, NoSQL, Oracle, Hadoop, or Teradata Ability to collaborate within and across teams of different technical knowledge to support delivery and educate end users on data products Expert problem-solving skills, including debugging skills, allowing the More ❯
data modelling, data performance tuning and data integration • Hands-on experience with physical and relational data modeling • Proven experience with ETL tools and API integration. • Experience with relational databases (Teradata/Oracle) • Expert knowledge of writing complex SQL queries to pull and summarize large datasets, report creation and ad-hoc analyses • Advanced knowledge of OOP Python and Tableau (certification preferred More ❯
data modelling, data performance tuning and data integration • Hands-on experience with physical and relational data modeling • Proven experience with ETL tools and API integration. • Experience with relational databases (Teradata/Oracle) • Expert knowledge of writing complex SQL queries to pull and summarize large datasets, report creation and ad-hoc analyses • Advanced knowledge of OOP Python and Tableau (certification preferred More ❯
data modelling, data performance tuning and data integration • Hands-on experience with physical and relational data modeling • Proven experience with ETL tools and API integration. • Experience with relational databases (Teradata/Oracle) • Expert knowledge of writing complex SQL queries to pull and summarize large datasets, report creation and ad-hoc analyses • Advanced knowledge of OOP Python and Tableau (certification preferred More ❯
data modelling, data performance tuning and data integration • Hands-on experience with physical and relational data modeling • Proven experience with ETL tools and API integration. • Experience with relational databases (Teradata/Oracle) • Expert knowledge of writing complex SQL queries to pull and summarize large datasets, report creation and ad-hoc analyses • Advanced knowledge of OOP Python and Tableau (certification preferred More ❯
and management of finance data entities, finance data models and attributes across the Finance systems landscape. Experience of working with existing finance logical data model from package vendors (e.g. Teradata and IBM) and new cloud technologies such as GCP, AWS and Azure. Understanding of the current technology and data challenges and trends in Financial Services industry. Excellent communication and presentation More ❯
Document database processes and provide training to technical staff as needed. Provide operational support, reviews, and health checks. Required Qualifications: Proficiency in SQL and experience with relational databases (e.g., Teradata, MySQL, SQL Server). Experience with cloud platforms (e.g., AWS RDS, Azure SQL, Google Cloud SQL). Strong understanding of data modeling, normalization, and schema design. Knowledge of DoD data More ❯
tools like Power BI. Proficient in T-SQL and ability to design efficient queries with a focus on high-performing solutions. Above mentioned skills including Oil and Gas experience, Teradata, and Snowflake are a strong plus. Senior level, typically 8+ years of experience. Ideal candidate will have at least 5 years of strong Power BI & SQL experience. Understands advanced aspects More ❯
skills to query and analyze customer databases using SQL or Python. Experience with BI visualization tools and technologies such as Power BI & Business Objects. Understanding of database management systems (Teradata, SQL Server, Big Data) and ETL (Extract, transform, load) framework. BS/BA in Information Systems, Statistics, Mathematics, Computer Science, Engineering or relevant field is required. Master s degree in More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Experis
infrastructure and/or network domains. * Strong understanding of enterprise networking, cloud platforms (e.g., Azure, AWS), and data center operations. * Familiarity with data warehouse technologies and architectures (e.g., Snowflake, Teradata, Azure Synapse). * Experience in documenting technical requirements, process flows, and system interactions. * Excellent stakeholder management and communication skills. * Ability to translate technical concepts into business-friendly language. Desirable Skills More ❯
Schemas. • Data analysis and modeling tools (e.g. Power Designer, ERWin, ER/Studio). • SQL and/or PL/SQL expert. • RDBMS platforms (e.g. SQL Server, Oracle, Netezza, Teradata, DB2/UDB . • Microsoft Excel, Word, Power Point and Visio experience. Why should you work for Mindbank? Since 1986, Mindbank has helped hundreds of clients solve some of the More ❯
and Schemas; • Data analysis and modeling tools (e.g. Power Designer, ERWin, ER/Studio); • SQL and/or PL/SQL expert; • RDBMS platforms (e.g. SQL Server, Oracle, Netezza, Teradata, DB2/UDB) • Microsoft Excel, Word, Power Point and Visio experience. Why should you work for Mindbank? Mindbank has been providing business solutions to Fortune 1000 companies and government agencies More ❯
performance tuning. Strong Data Analysis skills. Strong skills in transferring functional specifications and requirements into technical design. Must Have Skills Informatica (Power Exchange, Power Center) Must have knowledge in Teradata and UNIX. More ❯
Morris Plains, New Jersey, United States Hybrid / WFH Options
Placement Services USA, Inc
scope of functional, regression and end-to-end testing; providing support to onshore and offshore teams; PBM Toolset; SQL; Jenkins; IBM Mainframe; z/OS; Windows; COBOL; JCL; DB2; Teradata; Endevor; Insync; SPUFI; TSO; ISPF; CA-7; PEGA; BTT; CRT; HP-ALM; File Master; and JIRA. Please copy and paste your resume in the email body (do not send attachments More ❯
Role: Senior Data Engineer (Ab Initio, Teradata, Modern ETL) Location: Addison, TX Duration: Long Term Contract ONLY USC/GC Experienced in developing and supporting Ab Initio and Teradata based complex Data Warehousing applications as well as the ability to upskill to open source based modern ETL technologies like Python, Spark, Java etc Should have expertise in full SDLC for … techniques and with all components of Ab Initio such as Scan, join, Partition by key, Partition by Round Robin, Gather, Merge, Interleave, Lookup etc. 3. Experience and expertise in Teradata, PL/SQL and Unix shell scripting. 4. Financial services experience is a plus and data warehousing experience is required. 5. The candidate should have a very Robust technical experience … these 9. Work closely with Quality Control teams to deliver quality software to agreed project schedules. Looking for candidates who have experience in developing and supporting Ab Initio and Teradata based complex Data Warehousing applications as well as the ability to upskill to open source based modern technologies. Should have expertise in full SDLC for Data Management apps and be More ❯
Contract Opportunity: Data Analyst & Modeller - Teradata Platform Banking | Glasgow | Hybrid This role is for a Data Analyst & Modeller supporting a major data transformation programme for a financial services client. The successful candidate will play a key role in analysing existing data processes, tracing data lineage, and building models that support business and technical needs. Role details Title: Data Analyst & Modeller … Location: Glasgow (2-3 days on-site per week) Contract: Initial 12-month contract Requirements: Strong experience with Teradata SQL and BTEQ scripting Proven ability to perform data lineage and mapping Background in financial services or regulated industries Focus of the role: This role will support discovery and analysis activities across the clients data landscape. The successful candidate will interrogate … data using Teradata tools, collaborate with stakeholders to define data requirements, and document data flows and models with clarity and precision. This is a great opportunity to contribute to a long-term transformation programme in a high-impact environment. Please click here to find out more about our Key Information Documents. Please note that the documents provided contain generic information. More ❯
Contract Opportunity: Data Analyst & Modeller - Teradata Platform Banking | Glasgow | Hybrid This role is for a Data Analyst & Modeller supporting a major data transformation programme for a financial services client. The successful candidate will play a key role in analysing existing data processes, tracing data lineage, and building models that support business and technical needs. Role details Title: Data Analyst & Modeller … Location: Glasgow (2-3 days on-site per week) Contract: Initial 12-month contract Requirements: Strong experience with Teradata SQL and BTEQ scripting Proven ability to perform data lineage and mapping Background in financial services or regulated industries Focus of the role: This role will support discovery and analysis activities across the clients data landscape. The successful candidate will interrogate … data using Teradata tools, collaborate with stakeholders to define data requirements, and document data flows and models with clarity and precision. This is a great opportunity to contribute to a long-term transformation programme in a high-impact environment. Please click here to find out more about our Key Information Documents. Please note that the documents provided contain generic information. More ❯
MSP Owner: Shilpa Bajpai, Location: Cleveland, OH, Duration: 6 months, GBaMS ReqID: Role:, Exp: 8- 10 Years, Skills: Big Data and Hadoop Ecosystems, Banking and Financial Technology, Ab Initio, Teradata, MySQL, Unix/Linux Basics and Commands Lead and own all technical aspects of ETL projects from requirement till implementation., Strong technical skills in Ab Initio, UNIX shell scripting, Teradata … of hands-on experience with Ab Initio ETL tool., Minimum 5 years of experience of working in Banking Domain., Basic knowledge in Python would be preferred., Extensively worked on Teradata or Hadoop as database using Ab Initio as ETL tool for large scale data integration., Good understanding of data warehouse and Metadata management concepts and Tools., Good Knowledge in establishing … both Onshore & Offshore., Experience in working on Agile projects., Should have good communication skill and able to work with multiple vendors. Basics of Python Essential Skills:, Ab Initio, UNIX, Teradata, Hadoop, Google Big Query, Google Cloud Storage, Basics of data processing tools in GCP and Python Basics, Comments for Suppliers:, Rate Details More ❯
help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role: Seeking a skilled Data Analyst with expertise in Teradata, Informatica ETL, and risk modeling using SAS and Python. Design and implement ETL workflows to manage data extraction, transformation, and loading into Teradata. Build and maintain risk models for credit … gather requirements and deliver data-driven solutions. Perform exploratory data analysis to uncover trends and support risk mitigation. Automate reporting and dashboard creation using Python and BI tools. Optimize Teradata queries and ETL performance for efficient data processing. Document data flows, model logic, and technical specifications for transparency. Ensure compliance with data governance and contribute to continuous improvement initiatives. Your … Profile: 3+ years of experience in data analysis and ETL development. Strong proficiency in Teradata SQL and Informatica PowerCenter. Experience building and validating risk models using SAS and Python. Solid understanding of statistical techniques and risk modeling frameworks. Familiarity with data governance and compliance standards. Excellent problem-solving and communication skills. ABOUT CAPGEMINI Capgemini is a global business and technology More ❯
help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role: Seeking a skilled Data Analyst with expertise in Teradata, Informatica ETL, and risk modeling using SAS and Python. Design and implement ETL workflows to manage data extraction, transformation, and loading into Teradata. Build and maintain risk models for credit … gather requirements and deliver data-driven solutions. Perform exploratory data analysis to uncover trends and support risk mitigation. Automate reporting and dashboard creation using Python and BI tools. Optimize Teradata queries and ETL performance for efficient data processing. Document data flows, model logic, and technical specifications for transparency. Ensure compliance with data governance and contribute to continuous improvement initiatives. Your … Profile: 3+ years of experience in data analysis and ETL development. Strong proficiency in Teradata SQL and Informatica PowerCenter. Experience building and validating risk models using SAS and Python. Solid understanding of statistical techniques and risk modeling frameworks. Familiarity with data governance and compliance standards. Excellent problem-solving and communication skills. ABOUT CAPGEMINI Capgemini is a global business and technology More ❯
help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role: Seeking a skilled Data Analyst with expertise in Teradata, Informatica ETL, and risk modeling using SAS and Python. Design and implement ETL workflows to manage data extraction, transformation, and loading into Teradata. Build and maintain risk models for credit … gather requirements and deliver data-driven solutions. Perform exploratory data analysis to uncover trends and support risk mitigation. Automate reporting and dashboard creation using Python and BI tools. Optimize Teradata queries and ETL performance for efficient data processing. Document data flows, model logic, and technical specifications for transparency. Ensure compliance with data governance and contribute to continuous improvement initiatives. Your … Profile: 3+ years of experience in data analysis and ETL development. Strong proficiency in Teradata SQL and Informatica PowerCenter. Experience building and validating risk models using SAS and Python. Solid understanding of statistical techniques and risk modeling frameworks. Familiarity with data governance and compliance standards. Excellent problem-solving and communication skills. ABOUT CAPGEMINI Capgemini is a global business and technology More ❯
help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role: Seeking a skilled Data Analyst with expertise in Teradata, Informatica ETL, and risk modeling using SAS and Python. Design and implement ETL workflows to manage data extraction, transformation, and loading into Teradata. Build and maintain risk models for credit … gather requirements and deliver data-driven solutions. Perform exploratory data analysis to uncover trends and support risk mitigation. Automate reporting and dashboard creation using Python and BI tools. Optimize Teradata queries and ETL performance for efficient data processing. Document data flows, model logic, and technical specifications for transparency. Ensure compliance with data governance and contribute to continuous improvement initiatives. Your … Profile: 3+ years of experience in data analysis and ETL development. Strong proficiency in Teradata SQL and Informatica PowerCenter. Experience building and validating risk models using SAS and Python. Solid understanding of statistical techniques and risk modeling frameworks. Familiarity with data governance and compliance standards. Excellent problem-solving and communication skills. ABOUT CAPGEMINI Capgemini is a global business and technology More ❯
london (city of london), south east england, united kingdom
Capgemini
help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role: Seeking a skilled Data Analyst with expertise in Teradata, Informatica ETL, and risk modeling using SAS and Python. Design and implement ETL workflows to manage data extraction, transformation, and loading into Teradata. Build and maintain risk models for credit … gather requirements and deliver data-driven solutions. Perform exploratory data analysis to uncover trends and support risk mitigation. Automate reporting and dashboard creation using Python and BI tools. Optimize Teradata queries and ETL performance for efficient data processing. Document data flows, model logic, and technical specifications for transparency. Ensure compliance with data governance and contribute to continuous improvement initiatives. Your … Profile: 3+ years of experience in data analysis and ETL development. Strong proficiency in Teradata SQL and Informatica PowerCenter. Experience building and validating risk models using SAS and Python. Solid understanding of statistical techniques and risk modeling frameworks. Familiarity with data governance and compliance standards. Excellent problem-solving and communication skills. ABOUT CAPGEMINI Capgemini is a global business and technology More ❯
Excel, PowerPoint, etc. Ability to guide multiple technology onsite/offshore team members Candidate with strong business analyst experience will also be considered Good in software development using SAS, Teradata and Unix scripting Experience in Python, Spark and Hadoop will be an advantage More ❯
glasgow, central scotland, united kingdom Hybrid / WFH Options
GIOS Technology
Job Title: Data Analyst & Modeller – Teradata/SQL/BTEQ/Data Mapping Location: Glasgow, UK (Hybrid, 2–3 days per week in office) Duration: Until 31/12/2026 Job Description: Conduct discovery analysis to evaluate and optimize existing data processes and structures. Perform data lineage , usage analysis , and data mapping to support transformation initiatives. Interrogate and validate … data using Teradata SQL and BTEQ scripts to generate actionable insights. Collaborate with stakeholders to translate business requirements into effective data models . Document data flows, relationships, and lineage with precision and clarity. Identify and resolve data quality issues and inconsistencies across large-scale environments. Key Skills: Teradata SQL, BTEQ, Data Mapping, Data Lineage, Data Modeling, Data Analysis, Data Quality More ❯