Skip to main content

Informatica Case study

Informatica Case study

Scenario:

Data has to be moved from a legacy application to Siebel staging tables (EIM). The client will provide the data in a delimited flat file. This file contains Contact records which need to be loaded into the EIM_CONTACT table.
Some facts
A contact can be uniquely identified by concatenating the First name with the Last name and Zip code.
Known issues
A potential problem with the load could be the telephone number which is currently stored as a string (XXX-YYY-ZZZZ format). We need to convert this into (XXX) YYYYYYY format where XX X is the area code in brackets followed by a space and the 7 digit telephone number. Any extensions should be dropped.
Requirements
The load should have a batch number of 100. If the record count exceeds 500, increment the batch number by 5
Since the flat file may have duplicates due to alphabet case issues, its been decided that all user keys on the table should be stored in uppercase. For uniformity sake, the first name and last name should also be loaded in uppercase
Error logging
As per client’s IT standards, its expected that any data migration run would provide a automated high level report (a flat file report is acceptable) which will give information on how many records were read, loaded successfully and failed due to errors.

Output expected from case study:
  1. Informatica mapping from flat file to EIM_CONTACT table
  2. Log file created for error logging




Comments

Popular posts from this blog

CMN_1650 A duplicate row was attempted to be inserted into a dynamic lookup cache Dynamic lookup error.

Scenario: I have 2 ports going through a dynamic lookup, and then to a router. In the router it is a simple case of inserting new target rows (NewRowLookup=1) or rejecting existing rows (NewRowLookup=0). However, when I run the session I'm getting the error: "CMN_1650 A duplicate row was attempted to be inserted into a dynamic lookup cache Dynamic lookup error. The dynamic lookup cache only supports unique condition keys." I thought that I was bringing through duplicate values so I put a distinct on the SQ. There is also a not null filter on both ports. However, whilst investigating the initial error that is logged for a specific pair of values from the source, there is only 1 set of them (no duplicates). The pair exists on the target so surely should just return from the dynamic lookup newrowlookup=0. Is this some kind of persistent data in the cache that is causing this to think that it is duplicate data? I haven't got the persistent cache or...

SQL Transformation with examples

============================================================================================= SQL Transformation with examples   Use : SQL Transformation is a connected transformation used to process SQL queries in the midstream of a pipeline . We can insert, update, delete and retrieve rows from the database at run time using the SQL transformation. Use SQL transformation in script mode to run DDL (data definition language) statements like creating or dropping the tables. The following SQL statements can be used in the SQL transformation. Data Definition Statements (CREATE, ALTER, DROP, TRUNCATE, RENAME) DATA MANIPULATION statements (INSERT, UPDATE, DELETE, MERGE) DATA Retrieval Statement (SELECT) DATA Control Language Statements (GRANT, REVOKE) Transaction Control Statements (COMMIT, ROLLBACK) Scenario: Let’s say we want to create a temporary table in mapping while workflow is running for some intermediate calculation. We can use SQL transformat...

Load the session statistics such as Session Start & End Time, Success Rows, Failed Rows and Rejected Rows etc. into a database table for audit/log purpose.

                                                                                                                                                                     ...