Skip to main content

Need to get the lastest ID

Scenario:

We have data from source is coming as below
Source is Oracle database:

OLD_ID       NEW_ID      
---------- ----------
101        102       
102        103       
103        104       
105        106       
106        108
Need the output as below.
OLD_ID     NEW_ID
---------- ------------
101        104         
102        104         
103        104         
105        108         
106        108         


Can anyone help me todo this in informatica.

Solution:

Mapping def

                           Exp2
Sq --> Exp1 -->            ---->Jnr ----> TGT
                           Agg

Explnation:

In exp1 ,add tow variable shown below

OLD_ID       NEW_ID      Diff_of_rows       New_id

---------- ----------          ----------------- ----------
101        102                        1 (1)                        1
102        103                        1 (102-101)             1
103        104                        1 (103-102)             1
105        106                        2 (105-103)             2
106        108                        1 (106-105)             2

Diff_of_rows -  you have to maintain the old_id of prev row in exp variable,then you have to minus it with current row lod_id

New_id - starting with one, if value of prev row of  Diff_of_rows does not match with current row Diff_of_rows,increment value of new_id by 1.

Thane send below rows to Exp 2

OLD_ID       NEW_ID            New_id

---------- ----------         - ----------
101        102                             1
102        103                             1
103        104                             1
105        106                             2
106        108                             2

and in Agg o/p

NEW_ID            New_id
----------            ----------
104                            1
108                            2

Then join exp2 o/p with agg o/p based on New_id column so you will get required o/p

OLD_ID     NEW_ID
---------- ------------
101        104         
102        104         
103        104         
105        108         
106        108         














Comments

Popular posts from this blog

SQL Transformation with examples

============================================================================================= SQL Transformation with examples   Use : SQL Transformation is a connected transformation used to process SQL queries in the midstream of a pipeline . We can insert, update, delete and retrieve rows from the database at run time using the SQL transformation. Use SQL transformation in script mode to run DDL (data definition language) statements like creating or dropping the tables. The following SQL statements can be used in the SQL transformation. Data Definition Statements (CREATE, ALTER, DROP, TRUNCATE, RENAME) DATA MANIPULATION statements (INSERT, UPDATE, DELETE, MERGE) DATA Retrieval Statement (SELECT) DATA Control Language Statements (GRANT, REVOKE) Transaction Control Statements (COMMIT, ROLLBACK) Scenario: Let’s say we want to create a temporary table in mapping while workflow is running for some intermediate calculation. We can use SQL transformat...

Load the session statistics such as Session Start & End Time, Success Rows, Failed Rows and Rejected Rows etc. into a database table for audit/log purpose.

                                                                                                                                                                     ...

CMN_1650 A duplicate row was attempted to be inserted into a dynamic lookup cache Dynamic lookup error.

Scenario: I have 2 ports going through a dynamic lookup, and then to a router. In the router it is a simple case of inserting new target rows (NewRowLookup=1) or rejecting existing rows (NewRowLookup=0). However, when I run the session I'm getting the error: "CMN_1650 A duplicate row was attempted to be inserted into a dynamic lookup cache Dynamic lookup error. The dynamic lookup cache only supports unique condition keys." I thought that I was bringing through duplicate values so I put a distinct on the SQ. There is also a not null filter on both ports. However, whilst investigating the initial error that is logged for a specific pair of values from the source, there is only 1 set of them (no duplicates). The pair exists on the target so surely should just return from the dynamic lookup newrowlookup=0. Is this some kind of persistent data in the cache that is causing this to think that it is duplicate data? I haven't got the persistent cache or...