Skip to main content

PowerCenter objects – Introduction

PowerCenter objects – Introduction

• A repository is the highest physical entity of a project in PowerCenter.

• A folder is a logical entity in a PowerCenter project. For example, Customer_Data is a folder.

• A workflow is synonymous to a set of programs in any other programming language.

• A mapping is a single program unit that holds the logical mapping between source and target with required transformations. A mapping will just say a source table by name EMP exists with some structure. A target flat file by name EMP_FF exists with some structure. The mapping doesn’t say in which schema this EMP table exists and in which physical location this EMP_FF table going to be stored.

• A session is the physical representation of the mapping. The session defines what a maping didn’t do. The session stores the information about where this EMP table comes from. Which schema, with what username and password can we access this table in that schema. It also tells about the target flat file. In which physical location the file is going to get created.

• A transformation is a sub-program that performs a specific task with the input it gets and returns some output. It can be assumed as a stored procedure in any database. Typical examples of transformations are Filter, Lookup, Aggregator, Sorter etc.

• A set of transformations, that are reusable can be built into something called mapplet. A mapplet is a set of transformations aligned in a specific order of execution.

As with any other tool or programing language, PowerCenter also allows parameters to be passed to have flexibility built into the flow. Parameters are always passed as data in flat files to PowerCenter and that file is called the parameter file.

Comments

Popular posts from this blog

SQL Transformation with examples

============================================================================================= SQL Transformation with examples   Use : SQL Transformation is a connected transformation used to process SQL queries in the midstream of a pipeline . We can insert, update, delete and retrieve rows from the database at run time using the SQL transformation. Use SQL transformation in script mode to run DDL (data definition language) statements like creating or dropping the tables. The following SQL statements can be used in the SQL transformation. Data Definition Statements (CREATE, ALTER, DROP, TRUNCATE, RENAME) DATA MANIPULATION statements (INSERT, UPDATE, DELETE, MERGE) DATA Retrieval Statement (SELECT) DATA Control Language Statements (GRANT, REVOKE) Transaction Control Statements (COMMIT, ROLLBACK) Scenario: Let’s say we want to create a temporary table in mapping while workflow is running for some intermediate calculation. We can use SQL transformat...

Load the session statistics such as Session Start & End Time, Success Rows, Failed Rows and Rejected Rows etc. into a database table for audit/log purpose.

                                                                                                                                                                     ...

CMN_1650 A duplicate row was attempted to be inserted into a dynamic lookup cache Dynamic lookup error.

Scenario: I have 2 ports going through a dynamic lookup, and then to a router. In the router it is a simple case of inserting new target rows (NewRowLookup=1) or rejecting existing rows (NewRowLookup=0). However, when I run the session I'm getting the error: "CMN_1650 A duplicate row was attempted to be inserted into a dynamic lookup cache Dynamic lookup error. The dynamic lookup cache only supports unique condition keys." I thought that I was bringing through duplicate values so I put a distinct on the SQ. There is also a not null filter on both ports. However, whilst investigating the initial error that is logged for a specific pair of values from the source, there is only 1 set of them (no duplicates). The pair exists on the target so surely should just return from the dynamic lookup newrowlookup=0. Is this some kind of persistent data in the cache that is causing this to think that it is duplicate data? I haven't got the persistent cache or...