Skip to main content

Dynamically generate parameter files

Scenario:

 Dynamically generate parameter files

Solution:

Parameter file format for PowerCenter:

For a workflow parameter which can be used by any session in the workflow, below is the format in which the parameter file has to be created.

[Folder_name:WF.Workflow_Name]
$$parameter_name1=value
$$parameter_name2=value

For a session parameter which can be used by the particular session, below is the format in which the parameter file has to be created.

[Folder_name:WF.Workflow_Name:ST.Session_Name]
$$parameter_name1=value
$$parameter_name2=value


3. Parameter handling in a data model:

• To have flexibility in maintaining the parameter files.


• To reduce the overhead for the support to change the parameter file every time a value of a parameter changes


• To ease the deployment,
all the parameters have to be maintained in Oracle or any database tables and a PowerCenter session is created to generate the parameter file in the required format automatically.


For this, 4 tables are to be created in the database:

1. FOLDER table will have entries for each folder.


2. WORKFLOWS table will have the list of each workflow but with a reference to the FOLDERS table to say which folder this workflow is created in.


3. PARAMETERS table will hold all the parameter names irrespective of folder/workflow.


4. PARAMETER_VALUES table will hold the parameter of each session with references to PARMETERS table for parameter name and WORKFLOWS table for the workflow name. When the session name is NULL, that means the parameter is a workflow variable which can be used across all the sessions in the workflow.

To get the actual names because PARAMETER_VALUES table holds only ID columns of workflow and parameter, we create a view that gets all the names for us in the required format of the parameter file. Below is the DDL for the view.


a. Parameter file view:


CREATE OR REPLACE VIEW PARAMETER_FILE
(
HEADER,
DETAIL
)
AS
select '['fol.folder_name'.WF:' wfw.workflow_name']' header
,pmr.parameter_namenvl2(dtl.logical_name, '_'dtl.logical_name, NULL)'='
dtl.value detail
from folder fol
,parameters pmr
,WORKFLOWS wfw
,PARAMETER_VALUES dtl
where fol.id = wfw.folder_id
and dtl.pmr_id = pmr.id
and dtl.wfw_id = wfw.id
and dtl.session_name is null
UNION
select '['fol.folder_name'.WF:' wfw.workflow_name'.ST:' dtl.session_name']' header
,decode(dtl.mapplet_name, NULL, NULL, dtl.mapplet_name'.')
pmr.parameter_namenvl2(dtl.logical_name, '_'dtl.logical_name, NULL)'=' dtl.value detail
from folder fol
,parameters pmr
,WORKFLOWS wfw
,PARAMETER_VALUES dtl
where fol.id = wfw.folder_id
and dtl.pmr_id = pmr.id
and dtl.wfw_id = wfw.id
and dtl.session_name is not null

b. FOLDER table
ID (NUMBER)
FOLDER_NAME (varchar50)
DESCRIPTION (varchar50)


c. WORKFLOWS table
ID (NUMBER)
WORKFLOW_NAME (varchar50)
FOLDER_ID (NUMBER) Foreign Key to FOLDER.ID
DESCRIPTION (varchar50)


d. PARAMETERS table
ID (NUMBER)
PARAMETER_NAME (varchar50)
DESCRIPTION (varchar50)


e. PARAMETER_VALUES table
ID (NUMBER)
WF_ID (NUMBER)
PMR_ID (NUMBER)
LOGICAL_NAME (varchar50)
VALUE (varchar50)
SESSION_NAME (varchar50)

• LOGICAL_NAME is a normalization initiative in the above parameter logic. For example, in a mapping if we need to use $$SOURCE_FX as a parameter and also $$SOURCE_TRANS as another mapping parameter, instead of creating 2 different parameters in the PARAMETERS table, we create one parameter $$SOURCE. Then FX and TRANS will be two LOGICAL_NAME records of the PARAMETER_VALUES table.

• m_PARAMETER_FILE is the mapping that creates the parameter file in the desired format and the corresponding session name is s_m_PARAMETER_FILE.




Comments

Popular posts from this blog

SQL Transformation with examples

============================================================================================= SQL Transformation with examples   Use : SQL Transformation is a connected transformation used to process SQL queries in the midstream of a pipeline . We can insert, update, delete and retrieve rows from the database at run time using the SQL transformation. Use SQL transformation in script mode to run DDL (data definition language) statements like creating or dropping the tables. The following SQL statements can be used in the SQL transformation. Data Definition Statements (CREATE, ALTER, DROP, TRUNCATE, RENAME) DATA MANIPULATION statements (INSERT, UPDATE, DELETE, MERGE) DATA Retrieval Statement (SELECT) DATA Control Language Statements (GRANT, REVOKE) Transaction Control Statements (COMMIT, ROLLBACK) Scenario: Let’s say we want to create a temporary table in mapping while workflow is running for some intermediate calculation. We can use SQL transformat...

Load the session statistics such as Session Start & End Time, Success Rows, Failed Rows and Rejected Rows etc. into a database table for audit/log purpose.

                                                                                                                                                                     ...

CMN_1650 A duplicate row was attempted to be inserted into a dynamic lookup cache Dynamic lookup error.

Scenario: I have 2 ports going through a dynamic lookup, and then to a router. In the router it is a simple case of inserting new target rows (NewRowLookup=1) or rejecting existing rows (NewRowLookup=0). However, when I run the session I'm getting the error: "CMN_1650 A duplicate row was attempted to be inserted into a dynamic lookup cache Dynamic lookup error. The dynamic lookup cache only supports unique condition keys." I thought that I was bringing through duplicate values so I put a distinct on the SQ. There is also a not null filter on both ports. However, whilst investigating the initial error that is logged for a specific pair of values from the source, there is only 1 set of them (no duplicates). The pair exists on the target so surely should just return from the dynamic lookup newrowlookup=0. Is this some kind of persistent data in the cache that is causing this to think that it is duplicate data? I haven't got the persistent cache or...