Skip to main content

Load the session statistics such as Session Start & End Time, Success Rows, Failed Rows and Rejected Rows etc. into a database table for audit/log purpose.



                                                                                                                                                                      
Scenario:

Load the session statistics such as Session Start & End Time, Success Rows, Failed Rows and Rejected Rows etc. into a database table for audit/log purpose.

Solution:


After performing the below solution steps your end workflow will look as follows:

START => SESSION1 => ASSIGNMENT TASK => SESSION2

SOLUTION STEPS

SESSION1
This session is used to achieve your actual business logic. Meaning this session will perform your actual data load. It can be anything File Table.
àFile or TableàTable, File à

WORKFLOW VARIABLES

Create the following workflow variables.

=> $$Workflowname
=> $$SessionStartTime
=> $$SessionEndTime
=> $$TargetSuccessrows
=> $$TargetFailedRows


ASSIGNMENT TASK

Use the Expression tab in the Assignment Task and assign as follows:

$$workflowname = $PMWorkflowName
$$sessionStartTime = $ SESSION1.StartTime
$$SessionEndTime = $ SESSION1.Endtime
$$ TargetSuccessrows = $ SESSION1. TgtSuccessRows
$$ TargetFailedRows = $ SESSION1. TgtFailedRows

SESSION2

This session is used to load the session statistics into a database table.
=> This should call a mapping say ‘m_sessionLog’
=> This mapping m_sessionLog should have mapping variables for the above defined workflow variables such as $$wfname, $$Stime, $$Etime, $$TSRows and $$TFRows.
=> This mapping m_sessionLog should use a dummy source and it must have a expression transformation and a target => database Audit table)
=> Inside the expression you must assign the mapping variables to the output ports
workflowname=$$wfname
starttime=$$Stime
endtime=$$Etime
SucessRows=$$TSRows
FailedRows=$$TFRows
=> Create a target database table with the following columns
Workflowname, start time, end time, success rows and failed rows.
=> Connect all the required output ports to the target which is nothing but your audit table.
PRE-Session Variable
=> Session 2: In the Pre-session variable assignment tab assign the mapping variable = workflow variable
=> In our case
$$wfname=$$workflowname
$$Stime=$$sessionStartTime
$$Etime=$$sessionEndTime
$$TSRows=$$TargetSuccessrows
$$TFRows=$$TargetFailedrows

Workflow Execution

Comments

Popular posts from this blog

SQL Transformation with examples

============================================================================================= SQL Transformation with examples   Use : SQL Transformation is a connected transformation used to process SQL queries in the midstream of a pipeline . We can insert, update, delete and retrieve rows from the database at run time using the SQL transformation. Use SQL transformation in script mode to run DDL (data definition language) statements like creating or dropping the tables. The following SQL statements can be used in the SQL transformation. Data Definition Statements (CREATE, ALTER, DROP, TRUNCATE, RENAME) DATA MANIPULATION statements (INSERT, UPDATE, DELETE, MERGE) DATA Retrieval Statement (SELECT) DATA Control Language Statements (GRANT, REVOKE) Transaction Control Statements (COMMIT, ROLLBACK) Scenario: Let’s say we want to create a temporary table in mapping while workflow is running for some intermediate calculation. We can use SQL transformat...

CMN_1650 A duplicate row was attempted to be inserted into a dynamic lookup cache Dynamic lookup error.

Scenario: I have 2 ports going through a dynamic lookup, and then to a router. In the router it is a simple case of inserting new target rows (NewRowLookup=1) or rejecting existing rows (NewRowLookup=0). However, when I run the session I'm getting the error: "CMN_1650 A duplicate row was attempted to be inserted into a dynamic lookup cache Dynamic lookup error. The dynamic lookup cache only supports unique condition keys." I thought that I was bringing through duplicate values so I put a distinct on the SQ. There is also a not null filter on both ports. However, whilst investigating the initial error that is logged for a specific pair of values from the source, there is only 1 set of them (no duplicates). The pair exists on the target so surely should just return from the dynamic lookup newrowlookup=0. Is this some kind of persistent data in the cache that is causing this to think that it is duplicate data? I haven't got the persistent cache or...