Skip to main content

SQL Transformation with examples


=============================================================================================

SQL Transformation with examples

 Use: SQL Transformation is a connected transformation used to process SQL queries in the midstream of a pipeline. We can insert, update, delete and retrieve rows from the database at run time using the SQL transformation. Use SQL transformation in script mode to run DDL (data definition language) statements like creating or dropping the tables.

The following SQL statements can be used in the SQL transformation.
  • Data Definition Statements (CREATE, ALTER, DROP, TRUNCATE, RENAME)
  • DATA MANIPULATION statements (INSERT, UPDATE, DELETE, MERGE)
  • DATA Retrieval Statement (SELECT)
  • DATA Control Language Statements (GRANT, REVOKE)
  • Transaction Control Statements (COMMIT, ROLLBACK)

Scenario: Let’s say we want to create a temporary table in mapping while workflow is running for some intermediate calculation. We can use SQL transformation in script mode to achieve the same.
Below we will see how to create sql transformation in script mode with an example where we will create a table in mapping and will insert some rows in the same table.

Solution:

Step 1:
Create two text files in the $PMSourceFileDir directory with some sql queries.
1.      sql_script.txt
File contains the below Sql queries (you can have multiple sql queries in file separated by semicolon)

create table create_emp_table
(emp_id number,emp_name varchar2(100))

2.      sql_script2.txt
File contains the below Sql queries (you can have multiple sql queries in file separated by semicolon)

       insert into create_emp_table values (1,'abc')

These are the script files to be executed by SQL transformation on database server.

Step 2:
We need a source which contains the above script file names with a complete path.
So, I created another file in the $PMSourceFileDir directory to store these script file names as Sql_script_input.txt.

File contains the list of files with their complete path:
E:\softs\Informatica\server\infa_shared\SrcFiles\sql_script.txt
E:\softs\Informatica\server\infa_shared\SrcFiles\sql_script2.txt

Step 3:
Now we will create a mapping to execute the script files using the SQL transformation.

Go to the mapping designer tool, source analyzer and Import from file
=>then creates source definition by selecting a file Sql_script_input.txt Located at E:\softs\Informatica\server\infa_shared\SrcFiles.
Source definition will look like 





Similarly create a target definition, go to target designer and create a target flat file with result and error ports. This is shown in the below image



Step 4:
  • Go to the mapping designer and create a new mapping.
  • Drag the flat file into the mapping designer.
  • Go to the Transformation in the toolbar, Create, select the SQL transformation, enter a name and click on create.
  • Now select the SQL transformation options as script mode and DB type as Oracle and click ok.

  • The SQL transformation is created with the default ports.
  • Now connect the source qualifier transformation ports to the SQL transformation input port.
  • Drag the target flat file into the mapping and connect the SQL transformation output ports to the target.
  • Save the mapping. The mapping flow image is shown in the below picture.


  • Go to the workflow manager; create a new workflow and session.
  • Edit the session. For source, enter the source & target file directory.
  • For the SQL transformation, enter the oracle database relational connection as shown below.
  • Save the workflow and run it.
  • Open the target file, you will find the below data.
"PASSED";
"PASSED";

"PASSED"; -: for sql_script.txt, where it will crate the table and 
"PASSED"; -: For sql_scriptw.txt, where it will insert rows in to the table

 Fire a select query on the database to check whether table is created or not.




=============================================================================

Comments

  1. Take a brief suggestion from your informatica questions related blog these are very nice follow us at :- The questionnaires are studied and Informatica interview questions prepared by professionals who are related to this field. With this website you can get an overview about all details and concept related to the topic that can help them to learn each and every data warehouse interview questions concept clearly.

    ReplyDelete

Post a Comment

Popular posts from this blog

Informatica Quiz: Set 1

                                                                                                                                                                       Quiz: Informatica Set 1 Which one is not correct about filter transformation? Explanation: Filter generally parses single condition. For multiple condition we can use router Act as a 'where' condition Can't passes multiple conditions Act like 'Case' in pl/sql (wrong) If one record does not match condition, the record is blocked Can we calculate in aggrigator ? Explanation: No Yes (correct) Which one is not a type of  fact? Explanation: Semi-aditive Additive Confirm fact Not additive (wrong)  Which one is not a type of dimension ? Explanation: Conformed dimension Rapidly changing dimension (correct) Junk dimension Degenerated dimension Which of these not correct about Code Page? Explanation: A code page contains encodin

Function : NVL2 and COALESCE

NVL2 The NVL2 function accepts three parameters. If the first parameter value is not null it returns the value in the second parameter. If the first parameter value is null, it returns the third parameter. The following query shows NVL2 in action. SQL> SELECT * FROM null_test_tab ORDER BY id;           ID COL1        COL2        COL3        COL4 ---------- ---------- ---------- ---------- ----------          1 ONE         TWO         THREE       FOUR          2             TWO         THREE       FOUR          3                         THREE       FOUR          4                        THREE       THREE   4 rows selected. SQL> SELECT id, NVL2(col1, col2, col3) AS output FROM null_test_tab ORDER BY id;                  ID OUTPUT ---------- ----------                  1 TWO                  2 THREE                  3 THREE                  4 THREE   4 rows selected.   SQL> COALESCE The COALESCE function was introduced in Oracle 9i

Load the session statistics such as Session Start & End Time, Success Rows, Failed Rows and Rejected Rows etc. into a database table for audit/log purpose.

                                                                                                                                                                       Scenario: Load the session statistics such as Session Start & End Time, Success Rows, Failed Rows and Rejected Rows etc. into a database table for audit/log purpose. Solution: After performing the below solution steps your end workflow will look as follows: START => SESSION1 => ASSIGNMENT TASK => SESSION2 SOLUTION STEPS SESSION1 This session is used to achieve your actual business logic. Meaning this session will perform your actual data load. It can be anything File Table. à File or Table à Table, File à WORKFLOW VARIABLES Create the following workflow variables. => $$Workflowname => $$SessionStartTime => $$SessionEndTime => $$TargetSuccessrows => $$TargetFailedRows ASSIGNMENT TASK Use the Expression tab in the Assignment Task and assign as follo