=============================================================================================
SQL Transformation with examples
Use: SQL Transformation is a connected
transformation used to process SQL queries in the midstream of a pipeline. We can insert, update, delete and retrieve rows
from the database at run time using the SQL transformation. Use SQL transformation in script mode to run DDL (data definition
language) statements like creating or dropping the tables.
The
following SQL statements can be used in the SQL transformation.
- Data Definition Statements (CREATE, ALTER, DROP, TRUNCATE, RENAME)
- DATA MANIPULATION statements (INSERT, UPDATE, DELETE, MERGE)
- DATA Retrieval Statement (SELECT)
- DATA Control Language Statements (GRANT, REVOKE)
- Transaction Control Statements (COMMIT, ROLLBACK)
Scenario: Let’s say we want to create a
temporary table in mapping while workflow is running for some intermediate
calculation. We can use SQL transformation in script mode to achieve the same.
Below we
will see how to create sql transformation in script mode with an example where
we will create a table in mapping and will insert some rows in the same table.
Solution:
Step 1:
Create two
text files in the $PMSourceFileDir directory with some sql queries.
1.
sql_script.txt
File contains the below Sql queries (you can have multiple
sql queries in file separated by semicolon)
create table
create_emp_table
(emp_id number,emp_name varchar2(100))
2.
sql_script2.txt
File contains the below Sql queries (you can have multiple
sql queries in file separated by semicolon)
insert into create_emp_table values
(1,'abc')
These are
the script files to be executed by SQL transformation on database server.
Step 2:
We need a source which contains the above script file names with a complete path.
We need a source which contains the above script file names with a complete path.
So, I
created another file in the $PMSourceFileDir directory to store these script
file names as Sql_script_input.txt.
File
contains the list of files with their complete path:
E:\softs\Informatica\server\infa_shared\SrcFiles\sql_script.txt
E:\softs\Informatica\server\infa_shared\SrcFiles\sql_script2.txt
Step 3:
Now we
will create a mapping to execute the script files using the SQL transformation.
Go to the
mapping designer tool, source analyzer and Import from file
=>then
creates source definition by selecting a file Sql_script_input.txt Located at E:\softs\Informatica\server\infa_shared\SrcFiles.
Source
definition will look like
Similarly
create a target definition, go to target designer and create a target flat file
with result and error ports. This is shown in the below image
Step 4:
- Go to the mapping designer and create a new mapping.
- Drag the flat file into the mapping designer.
- Go to the Transformation in the toolbar, Create, select the SQL transformation, enter a name and click on create.
- Now select the SQL transformation options as script mode and DB type as Oracle and click ok.
- The SQL transformation is created with the default ports.
- Now connect the source qualifier transformation ports to the SQL transformation input port.
- Drag the target flat file into the mapping and connect the SQL transformation output ports to the target.
- Save the mapping. The mapping flow image is shown in the below picture.
- Go to the workflow manager; create a new workflow and session.
- Edit the session. For source, enter the source & target file directory.
- For the SQL transformation, enter the oracle database relational connection as shown below.
- Save the workflow and run it.
- Open the target file, you will find the below data.
"PASSED";
"PASSED";
"PASSED"; -: for sql_script.txt,
where it will crate the table and
"PASSED"; -: For sql_scriptw.txt,
where it will insert rows in to the table
Fire
a select query on the database to check whether table is created or not.
=============================================================================
Take a brief suggestion from your informatica questions related blog these are very nice follow us at :- The questionnaires are studied and Informatica interview questions prepared by professionals who are related to this field. With this website you can get an overview about all details and concept related to the topic that can help them to learn each and every data warehouse interview questions concept clearly.
ReplyDelete