Skip to main content

Informatica Training Effectiveness Assessment


                                                                                                                                                            
Informatica Training Effectiveness Assessment


Trainee:

Trainer:

Date of training:


1.       Informatica is an ETL tool where ETL stands for
a.       Extract – Transform – Load
b.       Evaluate – Transform – Load
c.       Extract – Test – Load
d.       Evaluate – Test – Load

2.       Informatica allows for the following:
a.       One source – multiple targets to be loaded within the same mapping
b.       Multiple source – multiple targets to be loaded within the same mapping
c.       Multiple sources – single target to be loaded within the same mapping
d.       Multiple sources – multiple targets to be loaded provided mapplets are used within the mapping

3.       The ____ manages the connections to the repository from the Informatica client application
a.       Repository Server
b.       Informatica Server
c.       Informatica Repository Manager
d.       Both a & b

4.       During development phase, its best to use what type of tracing levels to debug errors
a.       Terse tracing
b.       Verbose tracing
c.       Verbose data tracing
d.       Normal tracing

5.       During Informatica installation, what is the installation sequence?
a.       Informatica Client, Repository Server, Informatica Server
b.       Informatica Server, Repository Server, Informatica Client
c.       Repository Server, Informatica Server, Informatica Client
d.       Either of the above is fine, however, to create the repository we need the Informatica client installed and the repository server process should be running

6.       There is a requirement to concatenate the first name and last name from a flat file and use this concatenated value at 2 locations in the target table. The best way to achieve this functionality is by using the
a.       Expression transformation
b.       Filter transformation
c.       Aggregator transformation
d.       Using the character transformation

7.       The workflow monitor does not allow the user to edit workflows.
a.       True
b.       False

8.       There is a requirement to increment a batch number by one for every 5000 records that are loaded. The best way to achieve this is:
  1. Use Mapping parameter in the session
  2. Use Mapping variable in the session
  3. Store the batch information in the workflow manager
  4. Write code in a transformation to update values as required

9.       There is a requirement to reuse some complex logic across 3 mappings. The best way to achieve this is:
a.       Create a mapplet to encapsulate the reusable functionality and call this in the 3 mappings
b.       Create a worklet and reuse this at the session level during execution of the mapping
c.       Cut and paste the code across the 3 mappings
d.       Keep this functionality as a separate mapping and call this mapping in 3 different mappings – this would make the code modular and reusable

10.   You imported a delimited flat file “ABC.TXT” from you workstation into the Source qualifier in Informatica client. You then proceeded with developing a mapping and validated it for correctness using the “Validate” function. You then set it up for execution in the workflow manager. When you execute the mapping, you get an error stating that the file was not found. The most probable cause of this error is:
a.       Your mapping is not correct and the file is not being parsed correctly by the source qualifier
b.       The file cannot be loaded from your workstation, it has to be on the server
c.       Informatica did not have access to the NT directory on your workstation where the file is stored
d.       You forgot to mention the location of the file in the workflow properties and hence the error

11.   Various administrative functions such as folder creation and user access control are done using:
a.       Informatica Administration console
b.       Repository Manager
c.       Informatica Server
d.       Repository Server

12.   You created a mapping a few months back which is not invalid because the database schema underwent updates in the form of new column extensions. In order to fix the problem, you would:
a.       Re-import the table definitions from the database
b.       Make the updates to the table structure manually in the mapping
c.       Informatica detects updates to table structures automatically. All you have to do is click on “Validate” option for the mapping
d.       None of the above. The mapping has to be scrapped and a new one needs to be created

13.   The parameter file is used to store the following information
a.       Workflow parameters, session parameters, mapping parameters and variables
b.       Workflow variables, session variables, mapping variables
c.       Mapping parameters, session constants, workflow constants.

14.   The Gantt chart view in Informatica is useful for:
a.       Tracking dependencies for sessions and mappings
b.       Schedule workflows
c.       View progress of workflows and view overall schedule
d.       Plan start and end dates / times for each workflow run

15.   When using the debugger function, you can stop execution at the following:
a.       Errors or breakpoints
b.       Errors only
c.       Breakpoints only
d.       First breakpoint after the error occurs

16.   There is a requirement to selectively update or insert values in the target table based on the value of a field in the source table. This can be achieved using:
a.       Update Strategy transformation
b.       Aggregator transformation
c.       Router transformation
d.       Use the Expression transformation to write code for this logic

17.   A mapping can contain more than one source qualifier – one for each source that is imported.
a.       True
b.       False

18.   Which of the following sentences are accurate
a.       Power Channels are used to improve data migration across WAN / LAN networks
b.       Power Channels are adapters that Informatica provides for various ERP / CRM packages
c.       Power Connect are used to improve data migration across WAN / LAN networks
d.       None of the above

19.   To create a valid mapping in Informatica, at a minimum, the following entities are required:
a.       Source, Source Qualifier, Transformation, Target
b.       Source Qualifier, Transformation, Target
c.       Source and Target
d.       Source, Transformation, Target

20.   When one imports a relational database table using the Source Analyzer, it always creates the following in the mapping:
a.       An instance of the table with a source qualifier with a one to one mapping for each field
b.       Source sorter with one to one mapping for each field
c.       None of the above


Name:

Score: 

Pass / Fail:


Ans:
1. a 2. b 3. a 4. c 5. a 6. a 7. a 8. b 9. a 10. b 11. b 12. a,b 13. a 14. c 15. a 16. a 17. b8. a 19. a 20. a

Comments

Popular posts from this blog

CMN_1650 A duplicate row was attempted to be inserted into a dynamic lookup cache Dynamic lookup error.

Scenario: I have 2 ports going through a dynamic lookup, and then to a router. In the router it is a simple case of inserting new target rows (NewRowLookup=1) or rejecting existing rows (NewRowLookup=0). However, when I run the session I'm getting the error: "CMN_1650 A duplicate row was attempted to be inserted into a dynamic lookup cache Dynamic lookup error. The dynamic lookup cache only supports unique condition keys." I thought that I was bringing through duplicate values so I put a distinct on the SQ. There is also a not null filter on both ports. However, whilst investigating the initial error that is logged for a specific pair of values from the source, there is only 1 set of them (no duplicates). The pair exists on the target so surely should just return from the dynamic lookup newrowlookup=0. Is this some kind of persistent data in the cache that is causing this to think that it is duplicate data? I haven't got the persistent cache or...

SQL Transformation with examples

============================================================================================= SQL Transformation with examples   Use : SQL Transformation is a connected transformation used to process SQL queries in the midstream of a pipeline . We can insert, update, delete and retrieve rows from the database at run time using the SQL transformation. Use SQL transformation in script mode to run DDL (data definition language) statements like creating or dropping the tables. The following SQL statements can be used in the SQL transformation. Data Definition Statements (CREATE, ALTER, DROP, TRUNCATE, RENAME) DATA MANIPULATION statements (INSERT, UPDATE, DELETE, MERGE) DATA Retrieval Statement (SELECT) DATA Control Language Statements (GRANT, REVOKE) Transaction Control Statements (COMMIT, ROLLBACK) Scenario: Let’s say we want to create a temporary table in mapping while workflow is running for some intermediate calculation. We can use SQL transformat...

Informatica Quiz: Set 1

                                                                                                                                                                    ...