Reading the data from Flat File using Expression and Filter

In this example, we implement following scenarios:

a. How to read the data from CSV flat file?

b. Extracting the Source File Name into Target Table?
Source: – TIME.txt
Target: – ODS_TIME
Filter Transformation: –

This transformation is an active and connected transformation that can filter rows in mapping.

Only the rows meet filter condition pass through the filter transformation to next transformation in Pipeline. TRUE and FLASE are implicit return values from any filter condition we define in a mapping. If the Filter Condition evaluates to NULL value, row assumed to be FALSE.

1.Importing the Source Definition TIME.txt and Target Definition.

2.Create a Mapping

3.Drag the TIME.txt Source Definition the mapping designer workspace. Observe that by default source qualifier transformation is added.

4.Click Transformation > Create to create an Expression transformation. Select Expression, enter name as EXP_CONVERT_DATA

5.Drag the DATE, DAY_OF_WEEK,QUARTER and CurrentlyProcessedFileName columns to Expression Transformation. Rename the CurrentlyProcessedFileName to the FILE_NAME

6.Double-click the Expression transformation, and then click the Ports tab. Clear the Output (O) column for DATE and QUARTER. You want to use this port as an input (I) only, not as an output (O).

7.Click the Add button to add a new port O_DATE_ID, O_MONTH, O_QUARTER and O_YEAR. Add the expression as below and validate the expression and click OK



8.Create the filter transformation FIL_ODS_TIME, drag all the ports from EXP_CONVERT_DATA to FIL_ODS_TIME. Edit the filter transformation, in the properties tab add the condition below:
DATE_ID>=TO_DATE (’12/04/1999′,’MM/DD/YYYY’)

9.Drag the Target Definition and Link the columns from filter to Target Definition as per data mapping sheet

10.Click Repository > Save to save the new mapping to the repository. Validate the mapping.


11.In the Workflow Manager Navigator, create the session that uses mapping which is created at step-2 using the Task Developer.
Menu->Tools->Task Developer
Task Developer->Menu->Tasks->Create
12.Provide the Path for source file and connection for Target table
Session–>edit–>mapping ->Sources->SQ_TIME Source File Name->Time.txt (Copy the file to $PMSrcFiledir= C:/Informatica/9.5.1/server/infa_shared/SrcFiles)
Session—>edit–>mapping -> ODS_EMPLOYEE ->OLAP_TDBU
13.Choose the Target Load Type Normal or Bulk
Session->Edit–>Mapping->select Target ->Target Load Type
14.Provide the Bad File name to capture the rejected records “ods_time_bad.txt”
SessionEdit–>Mapping->select Target —>Reject File
15.Create the workflow “w_ods_time” using the workflow designer. Assign the appropriate INT service.
16.Click Tasks > Link Task. Drag from the Start task to the Session task. Click Repository > Save to save the workflow in the repository.
17.Execute the workflow and verify the data in Target.
Menu->Workflow->Start workflow
1.Verify the Session and workflow status in Workflow Monitor. Observe the Session log, Workflow log.
2.Verify the data in Target table in TDBU schema

Select * from ods_time;