Header and Trailer record handling in ODI

Hi All,
I have a delimited file which I am trying to load into Oracle DB using Oracle Data Integrator. We are using SQLLDR LKM for this. Since we have header, trailer have different format than the data records, so we have another interface which loads data. Since we have the requirement to load the data into another control table for audit purpose we have another interface.
What happens when we load the file, because of two interfaces the file scan happens 2 twice. This increases the overall turnaround time. Please note these interfaces work on filter basis for a field called record_type. 0,1,2 are the values for this field which mean header, data and trailer respectively. What we are looking for is how to restrict this double scanning of file for header-trailer and data records. What are the options which can be used here.
Regards,
Prashant

Hi Prashant
You can use a combination of external table and multi table insert. In ODI 11g there is a Oracle Multi Table Insert IKM. You will have to have one interface which has the source as the file and the target is a temporary target. The LKM will use LKM File to External Table, the IKM will use Oracle Multi Table IKM, and it will set DEFINE_QUERY. Also the LKM should set DELETE_TEMPORARY_OBJECTS to false. Then define 2 interfaces to load your 2 targets for example. The first target loading interface will use the 1st interface as a source and the MTI IKM again will use the MTI IKM and set IS_TARGET_TABLE to true and execute to false. Then the second target loading interface will use the 1st interface as a source and the MTI IKM again wil set IS_TARGET_TABLE to true and execute to true.
Cheers
David

Similar Messages

  • Open Hub Header and Trailer Record.

    Hi,
    For the Open Hub Destination
    Destination Type is File,
    Application server.
    Type of File Name: 2 Logical File name.
    How to to the get the Header and Trailer Record which will contain Creation Date, Creation Time and Total number of records.
    Header record Layput :
    Creation Date (YYYYMMDD)
    Creation Time (HHMMSS)
    Total record Count
    Trailer record Layout:
    Total number of Records in the file and
    XXXX( Key Figure ) Total.
    Thanks in advance.
    Regards,
    Adhvi.

    Hi Venkat,
    write a UDF in following way...
    pass the first parameter as the detail node (cache the whole queue) to the UDF pass the second parameter as the trailer countto the UDF
    now loop through the detail records get the count with a counter variable
    check the counter against the trailer count outside the loop
    if it doesnot match trigger the alert from the UDF itself
    Check the below link for triggering alert from an UDF
    /people/bhavesh.kantilal/blog/2006/07/25/triggering-xi-alerts-from-a-user-defined-function

  • Header and Trailer record validations

    Hi
    I have scenario file-xi-proxy. File contains the header record, detail record and trailer record
    In header record i am getting date field, in header i have to do validation like header record exits and it should be in date format.
    In Trailer record i have to do Total records and Total amount  equals to Total records processed and total amount.
    And amount value should be greater than zero..
    My source structure
    DT_ ACEAwardInformation
    <b>Header</b>
    BeginDate
    EndDate
    <b>DetailRecord</b>
    Field1
    Field2
    Field3
    Field4
    <b>TrailerRecord</b>
    TotalRecord
    TotalAmount
    Even content conversion parameters.
    venkat

    Hi Venkat,
    write a UDF in following way...
    pass the first parameter as the detail node (cache the whole queue) to the UDF pass the second parameter as the trailer countto the UDF
    now loop through the detail records get the count with a counter variable
    check the counter against the trailer count outside the loop
    if it doesnot match trigger the alert from the UDF itself
    Check the below link for triggering alert from an UDF
    /people/bhavesh.kantilal/blog/2006/07/25/triggering-xi-alerts-from-a-user-defined-function

  • Checking header and trailer record

    Hi
    Plz can any one guide me writing UDF for checking whether header record exists or not
    i am using node funtion Exits...
    In UDF i wanna check the value of Exists if it is true continue the process if it is false raise an exception
    How we write UDF for that
    venkat

    Hi.....Venkat,
    1.Import ur sender & reciever XSD files.
    2.A user defined function is created before doing the mapping where we write the java code to achieve the looping conditions.
    3.At the time of creating the function we should have two arguments (ex.a & b)(importing parameters) to it.then the java code is written for the conditions.
    4.write ur code.
    5. then using the user defined function further mapping is done.In this way the conditions are achieved during mapping.
    ******************if it helpful points will be rewarded*********************************
    Regards,
    Naresh.K

  • How to do the header and trailer validation in the input file?

    hi,
    what are the ways we can validate whether header and trailer record exists in the input file?
    how to do that?
    regards
    Ruban

    File to Proxy Validation
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/99593f86-0601-0010-059d-d2dd39dceaa0
    /people/swaroopa.vishwanath/blog/2005/06/24/generic-approach-for-validating-incoming-flat-file-in-sap-xi--part-1
    /people/swaroopa.vishwanath/blog/2005/06/29/generic-approach-for-validating-incoming-flat-file-in-sap-xi--part-ii
    /people/prateek.shah/blog/2005/06/14/file-to-r3-via-abap-proxy

  • How to Seggragate Header, Footer and Trailer Records in Informatica Cloud ?

    Hi All, This is my source file Structure which is Flat File. Source File : "RecordType","Creation Date","Interface Name""H","06-08-2015","SFC02""RecordType","Account Number","Payoff Amount","Good Through Date""D","123456787","2356.14","06-08-2015""D","12347","2356.14","06-08-2015""D","123487","235.14","06-08-2015""RecordType","Creation Date","TotalRecordCount""T","06-08-2015","5" The Source File has to be loaded into three targets for Header , Detail and Trailer records separately. Target Files: File 1 : Header.txt
    "RecordType","Creation Date","Interface Name""H","06-08-2015","SFC02" File 2 : Detail.txt "RecordType","Account Number","Payoff Amount","Good Through Date""D","123456787","2356.14","06-08-2015""D","12347","2356.14","06-08-2015""D","123487","235.14","06-08-2015" File 3 : Trailer.txt "RecordType","Creation Date","TotalRecordCount""T","06-08-2015","5"  I tired this solution below :  1.  Source ---> Expression ]-----filter 1---Detail.txt                                        -----filter 2---Trailer.txt  In source , I will read the records starting from 3rd row. This is because, if i read from first row, the detail part contains more fields when compared to Header part. Header Part contains only three fields. So it is taking only first three fields records in Detail Section as well.That's why I am skipping the first two records(Header fields and header record) .. refer the example.. In filter 1, condition is Record_Type = 'D'. In Filter 2 , condition is Record_Type = 'T'.So, the filter 1 will load to Detail.txt and Filter 2 Will load to Trailer.txt In task , pre session command, Calling the windows .bat script to fetch the first two lines and load into Header.txt  This solution is working fine..  My query is can we use two pipeline flow in a same mapping in Informatica cloud.?  Pipeline Flow 1  Source ---> Expression ]-----filter 1---Detail.txt                                    -----filter 2---Trailer.txtPipeline Flow 2  Source ---> Expression ]-----filter 3(Record_Type='H')---Header.txt Source file is same in two flows. In first flow, I will read from the third row, skipping the header section as I mentioned earlier.In second flow , I ll read the entire content and take only header record and load into target.  If I add the flow 2 to existing flow 1.  I am getting the below error..TE_7020 Internal error. The Source Qualifier [Header_Source] contains an unbound field [Creation_Date]. Contact Informatica Global Customer Support.  1. Do informatica Cloud supports, two parallel flow in a same mapping ?2. Any other best solution for my requirement?   Since I am new to Informatica Cloud, Can anyone suggest any other solution if you have?? It will be more helpful if you guys suggest a good solution ..  ThanksSindhu Ravindran

    We are using a Webservices Consumer Tranformation in our mapping to connect to RightFax Server using a WSDL url via Business Service in a mapping.Here in the mapping, where we are sending the input parameters through a flat file and then connecting to Rightfax Server via WS Consumer transformation and then fetching the data and writing to a Flat File.  07/28/2015 10:10:49 **** Importing Connection: Conn_000A7T0B0000000000EM ...07/28/2015 10:10:49 **** Importing Source Definition: SRC_RightFax_txt ...07/28/2015 10:10:49 **** Importing Target Definition: GetFax_txt ...07/28/2015 10:10:49 **** Importing Target Definition: FaultGroup_txt ...07/28/2015 10:10:49 **** Importing SessionConfig: default_session_config ...    <Warning> :  The Error Log DB Connection value should have Relational: as the prefix.    <Warning> :  Invalid value  for attribute Error Log DB Connection. Will use the default value     Validating Source Definition  SRC_RightFax_txt...    Validating Target Definition  FaultGroup_txt...    Validating Target Definition  GetFax_txt...07/28/2015 10:10:49 **** Importing Mapping: Mapping0 ...    <Warning> :  transformation: RIghtfax - Invalid value  for attribute Output is repeatable. Will use the default value Never    [transformation< RIghtfax > ] Validating transformations of mapping Mapping0...Validating mapping variable(s).07/28/2015 10:10:50 **** Importing Workflow: wf_mtt_000A7T0Z00000000001M ...    <Warning> :  Invalid value Optimize throughout for attribute Concurrent read partitioning. Will use the default value Optimize throughput   [Session< s_mtt_000A7T0Z00000000001M >  --> File Reader< File Reader > ]     <Warning> :  The value entered is not a valid integer.    <Warning> :  Invalid value NO for attribute Fail task after wait time. Will use the default value Successfully extracted session instance [s_mtt_000A7T0Z00000000001M].  Starting repository sequence id is [1048287470] Kindly provide us a solution. Attached are logs for reference.

  • How to generate Header and Trailer for a file

    Hi Guru
    How can we generate header and Trailer for a file
    EX:
    i want to generate header with date and trailer with record count from table.
    Sample file :
    20120120
    fwsfs
    adfwsfd
    adff
    afsadf
    afdwsg
    adgsg
    adgsgg
    asgdsag
    sdgasgdaf
    sdfsagfadf
    10

    Hi ,
    1.Create an interface to load data from oracle to file and set generate header as false option in IKM .
    2.Create variable get_current_date of alphanumeric datatype and implement logic SELECT to_Char(SYSDATE,'yyyymmdd') FROM DUAL under refreshing tab
    3.Create variable get_record_count of numeric datatype and implement logic SELECT '<%=odiRef.getPrevStepLog("INSERT_COUNT")%>' FROM DUAL under refreshing tab
    4.Create a package
    Drag the get_current_date variable ,
    Drag odioutfile and paste the below logic OdiOutFile "-FILE=D:\ODI_TEST\emp.txt" "-CHARSET_ENCODING=ISO8859_1" "-XROW_SEP=0D0A" #GET_current_date in command tab
    Drag the interface
    Drag another variable get_Record_count
    Drag the odioutfile and paste the below logic OdiOutFile "-FILE=D:\ODI_TEST\emp.txt" -APPEND "-CHARSET_ENCODING=ISO8859_1" "-XROW_SEP=0D0A"
    #GET_RECORD_COUNT in command tab
    Link all these in sequence,save and run the package.
    OR Modify the IKM SQL to File Append to achieve same functionality.
    Thanks,
    Anuradha

  • Receiver FCC Structure For Header and Trailer

    Hi Guys ,
    I need to go for receiver FCC for header and trailer  in PI .How the FCC in communication channel needs to be configured if I am taking a separate node for header and trailer in mapping .Output of the file should be as the file attached.
    Thanks.
    Regards.

    Hi,
    If you are using File communication channel as a reciver you can go with FCC.
    Use record set structure as : Header,detail,trailer.
    https://help.sap.com/saphelp_nw04/helpdata/en/d2/bab440c97f3716e10000000a155106/content.htm
    Otherwise if you are going for SFTP to create the file ... follow the Indrajit suggestion..
    Still facing any issue.. Please let us know..
    Thanks,
    Sreenivas...

  • Spooling data with Header and trailer

    Hi everyone,
    I have a problem in spooling the data.
    I get a .dat file to load into db .That .dat file will be having header,records,trailer.While loading into the db i load only the records .
    I load the .dat file into table using sqlldr with direct= true and parallel=true.
    I load data into one table.header and trailer into one table(Loading into two different tables using three control file each for header,records,trailer to load from same .dat file).But while spooling the data out after some modifications to the table.i need to spool it with the header and trailer.I am able to spool only the records but not header and trailer.
    That header and trailer are in standard form,everytime i get the .dat file.
    Help me how to spool data from a table with the HEADER & TRAILER in the spool file.
    Thanks

    Some may argue this is not the right forum - database general might have been a little been than downloads - but now the thread is here I'll try an answer.
    As I see it It depends on your spooling tool.
    Lets assume you are using sqlplus.
    I that case the sql prompt command may serve your purpose
    Alternatively
    select 'header' from dual
    union all
    select field1||','||field2||','field3 .... /*must single string expression but may concatentate field */ from somtable
    union all
    select 'trailer' from dual;may suit.
    However I fear either I have not understood the question or if I have understood the question that you may have difficulties implementing this.
    Edited by: bigdelboy on 19-Apr-2009 14:32

  • Unix Flat File: Remove header and trailer and put in another file.

    Hi,
    I have Source Flat File placed on Unix Box with header and trailer.
    I want to remove Header and Trailer and put in some other file and Data in another file.
    I tried following command in unix its working.But not getting Header and Trailer in another file.
    sed '1d;$d' input_source.txt > output_data.txt
    also How will i use OS command for it in ODI.
    Guide me.
    Thanks
    Ashwini

    Hi Ashwini,
    You can run OS commands in a package using an ODI Tool: OdiOSCommand.
    It is also possible to execute OS commands in an ODI procedure using the Operating System or Jython technologies.
    There should be some articles about this on metalink (http://metalink.oracle.com).
    Thanks,
    Julien

  • Suppressing Header and Trailer

    Can I suppress the Header and Trailer per runtime paremeter? How?
    Thanx!

    Hi Prashant
    You can use a combination of external table and multi table insert. In ODI 11g there is a Oracle Multi Table Insert IKM. You will have to have one interface which has the source as the file and the target is a temporary target. The LKM will use LKM File to External Table, the IKM will use Oracle Multi Table IKM, and it will set DEFINE_QUERY. Also the LKM should set DELETE_TEMPORARY_OBJECTS to false. Then define 2 interfaces to load your 2 targets for example. The first target loading interface will use the 1st interface as a source and the MTI IKM again will use the MTI IKM and set IS_TARGET_TABLE to true and execute to false. Then the second target loading interface will use the 1st interface as a source and the MTI IKM again wil set IS_TARGET_TABLE to true and execute to true.
    Cheers
    David

  • Header and trailer pages in report templates

    Hi:
    I wonder how can i make header and trailer pages in the tamplate reports , so that the objects in the template header & footer report would be inherited to the generated reports. I use Oracle reports 6.0.8 and there is no header section, niether Trailer section in the Template Editor, I've read in the reports online help :"You can use the margin and body of the Header and Trailer sections to create a Header and Trailer “page” as in earlier releases. In future releases, you will be able to add and delete sections.", is there any method i can use to make header&trailer pages in the template.

    I have the same question. This is what I found on Metalink.
    goal: How To Define A Header/Trailer Section In A Reports Template Definition File (TDF)
    fact: Oracle Reports Developer
    fix: It is not possible to have a HEADER/TRAILER section in a Reports Template Definition File (TDF). Templates do not need to have a HEADER/TRAILER section, infact there is no sectioning in a template. They just define the visual layout when applied against a report section.
    This did not answer my question!

  • Generating Header and trailer sections in reports..

    Hi all.
    I want to generate reports, that has both a header- and a trailer section page.
    Usually this was part of the template report, but in Designer 9, the generator first converts the template report (.rdf) to a template report (.tdf), and is then using this template.
    But where is the header- and trailer page? You can't see them in the reports builder, when looking at templates, and the generated report does not contain any header- nor trailer pages.
    Have I missed something?
    Any replies welcome...
    Benny Albrechtsen

    I have the same question. This is what I found on Metalink.
    goal: How To Define A Header/Trailer Section In A Reports Template Definition File (TDF)
    fact: Oracle Reports Developer
    fix: It is not possible to have a HEADER/TRAILER section in a Reports Template Definition File (TDF). Templates do not need to have a HEADER/TRAILER section, infact there is no sectioning in a template. They just define the visual layout when applied against a report section.
    This did not answer my question!

  • File content conversion - sender adapter for Header and detail records

    Hi Experts,
                     I am receiving a field of fixed length content format.(Header)The first line of the file will follow the structure X having some fields and (DetailRecord)subsequent lines in the file will follow structure Y having somes fields.There is no record identifier for Header and Detail records.In one file first line is Header records and remaining subsequent line is DetailRecord.What are the parameters we have to set for sender file content conversion parameters as i donot have any key field and key field value.And in one file we have only one header records ( first line) and n number of detail records from 2nd line onwards.
    Thanks
    Deepak

    Hi
    Refer the below fourm link,
    Flat file whitout id
    Regards
    Ramg.

  • RESTORE - Kernel   Bad page - header and trailer not matching

    Hello,
    I have a Problem with the recover of my backup.
    When i want to make a restore of the Database this error is shown in Logfile KNLDIAGERR
    2008-11-26 21:20:38                               --- Starting GMT 2008-11-26 20:20:38           7.5.0    Build 032-123-111-699
    2008-11-26 21:26:57      0xE0C ERR 20005 Kernel   Bad page - header and trailer not matching
    2008-11-26 21:26:57      0xE0C ERR 20006 Kernel   Header [ 5277032 - data - tab - checksumData - empty ]
    2008-11-26 21:26:57      0xE0C ERR 20007 Kernel   Trailer[ 0 - data - nil - chckNil - empty ]
    2008-11-26 21:26:57      0xE0C ERR 52015 RESTORE  write/check count mismatch 5277032
    2008-11-26 21:26:57      0xDFC ERR 52012 RESTORE  error occured, basis_err 300
    2008-11-26 21:26:57      0xDFC ERR 51080 SYSERROR -9026 BD Bad datapage
    any Help ?
    thanks a lot
    Steven

    Hi Steven,
    the database kernel checks with every read the header trailer. If it got a mismatch during read the error message you got is logged.
    The same checkes will be executed when you create a backup. You told us that the backup could be executed without any errors and that the system where you created the backup is ok as well -> you executed a Check data?
    Now everything looks like the error is in the backup and happened during writies to the external backup medium. Or the backup is ok and then the error happens during read from the external medium.
    To get closer to the reason where the problem is located I need to have some more information.
    1. You did the restore several times?
    2. Did the bad Data Page shappens always on the same page -> Check knldiag.err
    3. Did you do your backup on tape?
    4. Did you try to create a backup into a file?
    5. Does the restore from file got the error as well?
    Regards, Christiane Hienger

Maybe you are looking for