Procedure to Normalize Flat File Structure

Hi,
I'm trying to write a procedure that essentially takes a table with Customers Orders that are in a flat file format...ie customer_id, order_no1, order_no2...etc. and normalize it so that it looks like this: customer_id, order_no, order_cd.
Here's my procedure that doesn't seem to work in that it starts and customer order_no1 and inserts all records but the counter doesn't increment to order_no2, order_no3, etc.
CREATE OR REPLACE PROCEDURE NORMALIZE_ORDERS(
pSTART_DT IN varchar2
, pEND_DT IN varchar2
AUTHID CURRENT_USER IS
BEGIN
DECLARE
loop_beg_dt varchar2(20) := pSTART_DT;
loop_end_dt varchar2(20) := pEND_DT;
intX number := 1;
insert_orders varchar2(5000) := 'insert /*+ append */ into customer_orders_normalized '
||'(customer_id, order_no, order_cd, order_type) '
||'select customer_id,'|| intX ||', order_cd'||to_char( intX) ||' order_type'|| to_char( intX)
||' from customer_orders_flat where order_cd'|| to_char( intX) ||' is not null '
||'AND order_dt >= :1 and order_dt < :2';
BEGIN
LOOP
EXECUTE IMMEDIATE insert_orders using loop_beg_dt, loop_end_dt;
COMMIT;
intX := intX+1;
exit when intX > 25;
END LOOP;
END;
END NORMALIZE_ORDERS;
Essentially, instead of writing 25 insert statements for 25 possible orders for each customer, I want one insert statement with a loop that increments until the 25th order is inserted. So it looks like below:
insert /*+ append */ into customer_orders_normalized
(customer_id, order_no, order_cd, order_type)
select customer_id,1,order_no1,order_cd1,order_type1
from customer_orders_flat where order_cd1 is not null
AND order_dt >= '01-apr-2010' and order_dt < '01-apr-2011';
insert /*+ append */ into customer_orders_normalized
(customer_id, order_no, order_cd, order_type)
select customer_id,2,order_no2,order_cd2,order_type2
from customer_orders_flat where order_cd2 is not null
AND order_dt >= '01-apr-2010' and order_dt < '01-apr-2011';
Sorry, I realize I need to enclose my sql in tags but the help option doesn't tell me what tags I need to use for the sql code so my apologies, if someone can provide me with the necessary tags when I do insert sql into my message, I would appreciate it.
Thanks,
Ed

http://wikis.sun.com/display/Forums/Forums+FAQ => scroll down to "Are there any useful formatting options not shown on the sidebar? "
Is is bad to COMMIT inside LOOP
What exactly are you LOOPing on?
You do NOT require EXECUTE IMMEDIATE; just load variables with new/different values

Similar Messages

  • How to join 5 different tables using SQL to make it to a flat file structur

    I am trying to load five differnt tables into one flat file structure table without cartesian product.
    I have five different tables Jobplan, Jobtask(JT), Joblabor(JL), Jobmaterial(JM) and Jpsequence(JS) and the target table as has all the five tables as one table.
    The data i have here is something like this.
    jobplan = 1record
    jobtask = 5 records
    joblabor = 2 records
    jobmaterial = 1 record
    jpsequence = 3 records
    The output has to be like this.
    JPNUM     DESCRIPTION     LOCATION     JT_JPNUM     JT_TASK     JL_JPNUM     JL_labor     JM_JPNUM     JM_MATERIAL     JS_JPNUM     JS_SEQUENCE
    1001     Test Jobplan     USA     NULL     NULL     NULL     NULL     NULL     NULL     NULL     NULL
    1001     Test Jobplan     USA     1001     10     NULL     NULL     NULL     NULL     NULL     NULL
    1001     Test Jobplan     USA     1001     20     NULL     NULL     NULL     NULL     NULL     NULL
    1001     Test Jobplan     USA     1001     30     NULL     NULL     NULL     NULL     NULL     NULL
    1001     Test Jobplan     USA     1001     40     NULL     NULL     NULL     NULL     NULL     NULL
    1001     Test Jobplan     USA     1001     50     NULL     NULL     NULL     NULL     NULL     NULL
    1001     Test Jobplan     USA     NULL     NULL     1001     Sam     NULL     NULL     NULL     NULL
    1001     Test Jobplan     USA     NULL     NULL     1001     Mike     NULL     NULL     NULL     NULL
    1001     Test Jobplan     USA     NULL     NULL     NULL     NULL     1001     Hammer     NULL     NULL
    1001     Test Jobplan     USA     NULL     NULL     NULL     NULL     NULL     NULL     1001     1
    1001     Test Jobplan     USA     NULL     NULL     NULL     NULL     NULL     NULL     1001     2
    1001     Test Jobplan     USA     NULL     NULL     NULL     NULL     NULL     NULL     1001     3
    Please help me out with this issue.
    Thanks,
    Siva
    Edited by: 931144 on Apr 30, 2012 11:35 AM

    Hope below helps you
    CREATE TABLE JOBPLAN
    ( JPNUM NUMBER,
      DESCRIPTION VARCHAR2(100)
    INSERT INTO JOBPLAN VALUES(1001,'Test Jobplan');
    CREATE TABLE JOBTASK
    ( LOCATION VARCHAR2(10),
      JT_JPNUM NUMBER,
      JT_TASK  NUMBER
    INSERT INTO JOBTASK VALUES('USA',1001,10);
    INSERT INTO JOBTASK VALUES('USA',1001,20);
    INSERT INTO JOBTASK VALUES('USA',1001,30);
    INSERT INTO JOBTASK VALUES('USA',1001,40);
    INSERT INTO JOBTASK VALUES('USA',1001,50);
    CREATE TABLE JOBLABOR
    ( JL_JPNUM NUMBER,
      JL_LABOR VARCHAR2(10)
    INSERT INTO JOBLABOR VALUES(1001,'Sam');
    INSERT INTO JOBLABOR VALUES(1001,'Mike');
    CREATE TABLE JOBMATERIAL
    ( JM_JPNUM    NUMBER,
      JM_MATERIAL VARCHAR2(10)
    INSERT INTO JOBMATERIAL VALUES(1001,'Hammer');
    CREATE TABLE JOBSEQUENCE
    ( JS_JPNUM    NUMBER,
      JS_SEQUENCE NUMBER
    INSERT INTO JOBSEQUENCE VALUES(1001,1);
    INSERT INTO JOBSEQUENCE VALUES(1001,2);
    INSERT INTO JOBSEQUENCE VALUES(1001,3);
    SELECT   JP.JPNUM        AS JPNUM       ,
             JP.DESCRIPTION  AS DESCRIPTION ,
             NULL            AS LOCATION    ,
             NULL            AS JT_JPNUM    ,
             NULL            AS JT_TASK     ,
             NULL            AS JL_JPNUM    ,
             NULL            AS JL_labor    ,
             NULL            AS JM_JPNUM    ,
             NULL            AS JM_MATERIAL ,
             NULL            AS JS_JPNUM    ,
             NULL            AS JS_SEQUENCE
    FROM JOBPLAN JP
    UNION ALL
    SELECT   JP.JPNUM        AS JPNUM       ,
             JP.DESCRIPTION  AS DESCRIPTION ,
             JT.LOCATION     AS LOCATION    ,
             JT.JT_JPNUM     AS JT_JPNUM    ,
             JT.JT_TASK      AS JT_TASK     ,
             NULL            AS JL_JPNUM    ,
             NULL            AS JL_labor    ,
             NULL            AS JM_JPNUM    ,
             NULL            AS JM_MATERIAL ,
             NULL            AS JS_JPNUM    ,
             NULL            AS JS_SEQUENCE
    FROM JOBPLAN JP, JOBTASK JT
    UNION ALL
    SELECT   JP.JPNUM        AS JPNUM       ,
             JP.DESCRIPTION  AS DESCRIPTION ,
             NULL            AS LOCATION    ,
             NULL            AS JT_JPNUM    ,
             NULL            AS JT_TASK     ,
             JL.JL_JPNUM     AS JL_JPNUM    ,
             JL.JL_labor     AS JL_labor    ,
             NULL            AS JM_JPNUM    ,
             NULL            AS JM_MATERIAL ,
             NULL            AS JS_JPNUM    ,
             NULL            AS JS_SEQUENCE
    FROM JOBPLAN JP, JOBLABOR JL
    UNION ALL
    SELECT   JP.JPNUM        AS JPNUM       ,
             JP.DESCRIPTION  AS DESCRIPTION ,
             NULL            AS LOCATION    ,
             NULL            AS JT_JPNUM    ,
             NULL            AS JT_TASK     ,
             NULL            AS JL_JPNUM    ,
             NULL            AS JL_labor    ,
             JM.JM_JPNUM     AS JM_JPNUM    ,
             JM.JM_MATERIAL  AS JM_MATERIAL ,
             NULL            AS JS_JPNUM    ,
             NULL            AS JS_SEQUENCE
    FROM JOBPLAN JP, JOBMATERIAL JM
    UNION ALL
    SELECT   JP.JPNUM        AS JPNUM       ,
             JP.DESCRIPTION  AS DESCRIPTION ,
             NULL            AS LOCATION    ,
             NULL            AS JT_JPNUM    ,
             NULL            AS JT_TASK     ,
             NULL            AS JL_JPNUM    ,
             NULL            AS JL_labor    ,
             NULL            AS JM_JPNUM    ,
             NULL            AS JM_MATERIAL ,
             JS.JS_JPNUM     AS JS_JPNUM    ,
             JS.JS_SEQUENCE  AS JS_SEQUENCE
    FROM JOBPLAN JP, JOBSEQUENCE JS;
         JPNUM DESCRIPTION     LOCATION      JT_JPNUM    JT_TASK   JL_JPNUM JL_LABOR     JM_JPNUM JM_MATERIA   JS_JPNUM JS_SEQUENCE
          1001 Test Jobplan    NULL       NULL        NULL       NULL       NULL       NULL       NULL    NULL          NULL
          1001 Test Jobplan    USA        1001        10         NULL       NULL       NULL       NULL    NULL          NULL
          1001 Test Jobplan    USA        1001        20         NULL       NULL       NULL       NULL    NULL          NULL
          1001 Test Jobplan    USA        1001        30         NULL       NULL       NULL       NULL    NULL          NULL
          1001 Test Jobplan    USA        1001        40         NULL       NULL       NULL       NULL    NULL          NULL
          1001 Test Jobplan    USA        1001        50         NULL       NULL       NULL       NULL    NULL          NULL
          1001 Test Jobplan    NULL       NULL        NULL       1001       Sam        NULL       NULL    NULL          NULL
          1001 Test Jobplan    NULL       NULL        NULL       1001       Mike       NULL       NULL    NULL          NULL
          1001 Test Jobplan    NULL       NULL        NULL       NULL       NULL       1001       Hammer  NULL          NULL
          1001 Test Jobplan    NULL       NULL        NULL       NULL       NULL       NULL       NULL    1001          1
          1001 Test Jobplan    NULL       NULL        NULL       NULL       NULL       NULL       NULL    1001          2
          1001 Test Jobplan    NULL       NULL        NULL       NULL       NULL       NULL       NULL    1001          3
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Idoc data in flat file structure

    Dear Experts,
             We have  idoc data in flat file structure. We need to fetch it using ftp and map it to an idoc.
    please tell me how to proceed .
    Thanks,
    Aju

    Hi,
    For flat file you need to use the File content conversion parameters.
    Refer the blog,
    SAP Network Blog: How to process flat files with multiple documents like POs, SOs etc. in a File to IDoc scenario
    How to process flat files with multiple documents like POs, SOs etc. in a File to IDoc scenario
    Thanks
    Swarup

  • How to Create Hierarchy From Flat file Structure

    Hi Gurus,
    There is a scenario for me regarding the Hierarchy.
    Required Hierarchy structure - Region>Director>Manager-->Sales id
    I have flat file which gives the info like user id , sales id , manager id, director id.
    But the transaction data Flat file has structure with sales id, region id, sales amt, sales qty.
    Note : Region id is another Master Data.
    How i can create hierarchy from the first flat file which doesnot have region info in that but it is available in the transaction data Flat file.
    Is there anyway we can create hierarchy based on the first Flat file structure which contains more that 1,00,000 records.
    Try to Suggest me in this regard .
    This is urgent.
    Regards,
    Mano

    Hi Mano,
                    Defining the source system from which to load data
    Choose the source system tree File  ® Create.
           2.      Defining the InfoSource for which you want to load data
    Optional: choose InfoSource Tree ® Root (InfoSources) ® Create Application Components.
    Choose InfoSource Tree ® Your Application Component ® Other Functions  ® Create InfoSource 3.x ® Direct Update.
    Choose an InfoObject from the proposal list, and specify a name and a description.
           3.      Assigning the source system to the InfoSource
    Choose InfoSource Tree ® Your Application Component ® Your InfoSource ® Assign Source System. The transfer structure maintenance screen appears.
    The system automatically generates DataSources for the three different data types to which you can load data.
    &#9675;       Attributes
    &#9675;       Texts
    &#9675;       Hierarchies (if the InfoObject has access to hierarchies)
    The system automatically generates the transfer structure, the transfer rules, and the communication structure (for attributes and texts).
           4.      Maintaining the transfer structure / transfer rules
    Select the DataSource for uploading hierarchies.
    IDoc transfer method: The system automatically generates a proposal for the DataSource and the transfer structure. This consists of an entry for the InfoObject for which hierarchies are loaded. With this transfer method, the structure is converted to the structure of the PSA during loading, which affects performance.
    PSA transfer method: The transfer methods and the communication structure are also generated here.
           5.      Maintaining the hierarchy
    Choose Hierarchy Maintenance, and specify a technical name and a description of the hierarchy
    Hope this helps
    Regards
    Karthik
    Assign points if Helpful

  • IDOC --- XI -- HTTP (via a flat file structure, not XML)

    I am working on a senario in which we need to send a third party payroll service provider our HR Master Data records (Message type HROT_UM).  The manual method is to create a flat file from an IDOC and place the file in a shared directory, then upload the flat file by logging on to the web server.  We would like to automate this process using XI.
    Is it possible to send a flat file structure to a webserver using a HTTP receiver adapter?  If not, can you provide a basic view of how to accomplish this task.  Please note that FTP is not an option for us. 
    Any suggestions or recommendations would be greatly appreciated.

    hey
    flat file over HTTP is not possible,it takes only XML.
    have a look at the following thread
    Send file through http
    thanx
    ahmad
    PL:reward with points for helpful answers

  • Creating JCo IDoc from flat file structure

    Hi,
    I need to send an IDoc into SAP using JCo.
    The input to my program is a string containing lines representing a flat file idoc, e.g.
    Line 1="EDI_DC40                           2   ORDERS04.."
    Line 2="E1EDK01                                          00000100000001    USD..."
    Line 3="E1EDK14                                          0000030000000...."
    Is there a simple way to use JCo to create & send the IDoc? 
    i.e.
    1) If I use JCo and RFC IDOC_INBOUND_ASYNCHRONOUS, what would be all the steps/calls to SAP (create TID, call IDOC_INBOUND_ASYNCHRONOUS, confirm TID..?)
    And can IDOC_INBOUND_ASYNCHRONOUS be called using the flat file structures (without having to map to all the JCo ParameterList fields)?  Since the flat file structures are  in the format required by the RFC, just in one long string.
    Line 1=>IDOC_CONTROL_REC_40
    Lines 2..n=>IDOC_DATA_REC_40
    2) Similarly, if I were to use JCo plus the JCO IDoc library, is there a way to pass the flat file structures without having to do all the mapping to segment fields?
    3) Other options..?
    I want to use ALE to SAP, not files, even though the input is in the flat file structure.

    Your reply gives a link to the general JCo documentation.
    It doesn't give ideas on how to call an RFC or IDoc from JCO without mapping each and every field from a flat file structure.
    I'm looking for a way to do something like this:
    Function IDOC_INBOUND_ASYNCHRONOUS has table parameters
          IDOC_CONTROL_REC_40 STRUCTURE  EDI_DC40
          IDOC_DATA_REC_40 STRUCTURE  EDI_DD40
    Since I have the flat file representation of the IDoc, the first line should overlay exactly onto the EDI_DC40 structure.  And the subsequent lines should overlay onto EDI_DD40.  (all fields in this RFC are strings)
    However JCO and JCO IDoc library seem very strongly typed, so it looks like I would have to map each field from the flat file structure to a field in the JCO Function or JCO IDoc object. 
    This could be done in a generic way using the function/idoc metadata, however there would still be some overhead.
    Is there a way to get round this, and build the function/idoc treating its parameters as one long string?

  • Open Hub Destination with Flat File - Structure file is not correct

    Using the new concept for Open Hub destination (NOT an InfoSpoke).
    Output is to a Flat File. Remember that when output is to a flat file 2 files are generated: the output file and a structure file (S_<name of output file>.CSV)
    The Output file structure corresponds to the structure defined in the Open Hub destination.
    However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
    I have not found any settings that I can modify for the S_ file. When using InfoSpokes the S_ file is correct and I do not have to do anything.
    The issue is that the S_ file cannot be used to describe the output file. This is a problem.
    Anyone knows a solution to this issue? The goal is to get the S_ file to reflect the proper structure.
    I could not find any Notes on this. Please do not send general help links, this is a specific question.
    Thank you all for your support.

    Hi,
    However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
    As I know and checked this is not the case. The S_ file has the fields in the order of infospoke object sequence ( and in transformation tab, source structure).
    I would suggest you to control it again.
    Derya

  • What are the key steps & order to follow: changes in my flat file structure

    HI,
    I have a Cube which sits on ODS.
    In the ODS,  the are 6 characterisics: Char1, Char2,...., char6; and 3 key figures: kf1, kf2, and kf3.
    The ODS is loaded through a flat file.
    Through an update rule and startup routine, the ODS updates the cube.
    Now, I have a new requirement to add 2 new characteristics (Cha10, char20) and one 2 key figures (KF55, KF66)).i.e. the flat file will now be coming in with these new fields.
    I have an idea but this this is the first time I really have to implement, I need to be sure.
    What are the key steps that I need to go through and in what order?
    Thanks

    What version are you running ?
    Will you need to load history data as for these new fields or is it just going forward ?
    Is it all one to one mapping for the new fields ?
    1. Make sure you know to what dimensions you need to add the new chars ? New dimension for these two ?
    2. New chars to be made keyfields ?
    3. Update Type for keyfigures Overwrite or Additive ?
    Enhance the Cube, DSO, change  TRFN/TR/UR..

  • Hiearchy data loading from flat file to sap bw

    Hi experts,
    I am new to this can plz help me in sloving this sceniro.
    I have a sceniro like this can u tell me the procedure and hiearchy flat file structure
    To develop a data model in SAP BW to analyze sales
    MASTER DATA STRONG ENTITIES
    Create characteristics for following strong master data entities
    Customer
    Outlet
    Sales Office
    Sales Region
    Sales Representative
    Material
    Use Calendar Day and Calendar Month as time characteristics
    MASTER DATA WEEK ENTITIES
    Create attributes for following week master data entities
    Customer Name
    Customer Location
    Material Name
    Material Group
    ADDITIONAL MASTER DATA (HIEARCHIEY)
    Create a hierarchy where sales offices are assigned to sales regions
    and sales representatives are assigned to sales offices
    KEYFIGURES
    Quantity
    Price
    Tax %
    Sales Revenue
    DATA LOADING STARTEGY
    Load all master and transaction data using flat files.
    Thankq in advance
    Edited by: subbaraju on Dec 23, 2009 6:42 PM

    Hi arun
    Can u send me in detail the procedure how to slove the above sceniro.
    That is how many flat files we need to create. (cust f.f, mat f.f, heri f.f, trx f.f) i dont know wheather  it right r not.
    For tax and sales revenue wht are the formula we need to submit.
    and which one we need to take as master data key for hiearchery.
    Thanks in advance
    Edited by: subbaraju on Dec 24, 2009 7:05 PM

  • Transfer structure sequence for Flat File

    I am wondering is it realy important to maintain particular sequence in DS/TR str. for InfoObjects which are set up for getting constant value in TR rules. I am having trouble loading flat file. TR str. has some compounded characteristics like 0Fisvar for 0Fiscper and 0Co_area for 0profit_ctr both getting fix value in transfer rules. how to maintain flat file and TR str. sequence?

    Hi Vishan,
    This is what I used to do.
    If there is a compounded InfoObject or Key Figure that requires a unit (even though in your case, the compounded InfoCObject & the unita are always the same), You "need" add them in TS.
    After adding them, this is what I will do:
    Move them around to match the Flat File structure.
    Now, the fields that are always constants & not coming from Flat Files, move them to the end.
    They will be ignored. Your TR will populate them.
    If you get an error now, that means your Flat File is not formatted correctly, apart from that, you are fine.
    Ram Chamarthy
    Message was edited by: Ram Chamarthy

  • Create source structure -Need to create flat file or table?

    Hi All,
    Can any one let me know what needs to be created?? it should be flat file or a table on the data base?
    I am going to reecive flat files from the source system??
    If it is a flat file then is there any way to create flat file structure in the DS using SQL query statements?
    Thanks
    Rajeev

    you can create table same as file structure and use the table instead, but how are you going to populate the table with the data from flat file ? you will need some application to do that, you can do this is DS by creating a file format and using the source file as input for that file format and create a template table as target for the database datastore to wihch you want to load data from this source file, and use that table as source in other dataflows
    you can also create the table with the same structure in the database and import that in Datastore and use that as target for the source file

  • Error 1 when loading flat file in BW 7.0

    Hi,
        The flat file structure is same as the transfer structure. Its a csv file and i also checked about the delimiters and stuff.The flat is not open and it is closed while i am loading it. The same file gets loaded if i try in another laptop with my id.If i use my colleague's id on my system also...it doest work...so, the basic problem is with my laptop. I know its nor related to type of data or transfer structure. Its some settings on my laptop which got changed automatically. If i install some other softwares like mozilla firefox or yahoo msg-will that create a problem? I am not at all understanding why its like this. Please help.The error msgs i get when i try to load the flat file -
    Error 1 when loading external data
    Diagnosis
    Error number 1 occurred when loading external data:
    1. Error when reading the file (access rights, file name, ...)
    2. File size or number of records does not correspond to the data in the control file
    3. Error when generating the IDoc
    4. File contains invalid data (errors with an arithmetic operation or data conversion)
    Procedure
    Check whether you have the required access rights and whether the data in the control file is correct (file names, record length, number of records, ...). Correct the data in the control file if necessary and check the data file for invalid data (values of the wrong type, values in the wrong format for conversion exit,...). Check whether the file has headers that have not been specified.
    Error when opening the data file C:\vikki1.csv (origin C)
    Message no.
    Diagnosis
    File C:\ vikki1.csv (origin C) could not be opened.
    Origin:
    A : Application server
    C : Client workstation
    Procedure
    Check whether the file entered exists and is not been used by other applications.

    Hi! Vikki,
    Error 1 means your flat file is open while uploading the data..
    your flat file should be closed while uploading data in BW.
    that is why it is saying "Error when opening the file..".
    first close that file n then upload..it will work.
    rest of the things are ok!..
    I hope this will help you.
    Regards,
    khyati.

  • Hierarchy flat file extraction

    Hi experts--
    Can anyone could guide me in Flat file Hierchy extraction.Step by step.
    if possible with screen shots.
    can send it to [email protected]
    regards,
    Rambo.

    Hi,
    Flat file hierarchy extraction is similar to the normal flat file extraction procedures. But the file structure itself can be complex and is different than normal flat files.
    Take a look at the threads below for more details :
    Hierarchy Flat file
    Hierarchy from flat file
    Program to load a flat file in a Hierarchy
    Cheers,
    Kedar

  • Hierarchy using a flat file for a master data load.

    Can anyone please tell me the steps involved in creating a hierarchy in BI 7.0.  and loading the data using a flat file into the created hierarchy? I have seen some posts and weblogs but they were not helpful and i have been getting some errors which i dont know how to resolve.
    Can someone please give a clear steps and procedure for this?
    Thanks a lot in advance.
    Naveen

    Hi,
    The flat file is generated in application server by executing an ABAP program in SE38.
    This ABAP program contains the logic to generate hierarchy.
    In our case we used a DSO to store employee & supervisors.
    This program generated a flat file in tmp folder in application server
    In our case, To load this flat file we  created an Infosource on our master data using direct update using RSA1old Tcode
    after Infosource is created right click on the InfoSource and click on Assign DataSource.
    Give the Source System name
    When you open this Info source you can select data source as YHIEROBJ_HIER and then activate the InfoSource.
    create an Info Package to load the Hierarchy. Right click on the PC Files (Source System) and click on u201CCreate InfoPackage".
    Under Extraction tab give the flat file name which is generated by the ABAP program in AL11.
    Select the hierarchy which you want to load. Also, you can rename the Hierarchy after loading it. The update method is selected as u201CFull Updateu201D.
    Logic for Generation of flat file & ABAP Program:
    Flat file structure:
    Node ID     InfoObject     NodeName     Parent ID     Date To      Date From     Language
    u2022     u201CNode IDu201D indicates the unique number which defines the Node.
    u2022     u201CInfoObject u201Cgives the name of the InfoObject which is assigned to this Node ID.
    u2022     u201CNodeNameu201D is the name you specify to that Node.
    u2022     u201CParent IDu201D is the Node ID to which the Current Node ID reports to.
    u2022     u201CDate Tou201D and u201CDate Fromu201D are taken for time references (You can set them constant).
    u2022      u201CLanguageu201C is set to English (E).
    The program Logic can be :
    Declare the hierarchy structure
    Get the data from DSO into internal table
    Read all the EMP values into NodeName Field of the table
    Build lookup table for Parent Node ID
    Fill Parent Node ID field
    Append this data to work area & then to another internal table
    You need to declare the selection parameter as SSFILE1
    & use following code to write back the file
      OPEN DATASET SSFILE1 FOR OUTPUT IN TEXT MODE ENCODING DEFAULT
         MESSAGE MSG.
      IF SY-SUBRC NE 0.
        MESSAGE E008(ZBW1) WITH MSG.
      ENDIF
    Populate the output from that internal table into final work area.
    Transfer this final work area to the ssfile1 which is the desired output file.
    and   CLOSE DATASET SSFILE1.
    When this program is executed, selection screen asks for the name of file
    Enter the desired file name & execute
    flat file would be generated in the Application server now.
    To view the File, Go to Transaction AL11. Search the directory /tmp. Double click on this directory.
    This opens the list of flat files under this directory. Double click on the file name.
    Hope this helps.
    Thanks,
    Rashmi.

  • EFT Payment Program. Writing to a flat file using UTL Package

    Hi guys
    I wonder if someone can help me. I'm battling with something in pl/sql. I have a procedure that writes to a flat file. Each procedure I call writes a single section of the file, e.g "create_hdr_rec_line_fn" writes the header on top of the file and "create_std_trx_rec_line_fn" writes the body below the header. Then lastly the procedure that writes the trailer at the bottom of the file.
    My problem comes here: I have a proc that calculates the hash total.
    This is done by " utl_file.put_line(g_file_id, g_seed_number)". I want to write the hash total next to the trailer section of the file and not below it. How can I do this? Here's the example of the flat file produced. The very last line is the hash total, but I my hash total to be on the same line as the (Trailer). I want it to be just next to the trailer. I hope my question is clear. Please see my procedure below the flat file.
    Thanking you in advance....
    My flat file
    FHSSVSDIM15000932008102810483220081028T (header)
    SD0009300100001D19874200019873402211ACSA JOHANNESBURG INTERNATIONA (Line1-Detail)SC00093001D14540500014540057261IS H/O MAIN ACCOUNT 0000000124959315207 (Line-Transaction)
    ST00093000000700000000070000000000020000001806378410000000000000000000001806378 (Trailer)
    58298239848772973764654319387982 (hash total)
    My procedure
    PROCEDURE eft(errbuf OUT NOCOPY VARCHAR2,
    retcode OUT NOCOPY NUMBER,
    p_payment_batch IN VARCHAR2) IS
    v_eft_date VARCHAR2(100);     
    BEGIN
    v_eft_date := TO_CHAR(SYSDATE, 'DD-MON-RRRR_HH24_MI_SS');
    g_payment_batch := p_payment_batch;
    g_file_name := 'EFT'||v_eft_date||'.txt';
    g_file_id := utl_file.fopen(g_dir_name,
    g_file_name,
    'W');
    utl_file.put_line(g_file_id,
    create_hdr_rec_line_fn);
    create_std_trx_rec_line_fn(g_file_id);
    create_std_contra_rec_line_fn(g_file_id);
    create_std_trailer_fn(g_file_id);
    utl_file.put_line(g_file_id,
    g_seed_number);
    utl_file.fclose_all;
    IF (update_tables != TRUE) THEN
    Fnd_File.put_line(Fnd_File.LOG, 'Failed to update payment batch tables. Cancel the batch');
    END IF;
    EXCEPTION
    WHEN OTHERS THEN
    Fnd_File.put_line(Fnd_File.LOG, 'Print Error - eft' || SUBSTR(SQLERRM, 1, 250));
    END eft;

    user643734 wrote:
    Hi cdkumar
    I'm not quite sure if I understand what you mean. Are you saying that I should use the "PUT" to write 'create_std_trailer_fn' and and then use the "PUT_LINE" to write 'g_seed_number'? Could you please show me what you mean by changing the proc I posted on my question with your suggestions.How about, rather than use code it for you, you try changing the code and giving it a go yourself.
    Essentially PUT_LINE will write out the data and put a "new line" character on the end so the next output will appear on the following line; PUT will just output the data without terminating the line, so any subsequent output will just append to the end.
    It's very simple. Give it a go.

Maybe you are looking for

  • Tax not calculating for parked invoices

    Hi We are using external tax system to determine taxes in SAP.  When an invoice in MIRO is posted normally, the external tax system is called and tax is getting updated.  Whereas, when the invoice is parked for future posting, tax is not being calcul

  • Credit Check not occurring for Customer with Partial Credit limit existing

    Hi, I have a scenario 1) where, if credit limit is crossed, then on creation of order I get an credit check even when i have not entered materials in sales order. 2) Partial Credit limit is pending (for e.g Credit limit of 100, 50 is still pending) w

  • Missing Tools Panel

    I have just downloaded InDesign CS6. When I open a new document, I do not have the Tools Panel. I cannot find a solution. I have tried pressing ESC but this has no effect. Can anyone help me please? Thank you. Kevin

  • How to implement a connection pooling in servlet

    hi all, how to implement a connection pooling in servlet.i want to know how to implement it in tomcat in a struts based environment. any input to the topic is appreciated. Thanks Sudha

  • Saving Message_ Jms Queue

    Hi , how to save the message from the queue if the connection between the queue and the listener has been lost?