Related to CSV file data upload

Hi all,
I am new to the oracle technology.
While uploading the data from csv file i got this error
"SQL*Loader-350: Syntax error at line 4.
Expecting keyword TABLE, found "xxgfs_gen_text_lookups".
APPEND INTO xxgfs_gen_text_lookups"
my csv file data is
Invoice Match Options,,,Invoice,,
Invoice Match Options,,,Receipt,,
Invoice Match Options,,,Purchase Order,,X
Invoice Type,A 00,Advance,Standard,Standard invoice,
Invoice Type,B 00,Expense,Standard,Standard invoice,
Invoice Type,2 00,Debit Memo,Credit Memo,Credit Memo,
Invoice Type,2 20,EDI Debit Memo,Credit Memo,Credit Memo,
Invoice Type,1 00,Invoice,Standard,Standard invoice,
Invoice Type,1 01,Arrow Credit,Standard,Standard invoice,,
Invoice Type,1 10,Recurring Payments,Standard,Standard invoice,,
Invoice Type,1 20,EDI Invoices,Standard,Standard invoice,,
Invoice Type,1 21,Imaged Invoice,Standard,Standard invoice,X,Default when entering from form
Invoice Tax Code,XMT,DO NOT USE,,,,
Invoice Tax Code,STE,DO NOT USE,,,,
Invoice Tax Code,,,,,,
Invoice Tax Code,,,,,,
Invoice Tax Code,,,,,,
Invoice Tax Code,,,,,,
Invoice Tax Code,,,,,,
Invoice Tax Code,,,,,,
Invoice Tax Code,,,,,,
Invoice Tax Code,,,,,,
Invoice Tax Code
If i am removing the blank row then also it is giving the same problem.
If anybody face the same problem then please help me out.
thanks in advance to u all for ur help.
-Rajnish

This is the Oracle Application Express (formerly known as HTML DB) forum. SQL*Loader related questions should be asked in the Database SQL forum (PL/SQL or Database General Forum (General Database Discussions but being a nice guy familiar with SQL*Loader I will "give it a go".
The error message you are getting indicates that you have a syntax error in your control file. The syntax for the APPEND keyword is APPEND INTO TABLE table_name. So change "APPEND INTO xxgfs_gen_text_lookups" to "APPEND INTO TABLE xxgfs_gen_text_lookups".
Mike

Similar Messages

  • CSV file data upload problem

    Hi all,
    I am new to the oracle technology.
    While uploading the data from csv file i got this error
    "SQL*Loader-350: Syntax error at line 4.
    Expecting keyword TABLE, found "xxgfs_gen_text_lookups".
    APPEND INTO xxgfs_gen_text_lookups"
    my csv file data is
    Invoice Match Options,,,Invoice,,
    Invoice Match Options,,,Receipt,,
    Invoice Match Options,,,Purchase Order,,X
    Invoice Type,A 00,Advance,Standard,Standard invoice,
    Invoice Type,B 00,Expense,Standard,Standard invoice,
    Invoice Type,2 00,Debit Memo,Credit Memo,Credit Memo,
    Invoice Type,2 20,EDI Debit Memo,Credit Memo,Credit Memo,
    Invoice Type,1 00,Invoice,Standard,Standard invoice,
    Invoice Type,1 01,Arrow Credit,Standard,Standard invoice,,
    Invoice Type,1 10,Recurring Payments,Standard,Standard invoice,,
    Invoice Type,1 20,EDI Invoices,Standard,Standard invoice,,
    Invoice Type,1 21,Imaged Invoice,Standard,Standard invoice,X,Default when entering from form
    Invoice Tax Code,XMT,DO NOT USE,,,,
    Invoice Tax Code,STE,DO NOT USE,,,,
    Invoice Tax Code,,,,,,
    Invoice Tax Code,,,,,,
    Invoice Tax Code,,,,,,
    Invoice Tax Code,,,,,,
    Invoice Tax Code,,,,,,
    Invoice Tax Code,,,,,,
    Invoice Tax Code,,,,,,
    Invoice Tax Code,,,,,,
    Invoice Tax Code
    If i am removing the blank row then also it is giving the same problem.
    If anybody face the same problem then please help me out.
    thanks in advance to u all for ur help.
    -Rajnish

    Running SQL*Loader as:
      sqlload userid=... control=... data=... log=...
    HOSTSTR logical has been set to same value as your connection string but
    without domain name.
    When you have specified connect string (ie. SCOTT/TIGER@DATABASE) but no
    domain you receive these errors:
      SQL*Loader-704: Internal error: ulconnect: OCIServerAttach [0]
      ORA-12154: TNS:could not resolve service name
    When you have not specified connect string (ie. SCOTT/TIGER) you receive
    these errors:
      SQL*Loader-704: Internal error: ulconnect: OCIServerAttach [0]
      ORA-12162: TNS:service name is incorrectly specified
    Your sqlnet.ora has:
      names.default_domain entry = world
    The syntax in your tnsnames.ora entry is correct.
    Your entry in tnsnames.ora does not include the .WORLD extension (default
    domain from sqlnet.ora).
    Solution Description
    Specify the .WORLD in your tnsnames.ora and also in your connect string.
    This will remove the error.
    Also, ensure you are not hitting Bug 893290.  Can you connect to the database from that server using sqlplus?

  • Load .csv file data with OWb Process flow using Web

    Hi,
    I Have a file in my local machine( Machines on multiple user's), need to load data through Web user interface.
    Let's say have a web page with multiple radio buttons respective to different sources, by clicking on each button will pass the path of .csv file to through Application, (API or Java programming interface) execute owb Process flow as a accepting file path as a input parameter to execute for loading purpose.
    Should facilitate view data, Update data through web based on user requests.
    Need your guidence how can i implement this with OWb 11g R2.
    Assuming with Web browser functionality. Please confirm it and if yes, please throw some light how could be the steps to implement.
    Thanks

    Hi David,
    Thanks for your reply.
    Undersatnd your proposed solution.But my requirement should be as follows.
    1. Currently under consideration using web page likely to be implement with Java, allowing users to load .csv file data into staging area.(Loading flat file into Data abse table)
    Case 1, Assuming OWB software is not installed on user machine. I think no.
    Is it possible through web page (this case Java page) to trigger java procedure/Pl/SQl procedure or integration of both to laod data into staging area.If yes, how it could effect performance of data load with 1 GB file.
    Case 2, OWb client software installed on User machine, while runtime passing parameters means passing manually?
    In case it is automated, how should i pass machine name & Path to owb runtime web browser.
    Could you please show me guidence how should I acheive this functionality with APEX customization part?
    Thanks agin for your support.
    Anil

  • CSV file data load

    Hi All,
    I have a CSV file having data in cell. I have to load the csv file data in supporting table.
    In code, I will pass parameters as - csv file name, path and supporting table name that required inserting the file data. (This is just a sample code; I required handling many things)
    My Sample CSV file data: (Data are comma separated) ,In file in third,have null value ,Plese help to handle null values in below code while inserting.
    1,AB,1/1/2013
    2,CD,1/1/2012
    3,<null>,1/1/2013Sample Code:
    DECLARE
        v_csvfile       VARCHAR2(30) := 'temp.csv';--<<This is the csv file name>>
        v_csvpath       VARCHAR2(30) := --<<This is the path>>
        v_csvtab        VARCHAR2(30) := 'TEMP';
        v_csvdata       VARCHAR2(32767);
        v_csveof        BOOLEAN := FALSE;
        v_csvfilehandle utl_file.file_type;
        v_str           VARCHAR2(32767);
    BEGIN
        v_csvfilehandle := utl_file.fopen(v_csvpath, v_csvfile, 'r', 32767);
        WHILE NOT v_csveof
        LOOP
            BEGIN
                utl_file.get_line(v_csvfilehandle, v_csvdata);
                FOR i IN (SELECT s.data_type, s.internal_column_id
                            FROM user_tab_cols s
                           WHERE s.table_name = UPPER(v_csvtab)
                           ORDER BY internal_column_id)
                LOOP
                    IF i.data_type = 'DATE'
                    THEN
                        v_str := v_str || 'TO_DATE (' || ' REGEXP_SUBSTR( ' || '''' ||
                                 v_csvdata ||
                                 '''' || ' ,' || '''' || '[^,]+' || '''' ||
                                 ' ,1,' || i.internal_column_id || ' ),' || '''' ||
                                 'MM/DD/YYYY' || '''' || ' )';
                    ELSE
                        v_str := v_str || ' REGEXP_SUBSTR( ' || '''' ||
                                 v_csvdata ||
                                 '''' || ' ,' || '''' || '[^,]+' || '''' ||
                                 ' ,1,' || i.internal_column_id || ' ),';
                    END IF;
                END LOOP;
                        --DBMS_OUTPUT.put_line('INSERT INTO  ' || v_csvtab || ' VALUES ( ' || v_str || ' )');
                EXECUTE IMMEDIATE 'INSERT INTO  ' || v_csvtab || ' VALUES ( ' || v_str || ' )';
                                  v_str := NULL;
            EXCEPTION
                WHEN no_data_found THEN
                    v_csveof := TRUE;
            END;
        END LOOP;
        utl_file.fclose(v_csvfilehandle);
    EXCEPTION
        WHEN OTHERS THEN
            dbms_output.put_line(dbms_utility.format_error_backtrace ||
                                 ' Err ' || SQLERRM);
            RAISE;
    END;Script to create sample supporting table:
    CREATE TABLE TEMP
    (col1 VARCHAR2(100),
    col2 VARCHAR2(100),
    col3 DATE );Thanks,

    Don't re-invent the wheel.
    If your CSV is on the database server (it must be if you are using UTL_FILE), use an external table
    http://docs.oracle.com/cd/E11882_01/server.112/e22490/et_concepts.htm#SUTIL011
    If your CSV is on a client PC, use Sql*Loader
    http://docs.oracle.com/cd/E11882_01/server.112/e22490/ldr_concepts.htm#SUTIL003

  • Upload csv file data to sql server tables

    Hi all,
    I want clients to upload csv file from their machines to the server.
    Then the program should read all the data from the csv file and do a bulk insert into the SQL Server tables.
    Please help me of how to go about doing this.
    Thanx in advance.....

    1) Use a multipart form with input type="file" to let the client choose a file.
    2) Get the binary stream and put it in a BufferedReader.
    3) Read each line and map it to a DTO and add each DTO to a list.
    4) Persist the list of DTO's.
    Helpful links:
    1) http://www.google.com/search?q=jsp+upload+file
    2) http://www.google.com/search?q=java+io+tutorial
    3) http://www.google.com/search?q=java+bufferedreader+readline
    4) http://www.google.com/search?q=jdbc+tutorial and http://www.google.com/search?q=sql+tutorial

  • Use PL/SQL procedure to guard against malformed CSV files before upload

    Hi all,
    In my CSV upload utility, I've implemented error checking procedures, but they are invoked only AFTER data has been uploaded to temp table. This doesn't catch the following sample scenarios:
    1. The CSV is malformed, with some rows shifted due to fields and separators missing here and there.
    2. User has chosen a wrong CSV to upload (most likely number of fields mismatch)
    I'm wondering if it is a good idea to have procedure to read in the CSV, scan each line, count the number of fields (null fields but with delimiters showing the field exist is ok) for each record. If every single record matches the required number of fields for that table, then the CSV is ok, and the insert from external table to temp table is allow. This will ensure at least the CSV file has a valid matrix size for the target table, but rest of error checking is left until after temp table is populated.
    One of my concerns is, I specify "missing field values are null" in the external table parameters, which is necessary since not all fields are required. Does this specification causes a row with missing trailing separators still considered valid? If so then the stored procedure must be programmed to omit such a case.
    What do you think? Many thanks.

    Hi, Cuauhtemoc Amox
    Thank you for your advice. I have set web adi  using PL SQL interface.
    I have decided to go futher
    i have a procedure like that
    procedure delete_old_data ( p_id in number, p_description in varchar2)
    as
    begin
                   begin
                   if g_flag ='N' then
                   delete from xx_test;
                    g_flag='Y';
                    else null;
                    end if;
                    end;
    insert into xx_test (p_id,p_descriptiom);
    end;
    G_FLAG is a global variable with default value ='N'  in my package. When web_adi upload
    first row from excel sheet, then procedure delete all data from table and change g_flag to 'Y'.
    All other rows uploads succesfully; it  works fine.
    But when user change data in excel sheet and try to upload second time. DELETE_OLD_DATA procedure doesnt work in proper way.
    I have found what problem is. when user use same template and try to upload data several times there is same SESSION_ID in database.
    i.e. g_flag ='Y' when user try to upload second time. But when user log off and create template again it is work properly.
    My question is: How i can different upload attempts? May be there is some id for each upload proccess.
    if i can use this ID in my procedure user will have opportunity to use same template.
    Thank you

  • How to align the CSV file on upload?

    Hi All,
    I have to upload a CSV file as an attachment in a mail and the data's in the internal table which has to be uploaded in the CSV file are seperated using commas which on uploading appear in the CSV file in the same column but i need the data's to be seperated in different columns and different lines...
    Pls help it is very urgent..
    Thanks in Advance...

    Hi
    For my understanding you are talking about download.
    For that you have to concatenate each fields of final internal table (gt_itab for example) using comma as below.
    TYPES: BEGIN OF ty_lines,
             line(1023) TYPE c,
           END OF ty_lines.
    DATA: l_filename TYPE string VALUE 'C:\temp\abcd.csv',
          gt_lines  TYPE TABLE OF ty_lines,
          gw_lines  TYPE ty_lines.
      LOOP AT gt_itab INTO gw_itab.
        CONCATENATE gw_itab-f1
                    gw_itab-f2 .....
               INTO gw_lines-line
               SEPARATED BY ','.
         APPEND gw_lines TO gt_lines.
      ENDLOOP.
      CALL FUNCTION 'GUI_DOWNLOAD'
        EXPORTING
          filename                = l_filename
          filetype                = 'ASC'
          confirm_overwrite       = 'X'
          no_auth_check           = 'X'
        TABLES
          data_tab                = gt_lines.

  • Flat file data upload - happens for only 1 column

    Dear Experts,
    Goal: To upload data from flat file (with 3 columns-char3,language,text) to characteristic (texts).
    Problem: I am doing through CSV File but when i click on preview i am able to see data only in first column, the other two columns are empty.
    More details:
    I have created this data source in data source tab.
    Adapter - Load text type file from Workstation.
    data format -Seperated with seperator (CSV).
    Data Separator     - ,
    Escape Sign     - "
    File name      - C:\Documents and Settings\hans\Desktop\mtype.txt
    Fields
    /BIC/TSMOVTYPE           CHAR     3
    LANGU               CHAR     2
    TXTSH               CHAR     20
    Can anyone give me idea?
    regards
    BI Learner

    Hi,
    You can try this:
    1. After ESCAPE SIGN, there will be an option " Number of rows to be left". Put 1 there.
    2. In your flat file, leave the 1st row blank or add anything in the first row for all the three columns.Save as CSV file. Note that the file must have the same sequence as in the data source.
    3. Check your transformations. Data source fields should be mapped correctly to the right side.
    try to load now. Hope it'll help.
    Preet

  • Related to CSV file .

    Hi,
       How do we read a CSV file using OOPS concept? And at the same time how can we upload it is there any special functions to do that?please reply, urgent.
    Thanks,
    Sindhu.

    Here is a slightly more dynamic approach.
    report zrich_0001.
    types: begin of ttab,
           fld1(20) type c,
           fld2(20) type c,
           fld3(20) type c,
           end of ttab.
    data: itab type table of ttab.
    data: wa type ttab.
    data: iup type table of string.
    data: xup type string.
    data: isplit type table of string with header line.
    field-symbols: <fs>.
    call method cl_gui_frontend_services=>gui_upload
      exporting
        filename                = 'C:test.csv'
      changing
        data_tab                = iup.
    <b>loop at iup into xup.
      clear wa.
      clear isplit.  refresh isplit.
      split xup at ',' into table isplit.
      loop at isplit.
        assign component sy-tabix of structure wa to <fs>.
        if sy-subrc <> 0.
          exit.
        endif.
        <fs> = isplit.
      endloop.
      append wa to itab.
    endloop.</b>
    loop at itab into wa.
      write:/ wa-fld1, wa-fld2, wa-fld3.
    endloop.
    Regards,
    Rich Heilman

  • Best Practice for Flat File Data Uploaded by Users

    Hi,
    I have the following scenario:
    1.     Users would like to upload data from flat file and subsequently view their reports.
    2.     SAP BW support team would not be involved in data upload process.
    3.     Users would not go to RSA1 and use InfoPackages & DTPs. Hence, another mechanism for data upload is required.
    4.     Users consists of two group, external and internal users. External users would not have access to SAP system. However, access via a portal is acceptable.
    What are the best practice we should adopt for this scenario?
    Thanks!

    Hi,
    I can share what we do in our project.
    We get the files from the WEB to the Application Server in path which is for this process.The file placed in the server has a naming convention based on ur project,u can name it.Everyday the same name file is placed in the server with different data.The path in the infopackage is fixed to that location in the server.After this the process chain trigers and loads the data from that particular  path which is fixed in the application server.After the load completes,a copy of file is taken as back up and deleted from that path.
    So this happens everyday.
    Rgds
    SVU123
    Edited by: svu123 on Mar 25, 2011 5:46 AM

  • Commas in my CSV file data

    Hi everyone,
    I am trying to create a csv file.
    One field that I am trying to out put is the description and has commas, that is throwing my columns out of sync.
    I have tries putting the charactor ' around the data but all it does is output that charactor in the data.
    my code is this
         public File createFile(Vector downloadObject){
              Iterator iterator = downloadObject.iterator();
              String query ="";
              try {
                   Format formatter2 = new SimpleDateFormat("ddMMyyyy");
                   String fileName="DELIVERY" + ".csv";
                   File file = new File(fileName);
                   StringBuffer output = new StringBuffer();
                   while (iterator.hasNext()){
                        VODownloadObject downloadObject2 = (VODownloadObject)iterator.next();
                        query =
                             "'" +
                             downloadObject2.getStrokeNumber().trim()
                             + "','"+ downloadObject2.getJdeIdNumber()
                             + "','"+ downloadObject2.getRollNumber()
                             + "','"+ downloadObject2.getSupplier()
                             + "',"+ ",'"+ downloadObject2.getQuality()
                             + "','"+ downloadObject2.getColour()
                             + "','"+ downloadObject2.getSupplierRollNumber()
                             + "',"+ ",'"+ downloadObject2.getDeliveryDate()
                             + "','"+ downloadObject2.getFabricComposition()
                             + "',"+ ","+ ",'"+ downloadObject2.getTicketLength()
                             + "','"+ downloadObject2.getTicketWidth()
                             + "',"+ ","+ ","+ ",";
                        output.append(query + "\r\n");
                   BufferedWriter out = new BufferedWriter(new FileWriter(fileName));
                   out.write(output.toString());
                   out.close();
                   return file;
              } catch (IOException e) {
                   e.printStackTrace();
                   return null;
         }the output is this.
    'PA?????T','2/664659 ','153920','Menswear Formal, Shirts ',,'null','null',' ',,'11/07/2006','null',,,'-330000',' ',
    in the "'Menswear Formal, Shirts " field, this is one field but it is splitting up into 2 fields because there is a comma after Formal. Anyone know how I can keep this as one field but with a comma in it?
    thanks in advance

    I have abandoned my last appoach altogether
    I am using com.Ostermiller.util.CSVPrinter
    here is my final working class
         public File createFile(Vector downloadObject){
              Iterator iterator = downloadObject.iterator();
              String query ="";
              try {
                   Format formatter2 = new SimpleDateFormat("ddMMyyyy");
                   String fileName="DELIVERY" + ".csv";
                   File file = new File(fileName);
                   BufferedWriter out = new BufferedWriter(new FileWriter(fileName));
                   ExcelCSVPrinter ecsvp = new ExcelCSVPrinter(
                             out
                   while (iterator.hasNext()){
                        VODownloadObject downloadObject2 = (VODownloadObject)iterator.next();
                        String quality = " ";
                        if (downloadObject2.getQuality()!=null){
                             quality = downloadObject2.getQuality().trim();
                        String colour = " ";
                        if (downloadObject2.getColour()!=null){
                             colour = downloadObject2.getColour().trim();
                        String fabricComposition = " ";
                        if (downloadObject2.getFabricComposition()!=null){
                             fabricComposition =downloadObject2.getFabricComposition().trim();
                        ecsvp.writeln(new String [] {
                                   downloadObject2.getStrokeNumber().trim(),downloadObject2.getJdeIdNumber().trim(),
                                   downloadObject2.getRollNumber(), downloadObject2.getSupplier().trim(), " " ,
                                   quality , colour , downloadObject2.getSupplierRollNumber().trim() , " ",
                                   downloadObject2.getDeliveryDate().trim(), fabricComposition , " ", " ",
                                   downloadObject2.getTicketLength(), downloadObject2.getTicketWidth().trim(),
                   out.close();
                   return file;
              } catch (IOException e) {
                   e.printStackTrace();
                   return null;
         }

  • Hi ,related to CSV files out put.

    hi all,
          I am taking out put in ALV and .CSV format (in unix path ) .
    And the ALV out put is fine.
    In one of the material description the text is like this :
    my name is  (Activ.Foil+S,sr)
    so for description in CSV it is taking like this : my name is (Activ.Foil+S
    and for the next field the remaining text is going
    i.e ,  for material group it is getting : sr)
    and all the values are shifted by one field.
    so the BW team facing the problem while extracting data ,
    can any one please help me out how to solve this probelm.
    Thanks and Regards.
    C.Shamsundher.

    Hi Prem,
    In such case you need to change the delimitter or else remove the comma from Material description once you retrieve data from database into internal table.
    REPLACE ALL OCCURRENCES OF ',' in WA_TAB-DESC with ' '.
    Then move the material descritpion to the CSV file.
    Best regards,
    Prashant

  • Manipulate CSV File Data

    Hello,
    Our Active Directory structure only lists the person's manager in the format of an email address. After running a Get-ADUser command and exporting its data to a csv file. I want to import the csv file which will be in the format shown below, then manipulate
    it, so the manager1 field is converted to the relevant objectGUID and populated in the Manager_objectGUID field.
    I'm completely new to PowerShell so any assistance would be much appreciated.
    Thanks
    Stuart

    $OldCSV = "C:\OldADUsers.csv"
    $NewCSV = "C:\NewADUsers.csv"
    Add-Content $OldCSV "objectGUID,mail,givenName,sn,manager1,Manager_objectGUID"
    Add-Content $OldCSV "7c1be78f,[email protected],Mickey,Mouse,,"
    Add-Content $OldCSV "982874ab,[email protected],Donald,Duck,[email protected],"
    $ADUsers = Import-CSV $OldCSV
    # collect managers
    $Managers = @{}
    ForEach ($ADUser in $ADUsers)
    $Managers.Add($ADUser.Mail,$ADUser.ObjectGUID)
    # Get Managers
    ForEach ($ADUser in $ADUsers)
    IF ($ADUser.manager1 -NE "")
    If ($Managers[$ADUser.manager1] -NE $Null)
    $ADUser.Manager_objectGUID = $Managers[$ADUser.manager1]
    $ADUsers | Export-CSV $NewCSV
    Get-Content $NewCSV
    Gives this output
    #TYPE System.Management.Automation.PSCustomObject
    "objectGUID","mail","givenName","sn","manager1","Manager_objectGUID"
    "7c1be78f","[email protected]","Mickey","Mouse","",""
    "982874ab","[email protected]","Donald","Duck","[email protected]","7c1be78f"

  • CSV  file data load error

    Hi,
    Data load from Flat File.
    The master data is loaded using CSV file.
    Emp ID;  in source file it is replesented as 1,2,3..10...1009
    But when i load the data it is being converted to 0000000001,0000000002....etc
    The info object Data Type is Char and Length is 10 and Conversion Routine is none
    I want it to be loaded as 1,2,3... as it is as it is in source file
    Thanks

    Hi,
    Right click on that cell -> Format cells - > Number (tab) -> select Text and click OK.
    Hope this helps.
    PB

  • Flat file data upload to ODS failed

    Hi guys,
    While loading the flat file data to ODS, it gets fails immediately without loading single record. Error message shows :: 
    <b>" Error when opening the data file <file name> (origin A)."
    "Error in the data request"
    "Error occurred in the data selection"</b>
    I am trying to load the data manually thru Infopackage and file is lying in the Application Server and full access given to it. But the important part is when we try to load the file again without doing any changes, it run fine and loads the data without any issues. This happens with every load & everyday.
    Any solutions?
    Regards
    Sanjiv

    Hi CK,
    Step-by-step Analysis status are as given::
    <b><Red> </b>        Data request sent off ?
    <b><No colour></b> RFC to source system successful ?
    <b><No colour></b> Does selectable data exist in the source
    <b><Red></b>         Data selection successfully started ?
    <b><Red> </b>        Data selection successfully finished ?
    <b><Red></b>         Processing error in source system reporte
    <b><No colour></b> RFC to Warehouse successful ?
    <b><Red>    </b>     Processing error in Warehouse reported ?
    <b><No colour></b> Processing successfully finished?
    <b><No colour></b> All reported data packets received?
    <b><No colour></b> All data packets complete ?
    <b><Green> </b>     Have all Processing Steps been Carried ou
    <b><Green></b>      All Data Packets Updated in all Targets?
    <b><Green>  </b>    Inadmissable Aggregation?
    Regards
    Sanjiv

Maybe you are looking for