Read a csv file, fill a dynamic table and insert into a standard table

Hi everybody,
I have a problem here and I need your help:
I have to read a csv file and insert the data of it into a standard table.
1 - On the parameter scrreen I have to indicate the standard table and the csv file.
2 - I need to create a dynamic table. The same type of the one I choose at parameter screen.
3 - Then I need to read the csv and put the data into this dynamic table.
4 - Later I need to insert the data from the dynamic table into the standard table (the one on the parameter screen).
How do I do this job? Do you have an example? Thanks.

Here is an example table which shows how to upload a csv file from the frontend to a dynamic internal table.  You can of course modify this to update your database table.
report zrich_0002.
type-pools: slis.
field-symbols: <dyn_table> type standard table,
               <dyn_wa>,
               <dyn_field>.
data: it_fldcat type lvc_t_fcat,
      wa_it_fldcat type lvc_s_fcat.
type-pools : abap.
data: new_table type ref to data,
      new_line  type ref to data.
data: xcel type table of alsmex_tabline with header line.
selection-screen begin of block b1 with frame title text .
parameters: p_file type  rlgrap-filename default 'c:Test.csv'.
parameters: p_flds type i.
selection-screen end of block b1.
start-of-selection.
* Add X number of fields to the dynamic itab cataelog
  do p_flds times.
    clear wa_it_fldcat.
    wa_it_fldcat-fieldname = sy-index.
    wa_it_fldcat-datatype = 'C'.
    wa_it_fldcat-inttype = 'C'.
    wa_it_fldcat-intlen = 10.
    append wa_it_fldcat to it_fldcat .
  enddo.
* Create dynamic internal table and assign to FS
  call method cl_alv_table_create=>create_dynamic_table
               exporting
                  it_fieldcatalog = it_fldcat
               importing
                  ep_table        = new_table.
  assign new_table->* to <dyn_table>.
* Create dynamic work area and assign to FS
  create data new_line like line of <dyn_table>.
  assign new_line->* to <dyn_wa>.
* Upload the excel
  call function 'ALSM_EXCEL_TO_INTERNAL_TABLE'
       exporting
            filename                = p_file
            i_begin_col             = '1'
            i_begin_row             = '1'
            i_end_col               = '200'
            i_end_row               = '5000'
       tables
            intern                  = xcel
       exceptions
            inconsistent_parameters = 1
            upload_ole              = 2
            others                  = 3.
* Reformt to dynamic internal table
  loop at xcel.
    assign component xcel-col of structure <dyn_wa> to <dyn_field>.
    if sy-subrc = 0.
     <dyn_field> = xcel-value.
    endif.
    at end of row.
      append <dyn_wa> to <dyn_table>.
      clear <dyn_wa>.
    endat.
  endloop.
* Write out data from table.
  loop at <dyn_table> into <dyn_wa>.
    do.
      assign component  sy-index  of structure <dyn_wa> to <dyn_field>.
      if sy-subrc <> 0.
        exit.
      endif.
      if sy-index = 1.
        write:/ <dyn_field>.
      else.
        write: <dyn_field>.
      endif.
    enddo.
  endloop.
REgards,
RIch Heilman

Similar Messages

  • Read data from E$ table and insert into staging table

    Hi all,
    I am a new to ODI. I want your help in understanding how to read data from an "E$" table and insert into a staging table.
    Scenario:
    Two columns in a flat file, employee id and employee name, needs to be loaded into a datastore EMP+. A check constraint is added to allow data with employee names in upper case only to be loaded into the datastore. Check control is set to flow and static. Right click on the datastore, select control and then check. The rows that have violated the check constraint are kept in E$_EMP+ table.
    Problem:
    Issue is that I want to read the data from the E$_EMP+ table and transform the employee name into uppercase and move the corrected data from E$_EMP+ into EMP+. Please advise me on how to automatically handle "soft" exceptions in ODI.
    Thank you

    Hi Himanshu,
    I have tried the approach you suggested already. That works. However, the scenario I described was very basic. Based on the error logged into the E$_EMP table, I will have to design the interface to apply more business logic before the data is actually merged into the target table.
    Before the business logic is applied, I need to read each row from the E$EMP table first and I do not know how to read from E$EMP table.

  • Import data from few tables and export into the same tables on different db

    I want to import data from few tables and export into the same tables on different database. But on the target database, additional columns have been added
    to the same tables. how can i do the import?
    Its urgent can anyone please help me do this?
    Thanks.

    Hello Junior DBA,
    maybe try it with the "copy command".
    http://download.oracle.com/docs/cd/B14117_01/server.101/b12170/apb.htm
    Have a look at the section "Understanding COPY Command Syntax".
    Here is an example of a COPY command that copies only two columns from the source table, and copies only those rows in which the value of DEPARTMENT_ID is 30:Regards
    Stefan

  • Fetch data from one table and insert into two tables in desired format

    I have similar to the following data in a table and it is not normalized. The groupID is being used to group two records of similar nature.
    DECLARE @OldDoc TABLE (oldDocID INT, groupID INT, deptID INT)
    INSERT INTO @OldDoc (oldDocID, groupID) VALUES (1, NULL, 111),(2,NULL,111),(3,1,111),(4,NULL,333),(5,1,222),(6,NULL,333),(7,2,222),(8,2,333),(9,NULL,111),(10,3,222),(11,NULL,333),(12,3,444)
    I need to process the data from the above table (@OldDoc) and write into two new tables (@NewDoc and @NewDocGroup) as follows.
    oldDocID should be stored as newDocID when inserting to @NewDoc table. Only records with groupID NULL and one record (first one) per group should be considered (For example, oldDocID 5 is not considered as 3 and 5 belong to the same groupID 1) for insertion. 
    DECLARE @NewDoc TABLE (newDocID INT)
    INSERT INTO @NewDoc (newDocID) VALUES (1),(2),(3),(4),(6),(7),(9),(10),(11)
    All records from @OldDoc should be considered for insertion into @NewDocGroup table. OldDocID is inserted as NewDocID and deptID is as-is. Instead of groupID, the ID of the first record in the 
    group should be considered as parentNewDocID (For example, 3 is considered as parentNewDocID for newDocID 5 as 3 and 5 belong to the same groupID in @OldDoc table) for the newDocID.
    DECLARE @NewDocGroup (newDocID INT, parentNewDocID INT, deptID INT)
    INSERT INTO @NewDocGroup (newDocID, parentNewDocID, deptID) VALUES (1,1,111),(2,2,111),(3,3,111),(4,4,333),(5,3,222),(6,6,333),(7,7,222),(8,7,333),(9,9,111),(10,10,222),(11,11,333),(12,10,444)
    How do I accomplish the above using SQL ? Thanks for the help.

    >> I have similar to the following data in a table and it is not normalized. The group_id is being used to group two records [sic] of similar nature. <<
    Rows are not records. Tables have to have a key by definition. You do not do math with identifiers, so they should not be numeric. Let's ignore that error for now. In short, you are posting garbage. If you had followed Forum Netiquette, would you have posted
    this? 
    CREATE TABLE Old_Documents
    (old_doc_id INTEGER NOT NULL PRIMARY KEY, 
     group_id INTEGER, 
     dept_nbr INTEGER NOT NULL
       REFERENCES Departments (dept_nbr));
    INSERT INTO Old_Documents(old_doc_id, group_id, dept_nbr) 
    VALUES  (1, NULL, 111), 
    (2, NULL, 111), 
    (3, 1, 111), 
    (4, NULL, 333), 
    (5, 1, 222), 
    (6, NULL, 333), 
    (7, 2, 222), 
    (8, 2, 333), 
    (9, NULL, 111), 
    (10, 3, 222), 
    (11, NULL, 333), 
    (12, 3, 444);
    >> I need to process the data from the above table (Old_Documents) and write into two new tables (New_Documents and New_Documents_Groups) as follows. <<
    Just like punch cards and mag tape data processing! Being old and being new are a status, not another kind of entity. But that is how mag tapes work. And you even use the verb "fetch" from tape files. This design flaw is called  attribute splitting.
    Do you have a Male_Personnel and Female_Personnel table? NO! It is just Personnel! 
    >> old_doc_id should be stored as new_doc_id when inserting to New_Documents table. Only records [sic] with group_id NULL and one record [sic] (first [sic; no ordering in a table] one) per group should be considered (For example, old_doc_id 5 is not considered
    as 3 and 5 belong to the same group_id =1) for insertion. <<
    Think about your punch card mindset. Why did you physically materialize that redundant New_Documents table? Let me answer that: this is how you work with punch cards! In SQL we use a VIEW:
    CREATE VIEW New_Documents (new_doc_id)
    AS 
    SELECT old_doc_id 
      FROM Old_Documents;
    >> All records [sic] from Old_Documents should be considered for insertion into New_Documents_Groups table. The old_doc_id is inserted as new_doc_id and dept_nbr is as-is. Instead of group_id, the ID [sic: which identifier??] of the first [sic: tables
    have no ordering like a deck of punch cards] record [sic] in the group should be considered as parent_new_doc_id (For example, 3 is considered as parent_new_doc_id for new_doc_id 5 as 3 and 5 belong to the same group_id in Old_Documents table) for the new_doc_id.
    <<
    Why not use 5 as the parent? My guess is that you are trying to form equivalence classes. See:
    https://www.simple-talk.com/content/print.aspx?article=2020
    --CELKO-- Books in Celko Series for Morgan-Kaufmann Publishing: Analytics and OLAP in SQL / Data and Databases: Concepts in Practice Data / Measurements and Standards in SQL SQL for Smarties / SQL Programming Style / SQL Puzzles and Answers / Thinking
    in Sets / Trees and Hierarchies in SQL

  • Select from two tables and insert into a third

    I'm trying to do a select from two tables and do an insert into a third table from the two resulting columns.
    I have the following....
    DECLARE
    tempsid number;
    temphostid number;
    BEGIN
    select "DBSID_ID","ID" into tempsid,temphostid from "DBSIDS","SERVERS"
    where "HOST_SID" like '%'||"DBSID_NAME"||'%'
    and "HOST_NAME" not like 'vio%'
    and exists (select "DBSID_NAME" from DBSIDS)
    order by "DBSID_NAME";
    insert into "DBSID_LOOKUP" ("SIDLOOKUP_ID", "SERVERLOOKUP_ID")
    values(tempsid, temphostsid);
    END;
    run;
    I get the error ....
    ORA-06550: line 11, column 18:
    PL/SQL: ORA-00984: column not allowed here
    ORA-06550: line 10, column 1:
    PL/SQL: SQL Statement ignored
    1. DECLARE
    2. tempsid number;
    3. temphostid number;

    okay ... I tried a different way ...
    DECLARE
    a number;
    b number;
    BEGIN
    select "DBSID_ID","ID" into a,b from "DBSIDS","SERVERS"
    where "HOST_SID" like '%'||"DBSID_NAME"||'%'
    and "HOST_NAME" not like 'vio%'
    and exists (select "DBSID_NAME" from DBSIDS)
    order by "DBSID_NAME";
    insert into "DBSID_LOOKUP" (SIDLOOKUP_ID, SERVERLOOKUP_ID) values (a, b);
    END;
    and now it whines about ...
    ORA-01422: exact fetch returns more than requested number of rows

  • Program to upload csv file to internal table and insert into database table

    Hi I'm writing a program where I need to upload a csv file into an internal table using gui_upload, but i also need this program to insert the data into my custom database table using the split command.  Anybody have any samples to help, its urgent!

    Hi,
    Check this table may be it will give u an hint...
    REPORT z_table_upload LINE-SIZE 255.
    Data
    DATA: it_dd03p TYPE TABLE OF dd03p,
          is_dd03p TYPE dd03p.
    DATA: it_rdata  TYPE TABLE OF text1024,
          is_rdata  TYPE text1024.
    DATA: it_fields TYPE TABLE OF fieldname.
    DATA: it_file  TYPE REF TO data,
          is_file  TYPE REF TO data.
    DATA: w_error  TYPE text132.
    Macros
    DEFINE write_error.
      concatenate 'Error: table'
                  p_table
                  &1
                  &2
             into w_error
             separated by space.
      condense w_error.
      write: / w_error.
      stop.
    END-OF-DEFINITION.
    Field symbols
    FIELD-SYMBOLS: <table> TYPE STANDARD TABLE,
                   <data>  TYPE ANY,
                   <fs>    TYPE ANY.
    Selection screen
    SELECTION-SCREEN: BEGIN OF BLOCK b01 WITH FRAME TITLE text-b01.
    PARAMETERS: p_file  TYPE localfile DEFAULT 'C:\temp\' OBLIGATORY,
                p_separ TYPE c DEFAULT ';' OBLIGATORY.
    SELECTION-SCREEN: END OF BLOCK b01.
    SELECTION-SCREEN: BEGIN OF BLOCK b02 WITH FRAME TITLE text-b02.
    PARAMETERS: p_table TYPE tabname OBLIGATORY
                                     MEMORY ID dtb
                                     MATCHCODE OBJECT dd_dbtb_16.
    SELECTION-SCREEN: END OF BLOCK b02.
    SELECTION-SCREEN: BEGIN OF BLOCK b03 WITH FRAME TITLE text-b03.
    PARAMETERS: p_create TYPE c AS CHECKBOX.
    SELECTION-SCREEN: END OF BLOCK b03,
                      SKIP.
    SELECTION-SCREEN: BEGIN OF BLOCK b04 WITH FRAME TITLE text-b04.
    PARAMETERS: p_nodb RADIOBUTTON GROUP g1 DEFAULT 'X'
                                   USER-COMMAND rg1,
                p_save RADIOBUTTON GROUP g1,
                p_dele RADIOBUTTON GROUP g1.
    SELECTION-SCREEN: SKIP.
    PARAMETERS: p_test TYPE c AS CHECKBOX,
                p_list TYPE c AS CHECKBOX DEFAULT 'X'.
    SELECTION-SCREEN: END OF BLOCK b04.
    At selection screen
    AT SELECTION-SCREEN.
      IF sy-ucomm = 'RG1'.
        IF p_nodb IS INITIAL.
          p_test = 'X'.
        ENDIF.
      ENDIF.
    At selection screen
    AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
      CALL FUNCTION 'F4_FILENAME'
           EXPORTING
                field_name = 'P_FILE'
           IMPORTING
                file_name  = p_file.
    Start of selection
    START-OF-SELECTION.
      PERFORM f_table_definition USING p_table.
      PERFORM f_upload_data USING p_file.
      PERFORM f_prepare_table USING p_table.
      PERFORM f_process_data.
      IF p_nodb IS INITIAL.
        PERFORM f_modify_table.
      ENDIF.
      IF p_list = 'X'.
        PERFORM f_list_records.
      ENDIF.
    End of selection
    END-OF-SELECTION.
          FORM f_table_definition                                       *
    -->  VALUE(IN_TABLE)                                               *
    FORM f_table_definition USING value(in_table).
      DATA: l_tname TYPE tabname,
            l_state TYPE ddgotstate,
            l_dd02v TYPE dd02v.
      l_tname = in_table.
      CALL FUNCTION 'DDIF_TABL_GET'
           EXPORTING
                name          = l_tname
           IMPORTING
                gotstate      = l_state
                dd02v_wa      = l_dd02v
           TABLES
                dd03p_tab     = it_dd03p
           EXCEPTIONS
                illegal_input = 1
                OTHERS        = 2.
      IF l_state NE 'A'.
        write_error 'does not exist or is not active' space.
      ENDIF.
      IF l_dd02v-tabclass NE 'TRANSP' AND
         l_dd02v-tabclass NE 'CLUSTER'.
        write_error 'is type' l_dd02v-tabclass.
      ENDIF.
    ENDFORM.
          FORM f_prepare_table                                          *
    -->  VALUE(IN_TABLE)                                               *
    FORM f_prepare_table USING value(in_table).
      DATA: l_tname TYPE tabname,
            lt_ftab TYPE lvc_t_fcat.
      l_tname = in_table.
      CALL FUNCTION 'LVC_FIELDCATALOG_MERGE'
           EXPORTING
                i_structure_name = l_tname
           CHANGING
                ct_fieldcat      = lt_ftab
           EXCEPTIONS
                OTHERS           = 1.
      IF sy-subrc NE 0.
        WRITE: / 'Error while building field catalog'.
        STOP.
      ENDIF.
      CALL METHOD cl_alv_table_create=>create_dynamic_table
        EXPORTING
          it_fieldcatalog = lt_ftab
        IMPORTING
          ep_table        = it_file.
      ASSIGN it_file->* TO <table>.
      CREATE DATA is_file LIKE LINE OF <table>.
      ASSIGN is_file->* TO <data>.
    ENDFORM.
          FORM f_upload_data                                            *
    -->  VALUE(IN_FILE)                                                *
    FORM f_upload_data USING value(in_file).
      DATA: l_file    TYPE string,
            l_ltext   TYPE string.
      DATA: l_lengt   TYPE i,
            l_field   TYPE fieldname.
      DATA: l_missk   TYPE c.
      l_file = in_file.
      l_lengt = strlen( in_file ).
      FORMAT INTENSIFIED ON.
      WRITE: / 'Reading file', in_file(l_lengt).
      CALL FUNCTION 'GUI_UPLOAD'
           EXPORTING
                filename = l_file
                filetype = 'ASC'
           TABLES
                data_tab = it_rdata
           EXCEPTIONS
                OTHERS   = 1.
      IF sy-subrc <> 0.
        WRITE: /3 'Error uploading', l_file.
        STOP.
      ENDIF.
    File not empty
      DESCRIBE TABLE it_rdata LINES sy-tmaxl.
      IF sy-tmaxl = 0.
        WRITE: /3 'File', l_file, 'is empty'.
        STOP.
      ELSE.
        WRITE: '-', sy-tmaxl, 'rows read'.
      ENDIF.
    File header on first row
      READ TABLE it_rdata INTO is_rdata INDEX 1.
      l_ltext = is_rdata.
      WHILE l_ltext CS p_separ.
        SPLIT l_ltext AT p_separ INTO l_field l_ltext.
        APPEND l_field TO it_fields.
      ENDWHILE.
      IF sy-subrc = 0.
        l_field = l_ltext.
        APPEND l_field TO it_fields.
      ENDIF.
    Check all key fields are present
      SKIP.
      FORMAT RESET.
      FORMAT COLOR COL_HEADING.
      WRITE: /3 'Key fields'.
      FORMAT RESET.
      LOOP AT it_dd03p INTO is_dd03p WHERE NOT keyflag IS initial.
        WRITE: /3 is_dd03p-fieldname.
        READ TABLE it_fields WITH KEY table_line = is_dd03p-fieldname
                             TRANSPORTING NO FIELDS.
        IF sy-subrc = 0.
          FORMAT COLOR COL_POSITIVE.
          WRITE: 'ok'.
          FORMAT RESET.
        ELSEIF is_dd03p-datatype NE 'CLNT'.
          FORMAT COLOR COL_NEGATIVE.
          WRITE: 'error'.
          FORMAT RESET.
          l_missk = 'X'.
        ENDIF.
      ENDLOOP.
    Log other fields
      SKIP.
      FORMAT COLOR COL_HEADING.
      WRITE: /3 'Other fields'.
      FORMAT RESET.
      LOOP AT it_dd03p INTO is_dd03p WHERE keyflag IS initial.
        WRITE: /3 is_dd03p-fieldname.
        READ TABLE it_fields WITH KEY table_line = is_dd03p-fieldname
                             TRANSPORTING NO FIELDS.
        IF sy-subrc = 0.
          WRITE: 'X'.
        ENDIF.
      ENDLOOP.
    Missing key field
      IF l_missk = 'X'.
        SKIP.
        WRITE: /3 'Missing key fields - no further processing'.
        STOP.
      ENDIF.
    ENDFORM.
          FORM f_process_data                                           *
    FORM f_process_data.
      DATA: l_ltext TYPE string,
            l_stext TYPE text40,
            l_field TYPE fieldname,
            l_datat TYPE c.
      LOOP AT it_rdata INTO is_rdata FROM 2.
        l_ltext = is_rdata.
        LOOP AT it_fields INTO l_field.
          ASSIGN COMPONENT l_field OF STRUCTURE <data> TO <fs>.
          IF sy-subrc = 0.
          Field value comes from file, determine conversion
            DESCRIBE FIELD <fs> TYPE l_datat.
            CASE l_datat.
              WHEN 'N'.
                SPLIT l_ltext AT p_separ INTO l_stext l_ltext.
                WRITE l_stext TO <fs> RIGHT-JUSTIFIED.
                OVERLAY <fs> WITH '0000000000000000'.           "max 16
              WHEN 'P'.
                SPLIT l_ltext AT p_separ INTO l_stext l_ltext.
                TRANSLATE l_stext USING ',.'.
                <fs> = l_stext.
              WHEN 'F'.
                SPLIT l_ltext AT p_separ INTO l_stext l_ltext.
                TRANSLATE l_stext USING ',.'.
                <fs> = l_stext.
              WHEN 'D'.
                SPLIT l_ltext AT p_separ INTO l_stext l_ltext.
                TRANSLATE l_stext USING '/.-.'.
                CALL FUNCTION 'CONVERT_DATE_TO_INTERNAL'
                     EXPORTING
                          date_external = l_stext
                     IMPORTING
                          date_internal = <fs>
                     EXCEPTIONS
                          OTHERS        = 1.
              WHEN 'T'.
                CALL FUNCTION 'CONVERT_TIME_INPUT'
                     EXPORTING
                          input  = l_stext
                     IMPORTING
                          output = <fs>
                     EXCEPTIONS
                          OTHERS = 1.
              WHEN OTHERS.
                SPLIT l_ltext AT p_separ INTO <fs> l_ltext.
            ENDCASE.
          ELSE.
            SHIFT l_ltext UP TO p_separ.
            SHIFT l_ltext.
          ENDIF.
        ENDLOOP.
        IF NOT <data> IS INITIAL.
          LOOP AT it_dd03p INTO is_dd03p WHERE datatype = 'CLNT'.
          This field is mandant
            ASSIGN COMPONENT is_dd03p-fieldname OF STRUCTURE <data>
                                                          TO <fs>.
            <fs> = sy-mandt.
          ENDLOOP.
          IF p_create = 'X'.
            IF is_dd03p-rollname = 'ERDAT'.
              ASSIGN COMPONENT is_dd03p-fieldname OF STRUCTURE <data>
                                                            TO <fs>.
              <fs> = sy-datum.
            ENDIF.
            IF is_dd03p-rollname = 'ERZET'.
              ASSIGN COMPONENT is_dd03p-fieldname OF STRUCTURE <data>
                                                            TO <fs>.
              <fs> = sy-uzeit.
            ENDIF.
            IF is_dd03p-rollname = 'ERNAM'.
              ASSIGN COMPONENT is_dd03p-fieldname OF STRUCTURE <data>
                                                            TO <fs>.
              <fs> = sy-uname.
            ENDIF.
          ENDIF.
          IF is_dd03p-rollname = 'AEDAT'.
            ASSIGN COMPONENT is_dd03p-fieldname OF STRUCTURE <data>
                                                          TO <fs>.
            <fs> = sy-datum.
          ENDIF.
          IF is_dd03p-rollname = 'AETIM'.
            ASSIGN COMPONENT is_dd03p-fieldname OF STRUCTURE <data>
                                                          TO <fs>.
            <fs> = sy-uzeit.
          ENDIF.
          IF is_dd03p-rollname = 'AENAM'.
            ASSIGN COMPONENT is_dd03p-fieldname OF STRUCTURE <data>
                                                          TO <fs>.
            <fs> = sy-uname.
          ENDIF.
          APPEND <data> TO <table>.
        ENDIF.
      ENDLOOP.
    ENDFORM.
          FORM f_modify_table                                           *
    FORM f_modify_table.
      SKIP.
      IF p_save = 'X'.
        MODIFY (p_table) FROM TABLE <table>.
      ELSEIF p_dele = 'X'.
        DELETE (p_table) FROM TABLE <table>.
      ELSE.
        EXIT.
      ENDIF.
      IF sy-subrc EQ 0.
        FORMAT COLOR COL_POSITIVE.
        IF p_save = 'X'.
          WRITE: /3 'Modify table OK'.
        ELSE.
          WRITE: /3 'Delete table OK'.
        ENDIF.
        FORMAT RESET.
        IF p_test IS INITIAL.
          COMMIT WORK.
        ELSE.
          ROLLBACK WORK.
          WRITE: '- test only, no update'.
        ENDIF.
      ELSE.
        FORMAT COLOR COL_NEGATIVE.
        WRITE: /3 'Error while modifying table'.
        FORMAT RESET.
      ENDIF.
    ENDFORM.
          FORM f_list_records                                           *
    FORM f_list_records.
      DATA: l_tleng TYPE i,
            l_lasti TYPE i,
            l_offst TYPE i.
    Output width
      l_tleng = 1.
      LOOP AT it_dd03p INTO is_dd03p.
        l_tleng = l_tleng + is_dd03p-outputlen.
        IF l_tleng LT sy-linsz.
          l_lasti = sy-tabix.
          l_tleng = l_tleng + 1.
        ELSE.
          l_tleng = l_tleng - is_dd03p-outputlen.
          EXIT.
        ENDIF.
      ENDLOOP.
    Output header
      SKIP.
      FORMAT COLOR COL_HEADING.
      WRITE: /3 'Contents'.
      FORMAT RESET.
      ULINE AT /3(l_tleng).
    Output records
      LOOP AT <table> ASSIGNING <data>.
        LOOP AT it_dd03p INTO is_dd03p FROM 1 TO l_lasti.
          IF is_dd03p-position = 1.
            WRITE: /3 sy-vline.
            l_offst = 3.
          ENDIF.
          ASSIGN COMPONENT is_dd03p-fieldname OF STRUCTURE <data> TO <fs>.
          l_offst = l_offst + 1.
          IF is_dd03p-decimals LE 2.
            WRITE: AT l_offst <fs>.
          ELSE.
            WRITE: AT l_offst <fs> DECIMALS 3.
          ENDIF.
          l_offst = l_offst + is_dd03p-outputlen.
          WRITE: AT l_offst sy-vline.
        ENDLOOP.
      ENDLOOP.
    Ouptut end
      ULINE AT /3(l_tleng).
    ENDFORM.
    Regards,
    Joy.

  • How to read data from flatfile and insert into other relevant tables ? Please suggest me the query ?

    Hi to all,
    I have flat files in different location through FTP i need to fetch those files and load in the relavant table of the database.
    Please share me the query to do it ..

    You would need a ForEach Loop to iterate though the files. Initially the FTP task will pull the files from locations to a landing folder. Once thats done the ForEachLoop will iterate through files in the folder and will have a data flow task inside to transfer
    file data to tables.
    If you want a more secure option you can also use SFTP (Secured FTP) and can implement it using free WinSCP clinet. I've explained a method of doing it fo dynamic files here
    http://visakhm.blogspot.in/2012/12/implementing-dynamic-secure-ftp-process.html
    for iterating through files see this example
    http://visakhm.blogspot.in/2012/05/package-to-implement-daily-processing.html
    you may not need the validation step inside the loop in your case
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Dynamic Upload and Insert for each Z Table

    Hi all
    i search a blog or solution for dynamic upload each Z table with external data.
    Because we have a lot of migration table and we will fill it dynamic from .txt or excel or others and with out check fill it to SAP.
    My wish is an ABAP
    Parameter: Z-table name
                     path to upload
    inside on abap, take the structure from Z table and merge the external data to table
    Thanks for help

    the code for u...
    http://www.sapfans.com/sapfans/repos/neil.htm

  • How to read LONG RAW data from one  table and insert into another table

    Hello EVERYBODY
    I have a table called sound with the following attributes. in the music attribute i have stored some messages in the different language like hindi, english etc. i want to concatinate all hindi messages and store in the another table with only one attribute of type LONG RAW.and this attribute is attached with the sound item.
    when i click the play button of sound item the all the messages recorded in hindi will play one by one automatically. for that i'm doing the following.
    i have written the following when button pressed trigger which will concatinate all the messages of any selected language from the sound table, and store in another table called temp.
    and then sound will be played from the temp table.
    declare
         tmp sound.music%type;
         temp1 sound.music%type;
         item_id ITEM;
    cursor c1
    is select music
    from sound
    where lang=:LIST10;
    begin
         open c1;
         loop
              fetch c1 into tmp; //THIS LINE GENERATES THE ERROR
              temp1:=temp1||tmp;
              exit when c1%notfound;
         end loop;
    CLOSE C1;
    insert into temp values(temp1);
    item_id:=Find_Item('Music');
    go_item('music');
    play_sound(item_id);
    end;
    but when i'm clicking the button it generates the following error.
    WHEN-BUTTON-PRESSED TRIGGER RAISED UNHANDLED EXCEPTION ORA-06502.
    ORA-06502: PL/SQL: numeric or value error
    SQL> desc sound;
    Name Null? Type
    SL_NO NUMBER(2)
    MUSIC LONG RAW
    LANG CHAR(10)
    IF MY PROCESS TO SOLVE THE ABOVE PROBLEM IS OK THEN PLESE TELL ME THE SOLUTION FOR THE ERROR. OTHER WISE PLEASE SUGGEST ME,IF ANY OTHER WAY IS THERE TO SOLVE THE ABOVE PROBLEM.
    THANKS IN ADVANCE.
    D. Prasad

    You can achieve this in many different ways, one is
    1. Create another VO based on the EO which is based on the dest table.
    2. At save, copy the contents of the source VO into the dest VO (see copy routine in dev guide).
    3. commiting the transaction will push the data into the dest table on which the dest VO is based.
    I understand that if we attach VO object instance to region/page, we only can pull and put data in to only one table.
    if by table you mean a DB table, then no, you can have a VO based on multiple EOs which will do DMLs accordingly.Thanks
    Tapash

  • Join two source tables and replicat into a target table with BLOB

    Hi,
    I am working on an integration to source transaction data from legacy application to ESB using GG.
    What I need to do is join two source tables (to de-normalize the area_id) to form the transaction detail, then transform by concatenate the transaction detail fields into a value only CSV, replicate it on the target ESB IN_DATA table's BLOB content field.
    Based on what I had researched, lookup by join two source tables require SQLEXEC, which doesn't support BLOB.
    What alternatives are there and what GG recommend in such use case?
    Any helpful advice is much appreciated.
    thanks,
    Xiaocun

    Xiaocun,
    Not sure what you're data looks like but it's possible the the comma separated value (CSV) requirement may be solved by something like this in your MAP statement:
    colmap (usedefaults,
    my_blob = @STRCAT (col02, ",", col03, ",", col04)
    Since this is not 1:1 you'll be using a sourcedefs file, which is nice because it will do the datatype conversion for you under the covers (also a nice trick when migrating long raws to blobs). So col02 can be varchar2, col03 a number, and col04 a clob and they'll convert in real-time.
    Mapping two tables to one is simple enough with two MAP statements, the harder challenge is joining operations from separate transactions because OGG is operation based and doesn't work on aggregates. It's possible you could end up using a combination of built in parameters and funcations with SQLEXEC and SQL/PL/SQL for more complicated scenarios, all depending on the design of the target table. But you have several scenarios to address.
    For example, is the target table really a history table or are you actually going to delete from it? If just the child is deleted but you don't want to delete the whole row yet, you may want to use NOCOMPRESSDELETES & UPDATEDELETES and COLMAP a new flag column to denote it was deleted. It's likely that the insert on the child may really mean an update to the target (see UPDATEINSERTS).
    If you need to update the LOB by appending or prepending new data then that's going to require some custom work, staging tables and a looping script, or a user exit.
    Some parameters you may want to become familiar with if not already:
    COLS | COLSEXCEPT
    COLMAP
    OVERRIDEDUPS
    INSERTDELETES
    INSERTMISSINGUPDATES
    INSERTUPDATES
    GETDELETES | IGNOREDELETES
    GETINSERTS | IGNOREINSERTS
    GETUPDATES | IGNOREUPDATES
    Good luck,
    -joe

  • Build a sql query fro different table and insert into a table

    Hi I have a requirement ,
    i have some table 1 table which holds rules which rules i have to apply on the sql query
    2 table will hold table name and column name and transformation
    with these help of 2 tables i want to build a sql quey and the sql query need to be insert into another  table.

    Hi Karthick,
    I am not going to build Dynamic Data Model.
    i already have table 1 contains rules let's say COUNT,
    and another table contain source table name , column name , Target table name, Source Table name like that
    so with the help of these 2 tables i want to build Sql query and going to insert into 3rd table.

  • Looking data from more than one table and inserting into another.

    Hello,
    I am giving you the Table structures as per my requirement..
    CREATE TABLE TEMP_A
    (NAME VARCHAR2(100) primary key);
    CREATE TABLE TEMP_B
    (STRUCTURE VARCHAR2(10));
    CREATE TABLE TEMP_C
    ( NAME VARCHAR2(100),
    STRUCTURE VARCHAR2(10),
    VALUE VARCHAR2(10));
    Alter table TEMP_C
    add constraint fk_name_tempc foreign key(name) references TEMP_A(name)
    INSERT INTO TEMP_A VALUES('SMITH');
    INSERT INTO TEMP_A VALUES('ALLEN');
    INSERT INTO TEMP_A VALUES('WARD');
    INSERT INTO TEMP_A VALUES('JONES');
    COMMIT;
    INSERT INTO TEMP_B VALUES('IN');
    INSERT INTO TEMP_B VALUES('IN_MIN');
    INSERT INTO TEMP_B VALUES('IN_TYP');
    INSERT INTO TEMP_B VALUES('IN_MAX');
    INSERT INTO TEMP_B VALUES('DIP');
    INSERT INTO TEMP_B VALUES('TIM');
    COMMIT;
    INSERT INTO TEMP_c VALUES('SMITH','C1','');
    INSERT INTO TEMP_c VALUES('SMITH','C2','');
    INSERT INTO TEMP_c VALUES('SMITH','D1','');
    INSERT INTO TEMP_c VALUES('ALLEN','D2','');
    INSERT INTO TEMP_c VALUES('ALLEN','R1','');
    INSERT INTO TEMP_c VALUES('WARD','R2','');
    COMMIT;
    i want to say is it should insert into table 'TEMP_C' values as :
    For 'SMITH' there should be (6 * 3) = 18 records.
    ( 6 distinct values of TEMP_B for SMITH to be inserted into TEMP_C against 'C1')
    ( 6 distinct values of TEMP_B for SMITH to be inserted into TEMP_C against 'C2')
    ( 6 distinct values of TEMP_B for SMITH to be inserted into TEMP_C against 'D1')
    For 'ALLEN' there should be (6 * 2) = 12 records.
    ( 6 distinct values of TEMP_B for ALLEN to be inserted into TEMP_C against 'D2')
    ( 6 distinct values of TEMP_B for ALLEN to be inserted into TEMP_C against 'R1')
    For 'WARD' there should be (6 * 1) = 6 records.
    ( 6 distinct values of TEMP_B for WARD to be inserted into TEMP_C against 'R2')
    Like this if there are records for JONES also , it should also do the same way( Depending on the No. of records present
    in the table 'TEMP_C' for 'JONES').
    Thanks in advance,
    Amkotz

    Is this what you are looking for?
    SQL> insert into temp_c (name, structure, value)
      2  select c.name, c.structure, b.structure
      3  from temp_a a, temp_c c, temp_b b
      4  where a.name = c.name
      5  ;
    36 rows created.
    SQL> select * from temp_c;
    NAME    STRUCTURE  VALUE
    SMITH   C1
    SMITH   C2
    SMITH   D1
    ALLEN   D2
    ALLEN   R1
    WARD    R2
    SMITH   C1         IN
    SMITH   C1         IN_MIN
    SMITH   C1         IN_TYP
    SMITH   C1         IN_MAX
    SMITH   C1         DIP
    SMITH   C1         TIM
    SMITH   C2         IN
    SMITH   C2         IN_MIN
    SMITH   C2         IN_TYP
    SMITH   C2         IN_MAX
    SMITH   C2         DIP
    SMITH   C2         TIM
    SMITH   D1         IN
    SMITH   D1         IN_MIN
    SMITH   D1         IN_TYP
    SMITH   D1         IN_MAX
    SMITH   D1         DIP
    SMITH   D1         TIM
    ALLEN   D2         IN
    ALLEN   D2         IN_MIN
    ALLEN   D2         IN_TYP
    ALLEN   D2         IN_MAX
    ALLEN   D2         DIP
    ALLEN   D2         TIM
    ALLEN   R1         IN
    ALLEN   R1         IN_MIN
    ALLEN   R1         IN_TYP
    ALLEN   R1         IN_MAX
    ALLEN   R1         DIP
    ALLEN   R1         TIM
    WARD    R2         IN
    WARD    R2         IN_MIN
    WARD    R2         IN_TYP
    WARD    R2         IN_MAX
    WARD    R2         DIP
    WARD    R2         TIM
    42 rows selected.
    SQL>

  • Merge statement and insert into a third table

    I'm wondering if the merge statement will work in my case. I'm loading a file into a temporary table. I want to compare this temporary table with an existing source table (there are multiple source tables, I've thought of breaking them up into separate statements) and write out any differences to a third table. The third table would be used for future processing on the rows that are found to be different. Do you think this is possible using the merge command?

    Why use the merge command for that purpose?
    If you want to find out the differences of two tables and put the result into a third one then you should do
    insert into table_c
    query_a
    minus
    query_b
    or
    insert into table_c
    (query_a
    minus
    query_b)
    union all
    (query_b
    minus
    query_a)
    hth

  • How to get the data from one table and insert into another table

    Hi,
    We have requirement to build OA page with the data needs to be populated from one table and on save data into another table.
    For the above requirement what the best way to implement in OAF.
    I understand that if we attach VO object instance to region/page, we only can pull and put data in to only one table.
    Thanks

    You can achieve this in many different ways, one is
    1. Create another VO based on the EO which is based on the dest table.
    2. At save, copy the contents of the source VO into the dest VO (see copy routine in dev guide).
    3. commiting the transaction will push the data into the dest table on which the dest VO is based.
    I understand that if we attach VO object instance to region/page, we only can pull and put data in to only one table.
    if by table you mean a DB table, then no, you can have a VO based on multiple EOs which will do DMLs accordingly.Thanks
    Tapash

  • Fetch data from two tables and insert into one table

    I have similar to the following data in two tables (@Plant, @PlantDirector) and need to consolidate into one table (@PlantNew) as follows.
    DECLARE @Plant TABLE (PlantID INT, PlantName VARCHAR(100))
    INSERT INTO @Plant (PlantID, PlantName) VALUES (1, 'Name One'),(2, 'Name Two'),(3, 'Name Three'),(4, 'Name Four'),(5, 'Name Five'),(6, 'Name Six')
    Director data for the Plants exist in the following table. Assistant value 1 means Assistant Director and 0 means Director. 
    Data exists only for subset of plants and a Plant may have one or both roles.
    DECLARE @PlantDirector TABLE (PlantID INT, PlantDirectorID INT, Assistant bit)
    INSERT INTO @PlantDirector (PlantID, PlantDirectorID, Assistant) VALUES (2, 111, 1),(2, 222, 0),(4, 333, 0),(6,444,1)
    The above data needs to be inserted into one table (@PlantNew) as follows: 
    PlantID in @Plant table is IDENTITY value and needs to be inserted as-is into this table.
    PlantDirExists will get a value of 1 if at least one record exists in @PlantDirector table for a PlantID. PlantAssistantDirID and PlantDirID should be set to the corresponding PlantDirID or NULL appropriately depending on the data.
    DECLARE @PlantNew TABLE (PlantID INT, PlantName VARCHAR(100), PlantDirExists bit, PlantAssistantDirID INT, PlantDirID INT)
    INSERT INTO @PlantNew (PlantID, PlantName, PlantDirExists, PlantAssistantDirID, PlantDirID)
    VALUES (1, 'Name One', 0, NULL, NULL),(2, 'Name Two', 1, 111, 222),(3, 'Name Three', 0, NULL, NULL),(4, 'Name Four', 1, NULL, 333),(5, 'Name Five', 0, NULL, NULL),(6, 'Name Six',1, 444, NULL)
    How do I achieve the above using SQL ? Thanks.

    like this
    INSERT @PlantNew  (PlantID, PlantName, PlantDirExists, PlantAssistantDirID, PlantDirID) 
    SELECT p.PlantID,
    p.PlantName,
    CASE WHEN pd.PlantID IS NULL THEN 0 ELSE 1 END,
    PlantAssistantDirID,
    PlantDirID
    FROM @Plant p
    LEFT JOIN (SELECT PlantID,
    MAX(CASE WHEN Assistant = 1 THEN PlantDirectorID END) AS PlantAssistantDirID,
    MAX(CASE WHEN Assistant = 0 THEN PlantDirectorID END) AS PlantDirID
    @PlantDirector
    GROUP BY PlantID)pd
    ON pd.PlantID = p.PlantID
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs
    You're missing a FROM
    insert into @PlantNew
    SELECT p.PlantID,
    p.PlantName,
    CASE WHEN pd.PlantID IS NULL THEN 0 ELSE 1 END,
    PlantAssistantDirID,
    PlantDirID
    FROM @Plant p
    LEFT JOIN (SELECT PlantID,
    MAX(CASE WHEN Assistant = 1 THEN PlantDirectorID END) AS PlantAssistantDirID,
    MAX(CASE WHEN Assistant = 0 THEN PlantDirectorID END) AS PlantDirID
    from @PlantDirector
    GROUP BY PlantID)pd
    ON pd.PlantID = p.PlantID

Maybe you are looking for

  • Recognized in iTunes, but not as a camera in "My Computer"

    I have a PC, running Windows Vista Home Premium (64 bit) SP1. When I plug in my iPhone 3GS, it shows up in iTunes just fine. But I want to copy some pictures to my computer. It used to show up as a drive (with subfolders and files) but not anymore. W

  • Table name for account determination

    what is the table name for finding the material type, valuation classes and accounting modifier? Thanks Manesh

  • Technical Specification for Enterprise Portal

    Our EP consultant left without delivering Technical Specification. Now, we need to prepare this document mainly for handover to support team. Coming from BW background, I have little knowledge on what is expected in the document. What I can think of

  • Format Definition - Mapping Methods

    Hello, I have problem in Format Definition addon with mapping methods "count(path)" and "sum(path)" (Node Set Functions). I don't know how to use them. I need to sum some ammounts in transaction records, so I wanted to use mapping function "sum", but

  • What is an applications folder?!

    hi all, i downloaded software that supposedly will allow me to transfer songs from my ipod to itunes. the icon for the software says i'm supposed to drag it to my applications folder and this will initiate everything. the problem is i don't know what