Retrieve data from BAPI, FM and Insert in WAS Database

Hello All.
I need to retrieve data from SAP in WedDynpro project using BAPI, FM and create tables in WAS Database to Insert.
I have created the front ends in WedDynpro project to acces data using BAPI and FM. Its works fine.
How to create these table in WebDynpro project to insert the data come from SAP (BAPI - FM) ?
I created the tables in dictionary.
--> Create a new project --> project type Dictionary Project --> Create Tables.
How can insert the data from the BAPI in tables created in dictionary?
Best Regards.
Taylor

Hi Taylor,
I think you have opened two threads for the same question. Please, close this thread.
Regards,
Gopal.

Similar Messages

  • How to retrieve data from catsdb table and convert into xml using BAPI

    How to retrieve data from catsdb table and convert into xml using BAPI
    Points will be rewarded,
    Thank you,
    Regards,
    Jagrut BharatKumar Shukla

    Hi,
    This is not your requirment but u can try this :
    CREATE OR REPLACE DIRECTORY text_file AS 'D:\TEXT_FILE\';
    GRANT READ ON DIRECTORY text_file TO fah;
    GRANT WRITE ON DIRECTORY text_file TO fah;
    DROP TABLE load_a;
    CREATE TABLE load_a
    (a1 varchar2(20),
    a2 varchar2(200))
    ORGANIZATION EXTERNAL
    (TYPE ORACLE_LOADER
    DEFAULT DIRECTORY text_file
    ACCESS PARAMETERS
    (FIELDS TERMINATED BY ','
    LOCATION ('data.txt')
    select * from load_a;
    CREATE TABLE A AS select * from load_a;
    SELECT * FROM A
    Regards
    Faheem Latif

  • How to read the data from XML file and insert into oracle DB

    Hi All,
    I have below require ment.
    I will receive data in the XML file. then i need to read that data and insert into oracle tables. please let me know how this can be handled.
    Many Thanks.

    Sounds a lot like this question, only with less details.
    how to read data from XML  variable and insert into table variable
    We can only help if you provide us details to help as we cannot see what you are doing and only know what you tell us.  Plenty of examples abound on the forums that cover the topics you seek as well.

  • Hi, extract data from xml file and insert into another exiting xml file

    i am searching code to extract data from xml file and insert into another exiting xml file by a java program. I understood it is easy to extract data from a xml file, and how ever without creating another xml file. We want to insert the extracted data into another exiting xml file. Suggestions?
    1st xml file which has two lines(text1.xml)
    <?xml version="1.0" encoding="iso-8859-1"?>
    <xs:PrintDataRequest xmlns:xs="http://com.unisys.com/Anid"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://com.unisys.com/Anid file:ANIDWS.xsd">
    <xs:Person>
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://com.unisys.com/Anid file:ANIDWS.xsd">
    These two lines has to be inserted in the existing another xml(text 2.xml) file(at line 3 and 4)
    Regards,
    bubbly

    Jadz_Core wrote:
    RandomAccessFile? If you know where you want to insert it.Are you sure about this? If using this, the receiving file would have to have bytes inserted that exactly match the number of bytes replaced. I'm thinking that you'll likely have to stream through the second XML with a SAX parser and copy information (or insert new information) as you stream with an XML writer of some sort.

  • Extract data from xml file and insert into another exiting xml fil

    hello,
    i am searching extract data from xml file and insert into another exiting xml file by a java program. I understood it is easy to extract data from a xml file, and how ever without creating another xml file. We want to insert the extracted data into another exiting xml file. Suggestions?
    Regards,
    Zhuozhi

    If the files are small, you can load the target file in a DOM document, insert the data from the source file and persist the DOM.
    If the files are large, you probably want to use a StAX or SAX.

  • Retrieving data from an ArrayList and presenting them in a JSP

    Dear Fellow Java Developers:
    I am developing a catalogue site that would give the user the option of viewing items on a JSP page. The thing is, a user may request to view a certain item of which there are several varieties, for example "shirts". There may be 20 different varieties, and the catalogue page would present the 20 different shirts on the page. However, I want to give the user the option of either viewing all the shirts on a single page, or view a certain number at a time, and view the other shirts on a second or third JSP page.
    See the following link as an example:
    http://www.eddiebauer.com/eb/cat_default.asp?nv=2|21472|9&tid=&c=&sc=&cm_cg=&lp=n2f
    I am able to retrieve the data from the database, and present all of them on a JSP, however I am not sure how to implement the functionality so that they can be viewed a certain number at a time. So if I want to present say 12 items on a page and the database resultset brings back 30 items, I should be able to present the first 12 items on page1, the next 12 items on page2, and the remaining 6 items on page3. How would I do this? Below is my scriplet code that I use to retrieve information from the ArrayList that I retrieve from my database and present them in their entirety on a single JSP page:
    <table>
    <tr>
    <%String product=request.getParameter("item");
    ArrayList list=aBean.getCatalogueData(product);
    int j=0, n=2;
    for(Iterator i=list.iterator(); i.hasNext(); j++){
    if(j>n){
    out.print("</tr><tr>");
    j=0;
    Integer id=(Integer)i.next();
    String name=(String)i.next();
    String productURL=(String)i.next();
    out.print("a bunch of html with the above variables embedded inside")
    %>
    </tr>
    </table>
    where aBean is an instace of a JavaBean that retrieves the data from the Database.
    I have two ideas, because each iteration of the for loop represents one row from the database, I was thinking of introducing another int variable k that would be used to count all the iterations in the for loop, thus knowing the exact number. Once we had that value, we would then be able to determine if it was greater than or less than or equal to the maximum number of items we wanted to present on the JSP page( in this case 12). Once we had that value would then pass that value along to the next page and the for loop in each subsequent JSP page would continue from where the previous JSP page left off. The other option, would be to create a new ArrayList for each JSP page, where each JSP page would have an ArrayList that held all the items that it would present and that was it. Which approach is best? And more importantly, how would I implement it?
    Just wondering.
    Thanks in advance to all that reply.
    Sincerely;
    Fayyaz

    -You said to pass two parameters in the request,
    "start", and "count". The initial values for "start"
    would be zero, and the inital value for "count" would
    be the number of rows in the resultSet from the
    database, correct?Correct.
    -I am a little fuzzy about the following block of code
    you gave:
    //Set start and count for next page
    start += count; // If less than count left in array, send the number left to next next page
    count = ((start*3)+(count*3) < list.size()) ? (count) : ((list.size() - (start*3))/3)
    Could you explain the above block of code a little
    further please?Okay, first, I was using the ternary operators (boolean) ? val_if_true : val_if_false;
    This works like an if() else ; statement, where the expression before the ? represents the condition of the if statement, the result of the expression directly after the ? is returned if the condition is true, and the expression after the : is returned if the condition is false. These two statments below, one using the ternary ? : operators, and the other an if/else, do the same thing:
    count = ((start*3)+(count*3) < list.size()) ? (count) : ((list.size() - (start*3))/3);
    if ((start*3)+(count*3) < list.size()) count = count;
    else count = ((list.size() - (start*3))/3);Now, why all the multiplying by 3s? Because you store three values in your list for each product, 1) the product ID, 2) the product Name, and 3) the product URL. So to look at the third item, we need to be looking at the 9th,10th, and 11th values in the list.
    So I want to avoid an ArrayIndexOutOfBounds error from occuring if I try to access the List by an index greater than the size of the list. Since I am working with product numbers, and not the number of items stored in the list, I need to multiply by three, both the start and count, to make sure their sum does not exceed the value stored in the list. I test this with ((start*3)+(count*3) < list.size()) ?. It could have been done like: ((start + count) * 3 < list.size()) ?.
    So if this is true, the next page can begin looking through the list at the correct start position and find the number of items we want to display without overstepping the end of the list. So we can leave count the way it is.
    If this is false, then we want to change count so we will only traverse the number of products that are left in the list. To do this, we subtract where the next page is going to start looking in the list (start*3) from the total size of the list, and devide that by 3, to get the number of products left (which will be less then count. To do this, I used: ((list.size() - (start*3))/3), but I could have used: ((list.size()/3) - start).
    Does this explain it enough? I don't think I used the best math in the original post, and the line might be better written as:
    count = ((size + count)*3 < list.size()) ? (count) : ((list.size()/3) - start);All this math would be unnecessary if you made a ProductBean that stored the three values in it.
    >
    - You have the following code snippet:
    //Get the string to display this same JSP, but with new start and count
    String nextDisplayURL = encodeRedirectURL("thispage.jsp?start=" + start + "&count=" + count + "&item=" + product);
    %>
    <a href="<%=nextDisplayURL%>">Next Page</a>
    How would you do a previous page URL? Also, I need to
    place the "previous", "next" and the different page
    number values at the top of the JSP page, as in the
    following url:
    http://www.eddiebauer.com/eb/cat_default.asp?nv=2|21472
    9&tid=&c=&sc=&cm_cg=&lp=n2f
    How do I do this? I figure this might be a problem
    since I am processing the code in the middle of the
    page, and then I need to have these variables right at
    the top. Any suggestions?One way is to make new variable names, 'nextStart', 'previousStart', 'nextCount', 'previousCount'.
    Calculate these at the top, based on start and count, but leave the ints you use for start and count unchanged. You may want to store the count in the session rather than pass it along, if this is the case. Then you just worry about the start, or page number (start would be (page number - 1) * count. This would be better because the count for the last page will be less than the count on other pages, If we put the count in session we will remember that for previous pages...
    I think with the details I provided in this and previous post, you should be able to write the necessary code.
    >
    Thanks once again for all of your help, I really do
    appreciate the time and effort.
    Take care.
    Sincerely;
    Fayyaz

  • Need to read data from a jsp and insert in other jsp

    Hi,
    I have a jsp where i need to read from another jsp(which has data " header ---insert -----footer") and find a string "insert" in it and replace that string with a some value in the current jsp.
    Thanx in Advance.
    Sridhar.

    I have a jsp
    which needs to take headers and footers dynamically and insert them into my jsp.
    currently what i am doing is, using a script to get these header and footer and using document.write
    MYJSP.jsp looks this way
    <script src="/../Headerfooter.jsp"/>
    <script language="javascript">
    document.write(header); -->defined in -> /HeaderFooter.jsp
    document.write("<table>...</table>");
    document.write(footer);-->defined in -> /HeaderFooter.jsp
    </script>
    /HeaderFooter.jsp looks this way
    header = "<html>...<table width=\"100\">....";
    footer ="....</html">
    i dont want to use script bcoz i am getting problems in nestcape...

  • Fetch data from one table and insert into two tables in desired format

    I have similar to the following data in a table and it is not normalized. The groupID is being used to group two records of similar nature.
    DECLARE @OldDoc TABLE (oldDocID INT, groupID INT, deptID INT)
    INSERT INTO @OldDoc (oldDocID, groupID) VALUES (1, NULL, 111),(2,NULL,111),(3,1,111),(4,NULL,333),(5,1,222),(6,NULL,333),(7,2,222),(8,2,333),(9,NULL,111),(10,3,222),(11,NULL,333),(12,3,444)
    I need to process the data from the above table (@OldDoc) and write into two new tables (@NewDoc and @NewDocGroup) as follows.
    oldDocID should be stored as newDocID when inserting to @NewDoc table. Only records with groupID NULL and one record (first one) per group should be considered (For example, oldDocID 5 is not considered as 3 and 5 belong to the same groupID 1) for insertion. 
    DECLARE @NewDoc TABLE (newDocID INT)
    INSERT INTO @NewDoc (newDocID) VALUES (1),(2),(3),(4),(6),(7),(9),(10),(11)
    All records from @OldDoc should be considered for insertion into @NewDocGroup table. OldDocID is inserted as NewDocID and deptID is as-is. Instead of groupID, the ID of the first record in the 
    group should be considered as parentNewDocID (For example, 3 is considered as parentNewDocID for newDocID 5 as 3 and 5 belong to the same groupID in @OldDoc table) for the newDocID.
    DECLARE @NewDocGroup (newDocID INT, parentNewDocID INT, deptID INT)
    INSERT INTO @NewDocGroup (newDocID, parentNewDocID, deptID) VALUES (1,1,111),(2,2,111),(3,3,111),(4,4,333),(5,3,222),(6,6,333),(7,7,222),(8,7,333),(9,9,111),(10,10,222),(11,11,333),(12,10,444)
    How do I accomplish the above using SQL ? Thanks for the help.

    >> I have similar to the following data in a table and it is not normalized. The group_id is being used to group two records [sic] of similar nature. <<
    Rows are not records. Tables have to have a key by definition. You do not do math with identifiers, so they should not be numeric. Let's ignore that error for now. In short, you are posting garbage. If you had followed Forum Netiquette, would you have posted
    this? 
    CREATE TABLE Old_Documents
    (old_doc_id INTEGER NOT NULL PRIMARY KEY, 
     group_id INTEGER, 
     dept_nbr INTEGER NOT NULL
       REFERENCES Departments (dept_nbr));
    INSERT INTO Old_Documents(old_doc_id, group_id, dept_nbr) 
    VALUES  (1, NULL, 111), 
    (2, NULL, 111), 
    (3, 1, 111), 
    (4, NULL, 333), 
    (5, 1, 222), 
    (6, NULL, 333), 
    (7, 2, 222), 
    (8, 2, 333), 
    (9, NULL, 111), 
    (10, 3, 222), 
    (11, NULL, 333), 
    (12, 3, 444);
    >> I need to process the data from the above table (Old_Documents) and write into two new tables (New_Documents and New_Documents_Groups) as follows. <<
    Just like punch cards and mag tape data processing! Being old and being new are a status, not another kind of entity. But that is how mag tapes work. And you even use the verb "fetch" from tape files. This design flaw is called  attribute splitting.
    Do you have a Male_Personnel and Female_Personnel table? NO! It is just Personnel! 
    >> old_doc_id should be stored as new_doc_id when inserting to New_Documents table. Only records [sic] with group_id NULL and one record [sic] (first [sic; no ordering in a table] one) per group should be considered (For example, old_doc_id 5 is not considered
    as 3 and 5 belong to the same group_id =1) for insertion. <<
    Think about your punch card mindset. Why did you physically materialize that redundant New_Documents table? Let me answer that: this is how you work with punch cards! In SQL we use a VIEW:
    CREATE VIEW New_Documents (new_doc_id)
    AS 
    SELECT old_doc_id 
      FROM Old_Documents;
    >> All records [sic] from Old_Documents should be considered for insertion into New_Documents_Groups table. The old_doc_id is inserted as new_doc_id and dept_nbr is as-is. Instead of group_id, the ID [sic: which identifier??] of the first [sic: tables
    have no ordering like a deck of punch cards] record [sic] in the group should be considered as parent_new_doc_id (For example, 3 is considered as parent_new_doc_id for new_doc_id 5 as 3 and 5 belong to the same group_id in Old_Documents table) for the new_doc_id.
    <<
    Why not use 5 as the parent? My guess is that you are trying to form equivalence classes. See:
    https://www.simple-talk.com/content/print.aspx?article=2020
    --CELKO-- Books in Celko Series for Morgan-Kaufmann Publishing: Analytics and OLAP in SQL / Data and Databases: Concepts in Practice Data / Measurements and Standards in SQL SQL for Smarties / SQL Programming Style / SQL Puzzles and Answers / Thinking
    in Sets / Trees and Hierarchies in SQL

  • Read data from E$ table and insert into staging table

    Hi all,
    I am a new to ODI. I want your help in understanding how to read data from an "E$" table and insert into a staging table.
    Scenario:
    Two columns in a flat file, employee id and employee name, needs to be loaded into a datastore EMP+. A check constraint is added to allow data with employee names in upper case only to be loaded into the datastore. Check control is set to flow and static. Right click on the datastore, select control and then check. The rows that have violated the check constraint are kept in E$_EMP+ table.
    Problem:
    Issue is that I want to read the data from the E$_EMP+ table and transform the employee name into uppercase and move the corrected data from E$_EMP+ into EMP+. Please advise me on how to automatically handle "soft" exceptions in ODI.
    Thank you

    Hi Himanshu,
    I have tried the approach you suggested already. That works. However, the scenario I described was very basic. Based on the error logged into the E$_EMP table, I will have to design the interface to apply more business logic before the data is actually merged into the target table.
    Before the business logic is applied, I need to read each row from the E$EMP table first and I do not know how to read from E$EMP table.

  • Pool data from text file and insert into database

    Can anyone tell me how to pool data from a text file and insert into database?
    let's say my text file is in this format
    123456 Peter 22
    234567 Nicholas 24
    345678 Jane 20
    Then I need to insert the all the value for this three column into a table which has the three column name ID, Name, Age
    Anyone knows? I need to do this urgently...Thank in advanced

    1. Use BufferedReader and read the file line by line.
    2. Loop thru the file and do the following steps with in this loop.
    3. Use StringTokenizer to seperate each line into three values (columns).
    4. Now create a insert statement with these values and add the statement to the batch (using addBatch() method of PreparedStatement or Statement).
    5. Finally (after exiting the loop), execute these batch of statements (using ps.executeBatch()).
    Sudha

  • How to get the data from a file and insert into a table

    Good morning,
    NEED TO READ THIS FILE sui_facturacion_alcantarillado_15085_2011_01_76845_00A.csv containing the following information
    NUID,NUMERO_DE_CUENTA_CONTRATO,CÓDIGO_DANE_DEPARTAMENTO,CÓDIGO_DANE_MUNICIPIO,ZONA_IGAC,SECTOR_IGAC,MANZANA_O_VEREDA_IGAC,NÚMERO_DEL_PREDIO_IGAC,CONDICION_DE_PROPIEDAD_DEL_PREDIO_IGAC,DIRECCIÓN_DEL_PREDIO,NÚMERO_DE_FACTURA,FECHA_DE_EXPEDICIÓN_DE_LA_FACTURA,FECHA_DE_INICIO_DEL_PERÍODO_DE_FACTURACIÓN,DIAS_FACTURADOS,CÓDIGO_CLASE_DE_USO,UNIDADES_MULTIUSUARIO_RESIDENCIAL,UNIDADES_MULTIUSUARIO_NO_RESIDENCIAL,HOGAR_COMUNITARIO_O_SUSTITUTO,USUARIO_FACTURADO_CON_AFORO,USUARIO_CUENTA_CON_CARACTERIZACIÓN,CARGO_FIJO,CARGO_POR_VERTIMIENTO_BASICO,CARGO_POR_VERTIMIENTO_COMPLEMENTARIO,CARGO_POR_VERTIMIENTOSUNTUARIO,CMT,VERTIMIENTO_DEL_PERIOD_EN_METROS_CUBICOS,VALOR_FACTURADO_POR_VERTIDO,VALOR_DEL_SUBSIDIO,VALOR_DE_LA_CONTRIBUCIÓN,FACTOR_DE_SUBSIDIO_O_CONTRIBUCIÓN_CARGO_FIJO,FACTOR_DE_SUBSIDIO_O_CONTRIBUCIÓN_VERTIMIENTO,CARGOS_POR_CONEXIÓN,PAGO_ANTICIPADO_DEL_SERVICIO,DÍAS_DE_MORA,VALOR_DE_MORA,INTERESES_POR_MORA,OTROS_COBROS,CAUSAL_DE_REFACTURACIÓN,NUMERO_DE_LA_FACTURA_OBJETO_DE_REFACTURACIÓN,VALOR_TOTAL_FACTURADO,PAGOS_DEL_CLIENTE_DURANTE_EL_PERÍODO_FACTURADO
    242602,242602,76,845,99,99,9999,9999,999,CLL 5 CRA 7 PEATONAL,24911920,12-01-2011,01-12-2010,30,01,,,0,0,0,1,0000000000.00,0000000000.00,0000000000.00,0000000000.00,0000000005,0000002200.00,0000000000,0000000000,0.000,0.000,0000000000.00,0000000000.00,0,0000000000.00,0000000000.00,0000000000.00,0,0,0000002201.00,0000000000.00
    242604,242604,76,845,99,99,9999,9999,999,CRA 4 # 6 - 13,24911846,12-01-2011,01-12-2010,30,01,,,0,0,0,1,0000000000.00,0000000000.00,0000000000.00,0000000000.00,0000000013,0000002200.00,0000000000,0000000000,0.000,0.000,0000000000.00,0000000000.00,0,0000000000.00,0000000000.00,0000000000.00,0,0,0000002201.00,0000004411.00
    242605,242605,76,845,99,99,9999,9999,999,CRA 2 CLLES 3 Y 4,24911509,12-01-2011,01-12-2010,30,01,,,0,0,0,1,0000000000.00,0000000000.00,0000000000.00,0000000000.00,0000000004,0000002200.00,0000000000,0000000000,0.000,0.000,0000000000.00,0000000000.00,0,0000000000.00,0000000000.00,0000000000.00,0,0,0000002201.00,0000002200.00
    this is the function that I have
    <<function_test>>
    DECLARE
    TOTAL_CAR NUMBER;
    POS_1 NUMBER:= 0;
    POS_2 NUMBER:= 0;
    REST NUMBER:= 0;
    ACUM NUMBER:= 0;
    CADEN VARCHAR2(200);
    nom_archivo varchar2(80);
    v1 utl_file.file_type;
    v2 varchar2(2048);
    BEGIN
         nom_archivo := 'sui_facturacion_alcantarillado_15085_2011_01_76845_00A.csv';
         v1:= utl_file.fopen('PUBLIC_ACCESS',nom_archivo,'R',32767);
         utl_file.get_line(v1,v2);
         SELECT LENGTH(v2) INTO TOTAL_CAR FROM DUAL;
         ACUM:=1;
         POS_1:=0;
         WHILE ACUM <= 60
         LOOP
              select instr(v2, ',', 1, ACUM) PRUEBA
              INTO      POS_2
              FROM      DUAL;
              dbms_output.put_line(' TOTAL POSICION 1--> '|| POS_1);
              dbms_output.put_line(' TOTAL POSICION 2--> '|| POS_2);
              dbms_output.put_line(' TOTAL ACUMULADO --> '|| ACUM);
              REST := (POS_2-POS_1)-1;
              SELECT SUBSTR(v2,(POS_1+1),REST) PRUEBA2
                   INTO CADEN
              FROM      DUAL;     
              dbms_output.put_line(' CADENA SELECCIONADA --> '|| CADEN);
              ACUM := ACUM + 1;
              POS_1:= POS_2;
         END LOOP;
         utl_file.fclose(v1);
         dbms_output.put_line(' -->');
         dbms_output.put_line(' TOTAL POSICION 1-->'|| POS_1);
         dbms_output.put_line(' TOTAL POSICION 2-->'|| POS_2);
         dbms_output.put_line(' TOTAL ACUMULADO -->'|| ACUM);
         dbms_output.put_line(' TOTAL DE CARACTERES -->'|| TOTAL_CAR);
         dbms_output.put_line(' ');     
         EXCEPTION
              WHEN NO_DATA_FOUND THEN
                   dbms_output.put_line('NO SE ENCONTRARON MAS CARACTERES');
              WHEN OTHERS THEN
                   dbms_output.put_line('OTRO TIPO DE ERROR ');
                   dbms_output.put_line('CODIGO ERROR '|| SQLCODE ||' '||SQLERRM);
    END;
    Which must be separated by a comma and enter a table I have the following procedure in which only brings me the first line, which need not
    The current role I have just read and extract data from the row number 1, I do not need information.
    I need information for rows 2,3,4. In each row there are 41 fields, which I enter in a table called Dato_archivos.
    how to perform this function? ...
    I appreciate the cooperation and explanation ...
    GOOD DAY ...
    REYNEL SALAZAR MARTINEZ
    COLOMBIA ...

    When you get an error with external tables (or sql*loader) look in the same folder as the data file and you should get a .log file and maybe a .bad file too.
    The log file should indicate the nature of the error it has trying to load the data.
    I've just copied your sample data from your first post to a file on my server and tried it to find that you are not specifying the required format for your dates. The below shows it now working...
    CREATE TABLE tabla_prueba
      (NUID NUMBER,
       NUM_CUENTA_CONTRATO NUMBER,
       COD_DANE_DD NUMBER,
       COD_DANE_MM NUMBER,
       ZONA_IGAC NUMBER,
       SECTOR_IGAC NUMBER,
       MANZANA_VEREDA_IGAC NUMBER,
       NUM_PREDIO_IGAC NUMBER,
       CONDICION_PREDIO_IGAC NUMBER,
       DIRECCION_PREDIO_IGAC VARCHAR2(80),
       NUM_FACTURA NUMBER,
       FECHA_EXPED_FACTURA DATE,
       FECHA_INI_PERIODO_FACTURACION DATE,
       DIAS_FACTURADOS NUMBER,
       COD_CLASE_USO NUMBER,
       UNI_MULTIUSUARIO_RESIDENCIAL NUMBER,
       UNI_MULTIUSUARIO_NORESIDENCIAL NUMBER,
       HOGAR_COMUNITARIO NUMBER,
       USUARIO_FACTURADO_AFORO NUMBER,
       USUARIO_CON_CARACTERIZACION NUMBER,
       CARGO_FIJO NUMBER,
       CARGO_VERTIMENTO_BAS NUMBER,
       CARGO_VERTIMENTO_COMP NUMBER,
       CARGO_VERTIMENTO_SUNT NUMBER,
       CMT NUMBER,
       VLR_FACTURADO_VERTIDO NUMBER,
       VLR_SUBSIDIO NUMBER,
       VLR_CONTRIBUCCION NUMBER,
       FACTOR_SUBS_CONTR_CARGO_FIJO NUMBER,
       FACTOR_SUBS_CONTR_VERTIMENTO NUMBER,
       CARGO_CONEXION NUMBER,
       PAGO_ANTICIPADO_SERVICIO NUMBER,
       DIAS_MORA NUMBER,
       VLR_MORA NUMBER,
       INTERES_MORA NUMBER,
       OTROS_COBROS NUMBER,
       CAUSAL_REFACTURACION NUMBER,
       NUM_FACTURA_OBJ_REFACTURACION NUMBER,
       VLR_TOTAL_FACTURADO NUMBER,
       PAGOS_CLIENTE_DURANTE_PERIODO NUMBER
    ORGANIZATION EXTERNAL
      (TYPE ORACLE_LOADER
       DEFAULT DIRECTORY TEST_DIR
       ACCESS PARAMETERS
        (RECORDS DELIMITED BY NEWLINE
         SKIP 1
         FIELDS TERMINATED BY ","
         OPTIONALLY ENCLOSED BY '"'
          (NUID,
           NUM_CUENTA_CONTRATO,
           COD_DANE_DD,
           COD_DANE_MM,
           ZONA_IGAC,
           SECTOR_IGAC,
           MANZANA_VEREDA_IGAC,
           NUM_PREDIO_IGAC,
           CONDICION_PREDIO_IGAC,
           DIRECCION_PREDIO_IGAC,
           NUM_FACTURA,
           FECHA_EXPED_FACTURA CHAR DATE_FORMAT DATE MASK "DD-MM-YYYY",
           FECHA_INI_PERIODO_FACTURACION CHAR DATE_FORMAT DATE MASK "DD-MM-YYYY",
           DIAS_FACTURADOS,
           COD_CLASE_USO,
           UNI_MULTIUSUARIO_RESIDENCIAL ,
           UNI_MULTIUSUARIO_NORESIDENCIAL,
           HOGAR_COMUNITARIO ,
           USUARIO_FACTURADO_AFORO ,
           USUARIO_CON_CARACTERIZACION ,
           CARGO_FIJO ,
           CARGO_VERTIMENTO_BAS ,
           CARGO_VERTIMENTO_COMP,
           CARGO_VERTIMENTO_SUNT,
           CMT,
           VLR_FACTURADO_VERTIDO,
           VLR_SUBSIDIO ,
           VLR_CONTRIBUCCION ,
           FACTOR_SUBS_CONTR_CARGO_FIJO ,
           FACTOR_SUBS_CONTR_VERTIMENTO ,
           CARGO_CONEXION ,
           PAGO_ANTICIPADO_SERVICIO ,
           DIAS_MORA ,
           VLR_MORA ,
           INTERES_MORA ,
           OTROS_COBROS ,
           CAUSAL_REFACTURACION ,
           NUM_FACTURA_OBJ_REFACTURACION,
           VLR_TOTAL_FACTURADO,
           PAGOS_CLIENTE_DURANTE_PERIODO
        LOCATION ('test.csv')
    SQL> select * from tabla_prueba;
          NUID NUM_CUENTA_CONTRATO COD_DANE_DD COD_DANE_MM  ZONA_IGAC SECTOR_IGAC MANZANA_VEREDA_IGAC NUM_PREDIO_IGAC CONDICION_PREDIO_IGAC DIRECCION_PREDIO_IGAC                                                    NUM_FACTURA FECHA_EXPE FECHA_INI_
    DIAS_FACTURADOS COD_CLASE_USO UNI_MULTIUSUARIO_RESIDENCIAL UNI_MULTIUSUARIO_NORESIDENCIAL HOGAR_COMUNITARIO USUARIO_FACTURADO_AFORO USUARIO_CON_CARACTERIZACION CARGO_FIJO CARGO_VERTIMENTO_BAS CARGO_VERTIMENTO_COMP CARGO_VERTIMENTO_SUNT        CMT
    VLR_FACTURADO_VERTIDO VLR_SUBSIDIO VLR_CONTRIBUCCION FACTOR_SUBS_CONTR_CARGO_FIJO FACTOR_SUBS_CONTR_VERTIMENTO CARGO_CONEXION PAGO_ANTICIPADO_SERVICIO  DIAS_MORA   VLR_MORA INTERES_MORA OTROS_COBROS CAUSAL_REFACTURACION NUM_FACTURA_OBJ_REFACTURACION
    VLR_TOTAL_FACTURADO PAGOS_CLIENTE_DURANTE_PERIODO
        242602              242602          76         845         99          99                9999         9999              999 CLL 5 CRA 7 PEATONAL                                                         24911920 12-01-2011 01-12-2010
                 30             1                                                                             0               0                           0          1                    0                     0                     0          0
                        5         2200                 0                            0                            0              0                        0          0          0            0            0                    0                             0
                      0                          2201
        242604              242604          76         845         99          99                9999         9999              999 CRA 4 # 6 - 13                                                               24911846 12-01-2011 01-12-2010
                 30             1                                                                             0               0                           0          1                    0                     0                     0          0
                       13         2200                 0                            0                            0              0                        0          0          0            0            0                    0                             0
                      0                          2201
        242605              242605          76         845         99          99                9999         9999              999 CRA 2 CLLES 3 Y 4                                                            24911509 12-01-2011 01-12-2010
                 30             1                                                                             0               0                           0          1                    0                     0                     0          0
                        4         2200                 0                            0                            0              0                        0          0          0            0            0                    0                             0
                      0                          2201
    SQL>
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • How to get the data from one table and insert into another table

    Hi,
    We have requirement to build OA page with the data needs to be populated from one table and on save data into another table.
    For the above requirement what the best way to implement in OAF.
    I understand that if we attach VO object instance to region/page, we only can pull and put data in to only one table.
    Thanks

    You can achieve this in many different ways, one is
    1. Create another VO based on the EO which is based on the dest table.
    2. At save, copy the contents of the source VO into the dest VO (see copy routine in dev guide).
    3. commiting the transaction will push the data into the dest table on which the dest VO is based.
    I understand that if we attach VO object instance to region/page, we only can pull and put data in to only one table.
    if by table you mean a DB table, then no, you can have a VO based on multiple EOs which will do DMLs accordingly.Thanks
    Tapash

  • HOW TO READ DATA FROM A FILE AND INSERT INTO A TABLE USING UTL_FILE

    Hi..
    I have a file.I want to read the data from file and load it into a table using utl_file.
    how can I do it?
    Any reply apreciated...

    Hi,
    This is not your requirment but u can try this :
    CREATE OR REPLACE DIRECTORY text_file AS 'D:\TEXT_FILE\';
    GRANT READ ON DIRECTORY text_file TO fah;
    GRANT WRITE ON DIRECTORY text_file TO fah;
    DROP TABLE load_a;
    CREATE TABLE load_a
    (a1 varchar2(20),
    a2 varchar2(200))
    ORGANIZATION EXTERNAL
    (TYPE ORACLE_LOADER
    DEFAULT DIRECTORY text_file
    ACCESS PARAMETERS
    (FIELDS TERMINATED BY ','
    LOCATION ('data.txt')
    select * from load_a;
    CREATE TABLE A AS select * from load_a;
    SELECT * FROM A
    Regards
    Faheem Latif

  • Getting data from some source and insert into corresponding source

    hi all,
    i am using db10g.
    i have like like
    MMA : add1:add2:add3'the above said is the format and actual data will be something like
    MMA : bank street : no:32 : tel: +9127546663
    add1 = bank street
    add2 = no:32
    add3 = tel: +9127546663 my requirement to store add1,add2 and add3 into the table.
    So to take the data i am using substring and instring function to get the data with respect to : my problem is for example in add2 place no:32 is there my task is to fetch the no:32 from that line and store into the table as no:32 only in that case i cannot fetch the value in terms of :.
    sincr : is the delimiter(seperator for data element)
    how can i solve this issue?
    Thanks..
    Edited by: user13329002 on Jan 1, 2011 11:37 PM

    Your first addr1 is terminated by (:).
    your second addr2 is started with (no:), and also terminated by (:)
    and third addr3 is started with (tel:).
    only the first and third (:) will be used as delimiter, so how about this?
    WITH T AS (SELECT 'bank street:no:32:tel:+9127546663' AS STR FROM DUAL)
    SELECT SUBSTR(STR, 1, INSTR(STR, ':', 1, 1)-1) AS ADD1,
    SUBSTR(STR, INSTR(STR, ':', 1, 1)+1, INSTR(STR, ':', 1, 3)-INSTR(STR, ':', 1, 1)-1) AS ADD2,
    substr(str, instr(str, ':', -1, 2)+1) as add3 from t;
    bank street       no:32     tel:+9127546663Or can`t you just ask the people who gave the requirement to change the delimiter used in the format to something else like pipe (|)?

  • How to read LONG RAW data from one  table and insert into another table

    Hello EVERYBODY
    I have a table called sound with the following attributes. in the music attribute i have stored some messages in the different language like hindi, english etc. i want to concatinate all hindi messages and store in the another table with only one attribute of type LONG RAW.and this attribute is attached with the sound item.
    when i click the play button of sound item the all the messages recorded in hindi will play one by one automatically. for that i'm doing the following.
    i have written the following when button pressed trigger which will concatinate all the messages of any selected language from the sound table, and store in another table called temp.
    and then sound will be played from the temp table.
    declare
         tmp sound.music%type;
         temp1 sound.music%type;
         item_id ITEM;
    cursor c1
    is select music
    from sound
    where lang=:LIST10;
    begin
         open c1;
         loop
              fetch c1 into tmp; //THIS LINE GENERATES THE ERROR
              temp1:=temp1||tmp;
              exit when c1%notfound;
         end loop;
    CLOSE C1;
    insert into temp values(temp1);
    item_id:=Find_Item('Music');
    go_item('music');
    play_sound(item_id);
    end;
    but when i'm clicking the button it generates the following error.
    WHEN-BUTTON-PRESSED TRIGGER RAISED UNHANDLED EXCEPTION ORA-06502.
    ORA-06502: PL/SQL: numeric or value error
    SQL> desc sound;
    Name Null? Type
    SL_NO NUMBER(2)
    MUSIC LONG RAW
    LANG CHAR(10)
    IF MY PROCESS TO SOLVE THE ABOVE PROBLEM IS OK THEN PLESE TELL ME THE SOLUTION FOR THE ERROR. OTHER WISE PLEASE SUGGEST ME,IF ANY OTHER WAY IS THERE TO SOLVE THE ABOVE PROBLEM.
    THANKS IN ADVANCE.
    D. Prasad

    You can achieve this in many different ways, one is
    1. Create another VO based on the EO which is based on the dest table.
    2. At save, copy the contents of the source VO into the dest VO (see copy routine in dev guide).
    3. commiting the transaction will push the data into the dest table on which the dest VO is based.
    I understand that if we attach VO object instance to region/page, we only can pull and put data in to only one table.
    if by table you mean a DB table, then no, you can have a VO based on multiple EOs which will do DMLs accordingly.Thanks
    Tapash

Maybe you are looking for