How to read LONG RAW data from one  table and insert into another table

Hello EVERYBODY
I have a table called sound with the following attributes. in the music attribute i have stored some messages in the different language like hindi, english etc. i want to concatinate all hindi messages and store in the another table with only one attribute of type LONG RAW.and this attribute is attached with the sound item.
when i click the play button of sound item the all the messages recorded in hindi will play one by one automatically. for that i'm doing the following.
i have written the following when button pressed trigger which will concatinate all the messages of any selected language from the sound table, and store in another table called temp.
and then sound will be played from the temp table.
declare
     tmp sound.music%type;
     temp1 sound.music%type;
     item_id ITEM;
cursor c1
is select music
from sound
where lang=:LIST10;
begin
     open c1;
     loop
          fetch c1 into tmp; //THIS LINE GENERATES THE ERROR
          temp1:=temp1||tmp;
          exit when c1%notfound;
     end loop;
CLOSE C1;
insert into temp values(temp1);
item_id:=Find_Item('Music');
go_item('music');
play_sound(item_id);
end;
but when i'm clicking the button it generates the following error.
WHEN-BUTTON-PRESSED TRIGGER RAISED UNHANDLED EXCEPTION ORA-06502.
ORA-06502: PL/SQL: numeric or value error
SQL> desc sound;
Name Null? Type
SL_NO NUMBER(2)
MUSIC LONG RAW
LANG CHAR(10)
IF MY PROCESS TO SOLVE THE ABOVE PROBLEM IS OK THEN PLESE TELL ME THE SOLUTION FOR THE ERROR. OTHER WISE PLEASE SUGGEST ME,IF ANY OTHER WAY IS THERE TO SOLVE THE ABOVE PROBLEM.
THANKS IN ADVANCE.
D. Prasad

You can achieve this in many different ways, one is
1. Create another VO based on the EO which is based on the dest table.
2. At save, copy the contents of the source VO into the dest VO (see copy routine in dev guide).
3. commiting the transaction will push the data into the dest table on which the dest VO is based.
I understand that if we attach VO object instance to region/page, we only can pull and put data in to only one table.
if by table you mean a DB table, then no, you can have a VO based on multiple EOs which will do DMLs accordingly.Thanks
Tapash

Similar Messages

  • Hi, extract data from xml file and insert into another exiting xml file

    i am searching code to extract data from xml file and insert into another exiting xml file by a java program. I understood it is easy to extract data from a xml file, and how ever without creating another xml file. We want to insert the extracted data into another exiting xml file. Suggestions?
    1st xml file which has two lines(text1.xml)
    <?xml version="1.0" encoding="iso-8859-1"?>
    <xs:PrintDataRequest xmlns:xs="http://com.unisys.com/Anid"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://com.unisys.com/Anid file:ANIDWS.xsd">
    <xs:Person>
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://com.unisys.com/Anid file:ANIDWS.xsd">
    These two lines has to be inserted in the existing another xml(text 2.xml) file(at line 3 and 4)
    Regards,
    bubbly

    Jadz_Core wrote:
    RandomAccessFile? If you know where you want to insert it.Are you sure about this? If using this, the receiving file would have to have bytes inserted that exactly match the number of bytes replaced. I'm thinking that you'll likely have to stream through the second XML with a SAX parser and copy information (or insert new information) as you stream with an XML writer of some sort.

  • Extract data from xml file and insert into another exiting xml fil

    hello,
    i am searching extract data from xml file and insert into another exiting xml file by a java program. I understood it is easy to extract data from a xml file, and how ever without creating another xml file. We want to insert the extracted data into another exiting xml file. Suggestions?
    Regards,
    Zhuozhi

    If the files are small, you can load the target file in a DOM document, insert the data from the source file and persist the DOM.
    If the files are large, you probably want to use a StAX or SAX.

  • How to get the data from one table and insert into another table

    Hi,
    We have requirement to build OA page with the data needs to be populated from one table and on save data into another table.
    For the above requirement what the best way to implement in OAF.
    I understand that if we attach VO object instance to region/page, we only can pull and put data in to only one table.
    Thanks

    You can achieve this in many different ways, one is
    1. Create another VO based on the EO which is based on the dest table.
    2. At save, copy the contents of the source VO into the dest VO (see copy routine in dev guide).
    3. commiting the transaction will push the data into the dest table on which the dest VO is based.
    I understand that if we attach VO object instance to region/page, we only can pull and put data in to only one table.
    if by table you mean a DB table, then no, you can have a VO based on multiple EOs which will do DMLs accordingly.Thanks
    Tapash

  • How to read a XML file from BLOB column and insert in a table - PL/SQL Only

    Hi,
    To make data load more simple to end user instead placing file on the server and use SQL-LOADER, I came up with new idea that using oracle ebusiness suite attachment functionality. that loads a XML file from local PC to a database column(table is fnd_attachments, default data type is BLOB over here).
    I tried with DBMS_LOB and didnt get around.
    Please can anyone tell me how to read the BLOB column using PL/SQL and store the data in a oracle table. Here's the sample XML file and table structure FYI.
    <?xml version="1.0" encoding="UTF-8"?>
    <dataroot xmlns:od="urn:schemas-microsoft-com:officedata" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="Corporate_alloc.xsd" generated="2009-07-07T14:17:49">
    <Corporate_alloc>
    <PKG_CODE>BKCORP</PKG_CODE>
    <PKG_NAME>Corporate Edition - Books</PKG_NAME>
    <DET_CODE>B9780080543758</DET_CODE>
    <DET_NAME>Waves, Tides and Shallow-Water Processes</DET_NAME>
    <ALLOCATION_RATIO>0.000041</ALLOCATION_RATIO>
    </Corporate_alloc>
    <Corporate_alloc>
    <PKG_CODE>BKCORP</PKG_CODE>
    <PKG_NAME>Corporate Edition - Books</PKG_NAME>
    <DET_CODE>B9780080534343</DET_CODE>
    <DET_NAME>Hydrostatically Loaded Structures</DET_NAME>
    <ALLOCATION_RATIO>0.000127</ALLOCATION_RATIO>
    </Corporate_alloc>
    </dataroot>
    CREATE TABLE TEST_XML
    ( PKG_CODE VARCHAR2(50),
    PKG_NAME VARCHAR2(100),
    DET_CODE VARCHAR2(20),
    DET_NAME VARCHAR2(500),
    ALLOCATION_RATIO NUMBER )
    Thanks
    EBV

    In regards to #3, use the COLUMNS functionality of XMLTable instead of using Extract. Two simple examples are
    Re: XML Data - Caliculate fields
    Re: Extractvalue function not recognised

  • HOW TO READ DATA FROM A FILE AND INSERT INTO A TABLE USING UTL_FILE

    Hi..
    I have a file.I want to read the data from file and load it into a table using utl_file.
    how can I do it?
    Any reply apreciated...

    Hi,
    This is not your requirment but u can try this :
    CREATE OR REPLACE DIRECTORY text_file AS 'D:\TEXT_FILE\';
    GRANT READ ON DIRECTORY text_file TO fah;
    GRANT WRITE ON DIRECTORY text_file TO fah;
    DROP TABLE load_a;
    CREATE TABLE load_a
    (a1 varchar2(20),
    a2 varchar2(200))
    ORGANIZATION EXTERNAL
    (TYPE ORACLE_LOADER
    DEFAULT DIRECTORY text_file
    ACCESS PARAMETERS
    (FIELDS TERMINATED BY ','
    LOCATION ('data.txt')
    select * from load_a;
    CREATE TABLE A AS select * from load_a;
    SELECT * FROM A
    Regards
    Faheem Latif

  • How to get the data from a file and insert into a table

    Good morning,
    NEED TO READ THIS FILE sui_facturacion_alcantarillado_15085_2011_01_76845_00A.csv containing the following information
    NUID,NUMERO_DE_CUENTA_CONTRATO,CÓDIGO_DANE_DEPARTAMENTO,CÓDIGO_DANE_MUNICIPIO,ZONA_IGAC,SECTOR_IGAC,MANZANA_O_VEREDA_IGAC,NÚMERO_DEL_PREDIO_IGAC,CONDICION_DE_PROPIEDAD_DEL_PREDIO_IGAC,DIRECCIÓN_DEL_PREDIO,NÚMERO_DE_FACTURA,FECHA_DE_EXPEDICIÓN_DE_LA_FACTURA,FECHA_DE_INICIO_DEL_PERÍODO_DE_FACTURACIÓN,DIAS_FACTURADOS,CÓDIGO_CLASE_DE_USO,UNIDADES_MULTIUSUARIO_RESIDENCIAL,UNIDADES_MULTIUSUARIO_NO_RESIDENCIAL,HOGAR_COMUNITARIO_O_SUSTITUTO,USUARIO_FACTURADO_CON_AFORO,USUARIO_CUENTA_CON_CARACTERIZACIÓN,CARGO_FIJO,CARGO_POR_VERTIMIENTO_BASICO,CARGO_POR_VERTIMIENTO_COMPLEMENTARIO,CARGO_POR_VERTIMIENTOSUNTUARIO,CMT,VERTIMIENTO_DEL_PERIOD_EN_METROS_CUBICOS,VALOR_FACTURADO_POR_VERTIDO,VALOR_DEL_SUBSIDIO,VALOR_DE_LA_CONTRIBUCIÓN,FACTOR_DE_SUBSIDIO_O_CONTRIBUCIÓN_CARGO_FIJO,FACTOR_DE_SUBSIDIO_O_CONTRIBUCIÓN_VERTIMIENTO,CARGOS_POR_CONEXIÓN,PAGO_ANTICIPADO_DEL_SERVICIO,DÍAS_DE_MORA,VALOR_DE_MORA,INTERESES_POR_MORA,OTROS_COBROS,CAUSAL_DE_REFACTURACIÓN,NUMERO_DE_LA_FACTURA_OBJETO_DE_REFACTURACIÓN,VALOR_TOTAL_FACTURADO,PAGOS_DEL_CLIENTE_DURANTE_EL_PERÍODO_FACTURADO
    242602,242602,76,845,99,99,9999,9999,999,CLL 5 CRA 7 PEATONAL,24911920,12-01-2011,01-12-2010,30,01,,,0,0,0,1,0000000000.00,0000000000.00,0000000000.00,0000000000.00,0000000005,0000002200.00,0000000000,0000000000,0.000,0.000,0000000000.00,0000000000.00,0,0000000000.00,0000000000.00,0000000000.00,0,0,0000002201.00,0000000000.00
    242604,242604,76,845,99,99,9999,9999,999,CRA 4 # 6 - 13,24911846,12-01-2011,01-12-2010,30,01,,,0,0,0,1,0000000000.00,0000000000.00,0000000000.00,0000000000.00,0000000013,0000002200.00,0000000000,0000000000,0.000,0.000,0000000000.00,0000000000.00,0,0000000000.00,0000000000.00,0000000000.00,0,0,0000002201.00,0000004411.00
    242605,242605,76,845,99,99,9999,9999,999,CRA 2 CLLES 3 Y 4,24911509,12-01-2011,01-12-2010,30,01,,,0,0,0,1,0000000000.00,0000000000.00,0000000000.00,0000000000.00,0000000004,0000002200.00,0000000000,0000000000,0.000,0.000,0000000000.00,0000000000.00,0,0000000000.00,0000000000.00,0000000000.00,0,0,0000002201.00,0000002200.00
    this is the function that I have
    <<function_test>>
    DECLARE
    TOTAL_CAR NUMBER;
    POS_1 NUMBER:= 0;
    POS_2 NUMBER:= 0;
    REST NUMBER:= 0;
    ACUM NUMBER:= 0;
    CADEN VARCHAR2(200);
    nom_archivo varchar2(80);
    v1 utl_file.file_type;
    v2 varchar2(2048);
    BEGIN
         nom_archivo := 'sui_facturacion_alcantarillado_15085_2011_01_76845_00A.csv';
         v1:= utl_file.fopen('PUBLIC_ACCESS',nom_archivo,'R',32767);
         utl_file.get_line(v1,v2);
         SELECT LENGTH(v2) INTO TOTAL_CAR FROM DUAL;
         ACUM:=1;
         POS_1:=0;
         WHILE ACUM <= 60
         LOOP
              select instr(v2, ',', 1, ACUM) PRUEBA
              INTO      POS_2
              FROM      DUAL;
              dbms_output.put_line(' TOTAL POSICION 1--> '|| POS_1);
              dbms_output.put_line(' TOTAL POSICION 2--> '|| POS_2);
              dbms_output.put_line(' TOTAL ACUMULADO --> '|| ACUM);
              REST := (POS_2-POS_1)-1;
              SELECT SUBSTR(v2,(POS_1+1),REST) PRUEBA2
                   INTO CADEN
              FROM      DUAL;     
              dbms_output.put_line(' CADENA SELECCIONADA --> '|| CADEN);
              ACUM := ACUM + 1;
              POS_1:= POS_2;
         END LOOP;
         utl_file.fclose(v1);
         dbms_output.put_line(' -->');
         dbms_output.put_line(' TOTAL POSICION 1-->'|| POS_1);
         dbms_output.put_line(' TOTAL POSICION 2-->'|| POS_2);
         dbms_output.put_line(' TOTAL ACUMULADO -->'|| ACUM);
         dbms_output.put_line(' TOTAL DE CARACTERES -->'|| TOTAL_CAR);
         dbms_output.put_line(' ');     
         EXCEPTION
              WHEN NO_DATA_FOUND THEN
                   dbms_output.put_line('NO SE ENCONTRARON MAS CARACTERES');
              WHEN OTHERS THEN
                   dbms_output.put_line('OTRO TIPO DE ERROR ');
                   dbms_output.put_line('CODIGO ERROR '|| SQLCODE ||' '||SQLERRM);
    END;
    Which must be separated by a comma and enter a table I have the following procedure in which only brings me the first line, which need not
    The current role I have just read and extract data from the row number 1, I do not need information.
    I need information for rows 2,3,4. In each row there are 41 fields, which I enter in a table called Dato_archivos.
    how to perform this function? ...
    I appreciate the cooperation and explanation ...
    GOOD DAY ...
    REYNEL SALAZAR MARTINEZ
    COLOMBIA ...

    When you get an error with external tables (or sql*loader) look in the same folder as the data file and you should get a .log file and maybe a .bad file too.
    The log file should indicate the nature of the error it has trying to load the data.
    I've just copied your sample data from your first post to a file on my server and tried it to find that you are not specifying the required format for your dates. The below shows it now working...
    CREATE TABLE tabla_prueba
      (NUID NUMBER,
       NUM_CUENTA_CONTRATO NUMBER,
       COD_DANE_DD NUMBER,
       COD_DANE_MM NUMBER,
       ZONA_IGAC NUMBER,
       SECTOR_IGAC NUMBER,
       MANZANA_VEREDA_IGAC NUMBER,
       NUM_PREDIO_IGAC NUMBER,
       CONDICION_PREDIO_IGAC NUMBER,
       DIRECCION_PREDIO_IGAC VARCHAR2(80),
       NUM_FACTURA NUMBER,
       FECHA_EXPED_FACTURA DATE,
       FECHA_INI_PERIODO_FACTURACION DATE,
       DIAS_FACTURADOS NUMBER,
       COD_CLASE_USO NUMBER,
       UNI_MULTIUSUARIO_RESIDENCIAL NUMBER,
       UNI_MULTIUSUARIO_NORESIDENCIAL NUMBER,
       HOGAR_COMUNITARIO NUMBER,
       USUARIO_FACTURADO_AFORO NUMBER,
       USUARIO_CON_CARACTERIZACION NUMBER,
       CARGO_FIJO NUMBER,
       CARGO_VERTIMENTO_BAS NUMBER,
       CARGO_VERTIMENTO_COMP NUMBER,
       CARGO_VERTIMENTO_SUNT NUMBER,
       CMT NUMBER,
       VLR_FACTURADO_VERTIDO NUMBER,
       VLR_SUBSIDIO NUMBER,
       VLR_CONTRIBUCCION NUMBER,
       FACTOR_SUBS_CONTR_CARGO_FIJO NUMBER,
       FACTOR_SUBS_CONTR_VERTIMENTO NUMBER,
       CARGO_CONEXION NUMBER,
       PAGO_ANTICIPADO_SERVICIO NUMBER,
       DIAS_MORA NUMBER,
       VLR_MORA NUMBER,
       INTERES_MORA NUMBER,
       OTROS_COBROS NUMBER,
       CAUSAL_REFACTURACION NUMBER,
       NUM_FACTURA_OBJ_REFACTURACION NUMBER,
       VLR_TOTAL_FACTURADO NUMBER,
       PAGOS_CLIENTE_DURANTE_PERIODO NUMBER
    ORGANIZATION EXTERNAL
      (TYPE ORACLE_LOADER
       DEFAULT DIRECTORY TEST_DIR
       ACCESS PARAMETERS
        (RECORDS DELIMITED BY NEWLINE
         SKIP 1
         FIELDS TERMINATED BY ","
         OPTIONALLY ENCLOSED BY '"'
          (NUID,
           NUM_CUENTA_CONTRATO,
           COD_DANE_DD,
           COD_DANE_MM,
           ZONA_IGAC,
           SECTOR_IGAC,
           MANZANA_VEREDA_IGAC,
           NUM_PREDIO_IGAC,
           CONDICION_PREDIO_IGAC,
           DIRECCION_PREDIO_IGAC,
           NUM_FACTURA,
           FECHA_EXPED_FACTURA CHAR DATE_FORMAT DATE MASK "DD-MM-YYYY",
           FECHA_INI_PERIODO_FACTURACION CHAR DATE_FORMAT DATE MASK "DD-MM-YYYY",
           DIAS_FACTURADOS,
           COD_CLASE_USO,
           UNI_MULTIUSUARIO_RESIDENCIAL ,
           UNI_MULTIUSUARIO_NORESIDENCIAL,
           HOGAR_COMUNITARIO ,
           USUARIO_FACTURADO_AFORO ,
           USUARIO_CON_CARACTERIZACION ,
           CARGO_FIJO ,
           CARGO_VERTIMENTO_BAS ,
           CARGO_VERTIMENTO_COMP,
           CARGO_VERTIMENTO_SUNT,
           CMT,
           VLR_FACTURADO_VERTIDO,
           VLR_SUBSIDIO ,
           VLR_CONTRIBUCCION ,
           FACTOR_SUBS_CONTR_CARGO_FIJO ,
           FACTOR_SUBS_CONTR_VERTIMENTO ,
           CARGO_CONEXION ,
           PAGO_ANTICIPADO_SERVICIO ,
           DIAS_MORA ,
           VLR_MORA ,
           INTERES_MORA ,
           OTROS_COBROS ,
           CAUSAL_REFACTURACION ,
           NUM_FACTURA_OBJ_REFACTURACION,
           VLR_TOTAL_FACTURADO,
           PAGOS_CLIENTE_DURANTE_PERIODO
        LOCATION ('test.csv')
    SQL> select * from tabla_prueba;
          NUID NUM_CUENTA_CONTRATO COD_DANE_DD COD_DANE_MM  ZONA_IGAC SECTOR_IGAC MANZANA_VEREDA_IGAC NUM_PREDIO_IGAC CONDICION_PREDIO_IGAC DIRECCION_PREDIO_IGAC                                                    NUM_FACTURA FECHA_EXPE FECHA_INI_
    DIAS_FACTURADOS COD_CLASE_USO UNI_MULTIUSUARIO_RESIDENCIAL UNI_MULTIUSUARIO_NORESIDENCIAL HOGAR_COMUNITARIO USUARIO_FACTURADO_AFORO USUARIO_CON_CARACTERIZACION CARGO_FIJO CARGO_VERTIMENTO_BAS CARGO_VERTIMENTO_COMP CARGO_VERTIMENTO_SUNT        CMT
    VLR_FACTURADO_VERTIDO VLR_SUBSIDIO VLR_CONTRIBUCCION FACTOR_SUBS_CONTR_CARGO_FIJO FACTOR_SUBS_CONTR_VERTIMENTO CARGO_CONEXION PAGO_ANTICIPADO_SERVICIO  DIAS_MORA   VLR_MORA INTERES_MORA OTROS_COBROS CAUSAL_REFACTURACION NUM_FACTURA_OBJ_REFACTURACION
    VLR_TOTAL_FACTURADO PAGOS_CLIENTE_DURANTE_PERIODO
        242602              242602          76         845         99          99                9999         9999              999 CLL 5 CRA 7 PEATONAL                                                         24911920 12-01-2011 01-12-2010
                 30             1                                                                             0               0                           0          1                    0                     0                     0          0
                        5         2200                 0                            0                            0              0                        0          0          0            0            0                    0                             0
                      0                          2201
        242604              242604          76         845         99          99                9999         9999              999 CRA 4 # 6 - 13                                                               24911846 12-01-2011 01-12-2010
                 30             1                                                                             0               0                           0          1                    0                     0                     0          0
                       13         2200                 0                            0                            0              0                        0          0          0            0            0                    0                             0
                      0                          2201
        242605              242605          76         845         99          99                9999         9999              999 CRA 2 CLLES 3 Y 4                                                            24911509 12-01-2011 01-12-2010
                 30             1                                                                             0               0                           0          1                    0                     0                     0          0
                        4         2200                 0                            0                            0              0                        0          0          0            0            0                    0                             0
                      0                          2201
    SQL>
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • How to Query from table and insert into another table.

    Hi
    I am using the following query in VO and all the columns are attached to EO ( table name emp_temp)
    select a.npw_number, a.person_id,b.assignment_id,a.title,a.last_name,a.first_name,a.date_of_birth,a.sex,
    b.organization_name,b.organization_id,b.job_id,b.job_name,b.position_id,b.position_name,b.supervisor_id,
    b.supervisor_name,b.location_id,b.effective_start_date,b.effective_end_date
    from per_all_people_f a,per_assignments_v b
    where a.person_id=b.person_id
    and a.npw_number=:1
    I can query the data in screen. I need into insert the data into the emp_temp.
    I don't know how to do this . Please help me.
    Thanks
    Subra

    You can create a VO based on EO on emp_temp table.....
    And u have attached a Different VO on the page... Right...
    Now what u can do is....once u click on apply....
    u can set the each attributes of EO based VO explicitly via code, from the values of second VO.... and then commit.....
    Perhaps this might help...

  • How to read the data from XML file and insert into oracle DB

    Hi All,
    I have below require ment.
    I will receive data in the XML file. then i need to read that data and insert into oracle tables. please let me know how this can be handled.
    Many Thanks.

    Sounds a lot like this question, only with less details.
    how to read data from XML  variable and insert into table variable
    We can only help if you provide us details to help as we cannot see what you are doing and only know what you tell us.  Plenty of examples abound on the forums that cover the topics you seek as well.

  • Read data from E$ table and insert into staging table

    Hi all,
    I am a new to ODI. I want your help in understanding how to read data from an "E$" table and insert into a staging table.
    Scenario:
    Two columns in a flat file, employee id and employee name, needs to be loaded into a datastore EMP+. A check constraint is added to allow data with employee names in upper case only to be loaded into the datastore. Check control is set to flow and static. Right click on the datastore, select control and then check. The rows that have violated the check constraint are kept in E$_EMP+ table.
    Problem:
    Issue is that I want to read the data from the E$_EMP+ table and transform the employee name into uppercase and move the corrected data from E$_EMP+ into EMP+. Please advise me on how to automatically handle "soft" exceptions in ODI.
    Thank you

    Hi Himanshu,
    I have tried the approach you suggested already. That works. However, the scenario I described was very basic. Based on the error logged into the E$_EMP table, I will have to design the interface to apply more business logic before the data is actually merged into the target table.
    Before the business logic is applied, I need to read each row from the E$EMP table first and I do not know how to read from E$EMP table.

  • Getting data from some source and insert into corresponding source

    hi all,
    i am using db10g.
    i have like like
    MMA : add1:add2:add3'the above said is the format and actual data will be something like
    MMA : bank street : no:32 : tel: +9127546663
    add1 = bank street
    add2 = no:32
    add3 = tel: +9127546663 my requirement to store add1,add2 and add3 into the table.
    So to take the data i am using substring and instring function to get the data with respect to : my problem is for example in add2 place no:32 is there my task is to fetch the no:32 from that line and store into the table as no:32 only in that case i cannot fetch the value in terms of :.
    sincr : is the delimiter(seperator for data element)
    how can i solve this issue?
    Thanks..
    Edited by: user13329002 on Jan 1, 2011 11:37 PM

    Your first addr1 is terminated by (:).
    your second addr2 is started with (no:), and also terminated by (:)
    and third addr3 is started with (tel:).
    only the first and third (:) will be used as delimiter, so how about this?
    WITH T AS (SELECT 'bank street:no:32:tel:+9127546663' AS STR FROM DUAL)
    SELECT SUBSTR(STR, 1, INSTR(STR, ':', 1, 1)-1) AS ADD1,
    SUBSTR(STR, INSTR(STR, ':', 1, 1)+1, INSTR(STR, ':', 1, 3)-INSTR(STR, ':', 1, 1)-1) AS ADD2,
    substr(str, instr(str, ':', -1, 2)+1) as add3 from t;
    bank street       no:32     tel:+9127546663Or can`t you just ask the people who gave the requirement to change the delimiter used in the format to something else like pipe (|)?

  • How to read from one file and write into another file?

    Hi,
    I am trying to read a File and write into another file.This is the code that i am using.But what happens is last line is only getting written..How to resolve this.the code is as follows,
    public String get() {
         FileReader fr;
         try {
              fr = new FileReader(f);
              String str;
              BufferedReader br = new BufferedReader(fr);
              try {
                   while((str= br.readLine())!=null){
                   generate=str;     
              } catch (IOException e1) {
                   e1.printStackTrace();
              } }catch (FileNotFoundException e) {
                   e.printStackTrace();
         return generate;
    where generate is a string declared globally.
    how to go about it?
    Thanks for your reply in advance

    If you want to copy files as fast as possible, without processing them (as the DOS "copy" or the Unix "cp" command), you can try the java.nio.channels package.
    import java.nio.*;
    import java.nio.channels.*;
    import java.io.*;
    import java.util.*;
    import java.text.*;
    class Kopy {
         * @param args [0] = source filename
         *        args [1] = destination filename
        public static void main(String[] args) throws Exception {
            if (args.length != 2) {
                System.err.println ("Syntax: java -cp . Kopy source destination");
                System.exit(1);
            File in = new File(args[0]);
            long fileLength = in.length();
            long t = System.currentTimeMillis();
            FileInputStream fis = new FileInputStream (in);
            FileOutputStream fos = new FileOutputStream (args[1]);
            FileChannel fci = fis.getChannel();
            FileChannel fco = fos.getChannel();
            fco.transferFrom(fci, 0, fileLength);
            fis.close();
            fos.close();
            t = System.currentTimeMillis() - t;
            NumberFormat nf = new DecimalFormat("#,##0.00");
            System.out.print (nf.format(fileLength/1024.0) + "kB copied");
            if (t > 0) {
                System.out.println (" in " + t + "ms: " + nf.format(fileLength / 1.024 / t) + " kB/s");
    }

  • Fetch data from one table and insert into two tables in desired format

    I have similar to the following data in a table and it is not normalized. The groupID is being used to group two records of similar nature.
    DECLARE @OldDoc TABLE (oldDocID INT, groupID INT, deptID INT)
    INSERT INTO @OldDoc (oldDocID, groupID) VALUES (1, NULL, 111),(2,NULL,111),(3,1,111),(4,NULL,333),(5,1,222),(6,NULL,333),(7,2,222),(8,2,333),(9,NULL,111),(10,3,222),(11,NULL,333),(12,3,444)
    I need to process the data from the above table (@OldDoc) and write into two new tables (@NewDoc and @NewDocGroup) as follows.
    oldDocID should be stored as newDocID when inserting to @NewDoc table. Only records with groupID NULL and one record (first one) per group should be considered (For example, oldDocID 5 is not considered as 3 and 5 belong to the same groupID 1) for insertion. 
    DECLARE @NewDoc TABLE (newDocID INT)
    INSERT INTO @NewDoc (newDocID) VALUES (1),(2),(3),(4),(6),(7),(9),(10),(11)
    All records from @OldDoc should be considered for insertion into @NewDocGroup table. OldDocID is inserted as NewDocID and deptID is as-is. Instead of groupID, the ID of the first record in the 
    group should be considered as parentNewDocID (For example, 3 is considered as parentNewDocID for newDocID 5 as 3 and 5 belong to the same groupID in @OldDoc table) for the newDocID.
    DECLARE @NewDocGroup (newDocID INT, parentNewDocID INT, deptID INT)
    INSERT INTO @NewDocGroup (newDocID, parentNewDocID, deptID) VALUES (1,1,111),(2,2,111),(3,3,111),(4,4,333),(5,3,222),(6,6,333),(7,7,222),(8,7,333),(9,9,111),(10,10,222),(11,11,333),(12,10,444)
    How do I accomplish the above using SQL ? Thanks for the help.

    >> I have similar to the following data in a table and it is not normalized. The group_id is being used to group two records [sic] of similar nature. <<
    Rows are not records. Tables have to have a key by definition. You do not do math with identifiers, so they should not be numeric. Let's ignore that error for now. In short, you are posting garbage. If you had followed Forum Netiquette, would you have posted
    this? 
    CREATE TABLE Old_Documents
    (old_doc_id INTEGER NOT NULL PRIMARY KEY, 
     group_id INTEGER, 
     dept_nbr INTEGER NOT NULL
       REFERENCES Departments (dept_nbr));
    INSERT INTO Old_Documents(old_doc_id, group_id, dept_nbr) 
    VALUES  (1, NULL, 111), 
    (2, NULL, 111), 
    (3, 1, 111), 
    (4, NULL, 333), 
    (5, 1, 222), 
    (6, NULL, 333), 
    (7, 2, 222), 
    (8, 2, 333), 
    (9, NULL, 111), 
    (10, 3, 222), 
    (11, NULL, 333), 
    (12, 3, 444);
    >> I need to process the data from the above table (Old_Documents) and write into two new tables (New_Documents and New_Documents_Groups) as follows. <<
    Just like punch cards and mag tape data processing! Being old and being new are a status, not another kind of entity. But that is how mag tapes work. And you even use the verb "fetch" from tape files. This design flaw is called  attribute splitting.
    Do you have a Male_Personnel and Female_Personnel table? NO! It is just Personnel! 
    >> old_doc_id should be stored as new_doc_id when inserting to New_Documents table. Only records [sic] with group_id NULL and one record [sic] (first [sic; no ordering in a table] one) per group should be considered (For example, old_doc_id 5 is not considered
    as 3 and 5 belong to the same group_id =1) for insertion. <<
    Think about your punch card mindset. Why did you physically materialize that redundant New_Documents table? Let me answer that: this is how you work with punch cards! In SQL we use a VIEW:
    CREATE VIEW New_Documents (new_doc_id)
    AS 
    SELECT old_doc_id 
      FROM Old_Documents;
    >> All records [sic] from Old_Documents should be considered for insertion into New_Documents_Groups table. The old_doc_id is inserted as new_doc_id and dept_nbr is as-is. Instead of group_id, the ID [sic: which identifier??] of the first [sic: tables
    have no ordering like a deck of punch cards] record [sic] in the group should be considered as parent_new_doc_id (For example, 3 is considered as parent_new_doc_id for new_doc_id 5 as 3 and 5 belong to the same group_id in Old_Documents table) for the new_doc_id.
    <<
    Why not use 5 as the parent? My guess is that you are trying to form equivalence classes. See:
    https://www.simple-talk.com/content/print.aspx?article=2020
    --CELKO-- Books in Celko Series for Morgan-Kaufmann Publishing: Analytics and OLAP in SQL / Data and Databases: Concepts in Practice Data / Measurements and Standards in SQL SQL for Smarties / SQL Programming Style / SQL Puzzles and Answers / Thinking
    in Sets / Trees and Hierarchies in SQL

  • Copying from one arrangement and pasting into another arrangement

    Hi, I'm using Logic Express 8.
    I'm wondering if there is a way to copy several tracks from one arrangement window into another. I've tried to do this simply by copying and pasting, but when I paste into the new arrangement window I get error messages and the tracks pasted into the new window have a speaker icon with an "X" through it - and obviously no sound.
    Am I missing something, or perhaps this is simply a limitation that exists with using express instead of pro?
    Thanks a lot.

    You can do this by simply importing the data from one project to your new one. Check it out here.

  • Pool data from text file and insert into database

    Can anyone tell me how to pool data from a text file and insert into database?
    let's say my text file is in this format
    123456 Peter 22
    234567 Nicholas 24
    345678 Jane 20
    Then I need to insert the all the value for this three column into a table which has the three column name ID, Name, Age
    Anyone knows? I need to do this urgently...Thank in advanced

    1. Use BufferedReader and read the file line by line.
    2. Loop thru the file and do the following steps with in this loop.
    3. Use StringTokenizer to seperate each line into three values (columns).
    4. Now create a insert statement with these values and add the statement to the batch (using addBatch() method of PreparedStatement or Statement).
    5. Finally (after exiting the loop), execute these batch of statements (using ps.executeBatch()).
    Sudha

Maybe you are looking for

  • How to retrive guid in do_init method for send_mail

    Hi All, I have a requirement to default the CC mail id's when a follow up email is created from complaint i am trying to acheive this by redefining DO_INIT method and created a new method set_CCmailid where i am trying to call order_read to get the p

  • File association with OpenOffice

    Hello, I'm a newbie to the Mac - well returning after many years. So, the questions revolves around having files with .doc extension open with Open Office vs, the preinstalled trial version of MS Word. Does not appear to be a setting in Open Office,

  • Bom explosion how to restrict the item level posnr

    Hi friends TABLES : MAST. DATA: BEGIN OF ISTPO OCCURS 1000.         INCLUDE STRUCTURE STPOX.       DATA: END OF ISTPO. DATA: W_TOPMAT LIKE CSTMAT. SELECT-OPTIONS : P_MATNR FOR MAST-MATNR. PARAMETERS     : P_WERKS TYPE MAST-WERKS. DATA : BEGIN OF ITAB

  • Tecra R10-10J 3G broadband problem

    I've got a new Toshiba Tecra R10-10J I don't manage to make 3G work. I've inserted my 3G SIM card in the slot (behind the battery) Toshiba 3G F3507G broadband appears three times in device manager (network card, modem, and com port) However, When I s

  • Complex VO as AdvancedDataGrid dataProvider

    I would like to access fields in a complex VO from my AdvancedDataGrid. By 'complex' I mean - that my AdvancedDataGrid dataProvider point to VO of type 'A' which containes (besides properties) VO of type 'B'. I would like that part of my colums will