Retreive data from a table in a typical format

Hi,
I am trying to retreive the telephone no.'s of all employees from a table. Where in the data entered into the table by users is in free style as in (111)111-1111 or 214-548-9874 or 587-632-ABCD or 1 985 657 4879 or 001 987-587-3654.
There could be any combinations of the format. There could be '-' or <space> or <( )> in between or alphanumeric values to enter the no.
My requirement is to fetch only the 10 digits of a number (or less if any) irrespective of the country code from the table.
Note:The users cannot be restricted to put their contact no. in a particular format while making a data entry. :(((
Pls let me know how can I accomplish this.
Thanks!
Edited by: 807404 on Nov 3, 2010 12:49 PM

Hi,
807404 wrote:
Hi,
The same thing is happening with the Translate function as well.
If the user has entered telephone number in less than 10 digits, it gives me null.
Edited by: 807404 on Nov 3, 2010 1:20 PMIt works for me.
Post some test data (CREATE TABLE and INSERT statemnts), the results you want from that data, and the query you're running.
For example, with this data:
CREATE TABLE  table_x
(       x_id          NUMBER
,     phone_num     VARCHAR2 (30)
INSERT INTO table_x (x_id, phone_num) VALUES (0 , NULL);
INSERT INTO table_x (x_id, phone_num) VALUES (1 , '(111)111-1111');
INSERT INTO table_x (x_id, phone_num) VALUES (2 , ' 214-548-9874');
INSERT INTO table_x (x_id, phone_num) VALUES (3 , '587-632-ABCD');
INSERT INTO table_x (x_id, phone_num) VALUES (4 , '1 985 657 4879');
INSERT INTO table_x (x_id, phone_num) VALUES (5 , '001 987-587-3654');
INSERT INTO table_x (x_id, phone_num) VALUES (11, '12 ext. 34');
INSERT INTO table_x (x_id, phone_num) VALUES (12, '5-6-7');
INSERT INTO table_x (x_id, phone_num) VALUES (21, '(+++) +++-++++');
COMMIT;I expect (and actually get) these results:
`     X_ID PHONE_NUM                      STANDARDIZED
         0
         1 (111)111-1111                  1111111111
         2  214-548-9874                  2145489874
         3 587-632-ABCD                   587632ABCD
         4 1 985 657 4879                 9856574879
         5 001 987-587-3654               9875873654
        11 12 ext. 34                        12ext34
        12 5-6-7                                 567
        21 (+++) +++-++++using this query:
SELECT       table_x.*
,       SUBSTR ( '         ' || TRANSLATE ( phone_num
                                      , 'A ()[].-_/+'
                                    , 'A'
                     , -10
                     )     AS standardized
FROM      table_x
ORDER BY  x_id
;The standardized column is NULL only for x_id=0 (where phone_num itself is NULL) and x_id=21 (where phone_num contains only designated bad characters).
For the rows with "numbers" containing fewer than 10 characters (x_id=11 and x_id=12), the results are padded with blanks. If you'd prefer just the non-blank characters, use LTRIM.

Similar Messages

  • ABAP program to output data from SAP table to an XML format file?

    hello ABAP experts,
    Does anyone know how to output data from SAP table to an XML format file?  Would be appreciated if someone show the detailed sample codes and we will give you reward points!
    Thanks!

    Edited by: Jose Hugo De la cruz on Aug 19, 2009 8:23 PM

  • Retreiving data from 2 tables using a sender JDBC Channel

    Hi all,
    We got anew requirement where we have to select data from 2 tables and update fields in 2 tables at the sametime.
    I have a few queries regarding this.
    Can we retrieve data from 2 tables using select query in sender JDBC channel?
    If yes, how we can achieve this?
    Can we use inner/outer joins in the select query?
    Can we update field of 2 tables using the Update query in Sender JDBC channel?
    Your help is greatly rewarded.
    With Regards
    Sudha.

    Hi,
    Even i have the same requirement where data has to be read from 2 tables and later update the falg once done .
    SELECT query :
    SELECT t1.KUNNR,t1.SETT_KEY,t1.QUART_START,t1.QUART_END,t2.PAY_METH,t2.MAT_NDC,t2.AMOUNT   FROM TSAP_REBATE_MEDI t1  INNER JOIN  TSAP_REBATE_LINE t2  ON  t1.KUNNR=t2.KUNNR AND t1.SETT_KEY=t2.SETT_KEY  WHERE  t1.PROCESSING_STATUS = 'N' AND t2.PROCESSING_STATUS = 'N'
    This is working fine.
    Can somebody help me with update query with this. where flag PROCESSING_STATUS has to be updated with 'P'.
    I tried a lot but couldnt get the answer
    Br,
    Manoj

  • How to retreive data from a table and a structure

    Hi experts,
    I have a requirement to create an ALV report. I have to include fields from two tables 'PA0001' and 'CATSDB' and from one structure 'CATSAPPR'. I am able to retrieve data from tables using INNERJOIN. But i have no idea how to deal with Structure.
    I know that structure does not contain any data.
    But  i am not able to find any transparent table which has field 'ORDERTEXT' . I found field 'ORDERTEXT' only in structure 'CATSAPPR'.
    So please anyone tell me, either any transparent table which has field 'ORDERTEXT' or please tell me how to deal with structure, in this case.
    Please help me,
    Thanks in advance
    Anu

    check this
      Table-Fields                     Short descriptn
      AUFK- KTEXT          Order master data
    COCH- KTXT              Process Management: Control Recipe Header
    COCHP- KTXT             PI sheet: Control Recipe Header
    COMPHDR- KTEXT    OCM: Comparison results, indiv. records for order headers
    HIKO- KTEXT              Order master data history
    MAAVC_MDO-  KTEXT      MPO: Individual Master Data for Orders
    MCHPVS- AUFTEXT                           Batch Record: Shadow Table for Link to Archive
    OCMHOMO_ARCH- KFELD          Backup Table of Homogeneity List in Case of Archiving
    PSJHIEDATA- KTEXT                       Hierarchy data to test LDB PSJ
    TEWOCODEST- DESCRIPTION                      Order Codes (Texts)
    TEWOCODET- KTEXT           Allocation of Message Codes to Order 
    VLCSMORDER- KTEXT                                                   VELO : Service Order
    VSAUFK_CN- KTEXT               Version: Order master data
    Yogesh N

  • Convert data from internal table to XML file.

    Hi All,
    I am selecting data from database into one internal table.
    Now I want to convert data from internal table to xml file format and save in to my desktop. Please suggest me how I can achieve my requirement.
    Kindly reply me ASAP.

    Use this FM. SAP_CONVERT_TO_XML_FORMAT
    Check this link too -
    Re: Data Export in XML format
    XML files from ABAP programs

  • How to retreive data from a maintainence view?

    Hi all,
    I have used select query to retreive data from a maintainence view. But it gave me a syntactical error that XYZ is not in abap dictionary , it's a database / projection view?
    So can't we get data from a view using select query?
    Please share if u have answers.
    Thanks & regards,
    Chandru

    Hi,
    Maintenance view will be created for database tables only.
    Just display the maintainence from SE11 and click Tables tab in the screen
    Here you can find the list of tables used to created for Maintainence view.
    YOu can use these tables to write SELECT.
    If you give us the maintainence table, we will help you.
    Thanks,
    Ramakrishna

  • Retreive data from wa structure into fields.

    Hi friends,
                I am having a query regarding classical report.I want to retreive data for vendor billwise realisation report.I am getting few fields into my list screen(fields from bsis table).Remaining fields i want to retreive from table bseg(augdt,augbl....)with the condition that the belnr of both the tables should be same and Account type = 'k'(vendors).
    I tried it using select queries for the bsis and bseg..but data is retreived only from one table bsis and not from bseg.so for data from bseg i used workarea(wa).data is retreived for only single doc no. and not for other documents.
       If any problem in understanding this pls give me the mail id i will send in detail.
    Thanks and regards,     
    Anand.

    Hi Anand,
    You should not use Workarea, if you are exepecting N number of records. Please use internal table.
    SELECT * FROM BSIS
        into table i_bsis
         where ....
    if not i_bsis[] is initial.
    sort i_bsis by belnr.
    SELECT * FROM BSEG
    into table i_bseg
    for all entries in i_bsis
    where belnr = i_bsis-belnr.
    endif.
    Best regards,
    Prashant

  • Adding Data From One Table to Another

    Now, this doesn't strike me as a particularly complex problem, but I've either strayed outside the domain of Numbers or I'm just not looking at the problem from the right angle. In any case, I'm sure you guys can offer some insight.
    What I'm trying to do is, essentially, move data from one table to another. One table is a calendar, a simple two column 'date/task to be completed' affair, the other is a schedule of jogging workouts, i.e, times, distances. Basically, I'm trying to create a formula that copies data from the second table onto the first but only for odd days of the week, excepting Sundays (and assuming Monday as the start of the week). Now, this isn't the hard part, I can do that. The problem comes when I replicate the formula down the calendar. Even on the days when the 'if' statement identifies it as an 'even day', the cell reference to the appropriate workout on the second table is incremented, so when it comes to the next 'odd day', it has skipped a workout.
    I can't seem to see any way of getting it to specifically copy the NEXT line in the second table, and not the corresponding line.
    This began as a distraction to try and organise my running so I could see at a glance what I had to do that day and track my progress, but now it's turned into an obsession. SURELY there's a solution?
    Cheers.

    Hi Sealatron,
    Welcome to Apple Discussions and the Numbers '09 forum.
    Several possible ways to move the data occur to me, but the devil's in the details of how the data is currently arranged.
    Is it
    • a list of three workouts, one for each of Monday, Wednesday and Friday, then the same three repeated the following week?
    • an open-ended list that does not repeat?
    • something else?
    Regards,
    Barry

  • Error while selecting date from external table

    Hello all,
    I am getting the follwing error while selecting data from external table. Any idea why?
    SQL> CREATE TABLE SE2_EXT (SE_REF_NO VARCHAR2(255),
      2        SE_CUST_ID NUMBER(38),
      3        SE_TRAN_AMT_LCY FLOAT(126),
      4        SE_REVERSAL_MARKER VARCHAR2(255))
      5  ORGANIZATION EXTERNAL (
      6    TYPE ORACLE_LOADER
      7    DEFAULT DIRECTORY ext_tables
      8    ACCESS PARAMETERS (
      9      RECORDS DELIMITED BY NEWLINE
    10      FIELDS TERMINATED BY ','
    11      MISSING FIELD VALUES ARE NULL
    12      (
    13        country_code      CHAR(5),
    14        country_name      CHAR(50),
    15        country_language  CHAR(50)
    16      )
    17    )
    18    LOCATION ('SE2.csv')
    19  )
    20  PARALLEL 5
    21  REJECT LIMIT UNLIMITED;
    Table created.
    SQL> select * from se2_ext;
    SQL> select count(*) from se2_ext;
    select count(*) from se2_ext
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04043: table column not found in external source: SE_REF_NO
    ORA-06512: at "SYS.ORACLE_LOADER", line 19

    It would appear that you external table definition and the external data file data do not match up. Post a few input records so someone can duplicate the problem and determine the fix.
    HTH -- Mark D Powell --

  • Can we export DATA from all tables in a schema?

    Hi,
    I have a question; Can we export all the DATA from all the tables present in the schema to any source (eigther CSV, TXT, DAt, etc..)?
    Or
    Can I have a PL/SQL procedure to display DATA from all Tables in a schema?
    With Best Regards,
    - Naveed

    Hi,
    This is pretty much what you need.
    DECLARE
    V_COUNT NUMBER;
    v_sql varchar2(1000);
    IN_OWNER_NAME VARCHAR2(100) := 'AP' ; -- SCHEMA NAME
    TYPE T_COL_NAME is table of varchar2(1000) index by binary_integer;
    v_col_tbl t_col_name;
    BEGIN
    FOR i in
    (SELECT object_name
    FROM dba_objects
         WHERE owner = IN_OWNER_NAME
         AND object_type ='TABLE'
    and rownum < 2)
    LOOP
    v_sql := 'SELECT COUNT(*) FROM ' || i.object_name ;
    EXECUTE IMMEDIATE v_sql INTO V_COUNT;
    if v_count > 0 then
    v_sql := 'SELECT * FROM ' || i.object_name ;
    select column_name
    bulk collect
    into v_col_tbl
    from DBA_TAB_COLUMNS
    WHERE TABLE_NAME = I.OBJECT_NAME
    AND OWNER = IN_OWNER_NAME;
    -- start selecting the column and exporting using the column names selected.     
    end if;
    if v_col_tbl.count > 0 then
    for i in v_col_tbl.first .. v_col_tbl.last
    loop
    DBMS_OUTPUT.PUT_lINE(v_col_tbl(i));
    end loop;
    end if;
    DBMS_OUTPUT.PUT_lINE( i.object_name || '-' || v_count);
    END LOOP;
    END;
    - Ronel

  • Copying large amount of data from one table to another getting slower

    I have a process that copies data from one table (big_tbl) into a very big archive table (vb_archive_tbl - 30 mil recs - partitioned table). If there are less than 1 million records in the big_tbl the copy to the vb_archive_table is fast (-10 min), but more importantly - it's consistant. However, if the number of records is greater than 1 million records in the big_tbl copying the data into the vb_archive_tbl is very slow (+30 min - 4 hours), and very inconsistant. Every few days the time it takes to copy the same amount of data grows signicantly.
    Here's an example of the code I'm using, which uses BULK COLLECT and FORALL INSERST to copy the data.
    I occasionally change 'LIMIT 5000' to see performance differences.
    DECLARE
    TYPE t_rec_type IS RECORD (fact_id NUMBER(12,0),
    store_id VARCHAR2(10),
    product_id VARCHAR2(20));
    TYPE CFF_TYPE IS TABLE OF t_rec_type
    INDEX BY BINARY_INTEGER;
    T_CFF CFF_TYPE;
    CURSOR c_cff IS SELECT *
    FROM big_tbl;
    BEGIN
    OPEN c_cff;
    LOOP
    FETCH c_cff BULK COLLECT INTO T_CFF LIMIT 5000;
    FORALL i IN T_CFF.first..T_CFF.last
    INSERT INTO vb_archive_tbl
    VALUES T_CFF(i);
    COMMIT;
    EXIT WHEN c_cff%NOTFOUND;
    END LOOP;
    CLOSE c_cff;
    END;
    Thanks you very much for any advice
    Edited by: reid on Sep 11, 2008 5:23 PM

    Assuming that there is nothing else in the code that forces you to use PL/SQL for processing, I'll second Tubby's comment that this would be better done in SQL. Depending on the logic and partitioning approach for the archive table, you may be better off doing a direct-path load into a staging table and then doing a partition exchange to load the staging table into the partitioned table. Ideally, you could just move big_tbl into the vb_archive_tbl with a single partition exchange operation.
    That said, if there is a need for PL/SQL, have you traced the session to see what is causing the slowness? Is the query plan different? If the number of rows in the table is really a trigger, I would tend to suspect that the number of rows is causing the optimizer to choose a different plan (with your sample code, the plan is obvious, but perhaps you omitted some where clauses to simplify things down) which may be rather poor.
    Justin

  • Moving time-dependant data from one table to another (archiving)

    Hello all
    I would like to know if there's an easier solution or a "best practice" to move data from one table to another. The context of this issue can be found within "archiving".
    More concretely: we have an application that uses several tables to log information to.
    These tables are growing like crazy, and we would like to keep only "relevant" data in those tables, so I was thinking about moving data from these tables that have been in there for, say 2 months, to "archiving" tables.
    I figured there must be some kind of "best practice" to get this done.
    I have already written a procedure that loops the table that has the time indicator and inserts the records from the normal tables into the archive tables (and afterwards delete this data), but it seems to be taking ages to get it done.
    Thanks in advance!
    Message was edited by:
    timschraepen

    There is nothing to do with PL/SQL.
    You can refer below links:
    http://www.lc.leidenuniv.nl/awcourse/oracle/server.920/a96524/c12parti.htm
    http://www.stanford.edu/dept/itss/docs/oracle/10g/server.101/b10739/partiti.htm#i1006727

  • Insert old missing data from one table to another(databaase trigger)

    Hello,
    i want to do two things
    1)I want to insert old missing data from one table to another through a database trigger but it can't be executed that way i don't know what should i do in case of replacing old data in table_1 into table_2
    2)what should i use :NEW. OR :OLD. instead.
    3) what should i do if i have records exising between the two dates
    i want to surpress the existing records.
    the following code is what i have but no effect occured.
    CREATE OR REPLACE TRIGGER ATTENDANCEE_FOLLOWS
    AFTER INSERT ON ACCESSLOG
    REFERENCING NEW AS NEW OLD AS OLD
    FOR EACH ROW
    DECLARE
    V_COUNT       NUMBER(2);
    V_TIME_OUT    DATE;
    V_DATE_IN     DATE;
    V_DATE_OUT    DATE;
    V_TIME_IN     DATE;
    V_ATT_FLAG    VARCHAR2(3);
    V_EMP_ID      NUMBER(11);
    CURSOR EMP_FOLLOWS IS
    SELECT   EMPLOYEEID , LOGDATE , LOGTIME , INOUT
    FROM     ACCESSLOG
    WHERE    LOGDATE
    BETWEEN  TO_DATE('18/12/2008','dd/mm/rrrr') 
    AND      TO_DATE('19/12/2008','dd/mm/rrrr');
    BEGIN
    FOR EMP IN EMP_FOLLOWS LOOP
    SELECT COUNT(*)
    INTO  V_COUNT
    FROM  EMP_ATTENDANCEE
    WHERE EMP_ID    =  EMP.EMPLOYEEID
    AND    DATE_IN   =  EMP.LOGDATE
    AND    ATT_FLAG = 'I';
    IF V_COUNT = 0  THEN
    INSERT INTO EMP_ATTENDANCEE (EMP_ID, DATE_IN ,DATE_OUT
                                ,TIME_IN ,TIME_OUT,ATT_FLAG)
         VALUES (TO_NUMBER(TO_CHAR(:NEW.employeeid,99999)),
                 TO_DATE(:NEW.LOGDATE,'dd/mm/rrrr'),       -- DATE_IN
                 NULL,
                 TO_DATE(:NEW.LOGTIME,'HH24:MI:SS'),      -- TIME_IN
                 NULL ,'I');
    ELSIF   V_COUNT > 0 THEN
    UPDATE  EMP_ATTENDANCEE
        SET DATE_OUT       =  TO_DATE(:NEW.LOGDATE,'dd/mm/rrrr'), -- DATE_OUT,
            TIME_OUT       =   TO_DATE(:NEW.LOGTIME,'HH24:MI:SS'), -- TIME_OUT
            ATT_FLAG       =   'O'
            WHERE EMP_ID   =   TO_NUMBER(TO_CHAR(:NEW.employeeid,99999))
            AND   DATE_IN <=  (SELECT MAX (DATE_IN )
                               FROM EMP_ATTENDANCEE
                               WHERE EMP_ID = TO_NUMBER(TO_CHAR(:NEW.employeeid,99999))
                               AND   DATE_OUT IS NULL
                               AND   TIME_OUT IS NULL )
    AND   DATE_OUT  IS NULL
    AND   TIME_OUT IS NULL  ;
    END IF;
    END LOOP;
    EXCEPTION
    WHEN OTHERS THEN RAISE;
    END ATTENDANCEE_FOLLOWS ;
                            Regards,
    Abdetu..

    INSERT INTO SALES_MASTER
       ( NO
       , Name
       , PINCODE )
       SELECT SALESMANNO
            , SALESMANNAME
            , PINCODE
         FROM SALESMAN_MASTER;Regards,
    Christian Balz

  • Loading data from one table to another table

    I need to load the data (data conversion) from one table to 3 different table.
    I have to load the data from Source table to following 3 target tables.
    So Consumer table has email pk and it has relationship with CONSUMER_RCV table and CONSUMER_RCV
    table has relationship with CONSUMER_ATTR table. I am doing right now with pl/sql but I am looking
    to use MULTI INSERT or SQL*LOADER or another efficient way.
    I tried with MultiInsert but it getting failed as it's violating Unique Constraints as source table might contains
    Duplicate emails.
    Could you please show me how I can load the data efficiently?
    Also I need to load in CONSUMER_ATTR table from CONTACT table which has ATTR1, ATTR2 columns.
    so My mapping will be ...
    Contact.email, first_name, last_name will go into Consumer table
    then in CONSUMER_RCV, I generate the id column value by sequence, load email and DATE_CREATEd from contact table
    And rest of the value is hard coded
    Then in CONSUMER_ATTR table I will generate the CONSUMER_ATTR_id through sequence, for ID, will use the same
    id column value which I have generated by sequence in CONSUMER_RCV, then put the attr_type = 'ATTR1' and for
    attr_value column I will insert the value from contact table's ATTR1 value,
    Then generate another CONSUMER_ATTR_ID, will use the same id column value which I have generated by sequence
    in CONSUMER_RCV, then put the attr_type = 'ATTR2' and for
    attr_value column I will insert the value from contact table's ATTR2 value,
    Please let me know if you need further explanation.
    I am using Oracle 9i R2.
    Please consider CONTACT is Source table, CONSUMER is a Target1, CONSUMER_RCV is a Target2 and CONSUMER_ATTR is a Target3
    to simplify the problem.
    Following is the Table structure
    =========================
    CREATE TABLE CONTACT (
    ID VARCHAR2 (40) NOT NULL, -- will go into Target1
    EMAIL VARCHAR2 (100) ,      -- might be duplicate and will go into Target1 and Target2
    FIRST_NAME VARCHAR2 (100) NOT NULL, -- will go into Target1
    LAST_NAME VARCHAR2 (100) NOT NULL, -- will go into Target1
    COUNTRY VARCHAR2 (40) NOT NULL, -- will go into Target1
    PHONE VARCHAR2 (100), -- will go into Target1
    NOTIFY VARCHAR2 (3), -- will go into Target2 table's RECEIVE_EMAIL column
    CREATE_DATE DATE ,               -- will go into Target1 and target2
    ATTR1 VARCHAR2 (400),           -- will go into Target3 as attr_type= ATTR1 and will load actual value in ATTR_VALUE
    ATTR2 VARCHAR2(100),          -- will go into Target3 as attr_type= ATTR2 and will load actual value in ATTR_VALUE
    ATTR2 VARCHAR2(100),          -- will go into Target3 as attr_type= ATTR1 and will load actual value in ATTR_VALUE
    CONSTRAINT CONTACT_PK
    PRIMARY KEY ( USER_ID ) ) ;
    CREATE TABLE CONSUMER(
    EMAIL VARCHAR2 (100) NOT NULL, -- PK
    TITLE VARCHAR2 (40),
    FIRST_NAME VARCHAR2 (40),
    LAST_NAME VARCHAR2 (40),
    ADDRESS1 VARCHAR2 (40),
    ADDRESS2 VARCHAR2 (40),
    CITY VARCHAR2 (30),
    STATE          VARCHAR2 (30),
    ZIP          VARCHAR2 (10),
    COUNTRY VARCHAR2 (40),
    PHONE VARCHAR2 (15),
    DATE_CREATED DATE NOT NULL,
    CONSTRAINT CONSUMER_PK
    PRIMARY KEY ( EMAIL ) ) ;
    CREATE TABLE CONSUMER_RCV (
    ID                VARCHAR2 (40) NOT NULL,-- PK
    EMAIL VARCHAR2 (100) NOT NULL,-- FK reference to Consumer
    SITE VARCHAR2 (100) NOT NULL, -- default website
    RECEIVE_EMAIL VARCHAR2 (1),
    CREATE_DATE DATE NOT NULL,
    CONSTRAINT CONSUMER_RCV_PK
    PRIMARY KEY (ID) ) ;
    ALTER TABLE CONSUMER_RCV ADD CONSTRAINT CONSUMER_RCV_FK
    FOREIGN KEY (EMAIL)
    REFERENCES CONSUMER (EMAIL) ;
    CREATE TABLE CONSUMER_ATTR (
    CONSUMER_ATTR_ID           VARCHAR2 (40) NOT NULL, -- PK
    ID                     VARCHAR2 (40) NOT NULL, -- FK reference to COnsumer_RCV
    ATTR_TYPE           VARCHAR2 (100) NOT NULL,
    ATTR_VALUE           VARCHAR2 (4000) NOT NULL,
    CONSTRAINT CONSUMER_ATTR_PK
    PRIMARY KEY ( CONSUMER_ATTR_ID ) ) ;
    ALTER TABLE CONSUMER_ATTR ADD CONSTRAINT CONSUMER_ATTR_FK
    FOREIGN KEY (CONSUMER_ATTR_ID)
    REFERENCES CONSUMER_RCV (ID) ;

    HI Hema,
      How are the entries related.. is it like for one entry in y table there are more than one entry in x table then you have to use loop with in loop , here are the both the conditions.
    1) For each entry in Y there are more than one entries in X
      sort y by quota trpid.
      sort x by quota trpid.
      loop at y.
      loop at x where quota eq y-quota
                            and trpid eq y-trpid.
      z-value = x-value.
      append z.
      endloop.
      endloop.
    2) For each y there is one entry in x table.
    sort y by quota trpid.
      sort x by quota trpid.
      loop at y.
      read table x with key quota = y-quota
                                trpid = y-trpid binary search.
    if sy-subrc eq 0.
      z-value = x-value.
      append z.
    Endif.
      endloop.
    Mahesh

  • Copying data from one table to another, but not duplicate

    Good afternoon!
    I am new to Oracle SQL, I have a difficulty.
    I have a script that copies or add another table with data from another table.
    If the table already has 01 "Registry 01" when you make a copy of the data in table 02, can not duplicate the "Registry 01" again.
    As the table already exists since the beginning of the year before last and duplicate information, I can not apply the UNIQUE constraint because of the error. I have to make this change from now.
    How to perform this validation so that no duplicate data?
    DECLARE
    w_cont NUMBER;
    CURSOR c_simpro IS
    SELECT sc.cd_simpro,
    sc.ds_produto,
    sp.qt_embalagem,
    MAX(sp.dt_vigencia)
    FROM simpro_cadastro sc,
    simpro_preco sp
    WHERE sc.cd_simpro = sp.cd_simpro
    GROUP BY sc.cd_simpro,
    sc.ds_produto,
    sp.qt_embalagem;
    BEGIN
    FOR r_simpro IN c_simpro LOOP
    w_cont := 0;
    SELECT COUNT(1)
    INTO w_cont
    FROM pls_material pm
    WHERE pm.cd_material_ops = r_simpro.cd_simpro;
    IF w_cont = 0 THEN
    INSERT INTO pls_material(nr_sequencia,
    dt_atualizacao,
    nm_usuario,
    dt_atualizacao_nrec,
    nm_usuario_nrec,
    ie_tipo_despesa,
    cd_estabelecimento,
    nr_seq_estrut_mat,
    cd_simpro,
    ds_material,
    ie_situacao,
    ds_material_sem_acento,
    dt_inclusao,
    cd_material_ops_orig,
    cd_unidade_medida,
    cd_material_ops,
    qt_conversao_simpro)
    VALUES(pls_material_seq.nextval,
    SYSDATE,
    'ES-SIMPRO',
    SYSDATE,
    'ES-SIMPRO',
    3,
    1,
    3,
    r_simpro.cd_simpro,
    r_simpro.ds_PRODUTO,
    'A',
    r_simpro.ds_PRODUTO,
    SYSDATE,
    r_simpro.cd_simpro,
    'un',
    TRIM(to_char(r_simpro.cd_simpro,'0000099999')),
    r_simpro.qt_embalagem);
    COMMIT;
    END IF;
    IF w_cont > 0 THEN
    UPDATE pls_material p
    SET p.qt_conversao_simpro = r_simpro.qt_embalagem,
    p.dt_atualizacao = SYSDATE
    WHERE p.cd_simpro = r_simpro.cd_simpro;
    COMMIT;
    END IF;
    END LOOP;
    END;
    Edited by: 983464 on 22/01/2013 10:30

    Hi,
    in addition to what Marwin has already said, I suggest you to post CREATE TABLE and INSERT statements (as mentioned in the FAQ).
    The error you are getting from MERGE command is because you need a way to uniquely identify within the table. So it's is important to know also if your table has a primary key/unique index so the keys could to be used in the MERGE command.
    Additionally when you put some code or output please enclose it between two lines starting with {noformat}{noformat}
    i.e.:
    {noformat}{noformat}
    SELECT ...
    {noformat}{noformat}
    Regards.
    Al                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

Maybe you are looking for