Copy a table with long data

I have a table in which one column is long. The table also contains some record.
How can I make another exactly same as this one (same structure/data)??

hagrid wrote:
I have a table in which one column is long. The table also contains some record.
How can I make another exactly same as this one (same structure/data)??
Connected to:
Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
SQL> help copy
COPY
Copies data from a query to a table in the same or another
database. COPY supports CHAR, DATE, LONG, NUMBER and VARCHAR2.
COPY {FROM database | TO database | FROM database TO database}
            {APPEND|CREATE|INSERT|REPLACE} destination_table
            [(column, column, column, ...)] USING query
where database has the following syntax:
     username[/password]@connect_identifierLONG datatype has been deprecated for at least a decade.
STOP using it it NOW!

Similar Messages

  • Moving a table with long data type column

    hi
    1.how to move a table with a long data type column in 8.1.7.3.0 ver database.
    alter table APPLSYS.FND_LOBS_DOCUMENT move lob(BLOB_CONTENT) store as (tablespace testing)
    ERROR at line 1:
    ORA-00997: illegal use of LONG datatype
    2. and a table with varray type column
    alter table APPLSYS.WF_ERROR move lob("USER_DATA"."PARAMETER_LIST") store as (tablespace testing)
    ERROR at line 1:
    ORA-22917: use VARRAY to define the storage clause for this column or attribute
    table description is:
    SQL> desc applsys.wf_error;
    Name Null? Type
    Q_NAME VARCHAR2(30)
    MSGID NOT NULL RAW(16)
    CORRID VARCHAR2(128)
    PRIORITY NUMBER
    STATE NUMBER
    DELAY DATE
    EXPIRATION NUMBER
    TIME_MANAGER_INFO DATE
    LOCAL_ORDER_NO NUMBER
    CHAIN_NO NUMBER
    CSCN NUMBER
    DSCN NUMBER
    ENQ_TIME DATE
    ENQ_UID NUMBER
    ENQ_TID VARCHAR2(30)
    DEQ_TIME DATE
    DEQ_UID NUMBER
    DEQ_TID VARCHAR2(30)
    RETRY_COUNT NUMBER
    EXCEPTION_QSCHEMA VARCHAR2(30)
    EXCEPTION_QUEUE VARCHAR2(30)
    STEP_NO NUMBER
    RECIPIENT_KEY NUMBER
    DEQUEUE_MSGID RAW(16)
    SENDER_NAME VARCHAR2(30)
    SENDER_ADDRESS VARCHAR2(1024)
    SENDER_PROTOCOL NUMBER
    USER_DATA APPS.WF_EVENT_T
    lob column:
    SQL> select owner,table_name,column_name from dba_lobs where table_name='WF_ERROR';
    OWNER TABLE_NAME COLUMN_NAME
    APPLSYS WF_ERROR "USER_DATA"."PARAMETER_LIST"
    APPLSYS WF_ERROR "USER_DATA"."EVENT_DATA"
    pls help me
    thanks and regards
    srinivas

    1. Export and import
    2. Sql*Plus 'copy' command
    It is a good idea to move from 'LONG' to 'CLOB'.

  • How to copy a table with LONG and CLOB datatype over a dblink?

    Hi All,
    I need to copy a table from an external database into a local one. Note that this table has both LONG and CLOB datatypes included.
    I have taken 2 approaches to do this:
    1. Use the CREATE TABLE AS....
    SQL> create table XXXX_TEST as select * from XXXX_INDV_DOCS@ext_db;
    create table XXXX_TEST as select * from XXXX_INDV_DOCS@ext_db
    ERROR at line 1:
    ORA-00997: illegal use of LONG datatype
    2. After reading some threads I tried to use the COPY command:
    SQL> COPY FROM xxxx/pass@ext_db TO xxxx/pass@target_db REPLACE XXXX_INDV_DOCS USING SELECT * FROM XXXX_INDV_DOCS;
    Array fetch/bind size is 15. (arraysize is 15)
    Will commit when done. (copycommit is 0)
    Maximum long size is 80. (long is 80)
    CPY-0012: Datatype cannot be copied
    If my understanding is correct the 1st statement fails because there is a LONG datatype in XXXX_INDV_DOCS table and 2nd one fails because there is a CLOB datatype.
    Is there a way to copy the entire table (all columns including both LONG and CLOB) over a dblink?
    Would greatelly appriciate any workaround or ideas!
    Regards,
    Pawel.

    Hi Nicolas,
    There is a reason I am not using export/import:
    - I would like to have a one-script solution for this problem (meaning execute one script on one machine)
    - I am not able to make an SSH connection from the target DB to the local one (although the otherway it works fine) which means I cannot copy the dump file from target server to local one.
    - with export/import I need to have an SSH connection on the target DB in order to issue the exp command...
    Therefore, I am looking for a solution (or a workaround) which will work over a DBLINK.
    Regards,
    Pawel.

  • Function Module To Copy Abap tables with data

    Hi,
    Is there an easy way to copy abap tables with the data to another destination table using function modules or sample codes?

    What do you mean for "destination"? Another SAP system e.g. an RFC destination?
    And what kind of table do you need to copy? Ztables? Standard Tables?
    If you mean Standard Tables, your task isn't a  safe procedure.

  • Using DBMS_DATAPUMP with LONG data type

    I've got a procedure below that calls the DBMS_DATAPUMP procedure using a REMOTE_LINK to move a schema from one database to another. However, a couple of the tables within that schema have columns with the LONG data type. And when I run it I get an error saying that you cannot move data with the LONG data type using a REMOTE LINK. So no data in those particular tables gets moved over.
    Has anyone else had this issue? If so, do you have a work around? I tried adding a CLOB column to my table and setting the new CLOB to equal the LONG, but I couldn't get that to work either...even when I tried using a TO_LOB. If I could get that to, then I could just drop the LONG, move the schema, then recreate the LONG column on the opposite side.
    Here's my procedure....
    DECLARE
         /* EXPORT/IMPORT VARIABLES */
         v_dp_job_handle                    NUMBER ;          -- Data Pump job handle
         v_count                              NUMBER ;          -- Loop index
         v_percent_done                    NUMBER ;          -- Percentage of job complete
         v_job_state                         VARCHAR2(30) ;     -- To keep track of job state
         v_message                         KU$_LOGENTRY ;     -- For WIP and error messages
         v_job_status                    KU$_JOBSTATUS ;     -- The job status from get_status
         v_status                         KU$_STATUS ;     -- The status object returned by get_status
         v_logfile                         NUMBER ;
         v_date                              VARCHAR2(13) ;
         v_source_server_name          VARCHAR2(50) ;
         v_destination_server_name     VARCHAR2(50) ;
    BEGIN
         v_project := 'TEST' ;
         v_date := TO_CHAR(SYSDATE, 'MMDDYYYY_HHMI') ;
         v_source_server_name := 'TEST_DB' ;
         v_dp_job_handle := DBMS_DATAPUMP.OPEN(
              OPERATION     => 'IMPORT',
              JOB_MODE     => 'SCHEMA',
              REMOTE_LINK => v_source_server_name,
              JOB_NAME     => v_project||'_EXP_'||v_date,
              VERSION          => 'LATEST') ;
         v_logfile := DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE ;
         DBMS_DATAPUMP.ADD_FILE(
              HANDLE          => v_dp_job_handle,
              FILENAME     => v_project||'_EXP_'||v_date||'.LOG',
              DIRECTORY     => 'DATAPUMP',
              FILETYPE     => v_logfile) ;
         DBMS_DATAPUMP.METADATA_FILTER(
              HANDLE          => v_dp_job_handle,
              NAME          => 'SCHEMA_EXPR',
              VALUE          => '= '''||v_project||''' ') ;
         DBMS_DATAPUMP.START_JOB(v_dp_job_handle) ;
         v_percent_done := 0 ;
         v_job_state := 'UNDEFINED' ;
         WHILE (v_job_state != 'COMPLETED') AND (v_job_state != 'STOPPED')
         LOOP
              DBMS_DATAPUMP.GET_STATUS(
                   v_dp_job_handle,
                   DBMS_DATAPUMP.KU$_STATUS_JOB_ERROR + DBMS_DATAPUMP.KU$_STATUS_JOB_STATUS + DBMS_DATAPUMP.KU$_STATUS_WIP,
                   -1,
                   v_job_state,
                   v_status) ;
                   v_job_status := v_status.JOB_STATUS ;
              IF v_job_status.PERCENT_DONE != v_percent_done THEN
                   DBMS_OUTPUT.PUT_LINE('*** Job percent done = '||TO_CHAR(v_job_status.PERCENT_DONE)) ;
                   v_percent_done := v_job_status.PERCENT_DONE ;
              END IF ;
              IF BITAND(v_status.MASK, DBMS_DATAPUMP.KU$_STATUS_WIP) != 0 THEN
                   v_message := v_status.WIP ;
              ELSIF BITAND(v_status.mask, DBMS_DATAPUMP.KU$_STATUS_JOB_ERROR) != 0 THEN
                   v_message := v_status.ERROR ;
              ELSE
                   v_message := NULL ;
              END IF ;
              IF v_message IS NOT NULL THEN
                   v_count := v_message.FIRST ;
                   WHILE v_count IS NOT NULL
                   LOOP
                        DBMS_OUTPUT.PUT_LINE(v_message(v_count).LOGTEXT) ;
                        v_count := v_message.NEXT(v_count) ;
                   END LOOP ;
              END IF ;
         END LOOP ;
         DBMS_OUTPUT.PUT_LINE('Job has completed') ;
         DBMS_OUTPUT.PUT_LINE('Final job state = '||v_job_state) ;
         DBMS_DATAPUMP.DETACH(v_dp_job_handle) ;
    END ;

    But the application we have that uses the database cannot be changed to read from a CLOBWhy can't you change the application?
    Well, anyway you should point out to your superiors that Oracle documented years ago to not use LONGS anymore...
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14220/datatype.htm#sthref3806
    It clearly states:
    LONG Datatype
    Note:
    Do not create tables with LONG columns. Use LOB columns (CLOB, NCLOB) instead. LONG columns are supported only for backward compatibility.
    Oracle also recommends that you convert existing LONG columns to LOB columns. LOB columns are subject to far fewer restrictions than LONG columns. Further, LOB functionality is enhanced in every release, whereas LONG functionality has been static for several releases.
    How do I go from CLOB to LONG?I'm sorry, cannot help you on that one, I don't think you can do that at all (Oracle wants us to stop using LONGS, so, it's a one-way conversion...):
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:1037232794454#15512131314505
    So: NO built_in, you'll need to write a program if the clob is ALWAYS LESS THAN 32k in size, you can use plsql..but is that the case in your case? Only you know that.
    I believe that question is still unanswered on this forum, but you might try searchin for answers on this forum, and
    the 'Database-General' forum: General Database Discussions
    Perhaps you can google a Q&D workaround...
    ( And consider convincing your collegues to just convert your LONGS to LOBS)
    Edited by: hoek on Apr 8, 2009 5:43 PM

  • Exporting a table with Long datatype col. name

    Hello,
    I need help exporting a table, One of the column is long datatype.
    How can I do this without using Export Util?
    Is it possible.
    (If someone has a solution then PLEASE Email
    me)
    URGENT.
    Thanks.
    Pankaj Patel.

    Just wanted to find out, if you already solved this problem. I have a similar issue with long column. I am trying to sql dump a table with long column that will be imported into another database(probably using sql*load), but the spooled file puts the data from the long column in separate lines 80 char long and not on a single line. I have set the long to 64000 and linesize to 32000. Wrap is on too.
    null

  • How to fill internal table with no data in debugging mode

    Hi all,
             I modified one existing program.Now I want to test it.I am not given test data.So in the middle of my debugging, I found that one internal table with no data.My problem is how to fill that internal table with few records in that debugging mode just as we change contents in debugging mode.If I want to proceed further means that internal table must have some records.
    Please I dont know how to create test data so I am trying to create values temporarily in debugging mode only.
    Thanks,
    Balaji

    Hi,
    In the debugging do the following..
    Click the Table button..
    Double click on the internal table name..
    Then in the bottom of the screen you will get the buttons like CHANGE, INSERT, APPEND, DELETE..
    Use the APPEND button to insert records to the internal table..
    Thanks,
    Naren

  • KE28 with" Copy Characteristic Value with Reference Data" doesn't work

    Hi experts:
    We need to make a top-down setting the processing option 'Copy Characteristic Value with Reference Data'. We have the following source data:
    Customer Business Unit  Value field.
    6        #              100
    On the other hand, we have plan data as reference data:
    Customer Business Unit  Value field.
           A              40
           C              40
           D              20
    We need to run a top-down to distribute from client to Business Unit but copying client of reference data. Then, we set 'Copy Characteristic Value with Reference Data' and, in the selection criteria we set '*' for customer.
    After running, the result is that the system can find 1 sender and 3 receivers. It is exactly what we are expecting. However, program doesn't make individual items and anything is distributed.
    We have found notes 1086282 and 1273924 but the result is the same after implementing these notes.
    Thanks in advance for your help.
    Best regards
    Jose

    There are some restrictions on the XML Schema format that you can report off of in Crystal Reports.
    If you're using the ODBC XML driver, you may find this of relevance:
    [http://resources.businessobjects.com/support/communitycs/TechnicalPapers/cr_xml_data_sources.pdf|http://resources.businessobjects.com/support/communitycs/TechnicalPapers/cr_xml_data_sources.pdf]
    and if you're using the native XML driver, the following gives a guide for the accepted formats:
    [http://resources.businessobjects.com/support/communitycs/TechnicalPapers/cr_xi_native_xml_driver.pdf|http://resources.businessobjects.com/support/communitycs/TechnicalPapers/cr_xi_native_xml_driver.pdf]
    Sincerely,
    Ted Ueda

  • Query on a table with indexed date field

    I have a table with a date column which is indexed. If I run a query like "select column1 where date_field='20-JAN-04' for example it is fast and uses index.
    If I run select column1 where date < '20-JAN-04' it is slow and doesnt use the index. I logged a TAR and Oracle told me that this is to be expected as not using the index in this case is the most effiecient way of doing the query.
    Now my concept of an index is like the index of Yellow Pages(telephone directory) for example. In this example if I look for a name that is say "Halfords" or below, I can see all entries for Halfords and all the way to ZZZ in one block.
    I just cant see , in a common sense way why Oracle wont use the index in this type of query.
    George

    Using the concept of a telephone directory is wrong. In a telephone directory you have all information order by the name. However in your table (if it is not an IOT) you don't have all information/rows ordered by your date_field. Rather think at the document "Oracle9i Database Concepts" and it's index.
    Let's say you want to find all indexed words larger then "ISO SQL standard" (ok that doesn't make sense but it is just an example). So would it be faster to read the whole document or to lookup each word in the index and then read the entire page (Oracle block) to find the word.
    It's not allways easy to know in advance if the query will be faster over the index or a full table scan. what you need to do is to well analyze (dbms_stats) the table and it's index, in most cases Oracle chooses the right way. You may also use the hint /*+ index(table_name index_name) */ and will see if it would be faster over the index or not.
    A good document about that subject is:
    http://www.ioug.org/tech/IOUGinDefense.pdf
    HTH
    Maurice

  • How To Create Table with Static Data

    JDEV 10.1.3
    ADF BC
    ADF Faces
    I am trying to make some simple screen/screenflow diagrams to help flesh out some requirements. To do that, I need to make a table with static data that is not hooked up to a data source (because the data model has not yet been clearly defined, and I'm using the diagrams to help iterate the requirements).
    Is it possible to create a table that shows static data (i.e. a set of rows that does not come from a model data source, but rather is hardcoded.)
    If not, how does one create mock ups without actually implementing the data model?
    Thank you.

    Deepak, what specifically in those 2 links from Amis are useful? Those 2 posts are about bind variables, not static list of values?
    In response to the original poster, I'll attempt to help a little more.
    In the 11g release you can create VOs based on a static list of values. However in your case on 10.1.3, the best method I've found is to create a VO based on a SELECT <columns> FROM DUAL statement. The columns then include your dummy data. If you need more than one row, simply UNION ALL a number of SELECT statements together.
    What I haven't checked, is when you eventually transform the VO based on the SELECT DUAL statement into a VO based on an EO drawing real data from the database, is it an easy process? I recommend you try this out before committing to the approach above. Let us know how you go.
    Regards,
    CM.

  • How to compare two rows from two table with different data

    how to compare two rows from two table with different data
    e.g.
    Table 1
    ID   DESC
    1     aaa
    2     bbb
    3     ccc
    Table 2
    ID   DESC
    1     aaa
    2     xxx
    3     ccc
    Result
    2

    Create
    table tab1(ID
    int ,DE char(10))
    Create
    table tab2(ID
    int ,DE char(10))
    Insert
    into tab1 Values
    (1,'aaa')
    Insert
    into tab1  Values
    (2,'bbb')
    Insert
    into tab1 Values(3,'ccc')
    Insert
    into tab1 Values(4,'dfe')
    Insert
    into tab2 Values
    (1,'aaa')
    Insert
    into tab2  Values
    (2,'xx')
    Insert
    into tab2 Values(3,'ccc')
    Insert
    into tab2 Values(6,'wdr')
    SELECT 
    tab1.ID,tab2.ID
    As T2 from tab1
    FULL
    join tab2 on tab1.ID
    = tab2.ID  
    WHERE
    BINARY_CHECKSUM(tab1.ID,tab1.DE)
    <> BINARY_CHECKSUM(tab2.ID,tab2.DE)
    OR tab1.ID
    IS NULL
    OR 
    tab2.ID IS
    NULL
    ID column considered as a primary Key
    Apart from different record,Above query populate missing record in both tables.
    Result Set
    ID ID 
    2  2
    4 NULL
    NULL 6
    ganeshk

  • Sample report for filling the database table with test data .

    Hi ,
    Can anyone provide me sample report for filling the database table with test data ?
    Thanks ,
    Abhi.

    hi
    the code
    data : itab type table of Z6731_DEPTDETAIL,
           wa type Z6731_DEPTDETAIL.
    wa-DEPT_ID = 'z897hkjh'.
    wa-DESCRIPTION = 'computer'.
    append wa to itab.
    wa-DEPT_ID = 'z897hkjhd'.
    wa-DESCRIPTION = 'computer'.
    append wa to itab.
    loop at itab into wa.
    insert z6731_DEPTDETAIL from wa.
    endloop.
    rewards if helpful

  • Web Analysis : populate the same table with multiple data sources

    Hi folks,
    I would like to know if it is possible to populate a table with multiple data sources.
    For instance, I'd like to create a table with 3 columns : Entity, Customer and AvgCostPerCust.
    Entity and Customer come from one Essbase, AvgCostPerCust comes from HFM.
    The objective is to get a calculated member which is Customer * AvgCostPerCust.
    Any ideas ?
    Once again, thanks for your help.

    I would like to have the following output:
    File 1 - Store 2 - Query A + Store 2 - Query B
    File 2 - Store 4 - Query A + Store 4 - Query B
    File 3 - Store 5 - Query A + Store 5 - Query B
    the bursting level should be give at
    File 1 - Store 2 - Query A + Store 2 - Query B
    so the tag in the xml has to be split by common to these three rows.
    since the data is coming from the diff query, and the data is not going to be under single tag.
    you cannot burst it using concatenated data source.
    But you can do it, using the datatemplate, and link the query and get the data for each file under a single query,
    select distinct store_name from all-stores
    select * from query1 where store name = :store_name === 1st query
    select * from query2 where store name = :store_name === 2nd query
    define the datastructure the way you wanted,
    the xml will contain something like this
    <stores>
    <store> </store> - for store 2
    <store> </store> - for store 3
    <store> </store> - for store 4
    <store> </store> - for store 5
    <stores>
    now you can burst it at store level.

  • Copying database table with data

    Dear Team,
    I want to copy a database table with all entries of that table. example: if i am copying MARA table as ZMARA i want all the entries of MARA should also exist in ZMARA.
    Is it possible to do this without writing any piece of code.
    Regards,
    N.Jain

    Hi,
    It is not possible to do the same without coding.
    Regards,
    Sriram
    Edited by: Srirama Murthy Maddirala on Jun 4, 2008 11:14 AM

  • Error while accessing BSAD Table with dunning date

    Hi ,
    I developed a report for FI module accessing BSAD table with default customer ranges and for specific dunning dates - It ran for a very long time and timed out - (I know this is due to huge volume of data) -
    Is there any way to access BSAD table easily with Dunning dates (Other than creating Index on it) ???
    Or any standard function module available ??
    Regards
    Rajesh.

    Hi
    Try the below tables for the dunning data details:
    MHND            Dunning Data
    MHNDO           Dunning data version before the next change
    MHNK            Dunning data (account entries)
    MHNKA           Version administration of dunning changes
    MHNKO           Dunning data (acct entries) version before the next chang
    SKS

Maybe you are looking for