Possible to export oracle 8.1.6 solaris/import 8.1.5 linux?

I tried to export a tablespace from 8.1.6 solaris and import into 8.1.5 on linux. I get IMP0010 error complaining that the header is invalid or corrupt. Is this sort of export/import allowed?

<BLOCKQUOTE><font size="1" face="Verdana, Arial">quote:</font><HR>Originally posted by Jason Pepper ([email protected]):
You can go from 815 on any OS to 816 on any OS.
You can go from 816 to 815 if you use the 816 import utility.
You cannot use the 815 import utility to import 816 files.
<HR></BLOCKQUOTE>
Well, I did just that...
Oracle 8i(2), Windows NT, SP5 - I needed all of the data from several tables in this 8.1.6 db...they needed to go on my Linux (2.2.14, RedHat, Oracles 8.1.5.2) server...
Solution: On the Linux (8.1.5.2) box, I ran the export utility...connected as:
user@remote_sid.domain.com, pwd and exported the user/tables...then I used the import utility on the 8.1.5.2 server to import them...no problems at all...all triggers, procs and views showed up...I did have to recreate the sequences and recompile the triggers, but it came off without a hitch
null

Similar Messages

  • Is it possible to export my bookmarks from Firefox and import to another instance of the browser but without any type of cookie being imported?

    For convenience I want to have my already created bookmarks imported into another Firefox user's browser. For security reasons I don't want any cookies or flash cookies to be imported when I do this and I don't want to Sync browsers. Is this possible? If so, how, or are cookies not part of the exporting of bookmark data anyway?

    You can transfer bookmarks using two different formats:
    * Backup format: if you want to preserve tags, this format can do that. However, the Restore process completely replaces all existing bookmarks with what is in the backup. If you want to merge different sets of bookmarks together, do not use this method. More info:
    ** [[Restore bookmarks from backup or move them to another computer]]
    * Export format: with this option, Firefox creates an HTML file with an ancient format compatible with most browsers. It is not as complete as the newer Backup format. More info:
    ** [[Export Firefox bookmarks to an HTML file to back up or transfer bookmarks]]
    ** [[Import Bookmarks from a HTML file]]
    Neither of these files contains cookies or passwords.

  • Is it Possible to install oracle 9i  32 bit on Sun Solaris Intel Edition

    Dear OTN Members ,
    It is possible to install ORACLE 9I Sun SPARC Solaris (32 Bit) on
    Sun Solaris Intel Editon 2.8 . Please inform me on
    email :-
    [email protected]
    [email protected]
    Thanking You
    Piyush Patel
    - Server name :- pi.com
    - Filename
    - Date/Time
    - Browser + Version : Netscape 4.7
    - O/S + Version : Sun Solaris Intel Edition 2.8

    857211 wrote:
    I just need some advice on installing oracle 11gWhat part/product of "oracle 11g" exactly?
    Installation Guides should be clear enough, if read. Also read Release Notes for additional support/unsupport info.
    http://docs.oracle.com/cd/E11882_01/install.112/e24186/reqs.htm#CHDHGGFE
    http://docs.oracle.com/cd/E11882_01/install.112/e24187/pre_install.htm
    However the Installation Guide for Database Client adds:
    "Note: Oracle provides 32-bit (Windows x86) and 64-bit (Windows x64) versions of Oracle Database Client. _Oracle certifies 32-bit Oracle Database Client on Windows x64_." (underline added)

  • Is it possible to export (expdp) oracle table from client machine ?

    Is it possible to export (expdp*) oracle table from client machine, without using Remote desk top connection ?*
    1) Database-10g server is in America
    2) client machine is in India
    3) Oracle client-10g software is installed in my machine
    4) I am able to connect to the server, could see all tables
    5) one table is with 35 million record, I wanted to export this table using data pump, is it possible?
    note: without using Remort desk top connection.

    Hi
    I used the following syntax , but I am getting error
    6) connect sys as sysdba
    create directory test_dir as '/dbusr1/exp_test'
    grant read, write on directory test_dir to user1;
    7) expdp user1/pwd1@xyz tables=test_1 directory=test_dir dumpfile=test_file.dmp logfile=test_file.log
    8) Errors
    In linux prompt I am getting this error
    -bash: expdp: command not found
    In windows cmd prompt I am getting this error
    ORA-39002: invalid operation
    ORA-39070: Unable to open the log file.
    ORA-29283: invalid file operation
    ORA-06512: at "SYS.UTL_FILE";, line 488
    ORA-29283: invalid file operation
    9) why ?
    -- Does the user ( "user1") should have any system level privilege ? to export his own table?
    -- This folder ('/dbusr1/*exp_test'*) has write privilege in os level.
    Thanks in advance
    sbmk_design
    Edited by: sbmk_design on Aug 24, 2009 12:06 AM

  • How can I install Oracle Database 10g for Solaries (SPARC) from the console

    Dear Forum Members,
    In my office, I have to installed Oracle Database 10g for Solaries (SPARC). But I have to do it without DISPLAY Monitor.Is it possible install it by remote login to this server using response file (silent mode) or something like that?
    If yes. Then How?
    If anyone have the exact solution, then I need your feedback. I shall wait for your reply.
    Thanks
    Aungshuman Paul

    There are 2 possible ways to accomplish this.
    First,
    Silent installation
    http://www.informit.com/articles/article.asp?p=174771&rl=1
    Second, (cut/paste from other site)
    How to install Oracle software remotely?
    Remote Software Installation Steps: (For Solaris only)
    If you want to install Oracle Software remotely, you should perform the following steps. These steps are applicable only if your source and target machine are running Unix.
    For example, you can install Oracle Software from your home from Washington, DC to a target source in California.
    1. Pick your source server or machine for remote installation.
    2. Check that your CD is in your source CD-ROM drive.
    3. On the target machine, find your target machine name with the output of the /usr/bin/hostname
    4. On the source machine, login as a user.
    5. On the source machine, enable client access: % /usr/openwin/bin/xhost + target-machine-name
    6. Become root user by typing: su (don’t use -)
    7. Check that Volume Manger is running. # ps –ef |grep vold (if you see an entry that contains /usr/sbin/vold, Volume Manager is running. Then skip to Step 10.
    8. If not then do the following: # mkdir –p /cdrom/your-cd-file-name
    9. # mount –F hsfs –r cdrom-device /cdrom/your-cd-file-name
    10. Add the following line to your /etc/dfs/dfstab file: # share –F nfs –o ro /cdrom/your-cd-file-name
    11. Verify whether your source machine is an NFS server: # ps –ef | grep nfsd
    12. If you see an entry that contains /use/lib/nfs/nfsd –a 16, then nfsd is running and skip to Step 16.
    13. If nfsd is running, then type: # /usr/sbin/shareall
    14. If nfsd is not running, then start nfsd by typing: # /etc/init.d/nfs.server start
    15. Verify whether your source machine is an NFS server again by typing: # ps –ef | grep nfsd
    16. Make sure your source machine is exporting your product directory by typing: # /usr/sbin/dfshares
    17. Now, log in to the target machine by type: # rlogin target-machine-name –l user (not root)
    18. Then log in as the root user by typing: # su
    19. Go to the source machine by typing: # cd /net/source-machine/cdrom/your-cd-file-name ,then Skip to 24.
    20. If you cannot change to that directory in Step 19 and you do not have an auto-mounter on your network, then create amount point by typing the following commands.
    21. # mkdir /remote_products
    22. # /usr/sbin/mount –F nfs –r source-machine:/cdrom/your-cd-file-name /remote_products
    23. # cd /remote_products
    24. Redirect the target machine display to source machine by typing: # DISPLAY=source-machine:0; export DISPLAY (if you use a Bourne or Korn shell).
    25. Start the Web Start Installer by typing: # ./installer (or whatever the installer name program is).

  • Recommendations - Oracle RAC 10g on Solaris 10 Containers Logical/Local..

    Dear Oracle Experts et all
    I have a couple of questions for Oracle 10g RAC implementation on Solaris and seek your advice. we are attempting to implement oracle 10g RAC on Solaris OS and SPARC Platform.
    1 We are wondering if Oracle 10g RAC could be implemented on Solaris Local/Logical Containers? I was assuming that Oracle will always link it self with OS binaries and Libraries while S/W installation and hence will need an OS image/Root Disk over which it could go. However, in containers, I assume we have a single solaris installation and configuration which will thus be shared to the containers which will be further configured in it. In such situations how does Oracle instalation proceed? Do I need to look at a scenario where, the global Container/Zone will have Oracle install and this image be shared across to zones/containers accordingly? If it is so, what all filesystems from OS will need to be shared across to these zones/containers?
    Additionally, even if this approach is supported, is it a recommended approach? I am unsure about the stability and functionality of Oracle in such cases and am not able to completly conceptualize. However, I assume there could be certain items which needs to be approprietly taken care off. It will help if you could share observations from your experiences.
    2 The idea of RAC we are looking at is to have multiple Oracle Installations on top of native clustering solution say veritas clusters/Sun Clusters. Do we still need to have Oracle Cluster solution Clusterware (ORACRS) on top of this to achieve Oracle Clustering? Will I be able to install Oracle as a standalone installation on top of native clustering solution say veritas clusters/Sun Clusters?
    Our requirement is to have the above mentioned multiple Oracle installations spread across two (2) seperate H/W platforms,say Node A and Node B, and configure our Cluster Solution to behave as active-passive across Node A and Node B. In other words, I will configure Clustering Solution like VRTS/SunCluster in Active-Passive, then have 3 Oracle installations on Node A, another 3 on Node B. I will configure one database each for each of these Oracle S/W installation (with an idea not to have Clusterware between clustering solution VRTS/SunCluster and Oracle installation, if it works). Now I will run 3 databases thus on each of these nodes. If any downtime happens on any one of the nodes, say Node A, I will fail all oracle databases and S/W accordingly to the alternate available node, Node B in this case, using native clustering solution and I will want the database to behave as it was behaving earlier, on Node A. I am not sure though if I will be able to bring the database up on Node B when resources in OS perspective are failed over.
    we want to use Oracle 10g RAC Release 2 EE on Solaris 10 OS latest/one before the latest release.
    Please share your thoughts.
    Regards!
    Sarat

    Sarat Chandra C wrote:
    Dear Oracle Experts et all
    I have a couple of questions for Oracle 10g RAC implementation on Solaris and seek your advice. we are attempting to implement oracle 10g RAC on Solaris OS and SPARC Platform.
    1 We are wondering if Oracle 10g RAC could be implemented on Solaris Local/Logical Containers? My understanding is that RAC in a Zone (Container) is not supported by Oracle, and will not work anyway. Regardless of installation, RAC needs to do cluster level stuff about the cluster configuration, changing network addresses dynamically, and sending guaranteed messages over the cluster interconnect. None of this stuff can be done in a Local Zone in Solaris, because Local Zones have fewer permissions that the Global Zone. This is part of the design of Solaris Zones, and nothing to do with how Oracle RAC itself works on them.
    This is all down to the security model of Zones, and Local Zones lack the ability to do certain things, to stop them reconfiguring themselves and impacting other Zones. Hence RAC cannot do dynamic cluster reconfiguration in a Local Zone, such as changing virtual network addresses when a node fails.
    My understanding is that RAC just cannot work in a Local Zone. This was certainly true 5 years ago (mid 2005), and was a result of the inherent design and implementation of Zones in Solaris. Things may have changed, so check the Solaris documentation, and check if Oracle RAC is supported in Local Zones. However, as I said, this limitation was inherent in the design of Zones, so I do not see how Sun could possibly have changed it so that RAC would work in a Local Zone.
    To me, your only option is the Global Zone. Which pretty much destroys the argument for having Zones on a Solaris system, unless you can host other non-Oracle application on the other Zones.
    2 The idea of RAC we are looking at is to have multiple Oracle Installations on top of native clustering solution say veritas clusters/Sun Clusters. Do we still need to have Oracle Cluster solution Clusterware (ORACRS) on top of this to achieve Oracle Clustering? Will I be able to install Oracle as a standalone installation on top of native clustering solution say veritas clusters/Sun Clusters?I am not sure the term 'native' is correct. All 'Cluster' software is low level, and has components that run within the operating system. Whether this is Sun Cluster, Veritas Cluster Server, or Oracle Clusterware. They are all as 'native' to Solaris as each other. They all perform the same function for Oracle RAC around Cluster management - which nodes are members of the cluster, heartbeats between nodes, reliable fast message delivery, etc.
    You only need one piece of Cluster software. So pick one and use it. If you use the Sun or Veritas cluster products, then you do not need the Oracle Clusterware software. But I would use it, because it is free (included with RAC), is from Oracle themselves and so guaranteed to work, is fully supported, and is one less third party product to deal with. Having an all Oracle software stack makes things simpler and more reliable, as far as I am concerned. You can be sure that Oracle will have fully tested RAC on their own Clusterware, and be able to replicate any issues in their own support environments.
    Officially the Sun and Veritas products will work and are supported. But when you get a problem with your Cluster environment, who are you going to call? You really want to avoid "finger pointing" when you have a problem, with each vendor blaming the cause of the problem on another vendor. Using an all Oracle stack is simpler, and ensures Oracle will "own" all your support problems.
    Also future upgrades between versions will be simpler, as Oracle will release all their software together, and have tested it together. When using third party Cluster software, you have to wait for all vendors to release new versions of their own software, and then wait again while it is tested against all the different third party software that runs on it. I have heard of customers stuck on old versions of certain cluster products, who cannot upgrade because there are no compatible combinations in the support matrices between the cluster product and Oracle database versions.
    I will configure Clustering Solution like VRTS/SunCluster in Active-Passive, then have 3 Oracle installations on Node A, another 3 on Node B. As I said before, these 3 Oracle installations will actually all be on the same Global Zone, because RAC will not go into Local Zones.
    John

  • Is it possible to export tables from diffrent schema using expdp?

    Hi,
    We can export tables from different schema using exp. Ex: exp user/pass file=sample.dmp log=sample.log tables=scott.dept,system.sales ...But
    Is it possible in expdp?
    Thanks in advance ..
    Thanks,

    Hi,
    you have to use "schemas=user1,user2 include=table:"in('table1,table2')" use parfileexpdp scott/tiger@db10g schemas=SCOTT include=TABLE:"IN ('EMP', 'DEPT')" directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expdpSCOTT.log{quote}
    I am not able to perform it using parfile also.Using parfile it shows "UDE-00010: multiple job modes requested, schema and tables."
    When trying the below, i get error
    {code}
    bash-3.00$ expdp directory=EXP_DUMP dumpfile=test.dmp logfile=test.log SCHEMAS=(\'MM\',\'MMM\') include=TABLE:\"IN\(\'EA_EET_TMP\',\'WS_DT\'\)\"
    Export: Release 10.2.0.4.0 - 64bit Production on Friday, 15 October, 2010 18:34:32
    Copyright (c) 2003, 2007, Oracle. All rights reserved.
    Username: / as sysdba
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Starting "SYS"."SYS_EXPORT_SCHEMA_01": /******** AS SYSDBA directory=EXP_DUMP dumpfile=test.dmp logfile=test.log SCHEMAS=('MM','MMM') include=TABLE:"IN('EA_EET_TMP','WS_DT')"
    Estimate in progress using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 0 KB
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "MM"."EA_EET_TMP" 0 KB 0 rows
    ORA-39165: Schema MMM was not found.
    Master table "SYS"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
    Dump file set for SYS.SYS_EXPORT_SCHEMA_01 is:
    /export/home/nucleus/dump/test.dmp
    Job "SYS"."SYS_EXPORT_SCHEMA_01" completed with 1 error(s) at 18:35:19
    {code}
    When checking expdp help=y shows :-
    {code}TABLES Identifies a list of tables to export - one schema only.{code}
    As per few testing,tables from different schemas are not possible to export using expdp in a single command.
    Anand

  • Oracle RAC 10g on Solaris 10 in a non-global zone

    I need to run Oracle RAC 10g on Solaris 10 in a non-global zone as I must cap the CPUs used for Oracle licensing limitations. My question is a simple one, but one for which I'm getting conflicting information depending upon whom I ask.
    If I want to run RAC in a non-global zone on two nodes, does this require the use of Solaris Cluster?
    I know there are good reasons to use Solaris Cluster, but the company for which I work cannot afford the additional expense of Solaris Cluster at this time. Is it possible to run Oracle RAC 10g in a capped container without Solaris Cluster or is Solaris Cluster absolutely required?
    Thanks in advance for any insight you can provide.

    AFAIK, Oracle 10g RAC is not supported in solaris containers.
    It is however supported in Solaris zone clusters...in order to use it, you would have to use Sun Cluster 3.2 (iinm).

  • Migration Oracle 8.17 from Solaris 8 to Solaris 10

    Is it possible to migrate Oracle 8.1.7 from Solaris 8 to Solaris 10? I find out the papers or the practices from internet that were done before. Do you have any documents said that or gurantee success?

    Perhaps contact Oracle.
    http://www.oracle.com
    Maybe they will know how to make version 8.1.7 work with newer
    distributions of various OS's.
    After all, it is their product, not Sun's product.

  • DTrace probes for oracle database 10g in solaris 10

    Hi guys,since a mounth i`ve learnt about solaris DTrace and its D scripts and tried to look for probes for administrating oracle database but til now,nothing!so my question,does there DTrace probes for oracle application ?i really need it now,that`s my project:tracing oracle with DTrace in solaris 10 SPARC!anyone can help me pleaaase!!!

    Hey!!of course that's a great site but U know,i've already visit it and it doesn't talk about probes for oracle!!howeiver i thought about another option,how do u think about exploiting oracle instance...i mean,do u think it's possible to monitoring oracle processes (LWGR,PMON,DBWR,SMON,...) with DTrace by using providers?like fbt or io,i don't know much!!remember,the aim is monitoring oracle database performance !!
    regards!

  • How to export oracle schema with only TABLE and TABLE DATA

    Dear All,
    Is possible to export (expdp) all tables in a schema?
    Thank you in advance

    Hi, Yes it is possible:
    For example:
    expdp user/password direcory=my_dir dumfile=my_dump.dmp tables=schama1.table1,schema1.table2,etc content=metadata_only include=table
    OR
    expdp system/pass directory=DATA_PUMP dumpfilr=test.dmp log file=test.log schamas=Scahems_name
    https://forums.oracle.com/message/4103671#4103671
    https://forums.oracle.com/thread/1087857?start=0&tstart=0
    ORACLE-BASE - Oracle Data Pump (expdp and impdp) in Oracle Database 10g

  • How to start Oracle Enterprise Manager(Oracle 9i) on Sun Solaris 9 platform

    Hi
    How to start Oracle Enterprise Manager(Oracle 9i) on Sun Solaris 9 platform and Oracle Enterprise Manager(Oracle 10G) on Sun Solaris 10 platform?
    Thanks.
    RJ.

    I need to use it to unlock some accounts.There is a SQL command to do that
    SQL> alter user <username> account unlock;
    Anyway...
    $ export ORACLE_SID=chucky
    $ emctl start dbconsole

  • Export Oracle Databaes Objects into sql file

    Hi Experts,
    I searched and could not find anything whether is that possible to dump oracle database into a sql file.
    Can some one clarify this?
    Thanks,
    Dharan V

    Hi,
    Still struggling here,
    CREATE OR REPLACE DIRECTORY DUMP_DIR AS 'c:\';
    GRANT DATAPUMP_EXP_FULL_DATABASE TO SCOTT;
    GRANT DATAPUMP_IMP_FULL_DATABASE TO SCOTT;
    EXPDP SCOTT@LOCAL-DB directory=DUMP_DIR dumpfile=scott.dmp content=metadata_only Full=Y
    password: tiger
    C:\Documents and Settings\Dharan>expdp SCOTT@LOCAL-DB directory=DUMP_DI
    R dumpfile=scott.dmp content=metadata_only Full=Y
    Export: Release 11.1.0.6.0 - Production on Sunday, 31 January, 2010 16:48:42
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    Password:
    Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - Produc
    tion
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Starting "SCOTT"."SYS_EXPORT_FULL_01":  SCOTT/********@LOCAL-DB directo
    ry=DUMP_DIR dumpfile=scott.dmp content=metadata_only Full=Y
    Processing object type DATABASE_EXPORT/TABLESPACE
    Processing object type DATABASE_EXPORT/PROFILE
    Processing object type DATABASE_EXPORT/SYS_USER/USER
    Processing object type DATABASE_EXPORT/SCHEMA/USER
    Processing object type DATABASE_EXPORT/ROLE
    Processing object type DATABASE_EXPORT/GRANT/SYSTEM_GRANT/PROC_SYSTEM_GRANT
    Processing object type DATABASE_EXPORT/SCHEMA/GRANT/SYSTEM_GRANT
    Processing object type DATABASE_EXPORT/SCHEMA/POST_SCHEMA/PROCACT_SCHEMA
    Processing object type DATABASE_EXPORT/AUDIT
    Master table "SCOTT"."SYS_EXPORT_FULL_01" successfully loaded/unloaded
    Dump file set for SCOTT.SYS_EXPORT_FULL_01 is:
      C:\SCOTT.DMP
    Job "SCOTT"."SYS_EXPORT_FULL_01" successfully completed at 16:51:20But i don't see any scott.dmp in either c:\ [on newly created directory]
    OR C:\Documents and Settings\Dharan.
    Ok...i Tried similarly the second one now
    C:\Documents and Settings\Dharan>impdp SCOTT@LOCAL-DB directory=DUMP_DI
    R dumpfile=scott.dmp content=metadata_only Full=Y
    Export: Release 11.1.0.6.0 - Production on Sunday, 31 January, 2010 16:48:42
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    Password:
    Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - Produc
    tion
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "SCOTT"."SYS_SQL_FILE_FULL_01" successfully loaded/unloaded
    Starting "SCOTT"."SYS_SQL_FILE_FULL_01":  SCOTT/********@LOCAL-DB dire
    tory=DUMP_DIR dumpfile=scott.dmp SQLFILE=SCOTT.sql
    Processing object type DATABASE_EXPORT/TABLESPACE
    Processing object type DATABASE_EXPORT/PROFILE
    Processing object type DATABASE_EXPORT/SYS_USER/USER
    Processing object type DATABASE_EXPORT/SCHEMA/USER
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/POST_INSTANCE/PROCDEPOBJ
    Processing object type DATABASE_EXPORT/SCHEMA/POST_SCHEMA/PROCOBJ
    Processing object type DATABASE_EXPORT/SCHEMA/POST_SCHEMA/PROCACT_SCHEMA
    Processing object type DATABASE_EXPORT/AUDIT
    Job "SCOTT"."SYS_SQL_FILE_FULL_01" successfully completed at 17:13:35Even now i can't see any .sql file either c:\ [on newly created directory]
    OR C:\Documents and Settings\Dharan.
    Suggest what am doing wrong here.
    Thanks,
    Dharan V

  • Is it possible to export a Graph?

    I have requirement, where a Oracle page has data related to two View objects.One View object is displayed in Table another is represented in Graph.
    When I try to export all the VO's in an excel it's not exporting Graphs.
    Is it possible to export Graphs in Excel?
    If so, can anyone help me how to do it.
    Thanks,
    Ranjitha

    You could try using Network Shared Variables to pass the information into a chart. You'll only need a basic WLAN setup on the mobile device. (This concept depends on compatibility of shared variables with the PDA release of LabVIEW, which I'm not certain about).
    Failing that, you could set up a simple UDP broadcast of the data from the PC)
    Thoric (CLA, CLED, CTD and LabVIEW Champion)

  • Export Oracle Long columns

    Is it possible to export data from one table that contains a LONG data type column?
    If so, HOW?
    Ive been trying but it doesn't export any data
    null

    The link you've mentioned doesn't say that Oracle long is not supported. In
    fact it is supported and the corresponding java data-type for WLS is
    Longvarchar.
    For more on this you can refer to
    http://e-docs.bea.com/wls/docs61//oracle/advanced.html#1158561 under "Data
    Types". Here you'll find elaborate discussion on Java supported data-types
    for Oracle.
    Regards,
    Santanu
    "Gregory Gerard" <[email protected]> wrote in message
    news:3c1f0784$[email protected]..
    I noticed that the Oracle LONG is not supported in
    http://edocs.bea.com/wls/docs61/ejb/cmp.html#1059575
    but doesn't seem to cause an error when I use it in the descriptors or
    column types.
    Am I going to be bopped on the head down the road with hidden issues? I'm
    trying to store arbitrarily (well, reasonably so) long strings? Should Iuse
    CLOBs instead? LONG are easier to work with other database tools which is
    why I'd prefer them but...
    thanks,
    greg

Maybe you are looking for