Import from 8 to 10g

Hi all.
I have a .DMP from Oracle 8 and I must import in 10g r2 and I have no idea whats wrong with my import..
I create the instance from the beginning and I still hve the same error.
I suppose the problem is TEMP tablespace but i dont know how to resolve this.
this is my .log file
Connected to: Oracle Database 10g Release 10.2.0.2.0 - Production
Export file created by EXPORT:V08.01.07 via conventional path
Warning: the objects were exported by EXPORTA, not by you
import done in US7ASCII character set and AL16UTF16 NCHAR character set
import server uses WE8ISO8859P1 character set (possible charset conversion)
export client uses WE8ISO8859P1 character set (possible charset conversion)
export server uses WE8ISO8859P1 NCHAR character set (possible ncharset conversion)
. importing SYSTEM's objects into SYSTEM
IMP-00017: following statement failed with ORACLE error 12913:
"CREATE TABLESPACE "TEMP" DATAFILE 'D:\ORACLE\ORADATA\INEVOLUP\TEMP01.DBF' "
"SIZE 1073741824 , 'D:\ORACLE\ORADATA\INEVOLUP\TEMP02.DBF' SIZE 1073741"
"824 DEFAULT STORAGE(INITIAL 65536 NEXT 65536 MINEXTENTS 1 MAXEXTENTS"
" 2147483645 PCTINCREASE 0) ONLINE TEMPORARY "
IMP-00003: ORACLE error 12913 encountered
ORA-12913: Cannot create dictionary managed tablespace
IMP-00017: following statement failed with ORACLE error 3249:
"CREATE TABLESPACE "TOOLS" DATAFILE 'D:\ORACLE\ORADATA\INEVOLUP\TOOLS01.DBF"
"' SIZE 209715200 DEFAULT STORAGE(INITIAL 32768 NEXT 32768 MINEXTENTS"
" 1 MAXEXTENTS 4096 PCTINCREASE 0) ONLINE PERMANENT "
IMP-00003: ORACLE error 3249 encountered
ORA-03249: Uniform size for auto segment space managed tablespace should have atleast 5 blocks
IMP-00017: following statement failed with ORACLE error 3249:
"CREATE TABLESPACE "INDX" DATAFILE 'D:\ORACLE\ORADATA\INEVOLUP\INDX01.DBF' "
"SIZE 2500M , 'D:\ORACLE\ORADATA\INEVOLUP\INDX02.DBF' SIZE 2500M "
"DEFAULT STORAGE(INITIAL 16384 NEXT 16384 MINEXTENTS 1 MAXEXTENTS 5000 PCTI"
"NCREASE 0) ONLINE PERMANENT "
IMP-00003: ORACLE error 3249 encountered
ORA-03249: Uniform size for auto segment space managed tablespace should have atleast 5 blocks
IMP-00017: following statement failed with ORACLE error 24344:
"BEGIN SYS.DBMS_PSWMG_IMPORT.IMPORT_PSW_VERIFY_FN(' P_USER_GRAL ', ' VERIF"
"Y_FUNCTION ', ' username varchar2,password varchar2,old_password varchar2) "
"RETURN boolean IS"
" m integer;"
" ischar boolean;"
" chararray varchar2(52);"
"BEGIN"
" chararray:= ''abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ'';"
" -- Check if the password is same as the username"
" IF password = username THEN"
" raise_application_error(-20001, ''La clave no puede ser igual al usuar"
"io'');"
" END IF;"
" -- Check for the minimum length of the password"
" IF length(password) < 8 THEN"
" raise_application_error(-20002, ''La clave no puede ser menor de 8 ca"
"racteres'');"
" END IF;"
" -- 2. Check for the character"
" ischar:=FALSE;"
" m := length(password);"
" FOR i IN 1..length(chararray) LOOP"
" FOR j IN 1..m LOOP"
" IF substr(password,j,1) = substr(chararray,i,1) THEN"
" ischar:=TRUE;"
" GOTO findpunct;"
" END IF;"
" END LOOP;"
" END LOOP;"
" IF ischar = FALSE THEN"
" raise_application_error(-20003, ''La clave debe ser alfanumerica'');"
" END IF;"
" <<findpunct>>"
" -- Everything is fine; return TRUE ;"
" RETURN(TRUE);"
"END;"
"'); END;"
IMP-00003: ORACLE error 24344 encountered
ORA-24344: success with compilation error
ORA-06512: at "SYS.DBMS_PSWMG_IMPORT", line 100
ORA-06512: at line 1
IMP-00017: following statement failed with ORACLE error 28004:
"ALTER PROFILE "P_USER_GRAL" LIMIT PASSWORD_VERIFY_FUNCTION "VERIFY_FUNCTION"
IMP-00003: ORACLE error 28004 encountered
ORA-28004: invalid argument for function specified in PASSWORD_VERIFY_FUNCTION VERIFY_FUNCTION
IMP-00017: following statement failed with ORACLE error 28004:
"ALTER PROFILE "P_USER_ADMIN" LIMIT PASSWORD_VERIFY_FUNCTION "VERIFY_FUNCTIO"
"N""
IMP-00003: ORACLE error 28004 encountered
ORA-28004: invalid argument for function specified in PASSWORD_VERIFY_FUNCTION VERIFY_FUNCTION
IMP-00017: following statement failed with ORACLE error 959:
"ALTER USER "SYSTEM" IDENTIFIED BY VALUES 'F5145E3A6C640AC2' DEFAULT TABLESP"
"ACE "TOOLS" TEMPORARY TABLESPACE "TEMP" PROFILE "P_USER_ADMIN""
IMP-00003: ORACLE error 959 encountered
ORA-00959: tablespace 'TOOLS' does not exist
IMP-00017: following statement failed with ORACLE error 959:
"CREATE USER "TOAD" IDENTIFIED BY VALUES '4759257F78A8B5A3' DEFAULT TABLESPA"
"CE "TOOLS" TEMPORARY TABLESPACE "INDX" PROFILE "P_USER_ADMIN""
IMP-00003: ORACLE error 959 encountered
ORA-00959: tablespace 'TOOLS' does not exist
IMP-00017: following statement failed with ORACLE error 959:
Thanks a lot....!!!

What's your import command used?
Oracle is trying to create tablespace during import if they don't exist. However the tablespace in 8i were using dictionary managed tablespace which is depreciate in 10g. Further more, if your target OS is different from source dmp file. the filename syntax won't be correct. For example, if you create tablespace on linux using windows path it will fail.
In your case, you'd better precreate the tablespaces first to avoid the errors. Because the create tablespace failed all succeeding create table statements willl fail too.

Similar Messages

  • Oracle 11g- Straing behaviour of query after importing from Oracle 10g

    Hi,
    I have a table in Oracle 10g as follows:
    Create Table xyz (col1 varchar2(50), col2 varchar2(50));
    With following Data
    Col1     Col2
    A     320
    A     110
    A     290
    A     380
    B     ABC
    B     256
    B     LMN
    I am running following Query
    select * from xyz
    Where Col1='A' and Col2=110
    It works fine. But when I export this table and import it in Oracle 11g. It says invlid identifier.
    But if I enclose 110 in single quotes. It works fine.
    Also If I recreate this table in Oracle 11g like
    Create table xyz1
    as select * from xyz;
    Now alos I am able to run this query smoothly.
    select * from xyz1
    Where Col1='A' and Col2=110
    What is wrong exporting this table from 10g to 11g.
    Any comments/suggestion??
    Aarbi

    The check in your where clause
    Col2=110Is comparing a string (Col2 is defined as a VARCHAR) with a numeric literal, so there will be an implicit conversion taking place from character to number. The query then fails due to the B LMN row when 'LMN' fails number conversion.
    I'm guessing there was there an index on the table in your 10g installation which would allow the query to be satisfied without checking the B ABC or B LMN rows but is not present or not used in the 11g installation so a full table scan results in an attempt to convert 'ABC' and 'LMN' to a number. Check the explain plans.
    Or it could even just be a difference in the order in which the two conditions in the where clause are evaulated between the two versions.
    The solution, as you have already found is to do a string comparision
    Col2='110'Edited by: Cyn on Dec 7, 2009 12:38 PM

  • Data pump import from 11g to 10g

    I have 2 database: first is 11.2.0.2.0 and second is 10.2.0.1.0
    In 10g i created database link on 11g
    CREATE DATABASE LINK "TEST.LINK"
    CONNECT TO "monservice" IDENTIFIED BY "monservice"
    USING '(DESCRIPTION = (ADDRESS_LIST = (ADDRESS = (PROTOCOL = TCP)(HOST = host)(PORT = port))) (CONNECT_DATA = (SID = sid)))';
    And execute this query for test dbLink which work fine:
    select * from v$[email protected];
    After it i try to call open function:
    declare
    h number;
    begin
    h := dbms_datapump.open('IMPORT', 'TABLE', 'TEST.LINK', null, '10.2');
    end;
    and get exception: 39001. 00000 - "invalid argument value"
    if i remove 'TEST.LINK' from the arguments it works fine
    Edited by: 990594 on 26.02.2013 23:41

    Hi Richard Harrison,
    result for import from 11g to 10g:
    impdp user/pass@dburl schemas=SCHEMANAME network_link=TEST_LINK version=10.2
    Connected to: Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
    ORA-39001: invalid argument value
    ORA-39169: Local version of 10.2.0.1.0 cannot work with remote version of 11.2.0.2.0
    result for export from 11g to 10g:
    expdp user/pass@dburl schemas=SCHEMANAME network_link=TEST_LINK version=10.2
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64 bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing option
    ORA-39006: internal error
    ORA-39065: unexpected master process exception in DISPATCH
    ORA-04052: error occurred when looking up remote object SYS.KUPM$MCP@TEST_LINK
    ORA-00604: error occurred at recursive SQL level 3
    ORA-06544: PL/SQL: internal, error, arguments: [55916], [], [], [], [], [], [], []
    ORA-06553: PLS-801: internal error [55916]
    ORA-02063: preceding 2 lines from TEST_LINK
    ORA_39097: Data Pump job encountered unexpected error -4052

  • Import from 10g to 11g

    Hi,
    we have a DB in 10g R2. We have a DB server in 11g on another physical machine. Can we import from export dumpfile of DB in 10g R2 to 11g ?
    Thanks.

    sure this is no problem. the fastest way would be, to import the data with datapump through a network_link.

  • Datapump network import from 10g to 11g database is not progressing

    We have an 11.2.0.3 shell database in RedHat Linux 5 64-bit platform, and we are pulling data from its 10g database (10.2.0.4) HP-UX platform using datapump via NETWORK_LINK. However, the import seems to be not progressing and completing. We have left it run for almost 13 hours and did not even import a single row to the 11g database. We even try to import only one table with 0 rows but still it is not completing, the logfile just continue to be lopping on the following:
    Worker 1 Status:
    Process Name: DW00
    State: EXECUTING
    Estimate in progress using BLOCKS method...
    Job: IMP_NONLONG4_5
    Operation: IMPORT
    Mode: TABLE
    State: EXECUTING
    Bytes Processed: 0
    Current Parallelism: 1
    Job Error Count: 0
    Worker 1 Status:
    Process Name: DW00
    State: EXECUTING
    We also see this:
    EVENT SECONDS_IN_WAIT STATUS STATE ACTION
    SQL*Net message from dblink 4408 ACTIVE WAITING IMP_NONLONG4_5
    Below is our par file:
    NETWORK_LINK=DATABASE_10G
    DIRECTORY=MOS_UPGRADE_DUMPLOC1
    LOGFILE=imp_nonlong_grp-4_5.log
    STATUS=300
    CONTENT=DATA_ONLY
    JOB_NAME=IMP_NONLONG4_5
    TABLES=SYSADM.TEST_TBL
    Any ideas? Thanks.

    Thanks a lot for all who have looked and responded to this, I appreciate you giving time for suggestions. As a recap, datapump export and import via dumpfile works on both 10g and 11g databases, we are only having issue pulling data from the 10g database by executing datapump network import (using dblink) from the 11g database.
    SOLUTION: The culprit was the parameter optimizer_features_enable='8.1.7' that is currently set on the 10g database, we have taken that out of the parameter file of the 10g database and network import worked flawlessly.
    HOW DID WE FIGURE OUT: We turned on a trace to the datapump sessions and trace file shows something about optimizer (see below). So we have taken one by one any parameter that has something to do with optimizer and found out it was optimizer_features_enable='8.1.7' parameter that when removed makes the network import successful.
    SELECT /* OPT_DYN_SAMP */ /*+ ALL_ROWS opt_param('parallel_execution_enabled',
    'false') NO_PARALLEL(SAMPLESUB) NO_PARALLEL_INDEX(SAMPLESUB) NO_SQL_TUNE
    */ NVL(SUM(C1),0), NVL(SUM(C2),0), NVL(SUM(C3),0)
    FROM
    (SELECT /*+ NO_PARALLEL("SYS_IMPORT_TABLE_02") INDEX("SYS_IMPORT_TABLE_02"
    SYS_C00638241) NO_PARALLEL_INDEX("SYS_IMPORT_TABLE_02") */ 1 AS C1, 1 AS C2,
    1 AS C3 FROM "DBSCHEDUSER"."SYS_IMPORT_TABLE_02" "SYS_IMPORT_TABLE_02"
    WHERE "SYS_IMPORT_TABLE_02"."PROCESS_ORDER"=:B1 AND ROWNUM <= 2500)
    SAMPLESUB
    SELECT /* OPT_DYN_SAMP */ /*+ ALL_ROWS IGNORE_WHERE_CLAUSE
    NO_PARALLEL(SAMPLESUB) opt_param('parallel_execution_enabled', 'false')
    NO_PARALLEL_INDEX(SAMPLESUB) NO_SQL_TUNE */ NVL(SUM(C1),0), NVL(SUM(C2),0)
    FROM
    (SELECT /*+ IGNORE_WHERE_CLAUSE NO_PARALLEL("SYS_IMPORT_TABLE_02")
    FULL("SYS_IMPORT_TABLE_02") NO_PARALLEL_INDEX("SYS_IMPORT_TABLE_02") */ 1
    AS C1, CASE WHEN "SYS_IMPORT_TABLE_02"."PROCESS_ORDER"=:B1 THEN 1 ELSE 0
    END AS C2 FROM "DBSCHEDUSER"."SYS_IMPORT_TABLE_02" "SYS_IMPORT_TABLE_02")
    SAMPLESUB

  • Importing from 8 to oracle 10g

    how can i export oracle 8 data to oracle 10g

    Hi,
    There is two ways to do it.
    1) export from oracle 8 database and import in to oracle10g database
    2) export oracle 8 data from oracle 10g database by using tns alias and import in to the oracle 10g database.
    example: form oracle 10g database
         (tnsalias= for connect to oracle 8 from oracle 10g)
         exp system/XXXX@<tnsalias> file=xyz.dmp log=xyz.log ......
    Alex

  • Problem when exporting and importing project from odi 10g to odi 11g

    Hi,
    I want to migrate my project from odi 10g to odi 11g.
    But when i am importing the interface then it is giving the error of mising references .
    I have exported the project(without its child component),models
    (including my datastore),KM's,folder (without its child component),packages(with child components),interaces(with child components),procedures(with child components),variables from odi 10g.
    After exporting all these objects i imported all the objects with import type set as "Synonym mode insert" into odi 11g but when i imported the interface it is giving the error of missing references.
    Source technolgy is Oracle and target technolgy is Postgres.
    Topologies have been made in the ODI 11g same as in ODI 10g.
    Please help.

    You dont need to migrate the complete repository. You can migrate a project at a time into ODI 11.1.1.5.x
    You have to be careful while importing. You have to follow a sequence when importing.
    Empty Project -> KMs -> Models (with DB Stores) -> Variable -> Empty Folders -> Interfaces -> Procedures -> Packages ---- All in SYNONYM mode insert (no exceptions)
    And your repository id in 11g MUST be different from the one in 10g.

  • Hebrew Characters...Chars display as junk after import from 8i to 10g

    Gurus,
    I have a problem with a customer upgrade...
    Background of the issue... the customer is an Agile PLM customer of version 8.5 (Agile PLM version 8.5). The database was hosted on oracle 8i. He is intending to upgrade from Agile 8.5 to Agile 9.2.2.4. During this process he has upgraded he db from Agile 8.5 to Agile 9.2.2.4, and has also shifted the DB platform from 8i to 10g.
    Problem: There were hebrew characters entered in Varchar2 columns (Oracle 8i), which after upgrade are not displaying correctly. Newly entered hebrew characters after upgrade display correctly in UI...
    Customer DB Parameters : The nls parameters on the source db before upgrade(8i) are American_America.WE8ISO8859P1, and the destination db parameters are American_America.UTF8.
    What i have done to deal with the issue: i have tried exporting the db using UTF8 and importing the db to 10g on UTF8, but still the characters show as garble characters..., have tried various options of exporting/importing using the combinations of WE8ISO8859P1 char set as well as IW8ISO8859P8 charsets, as i have learnt during my research abt the charsets that Hebrew Characters are supported in IW8ISO8859P8 charset and not WE8ISO8859P1. My suspicion here is that the problem is with the export and import from 8i to 10g, and the Char conversion which is happening during this process..(this is my guess and i might be wrong tooo...)
    Currently this is a hot issue with the customer, and needs an immediate fix (to display the Hebrew characters properly after upgrade)
    I am a layman on the NLS Settings and couldnt figure out what else to do....I would request all the Gurus out there to help us figure out the problem and try resolve it...
    Thanks for your Help in Advance
    Regards,
    Raja.

    Hebrew characters aren't supported using the ISO 8859-1 character set. In the original system, what must be happening is that the NLS_LANG on the client matches the database character set, which tells the Oracle client not to do character set conversion. This causes Oracle to treat character data just as a binary data stream and to store whatever bits the client sends. So long as the client is really sending ISO 8859-8 data, telling Oracle it is ISO 8859-1, and on the return side asking Oracle for ISO 8859-1 and treating it internally as ISO 8859-8 while Oracle is doing no character set conversions, this will appear to work. But the data in the database is fundamentally wrong. You can configure things so that you can, apparently, store Chinese data in a US7ASCII database using this sort of approach, but you end up with screwed up data.
    This sort of problem usually comes to light when you start using tools (like export) that don't know how to mis-identify the data they are sending and retrieving like your application is doing or when character set conversion is necessary. Since the data in the database isn't valid ISO 8859-1, Oracle has no idea how to translate it to UTF8 properly.
    As previously suggested, the safest option is to move the data with a solution that replicates the behavior of the application. So
    - Set the client NLS_LANG to match the database character set (WE8ISO8859P1)
    - Extract the data to a flat file using SQL*Plus or a C/C++ application
    - This data file will, presumably, really be ISO 8859-8 encoded
    - Use SQL*Loader to load the data into the UTF8 database, specifying the actual character set (ISO 8859-8) in the control file.
    If you're rather more adventurous and working with Oracle Support, it is potentially possible to change the character set of the source database from ISO 8859-1 to ISO 8859-8 and then export and import will work properly. But this would require some undocumented options that shouldn't be used without Oracle Support being involved and introduces a number of risks that the flat file option avoids.
    Justin

  • Import from 11G db to 10G dab

    Hi David,
    I would need to export 11G database tables and import them to 10G database. I have exported from 11G using exp command. The command I have used is:
    exp scott/tiger file=emp.dmp tables=emp,dept;
    Now I need to import emp.dmp in Oracle 10G. Would the imp command for Oracle 10G work in this case? Is the command I need to use as given below?
    imp scott/tiger file=emp.dmp full=yes
    Can you please help me in this regard.
    Thanks as always,
    ChD

    David,
    Thanks. This has been helpful. I am having some questions working with AWM.
    In my case I am having the following dimensions:
    Positon Dim
    Positon Dim Hierarchy (Nation--> Region--> District --> territory)
    Product Dim
    Product Dim Hierarchy (Market --> TA --> Brand --> Item)
    Customer Dim.
    Now in my reports I am shwing the Product and Positions from Hierarchies (not from the Dimensions). In this case do I need to specify aggregation based on the Position/Product Dims or on Position/Product Hierarchies only?
    Again, I am having some reports based on the Customer Dim. But Customer Dim has no Hierarchy. So do I need to aggregate the data based on the Customer Dim at all?
    Also please tell me whether I need to follow any specific order of dimensions in case of aggregation (or that is something AWM would itself take care)?
    Lastly, there are a lot of Calculations in the Aggregation drop down. Should I use 'Sum' while aggregating in this case and rolling up data based on Hierarchies?
    Thanks in advance.
    ChD

  • Export from Oracle 9i to be Imported into Oracle 10g on 32bit platform

    Hi,
    Currently, I am in the process of performing a Homogenous system copy
    due to a domain change in our organization using R3load procedure.
    I have taken the export of the existing system which is on Oracle 9i.
    Initially, I was planning on setting up the new system on Oracle 9i,
    now that Oracle 9i is out of support, I would have to go with Oracle
    10g.
    I found a note on this topic, SAP Note: 932722, however, this note
    explicitly talks about database specific procedure of system copy and
    also this note is for 64 bit platform.
    Can you point me towards a Note or documentation which can be used as
    reference for importing an Oracle 9i export into Oracle 10g on 32bit
    platform(System copy procedure is R3load)?
    OS- Windows 2003 32bit, ECC 5.0
    Regards,
    Vishnu.

    Hi Gagan,
    Thanks for your response,
    As per your suggestion, I looked at the System Copy guide for Heterogeneous System Copy,
    On service.sap.com/osdbmigration under FAQ's this is what I found,
    I plan to copy my SAP system to a new hardware with the same OS and DB products but higher release levels. Is this a migration or a homogeneous system copy?
    A change of the version of the operation system and/or the database system is not relevant in the context of a migration or a homogeneous system copy. If you don't change the product, but only the release level it is a homogeneous system copy. If you change OS and/or DB, it is a migration.
    The system copy guide, doesn't explicitly talk about an export from Oracle 9i imported into Oracle 10g.
    Can you point me to a guide or Note?
    Regards,
    Vishnu

  • Lightroom 5.5 Import from Catalog *very* slow performance.

    Importing from a second catalog from a shoot is *very* slow with the last version or couple of versions of Lightroom.  It's been about 20 minutes and the import is moving glacially slow and is only about 20% done.
    Here are as many stats as I have:
    iMac 27" 3.5Ghz Core i7, 32G of ram, 3T fusion drive (the late 2013 model maxed out)
    Master catalog has 117,000 images in it.
    The catalog I'm importing has 948 images in it.
    CPU use during the import is 100% used by lightroom (note that the multiple CPUs can go to more than 100% so this means I think that one CPU is maxed out, though LR can sometimes take 500% of my CPUs during an export)
    Lightroom is using 4.2G of RAM (10G free in activity monitor)
    Hard drive is fairly full, but still has 366G free on it (so about 10%)
    Lightroom Mobile sync is enabled (not on this folder I don't think) but says "service unavailable" at this point.
    Lightroom is fully up to date (5.5) on both iMac and laptop (also a mac in case that matters).  Both are up to date with the latest OS software.
    How I did it was:
    Export the shoot as a new catalog on my iMac
    Transfer it to my laptop (smart previews only, no masters)
    Edit images, metadata, etc
    Transfer the entire folder back (.lrcat, previews, smart previews) to iMac
    In my master catalog do an import from catalog
    Waited for ages for checking for dupes (which I can understand with such a big catalog)
    Selected to import all, changing metadata and develop edits, and selecting to put new images in the folder (not sure why it asked as there weren't any new images)
    The last time I did an import I started it and then went out somewhere so I didn't notice how long it took, but this seems completely crazy.  In the time it's taken to write this up it's now about 20 minutes later and the progress bar is maybe at the 33% range.  It's still working away, it's just crazy crazy slow.
    Anything to help out or debug would be greatly appreciated.  I suspect it's pure size of the catalog, but 120k isn't outside the realm of what I've heard people say works fine, or maybe the Fusion drive (maybe the OS is trying to move files around behind the scenes)? 
    Either way this didn't seem to be an issue until the last couple of versions (I haven't gone and re-installed 5.4 or 5.3 to check) and it's starting to really frustrate me

    Hi Jim,
    Great suggestion! It opens perfectly fine on its own. I even went as far as re-exporting it once it loaded but am still experiencing the same issue. That same catalog loads perfectly fine on my assistant's iMac.
    I can't really think about anything that changed in my environment with the exception of a new Bootcamp partition to run windows on an external Thunderbolt HD but I doubt this is related.
    I'm open to any suggestions while I am trying this again... 10 minutes of 'Checking for changed and duplicate photos'.

  • Retrieve data from a large table from ORACLE 10g

    I am working with a Microsoft Visual Studio Project that requires to retrieve data from a large table from Oracle 10g database and export the data into the hard drive.
    The problem here is that I am not able to connect to the database directly because of license issue but I can use a third party API to retrieve data from the database. This API has sufficient previllege/license permission on to the database to perform retrieval of data. So, I am not able to use DTS/SSIS or other tool to import data from the database directly connecting to it.
    Here my approach is...first retrieve the data using the API into a .net DataTable and then dump the records from it into the hard drive in a specific format (might be in Excel file/ another SQL server database).
    When I try to retrieve the data from a large table having over 13 lacs records (3-4 GB) in a data table using the visual studio project, I get an Out of memory exception.
    But is there any better way to retrieve the records chunk by chunk and do the export without loosing the state of the data in the table?
    Any help on this problem will be highly appriciated.
    Thanks in advance...
    -Jahedur Rahman
    Edited by: Jahedur on May 16, 2010 11:42 PM

    Girish...Thanks for your reply...But I am sorry for the confusions. Let me explain that...
    1."export the data into another media into the hard drive."
    What does it mean by this line i.e. another media into hard drive???
    ANS: Sorry...I just want to write the data in a file or in a table in SQL server database.
    2."I am not able to connect to the database directly because of license issue"
    huh?? I never heard this question that a user is not able to connect the db because of license. What error / message you are getting?
    ANS: My company uses a 3rd party application that uses ORACLE 10g. And my compnay is licensed to use the 3rd party application (APP+Database is a package) and did not purchased ORACLE license to use directly. So I will not connect to the database directly.
    3.I am not sure which API is you are talking about, but i am running an application of the visual studio data grid or similar kind of controls; in which i can select (select query) as many rows as i needed; no issue.
    ANS: This API is provided by the 3rd party application vendor. I can pass a query to it and it returns a datatable.
    4."better way to retrieve the records chunk by chunk and do the export without loosing the state of the data in the table?"
    ANS: As I get a system error (out of memory) when I select all rows in a datatable at a time, I wanted to retrieve the data in multiple phases.
    E.g: 1 to 20,000 records in 1st phase
    20,001 to 40,000 records in 2nd phase
    40,001 to ...... records in 3nd phase
    and so on...
    Please let me know if this does not clarify your confusions... :)
    Thanks...
    -Jahedur Rahman
    Edited by: user13114507 on May 12, 2010 11:28 PM

  • Calling a web service from Oracle 10g forms

    Hi Everybody
    I want to send SMS from my 10g forms.... I searched for the topic and I got the following link
    "http://www.oracle.com/technology/products/forms/htdocs/10gr2/howto/webservicefromforms/ws_10_1_3_from_forms.html"
    Now I have downloaded "wsclient_extended_101320.zip"
    1. extracted the jar file,
    2. copied the jar file in "c:\ids10\WSclient"
    3. Specified jar file name in "default.env" "formsweb.cfg"
    4. Also specified the jar file name in "CLASSPATH" environment variable
    5. restarted OC4J instance
    6. Restarted Fotm Builder
    But I couldn't see the "webServiceProxy.proxy.SendServiceSoapClient" in Form Builder "JAVA CLASS iMPORTER"
    Can anyone plz help.....
    Wat may be the problem??
    Do I hav to follow the steps to create service as specified in demo page....or is it sufficient to download the jar file ??
    Or is there any other setting that I m missing???
    Thanx in advance....

    I created a new folder "WSclient" in "c:\ids10\forms"
    and added full path name in CLASSPATH......

  • ORA-02248 when importing from the RAU

    Designer 6i rel. 4.6
    ODS 10g rel. 9.0.4
    Oracle 9.2.0.5
    I did a full Designer 6i repository export from the RAU and then imported the dmp file into another Designer 6i repository from its RAU. The RAUs from both instances use the import/export utilities from ODS 10g rel. 9.0.4.
    The export completes succesfully, but the import gives me the following warnings:
    IMP-00017: following statement failed with ORACLE error 2248:
    "ALTER SESSION SET "_PLSQL_LOAD_WITHOUT_COMPILE" = TRUE"
    IMP-00003: ORACLE error 2248 encountered
    ORA-02248: invalid option for ALTER SESSION
    IMP-00017: following statement failed with ORACLE error 2248:
    "ALTER SESSION SET "_PLSQL_LOAD_WITHOUT_COMPILE" = FALSE"
    IMP-00003: ORACLE error 2248 encountered
    ORA-02248: invalid option for ALTER SESSION
    I've searched Metalink but the only relevant information I could find was Note 19481.1. This note hasn't been updated since 1999, so I don't know if it's still relavent to the above versions.
    Does anyone know what these warnings mean?
    Thanks.

    Yes I got it and talked to Oracle. As per them its a bug but I got a workaround for this problem.
    If you still haven't got throught this mail me at [email protected]
    Regards
    Nikunj sharma

  • Issues with importing from excel

    I have been running into a several issues with importing from Excel.
    First my configuration
    I am running SQL Developer ver 1.5.5 Build MAIN-5969
    I am on a Windows XP Professional Version 2002 with Service Pack 3
    I am importing into an Oracle 10g database.
    1. SQL Developer doesn't work on Excel 2007, I have to save all my files into Excel 97-2003 format before I can use them.
    2. If I run into an error loading and stop the process, SQL Developer doesn't release the Excel file and I get a sharing violation if I try to save the spreadsheet without closing SQL Developer.
    3. I have found that I have to set print area to the area I want to work with, otherwise the import wizard tries to keep adding rows.
    4. When using the Import wizard, it keeps adding commas on fields with numerics unless I specify the column in excel as text. Currently the column is formatted as General in the spreadsheet or I can change the wizard format to say the column is an integer, but it actually is just a code field with numeric characters so it may have leading zeroes that I need to keep.
    This might be related,
    I have a column in excel defined as text but only contains numerics. It is of length 4, but the wizard sets a precision of 5 with a datatype of VARCHAR2. If I try to change it to 4, I get an error saying the field is not large enough. Yet, when I do a LEN on the column, it only gives me a 4. I have other fields in the same sheet that a 3 position numeric and 2 position numeric and those are fine. I am thinking this is related to the comma being inserted in a numeric field for anything greater than 3 positions.
    5. Importing excel dates to oracle dates doesn't work. I have to convert the excel date column to text then import as a VARCHAR, then convert to Date once in the database.
    6. The default of nullible is not set on any columns and I have to set them before the import. (I would prefer it set to nullible and I have to uncheck the box to make it not nullible. I would prefer to import all of the data and then deal with the nulls after they have been pulled in)
    7. When I select header columns included it uses that as the column names. Is it possible to do the name length check then? It has bit me a few times where I try to import and forget to check the name length and then I get an error when I start running the import.
    8. If one of the columns to import has all nulls, then the precision comes out to 0 and if it isn't changed an error occurs on import. Could this situation default to a precision of 1?
    9. It would be nice if there was a completion message displayed and a cancel option while running.

    On point 3.
    I have a column in excel that consists of numbers. 4 digit numeric codes. Ex, 1111, 2345, etc
    The column's format is general. It displays as just 4 numbers.
    When I start the wizard initially, the column appears with data as 1,111, 2,345, etc, on the Data Preview screen.
    It determines the precision to be 5 on the column definition screen.
    If I change the precision to 4 then continue, that field will error out when I verify with "not big enough to hold the data in source columns"
    If, I change the excel column to a TEXT column.
    Excel still displays as 1111, 2345, etc
    The wizard then displays the same data 1111, 2345 on the Data Previeiw screen
    Yet, when I get to the column definition screen it still sizes it as a 5
    If I change it to a 4, I get the same error when verifying.
    If I leave them alone as 5, then it processes just fine.

  • Tables do not import when importing from 10gR2 to 9iR2

    Hello forum members
    I take full database dump from 10g database as below:
    C:\Documents and Settings\Vugar>exp
    Export: Release 9.2.0.1.0 - Production on Thu Aug 6 19:05:47 2009
    Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
    Username: sys/sys_pass@prod as sysdba
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Produc
    tion
    With the Partitioning, OLAP and Data Mining options
    Enter array fetch buffer size: 4096 >
    Export file: EXPDAT.DMP > c:\full6.dmp
    (1)E(ntire database), (2)U(sers), or (3)T(ables): (2)U > E
    Export grants (yes/no): yes >
    Export table data (yes/no): yes >
    Compress extents (yes/no): yes >
    And import the dump file in 9i database as below:
    C:\Documents and Settings\Vugar>imp
    Import: Release 9.2.0.1.0 - Production on Thu Aug 6 19:08:18 2009
    Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
    Username: sys/sys_pass as sysdba
    Connected to: Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production
    With the Partitioning, OLAP and Oracle Data Mining options
    JServer Release 9.2.0.1.0 - Production
    Import file: EXPDAT.DMP > c:\full6.dmp
    Enter insert buffer size (minimum is 8192) 30720>
    Export file created by EXPORT:V09.02.00 via conventional path
    import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
    import server uses AL32UTF8 character set (possible charset conversion)
    List contents of import file only (yes/no): no >
    Ignore create error due to object existence (yes/no): no >
    Import grants (yes/no): yes >
    Import table data (yes/no): yes >
    Import entire export file (yes/no): no > Y
    Everything goes alright, all of the users and their grants created in the target database. But the tables and other objects of users do not create. After the full import I import every user with the from user to user clause. After that all of the objects create under each user. Can anyone tell me the reason of that, why the objects of users do not create when perfomin full import?? What is my mistake in full importing?
    Thanks.

    Shafi Rasulov wrote:
    Thanks for reply
    Virendra, I exported from 10g with 9i exp utility, so 9i exp utility doesn't export incompatible objects from 10g, so it is possible full export import from 10g to 9i.
    Anand, as you say, I exported/imported with 9i utility from 10g. In tha documentation and another tutorials I haven't seen any notification about missing table and another objects in the export/import.
    Pavan, as you say there is several ways for migrating from 10g to 9i but I'm interested in why full export/import doesn't create tables and another objects when I do.
    I say again, after I full import, I import each user with the FROM USER TO USER statement and all of the objects are created under relevant user. I want to learn the reason why the tables and another objects are not created in full import??I don't have 9i database with me to check what is happening but there should not be any error if you have used proper binary of the right version to do the export. What are those "incompatible objects" that you are talking about which are not exported/imported? You don't need to use from user/to user, if you are going to mention a 10g schema and Full=y , your import would work properly.
    HTH
    Aman....

Maybe you are looking for

  • Problem in Forms Personalization

    Hi All, i am new to forms personalization. I need to do form Personalization for Advance supply chain planner Module. Req is when Order_type is Planned Order and Min_Ordered_qty >0 then make the row in Green Colour. I am able to do Green colour but t

  • Stolen iPhone 4S / I'm surprised there isn't a better recovery system

    I recently had my new iPhone 4S stolen. I made a trip to my local waste disposal facility on Saturday and it must have fallen out of my pocket while I was pulling heaps of recycling out of my car. After realising I must have dropped the phone, I drov

  • Ipad song order is strange, help?

    Hi guys, I have this problem with some albums on my ipad. When i want to play them their track order is via alphabetical order. the funny thing is other albums are not like this so i guess its not just a simple false setting. I have tried updating th

  • Publish WAD Template

    Hi i am unable to publish a wad template to the portal.  In WAD when i click on Publish - to Portal, i am getting the "Export to Portal Content Directory (PCD) as iView".  I click on Folder Selection, and in the "select a Folder" dialog box, i select

  • Adding images to an existing collection

    Using a PC how can you add selected images to an existing collection. Dragging, as per LR help, would appear to be for Mac users only. Many thanks Tom