Sybase 12.5 data dump to be imported to Oracle.
Hello,
Can someone please point to any relevant links where I can pull data from Sybase and put it in Oracle? Any pointers will be appreciated.
Thanks
Sami
In that case, you want the Oracle Migration Workbench which is also included in the most recent version of SQL Developer.
Justin
Similar Messages
-
Create data dump in Oracle 10g
Hi All,
Can somebody please tell me how to create a data dump and then import the database structure and data from an existing database?
The data dump would consist of the complete database structure, i suppose.
I am using Oracle 10g Rel 1.
Thanks in Advance,
TanujaNote, neither classical export nor datapump is really a database backup. You only perform a logical data extraction,but it's difficult to recreate a lost database based only on export dumps. Additionally you will have loss of data,because you only can restore the data as it was,when you did the export. Changes made later are lost. See export/import as an add-on,but backup your database physically. The best way to do this is (in my opinion) RMAN.
http://download.oracle.com/docs/cd/B19306_01/backup.102/b14193/toc.htm
Werner -
How to import a data dump into Oracle express
Hi
Thank you for reading my post
how i can import a data dump which is a dump of my Schema in oracle 10g into oracle database express ?
indeed it is just 20 tables and one sequence that i exported from oracle 10g and i want to import it into my oracle express 10g
thanksHi,
I
1) export the data from your source database XE
C:\WINDOWS\system32>exp userid=USER1@XE1 file=user1.dmp
2) in the target XE database you would have to recreate the same user, e.g. USER1
3) run imp on the same dump file:
C:\WINDOWS\system32>imp userid=USER1@xe2 file=user1.dmp full=yes ignore=yes
II
From DOS prompt make directory
DOS>md c:\oraclexe\tmp
DOS>cd c:\oraclexe\app\product\10.2.0\server\BIN
DOS>sqlplus SYS/Password AS SYSDBA
It is posible to be SYS/Password@SID AS SYSDBA
SQLPlus>CREATE OR REPLACE DIRECTORY XMLDIR AS 'C:\oraclexe\Tmp';
SQLPlus>GRANT READ, WRITE ON DIRECTORY XMLDIR TO property;
Go back to DOS prompt. There are two files - expdp.exe and impdp.exe in BIN directory.
DOS>expdp SYS/Password AS SYSDBA SCHEMAS=property DIRECTORY=XMLDIR DUMPFILE=property.dmp LOGFILE=property.log
From DOS prompt make directory
DOS>md c:\oraclexe\tmp
DOS>cd c:\oraclexe\app\product\10.2.0\server\BIN
DOS>sqlplus SYS/Password AS SYSDBA
It is posible to be SYS/Password@SID AS SYSDBA
SQLPlus>ALTER SYSTEM SET NLS_LENGTH_SIMANTICS='CHAR' SCOPE=BOTH;
SQLPlus>CREATE OR REPLACE DIRECTORY dmpdir AS 'C:\oraclexe\Tmp';
SQLPlus>GRANT READ, WRITE ON DIRECTORY XMLDIR TO public;
Go back to DOS prompt. There are two files - expdp.exe and impdp.exe in BIN directory.
DOS>impdp SYS/Password AS SYSDBA SCHEMAS=property DIRECTORY=XMLDIR DUMPFILE=property.dmp LOGFILE=property.log
There are so many treads about export and import in XE.
Konstantin -
Problem while importing same data dump in 2 databases
Hi,
Oracle Version : 11G r2
I have installed oracle 11g and created a database db1. I created tablespace tbl1, user usr1, granted required permissions and imported oracle data dump into it using the impdp command.
Now, i want to import the same dump into another database. I created a fresh database db2. But the dump requires tablespace name as tbl1 and user name as usr1.
How do i import the dump into this new db2.
FYI - I tried a few things. I created db2. Logged into it using sqlplus system/pwd@db2 command. Created a new tablespace tbl1 and user usr1. I was able to create them successfully without errors like 'already existing'. But when i issue the impdp command, i get 'table already exists' for all tables in the dump. :(
Please tell me how to do this.
Thanks,
AnandhiHi,
When you import second fresh database, it means there haven't tbl1 tablespace. 1. you can create this tablespace in second database.
2. If you don't need create again same table space then you must use REMAP_TABLESPACE and REMAP_SCHEMA option of impdp
for example :
inpdp REMAP_TABLESPACE=TBL1:USERS REMAP_SCHEMA=USR1:USR2 DUMPFILE=DATA_PUMP_DIR:DUMP.DMPRegards
Mahir M. Quluzade -
Script cannot find perl Data:Dumper module
Good day. I live in Australia and use a script called Shepherd to update my mythtv program listings.
After upgrading a couple of days ago (first time in about 6 months) Shepherd has stopped working.
Shepherd doesn't seem to find the perl data-dumper module, which is part of the base perl install.
All my packages are up to date as of this post, and I am on the current kernel -
uname -r
3.4.9-1-ARCH
~/.shepherd/shepherd --check
ERROR:
Mandatory module 'Data::Dumper' not found.
Please see the Wiki at <a href='http://svn.whuffy.com/wiki/Installation' class='bbc_url' title='External link' rel='nofollow external'>http://svn.whuffy.co...ki/Installation</a>
for details on how to install this module.
Shepherd is updating ~/.shepherd/output.xmltv (the listings information scraped from the net) but evidently needs the dumper module to post the data into MythTV.
I'm not sure if it has something to do with this, which I don't properly understand -
The current Arch Linux default perl installation installs updates to core modules into the perl core directories, creating file conflicts. Examples include modules such as Data::Dumper and version.
which is mentioned here.
I appear to have it installed as part of the perl base package -
sudo pacman -Qi perl
Name : perl
Version : 5.16.1-1
URL : <a href='http://www.perl.org' class='bbc_url' title='External link' rel='nofollow external'>http://www.perl.org</a>
Licenses : GPL PerlArtistic
Groups : base
Provides : perl-archive-extract=0.58 perl-archive-tar=1.82 perl-attribute-handlers=0.93 perl-autodie=2.10
perl-autoloader=5.72 perl-autouse=1.07 perl-b-debug=1.17 perl-b-deparse=1.14 perl-b-lint=1.14
perl-base=2.18 perl-bignum=0.29 perl-carp=1.26 perl-cgi=3.59 perl-compress-raw-bzip2=2.048
perl-compress-raw-zlib=2.048 perl-constant=1.23 perl-cpan=1.9800 perl-cpan-meta=2.120630
perl-cpan-meta-yaml=0.007 perl-cpanplus=0.9121 perl-cpanplus-dist-build=0.62 perl-data-dumper=2.135
So it seems Shepherd is suddenly not finding it for some reason, after the upgrade.
I tried installing the perl-data-dumper packages from the AUR (simple and concise) and neither of those made any difference.
I also tried installing the module with a cpan command (as root) but that gave me errors (which I'd rather not have to deal with unless I have to to fix this issue).
cpan
Terminal does not support AddHistory.
cpan shell -- CPAN exploration and modules installation (v1.9800)
Enter 'h' for help.
cpan[1]> install data:dumper
Reading '/root/.cpan/sources/authors/01mailrc.txt.gz'
sh: /bin/gzip: No such file or directory
............................................................................DONE
Reading '/root/.cpan/sources/modules/02packages.details.txt.gz'
sh: /bin/gzip: No such file or directory
Warning: Your /root/.cpan/sources/modules/02packages.details.txt.gz does not contain a Line-Count header.
Please check the validity of the index file by comparing it to more
than one CPAN mirror. I'll continue but problems seem likely to
happen.
Warning: Your /root/.cpan/sources/modules/02packages.details.txt.gz does not contain a Last-Updated header.
Please check the validity of the index file by comparing it to more
than one CPAN mirror. I'll continue but problems seem likely to
happen.
Subroutine AUTOLOAD redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 31.
Subroutine import redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 49.
Subroutine tv_interval redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 70.
.Could not split line["H\"rW{0M4K\cV1=YFH\cV\c^rqv\c?\c]hKUe\cGL)4:4\cH]C\cPW`_|#MU4}\c_\c_%\cQ\cN\cX8M}%e\cR|\cG-K\c?~\cAM_cREx#AnvAZDi\$\c\!h\cZ\cI^uG0 )>TQ}p{Q[Ah\cNIK\cV+R"]
Could not split line["-6W/GQp\cOm`>UwBP[5"]
Could not split line["\@\cPI\cW,(zY2]uA!fVegsf\\V\cGfL\cHY4ki\cFvLgY\\\c\Ed\cK\cTe-\cC0\@d!?\cSU/\\G8\cAt?(\cY.L\cZ\cBrtl~s[4\@\cF\cWu*D>I"]
Could not split line["hb\cSKA1\cCjX(w/K\cA"]
Giving up parsing your /root/.cpan/sources/modules/02packages.details.txt.gz, too many errorsReading '/root/.cpan/sources/authors/01mailrc.txt.gz'
sh: /bin/gzip: No such file or directory
............................................................................DONE
Reading '/root/.cpan/sources/modules/02packages.details.txt.gz'
sh: /bin/gzip: No such file or directory
Warning: Your /root/.cpan/sources/modules/02packages.details.txt.gz does not contain a Line-Count header.
Please check the validity of the index file by comparing it to more
than one CPAN mirror. I'll continue but problems seem likely to
happen.
Subroutine AUTOLOAD redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 31.
Subroutine import redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 49.
Subroutine tv_interval redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 70.
Warning: Your /root/.cpan/sources/modules/02packages.details.txt.gz does not contain a Last-Updated header.
Please check the validity of the index file by comparing it to more
than one CPAN mirror. I'll continue but problems seem likely to
happen.
Subroutine AUTOLOAD redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 31.
Subroutine import redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 49.
Subroutine tv_interval redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 70.
.Could not split line["H\"rW{0M4K\cV1=YFH\cV\c^rqv\c?\c]hKUe\cGL)4:4\cH]C\cPW`_|#MU4}\c_\c_%\cQ\cN\cX8M}%e\cR|\cG-K\c?~\cAM_cREx#AnvAZDi\$\c\!h\cZ\cI^uG0 )>TQ}p{Q[Ah\cNIK\cV+R"]
Could not split line["-6W/GQp\cOm`>UwBP[5"]
Could not split line["\@\cPI\cW,(zY2]uA!fVegsf\\V\cGfL\cHY4ki\cFvLgY\\\c\Ed\cK\cTe-\cC0\@d!?\cSU/\\G8\cAt?(\cY.L\cZ\cBrtl~s[4\@\cF\cWu*D>I"]
Could not split line["hb\cSKA1\cCjX(w/K\cA"]
Giving up parsing your /root/.cpan/sources/modules/02packages.details.txt.gz, too many errorsTerminal does not support GetHistory.
Lockfile removed.
It says that the 02packages.details.txt.gz file is missing, but it looks like it's there -
ls /root/.cpan/sources/modules/
02packages.details.txt.gz 03modlist.data.gz
I thought I might uninstall and reinstall perl but I got dependency errors and I'm not game to force an uninstall unless I get advice to do so -
pacman -Rs perl
checking dependencies...
error: failed to prepare transaction (could not satisfy dependencies)
:: automake: requires perl
:: cdrkit: requires perl
:: ddclient: requires perl
:: git: requires perl>=5.14.0
:: groff: requires perl
:: gtk-doc: requires perl
:: hspell: requires perl
:: hyphen: requires perl
:: imagemagick: requires perl
:: lm_sensors: requires perl
:: mod_perl: requires perl
:: openssl: requires perl
:: perl-algorithm-diff: requires perl
:: perl-archive-zip: requires perl>=5.10.0
:: perl-class-factory-util: requires perl
:: perl-class-inspector: requires perl>=5.6.0
:: perl-class-load: requires perl
:: perl-class-methodmaker: requires perl>=5.10.0
:: perl-class-singleton: requires perl>=5.10.0
:: perl-convert-binhex: requires perl
:: perl-data-dump: requires perl>=5.006
:: perl-data-optlist: requires perl-scalar-list-utils
:: perl-datetime-format-builder: requires perl
:: perl-dbi: requires perl
:: perl-digest-sha1: requires perl
:: perl-email-date-format: requires perl
:: perl-encode-locale: requires perl>=5.008
:: perl-error: requires perl>=5.10.0
:: perl-fcgi: requires perl
:: perl-file-listing: requires perl>=5.8.8
:: perl-file-slurp: requires perl>=5.14.0
:: perl-font-afm: requires perl>=5.5.0
:: perl-html-form: requires perl>=5.8.8
:: perl-html-formattext: requires perl>=5.10.0
:: perl-html-parser: requires perl>=5.12.1
:: perl-html-tagset: requires perl>=5.10.0
:: perl-html-tree: requires perl>=5.10.0
:: perl-http-cache-transparent: requires perl>=5.10.0
:: perl-http-cookies: requires perl>=5.8.8
:: perl-http-daemon: requires perl>=5.8.8
:: perl-http-date: requires perl>=5.8.8
:: perl-http-message: requires perl>=5.8.8
:: perl-http-negotiate: requires perl>=5.8.8
:: perl-image-size: requires perl
:: perl-io-socket-ssl: requires perl>=5.10.0
:: perl-io-string: requires perl>=5.10.0
:: perl-io-stringy: requires perl
:: perl-libwww: requires perl>=5.8.1
:: perl-lingua-en-numbers-ordinate: requires perl>=5.10.0
:: perl-lingua-preferred: requires perl>=5.10.0
:: perl-linux-pid: requires perl
:: perl-list-moreutils: requires perl>=5.5.30
:: perl-lwp-mediatypes: requires perl>=5.6.2
:: perl-lwp-protocol-https: requires perl>=5.8.1
:: perl-math-round: requires perl
:: perl-mime-lite: requires perl
:: perl-mime-types: requires perl
:: perl-mozilla-ca: requires perl>=5.006
:: perl-net-http: requires perl>=5.6.2
:: perl-net-upnp: requires perl
:: perl-package-stash: requires perl>=5.8.1
:: perl-package-stash: requires perl-scalar-list-utils
:: perl-package-stash-xs: requires perl>=5.8.1
:: perl-params-classify: requires perl>=5.10.1
:: perl-params-util: requires perl>=5.5.30
:: perl-params-validate: requires perl
:: perl-parse-recdescent: requires perl>=5.10.0
:: perl-perl4-corelibs: requires perl>=5.006
:: perl-soap-lite: requires perl
:: perl-socket6: requires perl
:: perl-sub-exporter: requires perl>=5.6.0
:: perl-sub-install: requires perl
:: perl-sub-uplevel: requires perl
:: perl-term-readkey: requires perl
:: perl-test-warn: requires perl>=5.10.0
:: perl-text-iconv: requires perl
:: perl-timedate: requires perl
:: perl-tk: requires perl
:: perl-unicode-string: requires perl
:: perl-unicode-utf8simple: requires perl>=5.10.0
:: perl-uri: requires perl>=5.10.0
:: perl-www-robotrules: requires perl>=5.8.1
:: perl-xml-dom: requires perl>=5.10.0
:: perl-xml-namespacesupport: requires perl
:: perl-xml-parser: requires perl
:: perl-xml-regexp: requires perl
:: perl-xml-sax: requires perl
:: perl-xml-sax-base: requires perl
:: perl-xml-simple: requires perl
:: perl-xml-twig: requires perl
:: perl-xml-writer: requires perl
:: perl-xml-xpath: requires perl
:: perl-yaml: requires perl>=5.10.0
:: perl-yaml-syck: requires perl
:: rsync: requires perl
:: system-tools-backends: requires perl
:: webmin: requires perl
Any help would be greatly appreciated.
Thanks
belbo
Last edited by belbo (2012-08-26 00:33:52)Thanks
Results as follows -
perl -MData::Dumper -e1
Perl API version v5.14.0 of Data::Dumper does not match v5.16.0 at /usr/lib/perl5/site_perl/XSLoader.pm line 95.
Compilation failed in require.
BEGIN failed--compilation aborted.
$ perl /mnt/Data/Scripts/Perl_Module_Check.pl wheremod Data::Dumper
Data::Dumper /usr/lib/perl5/site_perl/Data/Dumper.pm
Data::Dumper /usr/lib/perl5/core_perl/Data/Dumper.pm
I ran the full delete command after backing up the directories (there was a lot in them)
rm -rf /usr/lib/perl5/site_perl/* /usr/share/perl5/site_perl/*
This resulted in a different missing module -
$ sudo rm -rf /usr/lib/perl5/site_perl/* /usr/share/perl5/site_perl/*
$ ~/.shepherd/shepherd --check
ERROR:
Mandatory module 'List::Compare' not found.
I restored the backup and deleted only Data::Dumper but then got a missing XMLTV module -
$ ~/.shepherd/shepherd --check
$[ used in numeric lt (<) (did you mean $] ?) at /usr/share/perl5/site_perl/XML/Twig.pm line 7278.
$[ used in numeric lt (<) (did you mean $] ?) at /usr/share/perl5/site_perl/XML/Twig.pm line 7284.
$[ used in numeric lt (<) (did you mean $] ?) at /usr/share/perl5/site_perl/XML/Twig.pm line 7296.
ERROR:
Mandatory module 'XMLTV' not found.
Following this post, I did the following, which returned no results -
https://bbs.archlinux.org/viewtopic.php?pid=1110598
$ sudo pacman -Qml | awk '/_perl\/auto\/.+\.so$/ { print $1 }' | uniq
I then did the following -
$ sudo rm -rf /usr/lib/perl5/*
$ sudo pacman -S perl
warning: perl-5.16.1-1 is up to date -- reinstalling
resolving dependencies...
looking for inter-conflicts...
Targets (1): perl-5.16.1-1
Total Installed Size: 48.33 MiB
Net Upgrade Size: 0.00 MiB
Proceed with installation? [Y/n]
(1/1) checking package integrity [#######################################################] 100%
(1/1) loading package files [#######################################################] 100%
(1/1) checking for file conflicts [#######################################################] 100%
(1/1) checking available disk space [#######################################################] 100%
(1/1) upgrading perl [#######################################################] 100%
$ ~/.shepherd/shepherd --check
ERROR:
Mandatory module 'XMLTV' not found.
Oops. Then saw your next post saying not to do that as you'd lose the vendor modules, which I did. I had backed up so I restored, so still have the vendor stuff.
In case its of any assistance, this is what my perl lib and share directories look like -
[ben@htpc ~]$ ls /usr/lib/perl5
core_perl site_perl site_perl.bak vendor_perl
[ben@htpc ~]$ ls /usr/lib/perl5/site_perl
Alien auto Data Encode encoding.pm Filter List Net perlfilter.pod SVN Term Unicode version.pm XML
Attribute Compress Digest Encode.pm FCGI.pm filter-util.pl Math Params Storable.pm Sys Time version version.pod XSLoader.pm
[ben@htpc ~]$ ls /usr/lib/perl5/vendor_perl
Apache Attribute Crypt DateTimePP.pm dbixs_rev.pl GD.pm Image ModPerl Package SNMP.pm Text Unicode YAML
Apache2 auto DateTime DBD Digest gv.pm JSON mod_perl2.pm Params Socket6.pm Tie Win32
APR Bundle DateTime.pm DBI FCGI.pm gv.so Linux Net qd.pl SVN Tk Xfce4
APR.pm Class DateTimePPExtra.pm DBI.pm GD HTML List NetSNMP RRDs.pm Term Tk.pm XML
[ben@htpc ~]$ ls /usr/lib/perl5/core_perl
arybase.pm Compress Cwd.pm DynaLoader.pm Fcntl.pm I18N List ODBM_File.pm perllocal.pod Socket.pm threads.pm
attributes.pm Config_git.pl Data Encode File IO Math Opcode.pm POSIX.pm Storable.pm Tie
auto Config_heavy.pl DB_File.pm Encode.pm Filter IO.pm MIME O.pm re.pm Sys Time
B Config.pm Devel encoding.pm GDBM_File.pm IPC mro.pm ops.pm Scalar Text Unicode
B.pm CORE Digest Errno.pm Hash lib.pm NDBM_File.pm PerlIO SDBM_File.pm threads
[ben@htpc ~]$ ls /usr/share/perl5/
core_perl site_perl vendor_perl
[ben@htpc ~]$ ls /usr/share/perl5/core_perl
AnyDBM_File.pm bignum.pm Config Digest.pm fields.pm IO Object PerlIO.pm Term utf8.pm
App bigrat.pm constant.pm DirHandle.pm File IPC open.pm pod Test vars.pm
Archive blib.pm CPAN Dumpvalue.pm FileCache.pm JSON overload Pod Test.pm version
Attribute bytes_heavy.pl CPANPLUS dumpvar.pl FileHandle.pm less.pm overloading.pm Safe.pm Text Version
autodie bytes.pm CPANPLUS.pm Encode filetest.pm Locale overload.pm Search Thread version.pm
autodie.pm Carp CPAN.pm encoding Filter locale.pm Package SelectSaver.pm Thread.pm vmsish.pm
AutoLoader.pm Carp.pm DBM_Filter English.pm FindBin.pm Log Params SelfLoader.pm Tie warnings
AutoSplit.pm CGI DBM_Filter.pm Env.pm Getopt Math parent.pm sigtrap.pm Time warnings.pm
autouse.pm CGI.pm DB.pm Exporter HTTP Memoize Parse sort.pm Unicode XSLoader.pm
B _charnames.pm deprecate.pm Exporter.pm I18N Memoize.pm Perl strict.pm unicore
base.pm charnames.pm Devel ExtUtils if.pm Module perl5db.pl subs.pm UNIVERSAL.pm
Benchmark.pm Class diagnostics.pm Fatal.pm inc Net perlfaq.pm Symbol.pm User
bigint.pm Compress Digest feature.pm integer.pm NEXT.pm PerlIO TAP utf8_heavy.pl
[ben@htpc ~]$ ls /usr/share/perl5/site_perl
Alien bigint.pm Bundle Class CPANPLUS DateTime ExtUtils HTML IPC Locale LWP.pm Math MythTV Object Try YAML
Archive bignum.pm CGI Compress CPANPLUS.pm Encode File HTTP JSON LWP lwptut.pod Module MythTV.pm Test WWW YAML.pm
Attribute bigrat.pm CGI.pm CPAN Date Env.pm Filter IO List lwpcook.pod Mail Mozilla Net Tie XML
[ben@htpc ~]$ ls /usr/share/perl5/vendor_perl
abbrev.pl bigrat.pl ctime.pl Error Font HTML LWP.pm open2.pl SOAP Time validate.pl
Algorithm cacheout.pl Data Error.pm ftp.pl HTTP Mail open3.pl stat.pl timelocal.pl WWW
Apache CGI Date exceptions.pl getcwd.pl Image Math Package Sub Tree XML
Archive CGI.pm DateTime fastcwd.pl getopt.pl importenv.pl MIME Parse syslog.pl Try XMLRPC
assert.pl chat2.pl Dist File getopts.pl IO Module Perl4 tainted.pl UDDI XMLTV
auto Class dotsh.pl finddepth.pl Git Lingua Mozilla pwd.pl Term Unicode XMLTV.pm
bigfloat.pl complete.pl Email find.pl Git.pm look.pl Net RRDp.pm termcap.pl URI YAML
bigint.pl Convert Encode flush.pl hostname.pl LWP newgetopt.pl shellwords.pl Test URI.pm YAML.pm
Any further assistance would be greatly appreciated.
Thanks
belbo
Last edited by belbo (2012-08-27 13:40:14) -
Hi,
I am taken data dump on oracle 9i machine and ported (imported ) oracle 10g (production machine) ,But it will showing error : language set error,
Could you tell me how to take data dump with language set.
Regards,
SuvaHi PaulM,
Please follows the details,
Development server ,It is 9i machine (I am export in this machine) and Imported on Production Server ( It is Oracle 10 g).
When import on production server error is coming, Tis error log adding below.
Production Databse (Language details)
NLS_LANGUAGE AMERICAN
NLS_TERRITORY AMERICA
NLS_CURRENCY $
NLS_ISO_CURRENCY AMERICA
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET UTF8
NLS_CALENDAR GREGORIAN
NLS_DATE_FORMAT DD-MON-RR
NLS_DATE_LANGUAGE AMERICAN
NLS_SORT BINARY
NLS_TIME_FORMAT HH.MI.SSXFF AM
NLS_TIMESTAMP_FORMAT DD-MON-RR HH.MI.SSXFF AM
NLS_TIME_TZ_FORMAT HH.MI.SSXFF AM TZR
NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI.SSXFF AM TZR
NLS_DUAL_CURRENCY $
NLS_COMP BINARY
NLS_LENGTH_SEMANTICS BYTE
NLS_NCHAR_CONV_EXCP FALSE
NLS_NCHAR_CHARACTERSET UTF8
NLS_RDBMS_VERSION 10.2.0.1.0
Development Database Language details Details.
NLS_LANGUAGE AMERICAN
NLS_TERRITORY AMERICA
NLS_CURRENCY $
NLS_ISO_CURRENCY AMERICA
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET UTF8
NLS_CALENDAR GREGORIAN
NLS_DATE_FORMAT DD-MON-RR
NLS_DATE_LANGUAGE AMERICAN
NLS_SORT BINARY
NLS_TIME_FORMAT HH.MI.SSXFF AM
NLS_TIMESTAMP_FORMAT DD-MON-RR HH.MI.SSXFF AM
NLS_TIME_TZ_FORMAT HH.MI.SSXFF AM TZR
NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI.SSXFF AM TZR
NLS_DUAL_CURRENCY $
NLS_COMP BINARY
NLS_LENGTH_SEMANTICS BYTE
NLS_NCHAR_CONV_EXCP FALSE
NLS_NCHAR_CHARACTERSET UTF8
NLS_RDBMS_VERSION 10.2.0.1.0
Log file
Connected to: Oracle Database 10g Release 10.2.0.1.0 - Production
Export file created by EXPORT:V09.02.00 via conventional path
import done in WE8MSWIN1252 character set and UTF8 NCHAR character set
import server uses UTF8 character set (possible charset conversion)
export server uses AL16UTF16 NCHAR character set (possible ncharset conversion)
. importing JW_OR's objects into JW_OR
. importing JW_OS's objects into JW_OS
. importing JW_ADMIN's objects into JW_ADMIN
. importing JW_OR's objects into JW_OR
. . importing table "ACCRXNS" 234671 rows imported
. . importing table "AUTHORLINKS" 790450 rows imported
. . importing table "AUTHORS" 79500 rows imported
. . importing table "CATSOL" 25505 rows imported
. . importing table "CATSOLSYNONYMS" 80045 rows imported
. . importing table "CHAPTERTITLES" 133 rows imported
. . importing table "COMPOUNDLINKS" 601785 rows imported
. . importing table "CONDITIONS" 207445 rows imported
. . importing table "JOURNALS" 2327 rows imported
. . importing table "LANGUAGE" 0 rows imported
. . importing table "MAINDATA" 234659 rows imported
. . importing table "MOLDATA" 721174 rows imported
. . importing table "PLAN_TABLE" 1 rows imported
. . importing table "REFERENCES" 276783 rows imported
. . importing table "ROLES" 2 rows imported
. . importing table "RXNKEYLINKS" 1724404 rows imported
. . importing table "RXNKEYWORDS" 848 rows imported
. . importing table "TABLETITLES" 2400 rows imported
. . importing table "TEMP_TABLE" 165728 rows imported
. . importing table "TEMP_WILEY_MAINDATA" 155728 rows imported
. . importing table "TEMP_WILEY_PDF_MAP" 16672 rows imported
. . importing table "TEMP_WILEY_YEAR_VOL_MAP" 42 rows imported
. . importing table "WEX_ACCRXNS" 3465 rows imported
. . importing table "WEX_AUTHORLINKS" 14183 rows imported
. . importing table "WEX_AUTHORS" 79500 rows imported
. . importing table "WEX_CHAPTERTITLES" 133 rows imported
. . importing table "WEX_COMPOUNDLINKS" 10925 rows imported
. . importing table "WEX_CONDITIONS" 5297 rows imported
. . importing table "WEX_JOURNALS" 2327 rows imported
. . importing table "WEX_LANGUAGE" 0 rows imported
. . importing table "WEX_MAINDATA" 3465 rows imported
. . importing table "WEX_MOLDATA" 10358 rows imported
. . importing table "WEX_REFERENCES" 3795 rows imported
. . importing table "WEX_RXNKEYLINKS" 34540 rows imported
. . importing table "WEX_RXNKEYWORDS" 848 rows imported
. . importing table "WEX_TABLETITLES" 2400 rows imported
. . importing table "WEX_WILEY_HTML_MAP" 17316 rows imported
. . importing table "WEX_WILEY_MAINDATA" 3465 rows imported
. . importing table "WEX_WILEY_PDF_MAP" 23925 rows imported
. . importing table "WEX_WILEY_YEAR_VOL_MAP" 58 rows imported
. . importing table "WILEY_HTML_MAP" 17316 rows imported
. . importing table "WILEY_MAINDATA" 234659 rows imported
. . importing table "WILEY_PDF_MAP" 23925 rows imported
. . importing table "WILEY_YEAR_VOL_MAP" 58 rows imported
. importing JW_OS's objects into JW_OS
. . importing table "ACCRXNS" 7116 rows imported
. . importing table "ATMOSPHERE" 47 rows imported
. . importing table "AUTHORLINKS" 33276 rows imported
. . importing table "AUTHORS" 6555 rows imported
. . importing table "CATSOL" 1463 rows imported
. . importing table "CATSOLSYNONYMS" 9370 rows imported
. . importing table "CHEMICALS" 78197 rows imported
. . importing table "COMPOUNDLINKS" 20799 rows imported
. . importing table "EXPDET" 1 rows imported
. . importing table "FOOTNOTES" 77825 rows imported
. . importing table "JOURNALS" 2 rows imported
. . importing table "LANGUAGE" 2 rows imported
. . importing table "MAINDATA" 7116 rows imported
. . importing table "PATHSTEP" 7199 rows imported
. . importing table "PROCEDURENOTES" 77293 rows imported
. . importing table "ROLES" 2 rows imported
. . importing table "RXNKEYLINKS" 23096 rows imported
. . importing table "RXNKEYWORDS" 1272 rows imported
. . importing table "WEX_ACCRXNS" 135 rows imported
. . importing table "WEX_ATMOSPHERE" 47 rows imported
. . importing table "WEX_AUTHORLINKS" 613 rows imported
. . importing table "WEX_AUTHORS" 6555 rows imported
. . importing table "WEX_CHEMICALS" 0 rows imported
. . importing table "WEX_COMPOUNDLINKS" 497 rows imported
. . importing table "WEX_EXPDET" 1 rows imported
. . importing table "WEX_FOOTNOTES" 2184 rows imported
. . importing table "WEX_JOURNALS" 2 rows imported
. . importing table "WEX_LANGUAGE" 2 rows imported
. . importing table "WEX_MAINDATA" 135 rows imported
. . importing table "WEX_PATHSTEP" 135 rows imported
. . importing table "WEX_PROCEDURENOTES" 2253 rows imported
. . importing table "WEX_RXNKEYLINKS" 695 rows imported
. . importing table "WEX_RXNKEYWORDS" 1272 rows imported
. importing JW_ADMIN's objects into JW_ADMIN
. . importing table "APP_USER" 76 rows imported
. . importing table "AUTHOR" 61874 rows imported
. . importing table "CITATION"
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 9, maximum: 8)
Column 1 10794
Column 2 77
Column 3 1
Column 4 24
Column 5
Column 6 Science of Synthesis
Column 7 Negishi, E.-i.; Takahashi, T. Science of Synthesis...
Column 8 681–848
Column 9 2
Column 10
Column 11 2002
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 10, maximum: 8)
Column 1 10879
Column 2 77
Column 3 1
Column 4 110
Column 5
Column 6 Comprehensive Organic Synthesis
Column 7 Hiemstra, H.; Speckamp, W. N.; Trost, B. M.; Flemi...
Column 8 1047–108
Column 9 2
Column 10
Column 11
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 10, maximum: 8)
Column 1 10880
Column 2 77
Column 3 1
Column 4 111
Column 5
Column 6 Houben-Weyl Methods of Organic Chemistry
Column 7 De Koning, H.; Speckamp, W. N.; Helmchen, G.; Hoff...
Column 8 1953–200
Column 9 E21b
Column 10
Column 11 1995
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 10, maximum: 8)
Column 1 10904
Column 2 77
Column 3 1
Column 4 135
Column 5
Column 6 Houben-Weyl Methods of Organic Chemistry
Column 7 Ryu, I.; Murai, S.; de Meijere, A., Ed. Houben-Wey...
Column 8 1985–204
Column 9 E17c
Column 10
Column 11 1997
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 9, maximum: 8)
Column 1 10905
Column 2 77
Column 3 1
Column 4 136
Column 5
Column 6 The Chemistry of the Cyclopropyl Group
Column 7 Tsuji, T.; Nishida, S.; Patai, S.; Rappoport, Z., ...
Column 8 307–373
Column 9
Column 10
Column 11 1987
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 10, maximum: 8)
Column 1 10906
Column 2 77
Column 3 1
Column 4 137
Column 5
Column 6 The Chemistry of the Cyclopropyl Group
Column 7 Vilsmaier, E.; Patai, S.; Rappoport, Z., Eds. The ...
Column 8 1341–145
Column 9
Column 10
Column 11 1987
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 9, maximum: 8)
Column 1 10952
Column 2 77
Column 3 1
Column 4 183
Column 5
Column 6 Cyclopropane-Derived Reactive Intermediates
Column 7 Boche, G.; Walborsky, H. M. Cyclopropane-Derived R...
Column 8 117–173
Column 9
Column 10
Column 11 1990
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 10, maximum: 8)
Column 1 10958
Column 2 77
Column 3 1
Column 4 189
Column 5
Column 6 Houben-Weyl Methods of Organic Chemistry
Column 7 Klunder, A. J. H.; Zwanenburg, B. Houben-Weyl Meth...
Column 8 2419–243
Column 9 E17c
Column 10
Column 11 1997
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 9, maximum: 8)
Column 1 10995
Column 2 77
Column 3 1
Column 4 226
Column 5
Column 6 Science of Synthesis
Column 7 Cha, J. K. Science of Synthesis 2005, 325–338.
Column 8 325–338
Column 9
Column 10
Column 11 2005
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 10, maximum: 8)
Column 1 17123
Column 2 82
Column 3 1
Column 4 13
Column 5
Column 6 Comprehensive Organometallic Chemistry II
Column 7 Dushin, R. G.; Edward, W. A.; Stone, F. G. A.; Wil...
Column 8 1071–109
Column 9 12
Column 10
Column 11 1995
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 9, maximum: 8)
Column 1 17124
Column 2 82
Column 3 1
Column 4 14
Column 5
Column 6 Modern Carbonyl Olefination
Column 7 Ephritikhine, M.; Villiers, C.; Takeda, T. Ed. Mod...
Column 8 223–285
Column 9
Column 10
Column 11 2004
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 9, maximum: 8)
Column 1 17126
Column 2 82
Column 3 1
Column 4 16
Column 5
Column 6 Transition Metals for Organic Synthesis (2nd Editi...
Column 7 Furstner, A.; Beller, M.; Bolm, C. Eds. Transition...
Column 8 449–468
Column 9
Column 10
Column 11 2004 17712 rows imported
. . importing table "FOOTNOTE" 38 rows imported
. . importing table "GT_STATS_REPORT" 0 rows imported
. . importing table "GT_VALIDATION_REPORT" 0 rows imported
. . importing table "OR_USERS" 1 rows imported
. . importing table "OS_USERS" 1 rows imported
. . importing table "PROCEDURENOTE" 70 rows imported
. . importing table "QC_TRACKING" 539881 rows imported
. . importing table "ROLE" 5 rows imported
. . importing table "SCHEMA" 3 rows imported
. . importing table "TASK_ALLOCATION" 159370 rows imported
. . importing table "USER_LOG" 174488 rows imported
. . importing table "VERSION" 3 rows imported
About to enable constraints...
IMP-00017: following statement failed with ORACLE error 2298:
"ALTER TABLE "AUTHOR" ENABLE CONSTRAINT "FK_AUTHOR_CITATIONID""
IMP-00003: ORACLE error 2298 encountered
ORA-02298: cannot validate (JW_ADMIN.FK_AUTHOR_CITATIONID) - parent keys not found
Import terminated successfully with warnings.
Regards,
Subash -
Import of oracle 7 datbase dump to oracle 6 datagbase
Is there any other way to import oracle 7 data dump into oracle 6.0.36 database (except SQL Loader)?
Can you help me because there are about 350 tables in dump and individually load into oracle 6 will consume lot of time and also integrity of data.This question is not related to Portal Export/Import. Please post this question under RDBMS Migration forum to get appropriate answers.
-
To read dump file without importing dump file
Hi All,
I want to check table data from dump file without import utility.
how can i open dump file in oracle 10g release 2 version.
Thanks in advance.user647572 wrote:
Hi All,
I want to check table data from dump file without import utility.
how can i open dump file in oracle 10g release 2 version.
Thanks in advance.You cannot open the oracle dumpfile . Though you can see it bu simply using the import utility .(actually data is not imported it show the sql only .)
In datapump , use parameter sqlfile=abc.sql
e.g; impdp scott/tiger@orcl directorory=dpump sqlfile-abc.sql dumpfile=xyz.dmp and logfile=xyz_log.log full=y
in case of original import use the parameter "show"
e.g; imp scott/tiger@orcl file=abc.dmp show=y log=abc_log.log full=y
hope this help u ..
--neeraj -
Aperture Video Import Problem - from Lumix GH4: Imported clips have their dates changed to the import date. The files show up on the hard drive with import date not created date, but many of these same files are not showing up in Aperture. Sometimes the clips actually show up with the current import but take on the video information from a previously imported file.
It was suggested I move this question to IPhoto or IMovie which I did.
Well moving to a different discussion group did not provide an answer to this question either. But what I finally did was import one batch of photos and videos into IPhoto for a given day at a time. Working with these I could change the date and times in order to get them in the original sequench taken. Then I would create an album with that batch. These would all be on the same day (IMove was closed for this phase). Then I would open IMovie, generate the thumbnails for that album, and select the album I had created. This was necessary because the importing process in IPhoto was using incorrect dates for my video so it was a real struggle finding them in IMove until I developed this approach.
I believe that this whole process was so screwy because I was importing from an external hard drive not a camera. I had these photos on a PC and did not have the original cameras to use to import directly which I am fairly sure would have made this easier! -
when i connect my iphone to my computer, the "found new hardware" message appears and then about 3 seconds later my screen turns blue and a data dump happens, i have to restart my computer to get things back up but I can not connect my iphone to my computer without this happening...what do I do?
32 gb for MacOSX
32 gb for Windows XP (NTFS)
136 gb for Data (accessible by both OSs, in FAT32 i guess, or not?)
I wouldn't do that. FAT32 is not very efficient on disk space for drives that's much larger than 16GB. 100G is not recommended for FAT32.
I finally decided with the following partitions:
32GB - NTFS Boot Camp
8GB - FAT32 Shared Data
100+GB - OS X (My main storage partition)
I also used the same guide to setup the three partitions without any problem. After installing Windoz, just igore the Linux part, and the Shared FAT32 partition become available for both OS's immediately. -
I created an album from photos taken off three different cameras. I am sorting the photos manually and when I close iPhoto, the photos move back either by date or by camera import time. How can I keep the photos from moving around?
Don't tell us, tell them
http://www.apple.com/feedback/macosx.html is the place for feature requests and feedback -
Changing date format in CSV import
I'm importing banking statements in csv (comma separated values) to Numbers but dates of transactions are not being properly formatted. My Apple-->System Pref-->International-->Format is set to Canada so that 5th of January 2008 can be shown as 5-Jan-08 or 05/01/08 which is my preference. However, the dates in data from bank will interpret this as 01-May-08 (the bank is Canadian). Is there a simple way I can reformat the data correctly?
thanks, hopefully I'm clear enough.Before importing these datas, set your system to USA
Given that, the dates would be correctly imported.
Then, reset the setting to Canada.
Numbers will then display the dates using you preferred setting.
You said "imperialism" ? You didn't but it resemble
Yvan KOENIG (from FRANCE mercredi 17 septembre 2008 10:54:52) -
SQL Dev 1.5.4+: Scripting DDL and data dumps?
In SQL Dev 1.5.4, can I script a DDL and data dump? If not, what about 2.0? If not 2.0, has anyone requested this functionality so I can vote for it? I find it frustrating that, while doing a Database Export, I can't even pre-declare (e.g. save) the set of objects I want to dump; sometimes, you want to selectively dump, and it's a pain to hunt and peck and select just those you want to dump. Easy if you want to dump everything but 2-3 objects in a large schema. Not so easy if you only need, say, 20 out of 100 objects to be dumped (e.g. for domain or configuration tables--some subset of the whole schema).
I'm really enjoying SQL Developer 1.5.4 by the way. Despite it's flaws, I'm pretty happy with it. Looking forward to 2.0 and beyond. Good work SQL Dev team.
Thanks very much.
DanaThey're all command line tools, so they can all be wrapped up in a batch or shell script. Bummer you can't access them... Hope you find a better solutionThanks K. I should be getting Oracle 10g Express Edition on my desktop soon--critical because we don't have full access to the Development instance. It's like putting changes through a straw over to the DBAs. I'm not sure why Development is locked-down to developers, but that's the way it is.
Any chance that Oracle 10g Express Edition comes with scriptable data pump binaries? Will still need authorization, but maybe that's one way to go. I hate trying to write my own Data Pump in Python or any other language. It's seems a bit absurd to me but I suppose there are reasons.
Dana -
Validation rules applied to data migration templates at import
Hi everyone!
First post here for me, so please bear with me if I missed something.
My company has just started the initial implementation of ByDesign. We come from a set of disparate and partially home-grown systems that we outgrew a few years ago.
As part of this initial phase, we are basically re-creating the data on customers, suppliers, etc. since none of our existing systems makes a good source, unfortunately. We will be using the XML templates provided by ByDesign itself to import the relevant data.
It has become clear that ByDesign applies validation rules on fields like postal codes (zip codes), states (for some countries), and other fields.
It would be really helpful if we could get access to the rules that are applied at import time, so that we can format the data correctly in advance, rather than having to play "trial and error" at import time. For example, if you import address data, the first time it finds a postal code in the Netherlands which is formatted as "1234AB", it will tell you that "there needs to a space in the 5th position, because it expects the format to be "1234 AB". At that point, you stop the import, go back to the template to fix all the Dutch postal codes, and try the import again, only to run into the next validation issue.
We work with a couple of very experienced German consultants to help us implement ByDesign, and I have put this question to them, but they are unaware of a documented set of validation rules for ByDesign. Which is why I ask the question here.
So just to be very celar on what we are looking for: the data validation/formatting rules that ByDesign enforces at the time the XML data migration templates are imported.
Any help would be appreciated!
Best regards,
EelcoHello Eelco,
welcome to the SAP ByDesign Community Network!
The checks performed on postal codes are country specific, and represent pretty much the information that you would find in places like e.g. the "Postal Codes" page in Wikipedia.
I recommend to start with small files of 50-100 records that are assembled of a representative set of different records, in order to collect the validation rules that need reactions based on your data in an efficient way. Only once you have caught these generic data issues, I would proceed to larger files.
Personnaly I prefer to capture such generic work items on my list, then fix the small sample file immediately by editing, and do an immediate resimulation of the entire file, so that I can drill deeper and collect more generic issues of my data sample. Only after a while when I have harvested all learnings that were in my sample file, I would then apply the collected learnings to my actual data and create a new file - still not too large, in order to use my time efficiently.
Best regards
Michael -
I need a good data dump for practise
Can anyone refer me to where I can get really nice data dump for sql and plsql practise.
Thanks
Michael<p>Michael,</p>
</p>Read this article Oracle's Sample Schemas: Saying Goodbye to Scott. You should find sample data in ORACLE_HOME\demo and ORACLE_HOME\rdbms\admin directories. </p>
<p>With kind regards,</p>
<p>Jornica</p>
Maybe you are looking for
-
First and last day of current year as a date object
Howdy... can someone please help me out... i need a fast way to get the first day of the current year as a date object and the last day of the current year as a date object... big thanks for any sample code...
-
How can I tell if I'm using my discrete Quadro 1000M GPU?
I just recieved my w520 in the mail today, and I have no idea how I can tell when the optimus kicks in or if it's even using the discrete card. I already downloaded the driver for the 1000m. Anyone know how I can check what card it's using?
-
Solaris 10 upgrade from U3 to U8
Hi Gurus, I need to upgrade solaris 10 from U3 to U8, we tried a unsuccessful failure after creating a boot environment when we tried to upgrade from DVD 10/09 with command luupgrade -u BE -s /cdrom it complaint about a one file in some path (not sur
-
AIR 3.9 Android Workers Tested
HI all, I have a Worker compatibility library that allows you to develop pseudo-threaded code that will take advantage of Workers when available, and will fall back to pseudo-threads when not available (Flash Player < 11.4, workersSupported=false, iO
-
Illustrator crashes on startup after upgrading to Yosemite.
Have tried uninstalling and fresh install of Illustrator. Other CC apps seem fine just Illustrator so far.