Problem importing a table with blob's
hi all, I'm facing the following situation.
Source DB : 10.2.0.3 (client's DB)
Destination DB (mine): 10.2.0.4
I've a dump file (traditional) of a particular schema.
I'm running import (imp) to import on my DB.
It runs fine until it reaches one particular table. This table has 6 colums, 3 of them are BLOB.
This table has 260000 rows (checked with export log).
When import reaches row 152352 it stops loading data, but import is still running.
what can I do to get more information from this situation in order to solve this problem?
Any suggestion will be appreciated!
Thanks in advance.
Pl identify the source and target OS versions. Are there any useful messages in the alert.log ? How long did the export take ? Rule of thumb states import will take twice as long. Have you tried expdp/impdp instead ? Also see the following -
How To Diagnose And Troubleshoot Import Or Datapump Import Hung Scenarios (Doc ID 795034.1)
How To Find The Cause of a Hanging Import Session (Doc ID 184842.1)
Import is Slow or Hangs (Doc ID 1037231.6)
Export and Import of Table with LOB Columns (like CLOB and BLOB) has Slow Performance (Doc ID 281461.1)
HTH
Srini
Similar Messages
-
Error while importing a table with BLOB column
Hi,
I am having a table with BLOB column. When I export such a table it gets exported correctly, but when I import the same in different schema having different tablespace it throws error
IMP-00017: following statement failed with ORACLE error 959:
"CREATE TABLE "CMM_PARTY_DOC" ("PDOC_DOC_ID" VARCHAR2(10), "PDOC_PTY_ID" VAR"
"CHAR2(10), "PDOC_DOCDTL_ID" VARCHAR2(10), "PDOC_DOC_DESC" VARCHAR2(100), "P"
"DOC_DOC_DTL_DESC" VARCHAR2(100), "PDOC_RCVD_YN" VARCHAR2(1), "PDOC_UPLOAD_D"
"ATA" BLOB, "PDOC_UPD_USER" VARCHAR2(10), "PDOC_UPD_DATE" DATE, "PDOC_CRE_US"
"ER" VARCHAR2(10) NOT NULL ENABLE, "PDOC_CRE_DATE" DATE NOT NULL ENABLE) PC"
"TFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 STORAGE(INITIAL 65536 FREELISTS"
" 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "TS_AGIMSAPPOLOLIVE030"
"4" LOGGING NOCOMPRESS LOB ("PDOC_UPLOAD_DATA") STORE AS (TABLESPACE "TS_AG"
"IMSAPPOLOLIVE0304" ENABLE STORAGE IN ROW CHUNK 8192 PCTVERSION 10 NOCACHE L"
"OGGING STORAGE(INITIAL 65536 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEF"
"AULT))"
IMP-00003: ORACLE error 959 encountered
ORA-00959: tablespace 'TS_AGIMSAPPOLOLIVE0304' does not exist
I used the import command as follows :
imp <user/pwd@conn> file=<dmpfile.dmp> fromuser=<fromuser> touser=<touser> log=<logfile.log>
What can I do so that this table gets imported correctly?
Also tell me "whether the BLOB is stored in different tablespace than the default tablespace of the user?"
Thanks in advance.Hello,
U can either
1) create a tablespace with the same name in destination where you are trying to import.
2) get the ddl of the table, modify the tablespace name to reflect the existing tablespace name in destination and run the ddl in the destination database, and run your import command with option ignore=y--> which will ignore all the create errors.
Regards,
Vinay -
Problem import excel table with photos
When I import excel table with photos, the photos don't appear
Welcome to Project Siena!
If you're seeing an 'x' instead of your photo in the data source it's possible that Project Siena can't access the directory. Try saving your images in another folder such as C:\Users\Public\Pictures.
Others have also reported that if the Library isn't set up correctly in Windows 8 that they've had issues.
Here are some posts to look at while we wait for additional information from you:
Importing local images via an Excel file
Siena Gallery unable to load image
Images in Public Pictures directories showing
up as X in Img from URL
(fyi - this last post's screenshot will look different than yours as it was from Beta 1.)
Thor -
Problem importing a table with data in an existing schema
Hi,
I am trying to import a schema and found an error while importing
ORA-39083: Object type TABLE failed to create with error:
ORA-01843: not a valid month
Failing sql is:
CREATE TABLE "AMBERSAP"."QUOTE" ("QUOTE_ID" VARCHAR2(15) NOT NULL ENABLE, "CONTRACT_ID" VARCHAR2(12), "QUOTE_OWNER" VARCHAR2(100) NOT NULL ENABLE, "TEMP_CONTRACT" VARCHAR2(1) DEFAULT 'N', "QUOTE_STATUS_CODE" VARCHAR2(1), "QUOTE_INITIATE_DATE" DATE NOT NULL ENABLE, "QUOTE_EXPIRY_DATE" DATE NOT NULL ENABLE, "EXTENDED_EXPIRY_DATE" DATE, "OFFER_DAYS" NUMBER(*,0), "OFFER_CLOSED_DATE" DATE, "BRAND_CODE" VARCHA
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Any feedback on this is highly apprciated.
Thanks in advance .I've set the NLS_DATE_FORMAT='DD/MM/YYYY'
by export NLS_DATE_FORMAT='DD/MM/YYYY'
but still failing to create the quote table while i try to import the whole schema.
here is the error message
wlgora(oracle)[HDM_D]/db/HDM_D/dba/exp$: file=amberquota1.dmp logfile=impquote1.log TABLE_EXISTS_ACTION=APPEND <
Import: Release 10.2.0.4.0 - 64bit Production on Tuesday, 11 October, 2011 14:52:58
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
With the Partitioning, Real Application Clusters, OLAP, Data Mining
and Real Application Testing options
Master table "AMBER"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "AMBER"."SYS_IMPORT_FULL_01": amber/******** directory=EXPORT dumpfile=amberquota1.dmp logfile=impquote1.log TABLE_EXISTS_ACTION=APPEND
Processing object type TABLE_EXPORT/TABLE/TABLE
ORA-39083: Object type TABLE failed to create with error:
ORA-01843: not a valid month
Failing sql is:
CREATE TABLE "AMBER"."QUOTE" ("QUOTE_ID" VARCHAR2(15) NOT NULL ENABLE, "CONTRACT_ID" VARCHAR2(12), "QUOTE_OWNER" VARCHAR2(100) NOT NULL ENABLE, "TEMP_CONTRACT" VARCHAR2(1) DEFAULT 'N', "QUOTE_STATUS_CODE" VARCHAR2(1), "QUOTE_INITIATE_DATE" DATE NOT NULL ENABLE, "QUOTE_EXPIRY_DATE" DATE NOT NULL ENABLE, "EXTENDED_EXPIRY_DATE" DATE, "OFFER_DAYS" NUMBER(*,0), "OFFER_CLOSED_DATE" DATE, "BRAND_CODE" VARCHAR2(
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
ORA-39112: Dependent object type OBJECT_GRANT:"AMBER" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type OBJECT_GRANT:"AMBER" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type OBJECT_GRANT:"AMBER" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type OBJECT_GRANT:"AMBER" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type OBJECT_GRANT:"AMBER" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type OBJECT_GRANT:"AMBER" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type OBJECT_GRANT:"AMBER" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type OBJECT_GRANT:"AMBER" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type OBJECT_GRANT:"AMBER" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type OBJECT_GRANT:"AMBER" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type OBJECT_GRANT:"AMBER" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type OBJECT_GRANT:"AMBER" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type OBJECT_GRANT:"AMBER" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type OBJECT_GRANT:"AMBER" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
ORA-39112: Dependent object type INDEX:"AMBER"."QT_PK" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
ORA-39112: Dependent object type CONSTRAINT:"AMBER"."QPPS_ACTIVE_CHK" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type CONSTRAINT:"AMBER"."QTC_ACTIVE_CHK" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type CONSTRAINT:"AMBER"."QT_ACTIVE_CHK" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type CONSTRAINT:"AMBER"."QL_ACTIVE_CHK" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type CONSTRAINT:"AMBER"."QP_ACTIVE_CHK" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type CONSTRAINT:"AMBER"."QF_ACTIVE_CHK" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type CONSTRAINT:"AMBER"."QTEXP_ACTIVE_CHK" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type CONSTRAINT:"AMBER"."QNC_ACTIVE_CHK" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type CONSTRAINT:"AMBER"."QT_PK" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type CONSTRAINT:"AMBER"."CON_AUTH_CON_CHECK" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"AMBER"."QT_PK" creation failed
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
ORA-39112: Dependent object type REF_CONSTRAINT:"AMBER"."QUOTE_QUOTESTATUS_FK" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type REF_CONSTRAINT:"AMBER"."QUOTE_BRAND_FK" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
ORA-39112: Dependent object type REF_CONSTRAINT:"AMBER"."QCST_QUOTE_FK" skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
ORA-39112: Dependent object type TABLE_STATISTICS skipped, base object type TABLE:"AMBER"."QUOTE" creation failed
Job "AMBER"."SYS_IMPORT_FULL_01" completed with 31 error(s) at 14:53:00
wlgora(oracle)[HDM_D]/db/HDM_D/dba/exp$: sqlp
SQL*Plus: Release 10.2.0.4.0 - Production on Tue Oct 11 14:53:09 2011
Copyright (c) 1982, 2007, Oracle. All Rights Reserved.
Connected to:
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
With the Partitioning, Real Application Clusters, OLAP, Data Mining
and Real Application Testing options
Edited by: 790072 on 10/10/2011 18:54
Edited by: 790072 on 10/10/2011 18:55 -
Problem while importing table with blob datatype
hi i am having a database 9i on windows xp and dev database 9i on AIX 5.2
while i am taking export of normal tables and trying to import i am successful.but when i am trying to import a table with blob datatype it is throwing "tablespace <tablespace_name> doesn't exist" error
here how i followed.
SQL*Plus: Release 9.2.0.1.0 - Production on Mon Oct 8 14:08:29 2007
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Enter user-name: test@test
Enter password: ****
Connected to:
Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production
With the Partitioning, OLAP and Oracle Data Mining options
JServer Release 9.2.0.1.0 - Production
SQL> create table x(photo blob);
Table created.
exporting:
D:\>exp file=x.dmp log=x.log tables='TEST.X'
Export: Release 9.2.0.1.0 - Production on Mon Oct 8 14:09:40 2007
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Username: pavan@test
Password:
Connected to: Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production
With the Partitioning, OLAP and Oracle Data Mining options
JServer Release 9.2.0.1.0 - Production
Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
About to export specified tables via Conventional Path ...
Current user changed to TEST
. . exporting table X 0 rows exported
Export terminated successfully without warnings.
importing:
D:\>imp file=x.dmp log=ximp.log fromuser='TEST' touser='IBT' tables='X'
Import: Release 9.2.0.1.0 - Production on Mon Oct 8 14:10:42 2007
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Username: system@mch
Password:
Connected to: Oracle9i Enterprise Edition Release 9.2.0.6.0 - 64bit Production
With the Partitioning, Real Application Clusters, OLAP and Oracle Data Mining op
tions
JServer Release 9.2.0.6.0 - Production
Export file created by EXPORT:V09.02.00 via conventional path
Warning: the objects were exported by PAVAN, not by you
import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
import server uses US7ASCII character set (possible charset conversion)
. importing TEST's objects into IBT
IMP-00017: following statement failed with ORACLE error 959:
"CREATE TABLE "X" ("PHOTO" BLOB) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS "
"255 STORAGE(INITIAL 65536 FREELISTS 1 FREELIST GROUPS 1) TABLESPACE "TESTTB"
"S" LOGGING NOCOMPRESS LOB ("PHOTO") STORE AS (TABLESPACE "TESTTBS" ENABLE "
"STORAGE IN ROW CHUNK 8192 PCTVERSION 10 NOCACHE STORAGE(INITIAL 65536 FREE"
"LISTS 1 FREELIST GROUPS 1))"
IMP-00003: ORACLE error 959 encountered
ORA-00959: tablespace 'TESTTBS' does not exist
Import terminated successfully with warnings.
why it is happening for this table alone?plz help me
thanks in advanceHere is exerpt from {
http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:378418239571}
=============================================
Hi Tom,
I have a dump file containing blob datatypes, when i import the dump file in a schema it gives an
error stating that the tablespace for Blob datatype does not exists. My question is how do i import
the dump file in the default tablespace of the importing user.
Followup March 2, 2004 - 7am US/Eastern:
You'll have to precreate the table.
do this:
imp userid=u/p tables=that_table indexfile=that_table.sql
edit that_table.sql, fix up the tablespace references to be whatever you want to be, run that sql.
then imp with ignore=y
for any MULTI-SEGMENT object (iot's with overflows, tables with lobs, partitioned tables, for
example), you have to do this -- imp will not rewrite ALL of the tablespaces in that
multi-tablespace create -- hence you either need to have the same tablespaces in place or precreate
the object with the proper tablespaces.
Only for single tablespace segments will imp rewrite the create to go into the default if the
requested tablespace does not exist.
===================================================
To summarize: precreate target table when importing multi-segment tables -
Import of Tables with CLOB problem.
Hi,
I am having problems with the import of tables with CLOB columns.
Tables with CLOB columns will force the use of the tablespace in which it (table) was created originally. It will not be created on the new schema's default tablespace. Is there a way to get around this problem?
More on this problem can be found here :
Re: Import error 1647, CLOB fields problem, need help!
I am using Oracle 9.2.0.6 standard edition, that is why the transportable tablespace option will not work.
Thanx
RobSorry I have posted the wrong link, it should be:
Re: Import error 1647, CLOB fields problem, need help!
Regards
Rob -
I can't import a table contains BLOB column from one user to another user.
1) I create two user both have connect role,and each has its own tablespace, DDL:
create user d2zd identified by d2zd default tablespace d2zd quota unlimited on d2zd account unlock;
grant connect to d2zd;
create user d3zd identified by d3zd default tablespace d3zd quota unlimited on d3zd account unlock;
grant connect to d3zd;
2)Then enter oracle as d2zd and create a table contains BLOB column and insert data to the table.
3) export d2zd as follow:
exp d2zd/d2zd file=d2zd.dmp
4) import to d3zd as follow:
imp d3zd/d3zd fromuser=d2zd touser=d3zd file=d2zd.dmp
the question is the table with BOLB colum can't be import,
it says:have no privilege on tablespace d2zd.
How can I import a table contains BLOB column from one user to another user?Hi - the reason for as our friend already told ist that a blob can be stored outside of the table segment, in another Tablespace, This is for performance reason.
Sou you would need to have Quota on two tablespaces.
the one which holds the table segment the other which holds the blob(segment).
Regards
Carl
Message was edited by:
kreitsch -
How to export a table with BLOBs
Hi,
when I have to export a table with its own data, I generally export it by using the export on UNIX.
I've never tried to export a table with a BLOB. Is it possible in the same manner or what else?
Thanks!Hi mark;
Please see below which could be helpful for your issue:
Export and Import of Table with LOB Columns (like CLOB and BLOB) has Slow Performance [ID 281461.1]
Master Note for Data Pump [ID 1264715.1]
Regard
Helios -
How to import a table with renamed name
If I want to import a table from a DMP on UNIX towards my user where I already have a new version but I don't want to drop this new version 'cause I have only to move some records from one to the other, is there an import option for importing the table with another name? I mean I want to import on the same user, where there already exists the table MYTABLE, an old version of MYTABLE by impoting it with the new name MYTABLE_OLD, so afterwards I can update MYTABLE with some of old records of its old versione MYTABLE_OLD. Is it possible?
You cannot directly do this.
You can import the Table into a different schema and then rename the table.
export and import back into the required schema.
Import DataPump: How to Import Table Data into a Table that has Different Name ? [ID 342314.1]
Regards
Rajesh -
Importing a table with a BLOB column is taking too long
I am importing a user schema from 9i (9.2.0.6) database to 10g (10.2.1.0) database. One of the large tables (millions of records) with a BLOB column is taking too long to import (more that 24 hours). I have tried all the tricks I know to speed up the import. Here are some of the setting:
1 - set buffer to 500 Mb
2 - pre-created the table and turned off logging
3 - set indexes=N
4 - set constraints=N
5 - I have 10 online redo logs with 200 MB each
6 - Even turned off logging at the database level with disablelogging = true
It is still taking too long loading the table with the BLOB column. The BLOB field contains PDF files.
For your info:
Computer: Sun v490 with 16 CPUs, solaris 10
memory: 10 Gigabytes
SGA: 4 GigabytesLegatti,
I have feedback=10000. However by monitoring the import, I know that its loading average of 130 records per minute. Which is very slow considering that the table contains close to two millions records.
Thanks for your reply. -
Global Temp table with BLOB causing session crash
Hi,
i have a table as follows:
create global temporary table spg_file_import (
name varchar2 (128) constraint sfi_nam_ck not null,
mime_type varchar2 (128),
doc_size number,
dad_charset varchar2 (128),
last_updated date,
content_type varchar2 (128),
content long raw,
blob_content blob
on commit delete rows
this is my 9ias 'document' table thats used to receive uploaded files which i modified to be global temporary.
what i want to do is:
a)upload a text file (xml)
b) convert that file to clob
c) store that file in new permanent table as clob
d) commit (hence delete temp table rows as they are no longer necessary)
to test it i have:
CREATE OR REPLACE procedure daz_html
as
begin
htp.p(' <FORM enctype="multipart/form-data" action="daz_fu" method="POST">');
htp.p(' <p>');
htp.p(' File to upload: <INPUT type="file" name="p_file_in"><br>');
htp.p(' <p><INPUT type="submit">');
htp.p(' </form>');
htp.p('</body>');
htp.p('</html>');
end;
CREATE OR REPLACE procedure daz_fu (
p_file_in varchar2
as
-- BLOB Stream locator
v_raw blob;
v_clob clob;
v_blob_length number;
v_length number;
v_buffer varchar2(32767);
v_pos number := 1;
begin
-- Get xml document from transient 9iAs data store.
select blob_content
into v_raw
from spg_file_import
where name = p_file_in;
-- create temp LOB
dbms_lob.createtemporary(v_clob, false);
-- get BLOB length
v_blob_length := dbms_lob.getlength(v_raw);
loop
-- get length to read. this is set as a max length of 32767
v_length := least((v_blob_length - (v_pos-1)),32767);
-- assign BLOB to a varchar2 buffer in preparation to convert to CLOB
v_buffer := utl_raw.cast_to_varchar2(dbms_lob.substr(v_raw, v_length, v_pos));
-- now write out to the CLOB
dbms_lob.writeappend(v_clob, v_length, v_buffer);
-- increment our position.
v_pos := v_pos + v_length;
-- exit when we are done.
exit when v_pos >= v_blob_length;
end loop;
commit;
htp.p('commit done!');
end;
now if i upload a small text file (about 5kb) it works with no problem.
however if I upload a large text file (say about 1Mb) it crashes oracle with:
Fri, 26 Jul 2002 11:49:24 GMT
ORA-03113: end-of-file on communication channel
DAD name: spgd1
PROCEDURE : daz_fu
USER : spg
URL : http://www.bracknell.bt.co.uk/pls/spgd1/daz_fu
PARAMETERS :
============
p_file_in:
F22210/Document.txt
this produces a large trc file.. the trace file indicates the crash occured on the commit; line
Current RBA:[0x4eb0.117.10]
*** 2002-07-26 12:35:11.857
ksedmp: internal or fatal error
ORA-00600: internal error code, arguments: [kcblibr_user_found], [4294967295], [2], [12583564], [65], [], [], []
Current SQL statement for this session:
declare
rc__ number;
begin
owa.init_cgi_env(:n__,:nm__,:v__);
htp.HTBUF_LEN := 255;
null;
daz_fu(p_ref_in=>:p_ref_in,p_type_in=>:p_type_in,p_file_in=>:p_file_in);
if (wpg_docload.is_file_download) then
rc__ := 1;
wpg_docload.get_download_file(:doc_info);
null;
commit;
else
rc__ := 0; null;
commit;
owa.get_page(:data__,:ndata__);
end if;
:rc__ := rc__;
end;
----- PL/SQL Call Stack -----
object line object
handle number name
812b1998 42 procedure SPG.DAZ_FU
819dff90 7 anonymous block
----- Call Stack Trace -----
If i reaplce the temporary table with a non-temp table of the same structure i get no problems what-so-ever. Am I doing something that I shouldnt be with global temporary tables?
Thanks.This is on Oracle 8.1.7.2
-
"Error: Document not found (WWC-46000)" with a form based on a table with blob item
Hi all,
I have to manage a table-based form. One of the fields of the table is BLOB and the corrisponding field of the form is HIDDEN.
When I push the INSERT button all works well but when I try to UPDATE I get "Error: Document not found. (WWC-46000)".
Have you some suggestions ?
Thanks,
Antonino
p.s. Oracle Portal 3.0.7.6.2 / NT
nullSorry, I think I did not explain well my problem.
Imagine this simple table:
key number;
description varchar2(50);
image blob;
I need to make a form that contains the corresponding "key" field and the "description" field but not the "image" one. I don't want to allow the end user to upload images!
When I insert a row the form works well and in the "image" field of the table an empty blob or null (now I don't remember) is stored: that's ok.
Now imagine I want to change the value of the "description" field. I submit a query for the right key, I type a new value for the description and finally I push UPDATE button but....an error occours.
I think this error is related with the Blob item of the table but I'm not sure.
Thanks again,
Antonino
<BLOCKQUOTE><font size="1" face="Verdana, Arial">quote:</font><HR>Originally posted by Dmitry Nonkin([email protected]):
Antonino,
If I understood the problem correctly:
The form's item type for BLOB column used to upload content cannot be hidden, that's why we have a special item type FileUpload(binary).
Thanks,
Dmitry<HR></BLOCKQUOTE>
null -
Export and Import Of Tables having BLOB and Raw datatype
Hi Gurus,
I had to export one schema in one database and import to another schema in another database.
However my current database contains raw and blob datatype.I have exported the whole database by the following commnad
exp SYSTEM/manager FULL=y FILE=jbrms_full_19APR2013.dmp log=jbrms_full_19APR2013.log GRANTS=y ROWS=y
My question is if all the tables with raw and blob have been exported properly or not.I have done one more thing after taking the export , I have imported to local db and checked the no of rows in the both the envs are same.As I have not tested with the application to confirm.
I am using this version of Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - Production
not able to attach the complete log file but for the schema jbrms which has the blob and raw datatype.
Please let me know if you find some potential concerns with the export for BLOB and raw
. about to export JBRMS's tables via Conventional Path ...
. . exporting table FS_FSENTRY 8 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table FS_WS_DEFAULT_FSENTRY 2 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table PM_WS_DEFAULT_BINVAL 60 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table PM_WS_DEFAULT_BUNDLE 751 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table PM_WS_DEFAULT_NAMES 2 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table PM_WS_DEFAULT_REFS 4 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table VERSIONING_FS_FSENTRY 1 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table VERSIONING_PM_BINVAL 300 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table VERSIONING_PM_BUNDLE 11654 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table VERSIONING_PM_NAMES 2 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table VERSIONING_PM_REFS 1370 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.You could see the 'QUESTIONABLE STATISTICS' warning for a couple of reasons. I don't remember them all but.
1. If the target and source character set is different.
2. system generated names (I think?)
the best solution if you don't need the exact statistics that are on your source database would be to add
statistics=none
to your imp command and then regather statistics when the imp command is done.
Dean -
Hi
I have a problem to import a dump that contains the tables with 0 rows.
When i exported from ORACLE 11.2 64 bit on SERVER 2008 i noticed that log didn't confirm the tables with 0 rows.
When i want to import to ORACLE 11.2 64 bit on other SERVER 2008 i have a lot of errors on this tables with 0 rows.
In the log i get the same tables with 1 row at least, but no one with 0 rows.
I open my dump in TEXTPAD and i see it contains "CREATE ....." these tables.
I don't understand why it happens. I used FUll DUMP by SYS, it didn't help.
This is not first time when i export and import dumps,no errors.
I'm using command "EXP" and "IMP" and every time it's ok.(IF it's a releavent)
Why it happens? any solutions for this issue?
ThanksI've found (i guess so) solution to this issue
here are two links to this new feature that is called deffered segment creation
The reason for this behavior is 11.2 new feature ‘Deferred Segment Creation‘ – the creation of a table sent is deferred until the first row is inserted.
As a result, empty tables are not listed in dba_segments and are not exported by exp utility
http://www.nativeread.com/2010/04/09/11gr2-empty-tables-skipped-by-export-deferred-segment-creation/
http://antognini.ch/2010/10/deferred-segment-creation-as-of-11-2-0-2/
And this is i've found in official documentation from oracle
Beginning in Oracle Database 11g Release 2, when creating a non-partitioned heap-organized table in a locally managed tablespace, table segment creation is deferred until the first row is inserted. In addition, creation of segments is deferred for any LOB columns of the table, any indexes created implicitly as part of table creation, and any indexes subsequently explicitly created on the table.The advantages of this space allocation method are the following:A significant amount of disk space can be saved for applications that create hundreds or thousands of tables upon installation, many of which might never be populated.Application installation time is reduced.There is a small performance penalty when the first row is inserted, because the new segment must be created at that time.
To enable deferred segment creation, compatibility must be set to '11.2.0' or higher. You can disable deferred segment creation by setting the initialization parameter DEFERRED_SEGMENT_CREATION to FALSE. The new clauses SEGMENT CREATION DEFERRED and SEGMENT CREATION IMMEDIATE are available for the CREATE TABLE statement. These clauses override the setting of the DEFERRED_SEGMENT_CREATION initialization parameter.
+Note that when you create a table with deferred segment creation (the default), the new table appears in the _TABLES views, but no entry for it appears in the SEGMENTS views until you insert the first row. There is a new SEGMENTCREATED column in _TABLES, _INDEXES, and _LOBS that can be used to verify deferred segment creation+
Note:
The original Export utility does not export any table that was created with deferred segment creation and has not had a segment created for it. The most common way for a segment to be created is to store a row into the table, though other operations such as ALTER TABLE ALLOCATE EXTENTS will also create a segment. If a segment does exist for the table and the table is exported, the SEGMENT CREATION DEFERRED clause will not be included in the CREATE TABLE statement that is executed by the original Import utility. -
Fireworks Imports - Replacing Tables with Divs
Hi folks,
I love Fireworks to bits but it's intergration with DW is slowly driving me nuts.
Take a look at the code for www.total-formula1.com
I've imported a nav section, top bar and footer from Fireworks - which (IMHOP) look great until you look at the code.
Then it becomes a mess of tables and spacer gif's.
I've tried importing as CSS with images but it looses behaviours.
Am I kidding myself. Do I have to split it all up and use DW to perform rollovers?
Thoughts, help or a loaded shotgun welcome.Fireworks CSS and Images export does not support Javascript. Same with States, if I recall. CSS and Images will not export out anything but the current state. You would have to add that code (CSS or JS) in by using Dreamweaver or another editor.
If properly set up, you may be able to export the entire page as CSS and images, then only have to add in the CSS to create the mouseover effect, after they've been exported as graphics.
HTH
Jim Babbage
Maybe you are looking for
-
Adobe Photoshop CS3 & Paging Out (Screengrabs)
I have a 2x 2.66 Dual Core Intel Xeon mac with 5GB of RAM. Running latest OSX. Lately, have been noticing slowdowns and, when using the Terminal, have been seeing a lot of Pageouts. For the screengrabs below, I had a fresh restart, and opened 5 layer
-
Macro not working for Word.
My macro is not working for Microsoft word. Its showing some macro settings issue. Please help.
-
Memo record-cash position and liquidity forescast
Hi, I am working with memo record to impact in the cash postion and the liquidity forecast in order to show future cash movement. I want to know if is possible to relate this memo with the real transaction? Thanks, Cecilia
-
Hi all. We are currently using JCST condition to determine taxes in India. JCST access sequence included in JCST condition has got three accesses (Fisrt seems to be for domestic, next for export business). Normally when the first is not determined, g
-
Custom/Complex data types in web services
Hi, I've read in several articles here on otn that JDev 9.0.3 would have better support for creating web services with complex datatypes (like java.util.LinkedList), but still (in the 9.0.3 Preview) theese methods are greyed out on the web service wi