XML DB: As XML file size doubles validate time more than doubles
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
We have default XMLDB install. I am using schemavalidate() to successfullly validate xml files against a registered xsd. For some reason as the files size doubles the time to validate the file grows at a faster rate than double? We wanted to generate approx 18MB files since they hold 5000 records. But as you can see it appears I would be better off generating six 3MB files instead. What init.ora or xdbconfig.xml parameter might I change to help out with behavior? Any other ideas of why this happening? Is this an example of "doesn't scale"?
1.5MB 3 Seconds
3MB 15 Seconds
6MB 1 Minute
12MB 6 Minutes
18MB 16 Minutes
Code is simple..
procedure validate_xml_file (p_dir in varchar2, p_file in varchar2) is
v_xmldoc xmltype;
begin
select XMLTYPE(bfilename(p_dir, p_file),NLS_CHARSET_ID('AL32UTF8'))
into v_xmldoc
from dual;
v_xmldoc.schemaValidate();
end validate_xml_file;
If I take the clause off the end of the cview table this procedure does not work at all.. It fails at the schemavalidate call...
Error report:
ORA-19030: Method invalid for non-schema based XML Documents.
ORA-06512: at "SYS.XMLTYPE", line 345
ORA-06512: at line 26
19030. 00000 - "Method invalid for non-schema based XML Documents."
*Cause: The method can be invoked on only schema based xmltype objects.
*Action: Don't invoke the method for non schema based xmltype objects.
Similar Messages
-
Hi, All:
I wondering why the XML file size changed after go through DB using CLOB. The original size is 1KB, but after I use the PUT and GET method, the size changed to 8KB. There is a lot of space appened to the content.
Thanks a lot!Can you provide an example of the code you are using for the put / get method. In SQL*PLUS I do not see what you are talking about
SQL> create or replace directory xmltemp as 'c:\temp'
2 /
SQL> drop table xmltest
2 /
SQL> host dir c:\temp\testcase.xml
Volume in drive C has no label.
Volume Serial Number is 8CC2-E429
Directory of c:\temp
02/01/2006 04:48 AM 1,174 testcase.xml
1 File(s) 1,174 bytes
0 Dir(s) 23,780,163,584 bytes free
SQL> --
SQL> host type c:\temp\testcase.xml
<PurchaseOrder xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="http://xfiles:8080/home/SCOTT/poSource/x
sd/purchaseOrder.xsd">
<Reference>EABEL-20030409123336251PDT</Reference>
<Actions>
<Action>
<User>EZLOTKEY</User>
</Action>
</Actions>
<Reject/>
<Requestor>Ellen S. Abel</Requestor>
<User>EABEL</User>
<CostCenter>R20</CostCenter>
<ShippingInstructions>
<name>Ellen S. Abel</name>
<address>300 Oracle Parkway
Redwood Shores
CA
94065
USA</address>
<telephone>650 506 7300</telephone>
</ShippingInstructions>
<SpecialInstructions>Counter to Counter</SpecialInstructions>
<LineItems>
<LineItem ItemNumber="1">
<Description>Samurai 2: Duel at Ichijoji Temple</Description>
<Part Id="37429125526" UnitPrice="29.95" Quantity="3"/>
</LineItem>
<LineItem ItemNumber="2">
<Description>The Red Shoes</Description>
<Part Id="37429128220" UnitPrice="39.95" Quantity="4"/>
</LineItem>
<LineItem ItemNumber="3">
<Description>A Night to Remember</Description>
<Part Id="715515009058" UnitPrice="39.95" Quantity="1"/>
</LineItem>
</LineItems>
</PurchaseOrder>
SQL> --
SQL> create table xmltest of xmltype
2 /
SQL> insert into xmltest values (xmltype(bfilename('XMLTEMP','testcase.xml'),nls_charset_id('AL32UTF8'),null,1,1))
2 /
SQL> commit
2 /
SQL> select dbms_lob.getLength(value(x).getClobVal())
2 from xmltest x
3 /
1193
SQL> set long 100000
SQL> set echo off
SQL> set pages 0
SQL> set lines 150
SQL> set heading off
SQL> set feedback off
SQL> set trimspool on
SQL> spool c:\temp\testcase.xml.out
SQL> --
SQL> select object_value
2 from xmltest
3 /
<PurchaseOrder xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="http://xfiles:8080/home/SCOTT/poSource/x
sd/purchas
eOrder.xsd">
<Reference>EABEL-20030409123336251PDT</Reference>
<Actions>
<Action>
<User>EZLOTKEY</User>
</Action>
</Actions>
<Reject/>
<Requestor>Ellen S. Abel</Requestor>
<User>EABEL</User>
<CostCenter>R20</CostCenter>
<ShippingInstructions>
<name>Ellen S. Abel</name>
<address>300 Oracle Parkway
Redwood Shores
CA
94065
USA</address>
<telephone>650 506 7300</telephone>
</ShippingInstructions>
<SpecialInstructions>Counter to Counter</SpecialInstructions>
<LineItems>
<LineItem ItemNumber="1">
<Description>Samurai 2: Duel at Ichijoji Temple</Description>
<Part Id="37429125526" UnitPrice="29.95" Quantity="3"/>
</LineItem>
<LineItem ItemNumber="2">
<Description>The Red Shoes</Description>
<Part Id="37429128220" UnitPrice="39.95" Quantity="4"/>
</LineItem>
<LineItem ItemNumber="3">
<Description>A Night to Remember</Description>
<Part Id="715515009058" UnitPrice="39.95" Quantity="1"/>
</LineItem>
</LineItems>
</PurchaseOrder>
SQL> spool off
SQL> --
SQL> set echo on
SQL> host dir c:\temp\testcase.xml.out
Volume in drive C has no label.
Volume Serial Number is 8CC2-E429
Directory of c:\temp
06/11/2006 01:53 PM 1,313 testcase.xml.out
1 File(s) 1,313 bytes
0 Dir(s) 23,780,163,584 bytes free
SQL> --
SQL> host type c:\temp\testcase.xml.out
SQL> --
SQL> select object_value
2 from xmltest
3 /
<PurchaseOrder xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="http://xfiles:8080/home/SCOTT/poSource/x
sd/purchas
eOrder.xsd">
<Reference>EABEL-20030409123336251PDT</Reference>
<Actions>
<Action>
<User>EZLOTKEY</User>
</Action>
</Actions>
<Reject/>
<Requestor>Ellen S. Abel</Requestor>
<User>EABEL</User>
<CostCenter>R20</CostCenter>
<ShippingInstructions>
<name>Ellen S. Abel</name>
<address>300 Oracle Parkway
Redwood Shores
CA
94065
USA</address>
<telephone>650 506 7300</telephone>
</ShippingInstructions>
<SpecialInstructions>Counter to Counter</SpecialInstructions>
<LineItems>
<LineItem ItemNumber="1">
<Description>Samurai 2: Duel at Ichijoji Temple</Description>
<Part Id="37429125526" UnitPrice="29.95" Quantity="3"/>
</LineItem>
<LineItem ItemNumber="2">
<Description>The Red Shoes</Description>
<Part Id="37429128220" UnitPrice="39.95" Quantity="4"/>
</LineItem>
<LineItem ItemNumber="3">
<Description>A Night to Remember</Description>
<Part Id="715515009058" UnitPrice="39.95" Quantity="1"/>
</LineItem>
</LineItems>
</PurchaseOrder>
SQL> spool off
SQL> --
SQL>
SQL>Also, if I do a WebDav or FTP put/get I do not see a problem either.
C:\TEMP>ftp
ftp> open localhost 2100
Connected to mdrake-lap.
220- mdrake-lap
Unauthorised use of this FTP server is prohibited and may be subject to civil and criminal prosecution
220 mdrake-lap FTP Server (Oracle XML DB/Oracle Database) ready.
User (mdrake-lap:(none)): scott
331 pass required for SCOTT
Password:
230 SCOTT logged in
ftp> cd /public/testdir
250 CWD Command successful
ftp> rm testcase.xml
550 /public/testdir/testcase.xml : Not a directory.
ftp> del testcase.xml
250 DELE Command successful
ftp> put testcase.xml
200 PORT Command successful
150 ASCII Data Connection
226 ASCII Transfer Complete
ftp: 1174 bytes sent in 0.02Seconds 73.38Kbytes/sec.
ftp> ls -l
200 PORT Command successful
150 ASCII Data Connection
-rw-r--r-- 1 SCOTT oracle 1174 JUN 11 11:01 testcase.xml
226 ASCII Transfer Complete
ftp: 68 bytes received in 0.01Seconds 4.53Kbytes/sec.
ftp> get testcase.xml testcase.xml.out
200 PORT Command successful
150 ASCII Data Connection
226 ASCII Transfer Complete
ftp: 1174 bytes received in 0.00Seconds 1174000.00Kbytes/sec.
ftp> !dir testcase.xml
Volume in drive C has no label.
Volume Serial Number is 8CC2-E429
Directory of C:\TEMP
02/01/2006 04:48 AM 1,174 testcase.xml
1 File(s) 1,174 bytes
0 Dir(s) 23,780,032,512 bytes free
ftp> !dir testcase.xml.out
Volume in drive C has no label.
Volume Serial Number is 8CC2-E429
Directory of C:\TEMP
06/11/2006 02:01 PM 1,174 testcase.xml.out
1 File(s) 1,174 bytes
0 Dir(s) 23,780,032,512 bytes free
ftp> quit
221 QUIT Goodbye.
C:\TEMP> -
Load and Read XML file size more than 4GB
Hi All
My environment is Oracle 10.2.0.4 on Solaris and I have processes to work with XML file as below detail by PL/SQL
1. I read XML file over HTTP port into XMLTYPE column in table.
2. I read value no.1 from table and extract to insert into another table
On test db, everything is work but I got below error when I use production XML file
ORA-31186: Document contains too many nodes
Current XML size about 100MB but the procedure must support XML file size more than 4GB in the future.
Belows are some part of my code for your info.
1. Read XML by line into variable and insert into table
LOOP
UTL_HTTP.read_text(http_resp, v_resptext, 32767);
DBMS_LOB.writeappend (v_clob, LENGTH(v_resptext), v_resptext);
END LOOP;
INSERT INTO XMLTAB VALUES (XMLTYPE(v_clob));
2. Read cell value from XML column and extract to insert into another table
DECLARE
CURSOR c_xml IS
(SELECT trim(y.cvalue)
FROM XMLTAB xt,
XMLTable('/Table/Rows/Cells/Cell' PASSING xt.XMLDoc
COLUMNS
cvalue
VARCHAR(50)
PATH '/') y;
BEGIN
OPEN c_xml;
FETCH c_xml INTO v_TempValue;
<Generate insert statement into another table>
EXIT WHEN c_xml%NOTFOUND;
CLOSE c_xml;
END
And one more problem is performance issue when XML file is big, first step to load XML content to XMLTYPE column slowly.
Could you please suggest any solution to read large XML file and improve performance?
Thank you in advance.
HikoSee Mark Drake's (Product Manager Oracle XMLDB, Oracle US) response in this old post: ORA-31167: 64k size limit for XML node
The "in a future release" reference, means that this boundary 64K / node issue, was lifted in 11g and onwards...
So first of all, if not only due to performance improvements, I would strongly suggest to upgrade to a database version which is supported by Oracle, see My Oracle Support... In short Oracle 10.2.x was in extended support up to summer 2013, if I am not mistaken and is currently not supported anymore...
If you are able to able to upgrade, please use the much, much more performing XMLType Securefile Binary XML storage option, instead of the XMLType (Basicfile) CLOB storage option.
HTH -
How to find the XML file size in the scenarios?
Hi All,
Recently i have attended an interview at a MNC.
They asked some realtime questions like
1.how to find the XML document size in a File-File scenario?
2.What is mass-assignment replication etc.....
Can anybody tell me the solution for these.
Best regards
Hari prasadIf the input is a flat file, there is no exact way to calculate the size of the generated XML file, since it depends on many factors (number of fields in the recordsets, size of the name of the fields, number of records, etc).
As a rule of thumb, generally ppl use XML file size = 2 x Flat file size, in order to do sizing calculations etc. But again, that is just an estimation, not a precise calculation.
Regards,
Henrique. -
Out of Memory Error bcoz of xml file size
Hi,
Help me to solve this out of memory error, if xml file size
is increased means
it is not displaying anything and displaying this out of
memory error.
Thanking you
Regards
Nirmalatha.NYou should avoid loading large sized XML files in your Flash
Lite application. There is a limit on incoming data, and anything
beyond that will give an error. My experience has been around 1000
characters in a single stream of incoming text.
A possible solution your memory problem is to use a middle
language like PHP, ASP etc, to stream a single XML data file in
parts to your Flash Lite application. This means you avoid loading
XML directly in Flash.
Mariam -
XML file error on second time execution.
suppose i had xml file test1 which have some attributes depicted below. While after creating data sever for test1 xm file and on doing selective reverse external tables are created are
MAPPING,ENGINE,TRNASMISSION,TRIM,MARKET,LINE,BODY,SERIES.
It means the tables are created for the attributes which are empty Or which doesnt have values like BODY,ENGINE etc.
and the attributes which are not empty are contained as columns insdide MAPPING tables such as TOC_ID,MAPPING_ID etc are column names inside MAPPPINGS table.
<MAPPINGS>
<MAPPING_ID>481</MAPPING_ID>
<TOC_ID>9456</TOC_ID>
<MODEL_YEAR>1997</MODEL_YEAR>
<FAMILY>AB</FAMILY>
<ENGINE/>
<TRANSMISSION/>
<TRIM/>
<MARKET/>
<LINE/>
<BODY/>
<SERIES/>
<XT_ID>481</XT_ID>
<LEVEL>1</LEVEL>
</MAPPINGS>
Suppose second time loading of test1 xml file if i recieve data in which engine column is not null
<MAPPINGS>
<MAPPING_ID>481</MAPPING_ID>
<TOC_ID>9456</TOC_ID>
<MODEL_YEAR>1997</MODEL_YEAR>
<FAMILY>AB</FAMILY>
<ENGINE>1 </ENGINE>
<TRANSMISSION/>
<TRIM/>
<MARKET/>
<LINE/>
<BODY/>
<SERIES/>
<XT_ID>481</XT_ID>
<LEVEL>1</LEVEL>
</MAPPINGS>
sice already the MAPPINGS table table is created which doesnt have ENGINE column inside it. so second time load I recive the error .
So it means that if in XML file during the first time load if one attribute is null while at second it is not null a mismatch on the table or column creation occurs.
Is there any solution.
Thanks
Neeraj singh
Edited by: neeraj_singh on Dec 18, 2008 10:43 PMHi
Did you get the solution to this problem? I am having a 'sort of' similar issue.
Am using Oracle table as my source and XML model as the target.
While creating a XML server, instead of giving an XML file , am giving a DTD file.
All the elements of the DTD file after reversing are "tables" and the attributes within those elements are coming as "columns". And after reversing XML model,
ODI is inserting some Primary key and foreign key columns within elements. I think they are coming from"element ref" tag in the DTD.
Now the problem is
1. How do i populate these PK's and FK's? i tried setting default value to them, but its not working. Will it affect my XML file? and moreover i am not populating any of the PK's and FK's from the source and creating multiple interfaces to populate different elements of just one XML file as i am having single Oracle table as the source.
2. After populating the XML model, where will be my resultant XML file. Is it the same location/file as the XML schema. (this XML schema has been specified while creating XML data server).
3. Can you provide some documention if you have (other than the Users guide)
thanks
CJ -
I am trying to make a pdf of a file and I need to get the file size to be no bigger than 10MB. Even when I save it, the image quality at minimum the file size is 10.9 MB. What is the best way to do this
@Barbara – What purpose is that PDF for? Print? Web?
If web purpose, you could convert CMYK images to sRGB. That would reduce the file size as well.
A final resort to bring down file size is:
1. Print to PostScript
2. Distill to PDF
That would bring file size down even more. About 20%, depending on the images and other contents you are using, compared with the Acrobat Pro method. If you like you could send me a personal message, so we could exchange mail addresses. I could test for you. Just provide the highres PDF without any downsampling and transparency intact. Best provide a PDF/X-4.
I will place the PDF in InDesign, print to PostScript, distill to PDF.
Uwe -
LPX file size grows every time I save (with almost no changes) - WHY?
Hi all,
I am just now starting to move from Logic 9 to Logic Pro X. I first imported a very simple file from logic nine as a folder project and did not import anything except the file.
- Then, I immediately saved it. took more than 30 secs on core i7 - why so long?
- next I "saved a copy as" file 1a - file size 700k - nice and small.
- that I loaded the first logic X file back in and muted a few things and saved again
- I made some other very minor changes like muting and re-saved three or four times.
Here is what is strange –
Original file = 1 MB
main LX file was 2 megs and each subsequent save gained about 1 MB each time I hit save! So, the final grew to 6 MB after just a few saves!
I checked to see if there was some large edit history file I could purge – but it was empty because all I had done was change some mutes…
Q: what is going on with this large file size growing every time I hit save without any major changes – I would like to know what's going on.It has to do with undo history, backups, and autosaves. You can see these if you right click on the file and show package contents. Explore in there and you should see these growing in the alternatives folder. I would imagine there is kind of a limit to how many of these there are, but haven't looked to closely at it.
-
Getting adobe form file size at run time
hi,
ya i need the adobe form file size at run time. my requirement
is to get xstring and file size of the adobe form. adobe form format is already in xstring format, but how to get the file size.Hi,
try the following:
DATA lv_file_content TYPE xstring.
DATA lv_file_size TYPE i.
lv_file_size = XSTRLEN( lv_file_content ).
This will give you the file size in Byte. For KB, just divide by 1024, and once more for MB.
Kind regards
Ole -
Causes of ORA-01200: actual file size of x is smaller than correct size n ?
Hello everyone
We are running Oracle 11.2.0.3 64-bit E/E on Oracle Linux 6.2 with UEK R2 on X64.
Using Grid and ASM 11.2.0.3 and OMF names.
The database files are alll on SAN, the SAN vendor name not disclosed here to protect the innocent/guilty 8^)
I have a Test database MYDB (in NOARCHIVELOG mode) and after a normal server reboot, not a crash, the following error occured on Oracle database startup.
srvctl start database -d MYDB
PRCR-1079 : Failed to start resource ora.mydb.db
CRS-5017: The resource action "ora.mydb.db start" encountered the following error:
ORA-01122: database file 1 failed verification check
ORA-01110: data file 1: '+ASMDATA/mydb/datafile/system.256.787848913' <<<<<<<<<<---------------------------------------------------- Corrupt file on ASM disk, system tablespace this time
ORA-01200: actual file size of 94720 is smaller than correct size of 98560 blocks <<<<<<<<<<---------------------------------------------------- ERROR message
The ASM disks are all up and disk groups are mounted OK. The ASM protection level is EXTERNAL.
My understanding that the only proper recovery from the above error is to use RMAN Restore Database/File/Tablespace.etc (and then RMAN Recover, when in ArchiveLog mode).
I do have RMAN disk backups, so I don't need to "patch" the database to recover.
This is not my question at this point in time.
My Question is this : what are the most likely causes of such error?
Oracle Database bug? OS bug? Disk driver error? Server hardware failure (bus, memory, etc)? Or a SAN bug?
I expect that Oracle 11g R2 will always come up with the database "clean" if the server reboots or if server crashes (i.e. due to complete power failure) provided the actual storage is not physically damaged.
Our SAN vendor (no names!) says they are of the opinion that it's most likely Oracle database or Oracle Linux 6.x/UEK software bug, or probably Oracle ASM 11.2 bug.
We have opened a support call with Oracle.....
My personal experience dealing with similar database errors on more recent releases of Oracle (9i R2, 10g R2, 11g R2) and also MS-SQL 2005 and 2008 R2 suggests this kind of a problem is most likely related to errors/bugs in storage/drivers/firmware/BIOS and SAN and not likely to be a 'database' or O/S bug.
Perhaps you, good people on this forum, can share their experiences, as unbiased as you can?
Many thanksIve seen Ora-1200 twice I think over the years, both times there was disk problems which led to write issues which caused file problems, youve reported no such issues on your side though so if thats actually true, Im thinking bug.
-
ORA-01200: actual file size of 437759 is smaller than correct size of 43776
Hi,
I am getting the following unexpected errors while going to create CONTROL files after successful completion of offline/online oracle backup RESTORE (of PRD system) on Quality system. We are following Database specific system copy method to do the same.
All the required pre & post restore activities for the same were carried out. Even the same RESTORE activities are performed with different different online/offline backups of PRD system to do such system copy. But, the thing is stuck at control file creation step with the following same error which is seen again & again after every DB restore operation.....
SQL> @/oracle/AEQ/saptrace/usertrace/CONTROL.SQL
ORACLE instance started.
Total System Global Area 4714397696 bytes
Fixed Size 2050336 bytes
Variable Size 2365589216 bytes
Database Buffers 2332033024 bytes
Redo Buffers 14725120 bytes
CREATE CONTROLFILE REUSE SET DATABASE "AEQ" RESETLOGS ARCHIVELOG
ERROR at line 1:
ORA-01503: CREATE CONTROLFILE failed
ORA-01200: actual file size of 437759 is smaller than correct size of 437760 blocks
ORA-01110: data file 4: '/oracle/AEQ/sapdata1/sr3_1/sr3.data1'
At OS level the file size of sr3.data1 is found 3586129920 bytes (= 437760 * 8192 bytes).
host1:oraaeq 20> cd /oracle/AEQ/sapdata1/sr3_1
host1:oraaeq 21> ll
total 7004176
-rw-r--r-- 1 oraaeq dba 3586129920 May 11 02:26 sr3.data1
The above mentioned error is coming for all 294 data files. The reported file size difference is only of 1 Block in all data files. The DB block size is 8192 bytes.
Environment: (for SAP QUALITY & PRD systems)
OS: HP_UX ia64 B.11.23
SAP System : SAP ECC 6.0
Database: Oracle 10.2.0.2.0
Your help for this reported issue will be highly appreciated.
Regards,
Bhavik G. ShroffHi,
Thanks for your response.
We already have tried the same whatever you have mentioned as suggestions in ur last post .
We already tried to extend all 294 data-files as mentioned in that oracle forum link.
Its not the recommended way to play with data-files in such a way as it can lead to other unnecessary errors.
We have seen the following errors after successful creation of control file by manually extending all those 294 files (it was around 10hrs job).
Specify log: {<RET>=suggested | filename | AUTO | CANCEL}
auto
ORA-00332: archived log is too small - may be incompletely archived
ORA-00334: archived log: '/oracle/AEQ/oraarch/AEQarch1_268984_629943661.dbf'
have you tried also restoring init<SID>.ora file from PRD to new system.
I think its not having relationship with control file generation. Both systems are having same init files with respective SIDs.
Did you find any other points in your further investigation ?
I am thinking to perform Fresh SAP System Installation with same SID (AEQ) and then will try to do Database Restore again with last offline backup of AEQ system.
Regards,
Bhavik G. Shroff -
Why does PDF file size increase each time I "save" tagging tasks?
Why does PDF file size increase each time I "save" tagging tasks?
Given:
1) I'm running Acrobat Pro 11.0.07 (this is most current version)
2) My file size starts at 750mb and ends up 15mb when finished tagging.
3) Only certain documents experience this increase, i.e., no visible pattern of document characteristics
4) PDF's are not full of images...in fact, mostly <H1> <H2> <H3> <P> <Figure> alt text, ect.
5) Occurs with text PDF's and/or fillable forms (again, does not happen to all documents...only some)
6) File increase occurs incrementally as tagging tasks are performed; i.e., each new save increases file size a few megabytes.
7) Occurs whether I "save" or "save as"
8) Difficult to optimize these files without corruption
9) I'm running Mac OS 10.9.4 (this is most current version)Thank you so much for responding! I've been experimenting with the SAVE AS vs. SAVE for the past few days...and you are correct. It's funny because I've been tagging files for 2 years and never noticed this. Probably b/c I use both methods depending on the tagging tasks...some are more complicated than others and should not be overwritten. In those cases I SAVE AS new file.
I love this forum...thank you again! -
How can we know that size of dimension is more than fact table?
how can we know that size of dimension is more than fact table?
this was the question asked for me in interviewHi Reddy,
This is common way finding the size of cube or dimensions or KF.
Each keyfiure occupies 10 Bytes of memory
Each Char occupies 6 Bytes of memory
So in an Infocube the maximum number of fields are 256 out of which 233 keyfigure, 16 Dimesions and 6 special char.
So The maximum capacity of a cube
= 233(Key figure)10 + 16(Characteristics)6 + 6(Sp.Char)*6
In general InfoCube size should not exceed 100 GB of data
Hope it answer your question.
Regards,
Varun -
XML file size increases GREATLY with ![CDATA[
I have several .xml files that provide info for a website of
mine and was quite surprised to discover, while I was uploading
them, that some of them were around 60-70kb, while others were 4kb
or less. Knowing that the amount of content in all of them doesn't
vary that much, I noticed that the biggest file sizes corresponded
to the files using <![CDATA[ ]]>.
I included a sample file code below. It's in Portuguese, but
I think it's still obvious that this file could not be more than a
few kb, and Dreamweaver (CS3) saves it with a whopping 62kb!
I tried saving the exact same file in Text Edit and it
resulted in a normal file size.
Has anyone encountered similar situations? I'm guessing this
is some sort of bug, that Dreamweaver puts some extra content -
somewhere - because of the CDATA. Is there any sort of reason or
fix for this?
Thanks in advance.Ok... embarassing moment. Just realized that DW CS3 is not
guilty. In Leopard, in the file's Get Info panel, I changed the
preferred application to open it, and the file's size changed
according to different applications. Reverting back to DW CS3, it
still resulted in the 60-70kb size, but deleting the file and
saving a new one it the same name solved the problem.
Sorry guys. -
Maximum XML file size that can parsed with c++ XML parser
Hi!
what is the maximum file size that can be parsed using the xml parser(version 1) for c++ on linux .
i'm getting an error(error no 231) when i try to parse an XML file of 3MB on Red Hat Linux 6.1
Regards
anjanamoving to xml db forum
Maybe you are looking for
-
Oracle 8.1.6 Driver Misbehavior
i have installed oracle 8.1.6 Client(with Driver) on a Fresh Win NT PC.WHen i test database connection through ODBC 32-bit Test,it is failing. Also,WHen i test database connection from java thro' JDbcOdbc Driver,Dr.Watson error(Access violation) occu
-
Could not restore : error -18
Please help!!! I'm trying to update to the latest firmware and iTunes is saying I must restore but when I try to restore it says "The iPhone "iPhone" could not be restored. An unknown error occurred (-18)." Has anyone seen this error code? I tried se
-
I'm trying to convert several .tiff files to .jpg files. I had scanned all these files in about the same time at the same settings. However some of the .tiff files won't let me convert to .jpg files. I'm using the 'process multiple files' feature. Wh
-
Captivate to create web site?
I'm considering using Captivate to create a web site. Has anyone done this with success? If so, I'd love to hear about any learnings or caveats you may have. Thanks.
-
WLC 4402 web auth Internal login page
Hi, We recently upgraded our code on our wlc and now our internal web auth page has a nice teal colored L shaped bar in the right upper part of the screen. Is there a way to edit the internal web auth page other than just uploaded a new bundle to the