XML File size changed

Hi, All:
I wondering why the XML file size changed after go through DB using CLOB. The original size is 1KB, but after I use the PUT and GET method, the size changed to 8KB. There is a lot of space appened to the content.
Thanks a lot!

Can you provide an example of the code you are using for the put / get method. In SQL*PLUS I do not see what you are talking about
SQL> create or replace directory xmltemp as 'c:\temp'
  2  /
SQL> drop table xmltest
  2  /
SQL> host dir c:\temp\testcase.xml
Volume in drive C has no label.
Volume Serial Number is 8CC2-E429
Directory of c:\temp
02/01/2006  04:48 AM             1,174 testcase.xml
               1 File(s)          1,174 bytes
               0 Dir(s)  23,780,163,584 bytes free
SQL> --
SQL> host type c:\temp\testcase.xml
<PurchaseOrder xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="http://xfiles:8080/home/SCOTT/poSource/x
sd/purchaseOrder.xsd">
        <Reference>EABEL-20030409123336251PDT</Reference>
        <Actions>
                <Action>
                        <User>EZLOTKEY</User>
                </Action>
        </Actions>
        <Reject/>
        <Requestor>Ellen S. Abel</Requestor>
        <User>EABEL</User>
        <CostCenter>R20</CostCenter>
        <ShippingInstructions>
                <name>Ellen S. Abel</name>
                <address>300 Oracle Parkway
Redwood Shores
CA
94065
USA</address>
                <telephone>650 506 7300</telephone>
        </ShippingInstructions>
        <SpecialInstructions>Counter to Counter</SpecialInstructions>
        <LineItems>
                <LineItem ItemNumber="1">
                        <Description>Samurai 2: Duel at Ichijoji Temple</Description>
                        <Part Id="37429125526" UnitPrice="29.95" Quantity="3"/>
                </LineItem>
                <LineItem ItemNumber="2">
                        <Description>The Red Shoes</Description>
                        <Part Id="37429128220" UnitPrice="39.95" Quantity="4"/>
                </LineItem>
                <LineItem ItemNumber="3">
                        <Description>A Night to Remember</Description>
                        <Part Id="715515009058" UnitPrice="39.95" Quantity="1"/>
                </LineItem>
        </LineItems>
</PurchaseOrder>
SQL> --
SQL> create table xmltest of xmltype
  2  /
SQL> insert into xmltest values (xmltype(bfilename('XMLTEMP','testcase.xml'),nls_charset_id('AL32UTF8'),null,1,1))
  2  /
SQL> commit
  2  /
SQL> select dbms_lob.getLength(value(x).getClobVal())
  2    from xmltest x
  3  /
                                     1193
SQL> set long 100000
SQL> set echo off
SQL> set pages 0
SQL> set lines 150
SQL> set heading off
SQL> set feedback off
SQL> set trimspool on
SQL> spool c:\temp\testcase.xml.out
SQL> --
SQL> select object_value
  2    from xmltest
  3  /
<PurchaseOrder xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="http://xfiles:8080/home/SCOTT/poSource/x
sd/purchas
eOrder.xsd">
  <Reference>EABEL-20030409123336251PDT</Reference>
  <Actions>
    <Action>
      <User>EZLOTKEY</User>
    </Action>
  </Actions>
  <Reject/>
  <Requestor>Ellen S. Abel</Requestor>
  <User>EABEL</User>
  <CostCenter>R20</CostCenter>
  <ShippingInstructions>
    <name>Ellen S. Abel</name>
    <address>300 Oracle Parkway
Redwood Shores
CA
94065
USA</address>
    <telephone>650 506 7300</telephone>
  </ShippingInstructions>
  <SpecialInstructions>Counter to Counter</SpecialInstructions>
  <LineItems>
    <LineItem ItemNumber="1">
      <Description>Samurai 2: Duel at Ichijoji Temple</Description>
      <Part Id="37429125526" UnitPrice="29.95" Quantity="3"/>
    </LineItem>
    <LineItem ItemNumber="2">
      <Description>The Red Shoes</Description>
      <Part Id="37429128220" UnitPrice="39.95" Quantity="4"/>
    </LineItem>
    <LineItem ItemNumber="3">
      <Description>A Night to Remember</Description>
      <Part Id="715515009058" UnitPrice="39.95" Quantity="1"/>
    </LineItem>
  </LineItems>
</PurchaseOrder>
SQL> spool off
SQL> --
SQL> set echo on
SQL> host dir c:\temp\testcase.xml.out
Volume in drive C has no label.
Volume Serial Number is 8CC2-E429
Directory of c:\temp
06/11/2006  01:53 PM             1,313 testcase.xml.out
               1 File(s)          1,313 bytes
               0 Dir(s)  23,780,163,584 bytes free
SQL> --
SQL> host type c:\temp\testcase.xml.out
SQL> --
SQL> select object_value
  2    from xmltest
  3  /
<PurchaseOrder xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="http://xfiles:8080/home/SCOTT/poSource/x
sd/purchas
eOrder.xsd">
  <Reference>EABEL-20030409123336251PDT</Reference>
  <Actions>
    <Action>
      <User>EZLOTKEY</User>
    </Action>
  </Actions>
  <Reject/>
  <Requestor>Ellen S. Abel</Requestor>
  <User>EABEL</User>
  <CostCenter>R20</CostCenter>
  <ShippingInstructions>
    <name>Ellen S. Abel</name>
    <address>300 Oracle Parkway
Redwood Shores
CA
94065
USA</address>
    <telephone>650 506 7300</telephone>
  </ShippingInstructions>
  <SpecialInstructions>Counter to Counter</SpecialInstructions>
  <LineItems>
    <LineItem ItemNumber="1">
      <Description>Samurai 2: Duel at Ichijoji Temple</Description>
      <Part Id="37429125526" UnitPrice="29.95" Quantity="3"/>
    </LineItem>
    <LineItem ItemNumber="2">
      <Description>The Red Shoes</Description>
      <Part Id="37429128220" UnitPrice="39.95" Quantity="4"/>
    </LineItem>
    <LineItem ItemNumber="3">
      <Description>A Night to Remember</Description>
      <Part Id="715515009058" UnitPrice="39.95" Quantity="1"/>
    </LineItem>
  </LineItems>
</PurchaseOrder>
SQL> spool off
SQL> --
SQL>
SQL>Also, if I do a WebDav or FTP put/get I do not see a problem either.
C:\TEMP>ftp
ftp> open localhost 2100
Connected to mdrake-lap.
220- mdrake-lap
Unauthorised use of this FTP server is prohibited and may be subject to civil and criminal prosecution
220 mdrake-lap FTP Server (Oracle XML DB/Oracle Database) ready.
User (mdrake-lap:(none)): scott
331 pass required for SCOTT
Password:
230 SCOTT logged in
ftp> cd /public/testdir
250 CWD Command successful
ftp> rm testcase.xml
550 /public/testdir/testcase.xml : Not a directory.
ftp> del testcase.xml
250 DELE Command successful
ftp> put testcase.xml
200 PORT Command successful
150 ASCII Data Connection
226 ASCII Transfer Complete
ftp: 1174 bytes sent in 0.02Seconds 73.38Kbytes/sec.
ftp> ls -l
200 PORT Command successful
150 ASCII Data Connection
-rw-r--r--   1 SCOTT    oracle      1174 JUN 11 11:01 testcase.xml
226 ASCII Transfer Complete
ftp: 68 bytes received in 0.01Seconds 4.53Kbytes/sec.
ftp> get testcase.xml testcase.xml.out
200 PORT Command successful
150 ASCII Data Connection
226 ASCII Transfer Complete
ftp: 1174 bytes received in 0.00Seconds 1174000.00Kbytes/sec.
ftp> !dir testcase.xml
Volume in drive C has no label.
Volume Serial Number is 8CC2-E429
Directory of C:\TEMP
02/01/2006  04:48 AM             1,174 testcase.xml
               1 File(s)          1,174 bytes
               0 Dir(s)  23,780,032,512 bytes free
ftp> !dir testcase.xml.out
Volume in drive C has no label.
Volume Serial Number is 8CC2-E429
Directory of C:\TEMP
06/11/2006  02:01 PM             1,174 testcase.xml.out
               1 File(s)          1,174 bytes
               0 Dir(s)  23,780,032,512 bytes free
ftp> quit
221 QUIT Goodbye.
C:\TEMP>

Similar Messages

  • Load and Read XML file size more than 4GB

    Hi All
    My environment is Oracle 10.2.0.4 on Solaris and I have processes to work with XML file as below detail by PL/SQL
    1. I read XML file over HTTP port into XMLTYPE column in table.
    2. I read value no.1 from table and extract to insert into another table
    On test db, everything is work but I got below error when I use production XML file
         ORA-31186: Document contains too many nodes
    Current XML size about 100MB but the procedure must support XML file size more than 4GB in the future.
    Belows are some part of my code for your info.
    1. Read XML by line into variable and insert into table
    LOOP
    UTL_HTTP.read_text(http_resp, v_resptext, 32767);
    DBMS_LOB.writeappend (v_clob, LENGTH(v_resptext), v_resptext);
        END LOOP;
        INSERT INTO XMLTAB VALUES (XMLTYPE(v_clob));
    2. Read cell value from XML column and extract to insert into another table
    DECLARE
    CURSOR c_xml IS
    (SELECT  trim(y.cvalue)
    FROM XMLTAB xt,
    XMLTable('/Table/Rows/Cells/Cell' PASSING xt.XMLDoc
    COLUMNS
    cvalue
    VARCHAR(50)
    PATH '/') y;
        BEGIN
    OPEN c_xml;
    FETCH c_xml INTO v_TempValue;
    <Generate insert statement into another table>
    EXIT WHEN c_xml%NOTFOUND;
    CLOSE c_xml;
        END
    And one more problem is performance issue when XML file is big, first step to load XML content to XMLTYPE column slowly.
    Could you please suggest any solution to read large XML file and improve performance?
    Thank you in advance.
    Hiko      

    See Mark Drake's (Product Manager Oracle XMLDB, Oracle US) response in this old post: ORA-31167: 64k size limit for XML node
    The "in a future release" reference, means that this boundary 64K / node issue, was lifted in 11g and onwards...
    So first of all, if not only due to performance improvements, I would strongly suggest to upgrade to a database version which is supported by Oracle, see My Oracle Support... In short Oracle 10.2.x was in extended support up to summer 2013, if I am not mistaken and is currently not supported anymore...
    If you are able to able to upgrade, please use the much, much more performing XMLType Securefile Binary XML storage option, instead of the XMLType (Basicfile) CLOB storage option.
    HTH

  • How to find the XML file size in the scenarios?

    Hi All,
    Recently i have attended an interview at a MNC.
    They asked some realtime questions like
    1.how to find the XML document size in a File-File scenario?
    2.What is mass-assignment replication etc.....
    Can anybody tell me the solution for these.
    Best regards
    Hari prasad

    If the input is a flat file, there is no exact way to calculate the size of the generated XML file, since it depends on many factors (number of fields in the recordsets, size of the name of the fields, number of records, etc).
    As a rule of thumb, generally ppl use XML file size = 2 x Flat file size, in order to do sizing calculations etc. But again, that is just an estimation, not a precise calculation.
    Regards,
    Henrique.

  • Out of Memory Error bcoz of xml file size

    Hi,
    Help me to solve this out of memory error, if xml file size
    is increased means
    it is not displaying anything and displaying this out of
    memory error.
    Thanking you
    Regards
    Nirmalatha.N

    You should avoid loading large sized XML files in your Flash
    Lite application. There is a limit on incoming data, and anything
    beyond that will give an error. My experience has been around 1000
    characters in a single stream of incoming text.
    A possible solution your memory problem is to use a middle
    language like PHP, ASP etc, to stream a single XML data file in
    parts to your Flash Lite application. This means you avoid loading
    XML directly in Flash.
    Mariam

  • XML DB: As XML file size doubles validate time more than doubles

    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    We have default XMLDB install. I am using schemavalidate() to successfullly validate xml files against a registered xsd. For some reason as the files size doubles the time to validate the file grows at a faster rate than double? We wanted to generate approx 18MB files since they hold 5000 records. But as you can see it appears I would be better off generating six 3MB files instead. What init.ora or xdbconfig.xml parameter might I change to help out with behavior? Any other ideas of why this happening? Is this an example of "doesn't scale"?
    1.5MB 3 Seconds
    3MB 15 Seconds
    6MB 1 Minute
    12MB 6 Minutes
    18MB 16 Minutes
    Code is simple..
    procedure validate_xml_file (p_dir in varchar2, p_file in varchar2) is
    v_xmldoc xmltype;
    begin
    select XMLTYPE(bfilename(p_dir, p_file),NLS_CHARSET_ID('AL32UTF8'))
    into v_xmldoc
    from dual;
    v_xmldoc.schemaValidate();
    end validate_xml_file;

    If I take the clause off the end of the cview table this procedure does not work at all.. It fails at the schemavalidate call...
    Error report:
    ORA-19030: Method invalid for non-schema based XML Documents.
    ORA-06512: at "SYS.XMLTYPE", line 345
    ORA-06512: at line 26
    19030. 00000 - "Method invalid for non-schema based XML Documents."
    *Cause:    The method can be invoked on only schema based xmltype objects.
    *Action:   Don't invoke the method for non schema based xmltype objects.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • XML file size increases GREATLY with ![CDATA[

    I have several .xml files that provide info for a website of
    mine and was quite surprised to discover, while I was uploading
    them, that some of them were around 60-70kb, while others were 4kb
    or less. Knowing that the amount of content in all of them doesn't
    vary that much, I noticed that the biggest file sizes corresponded
    to the files using <![CDATA[ ]]>.
    I included a sample file code below. It's in Portuguese, but
    I think it's still obvious that this file could not be more than a
    few kb, and Dreamweaver (CS3) saves it with a whopping 62kb!
    I tried saving the exact same file in Text Edit and it
    resulted in a normal file size.
    Has anyone encountered similar situations? I'm guessing this
    is some sort of bug, that Dreamweaver puts some extra content -
    somewhere - because of the CDATA. Is there any sort of reason or
    fix for this?
    Thanks in advance.

    Ok... embarassing moment. Just realized that DW CS3 is not
    guilty. In Leopard, in the file's Get Info panel, I changed the
    preferred application to open it, and the file's size changed
    according to different applications. Reverting back to DW CS3, it
    still resulted in the 60-70kb size, but deleting the file and
    saving a new one it the same name solved the problem.
    Sorry guys.

  • After copying a file from NTFS to HFS volume, file size changed when viewing in Windows

    Hi guys,
    I have a Mac Air running Mavericks on a HFS partition and Windows 7 on a BOOTCAMP NTFS partition. I have some files that I want to read/write from/to both systems. Since OS X can't write NTFS and Windows can't write HFS either, and I don't want to use any 3rd-party tools/drivers, I have to adopt a "stupid" way: in OS X, I copy those files from NTFS to its HFS partition, make changes, then switch to Windows and sync them back to NTFS.
    The problem is, after I copied a file from NTFS to HFS in OS X, it seemed ok. But when I switched to Windows, the very copied file in HFS partition had its size changed (bigger) although I didn't make any changes to it in OS X yet. This happens to almost every file I copied, text and binary. For those text files, I tried to open it with EditPlus in Windows and EditPlus reports the correct size on the status bar.
    How could this happen?

    I am not sure if this is what your seeing but...
    The same unaltered file on two different volumes might use different amounts of disk space. This is because a 'disk' is divided in to 'blocks' and a block (also historically known as a 'sector') is a certain minimum size. So if disk-1 has a block size of 512 bytes and disk-2 has a block size of 1024 bytes then a file containing just 10 bytes will use up twice as much space on disk-2 as disk-1 even though it is the exact same file.
    Beyond that, Macs can add additional information like Spotlight tags, labels, icons, etc. which make a file bigger. If you are modifying a file then presumably that also implies adding additional content e.g. for a Word document more text and this will make it bigger. Also depending on some programs are configured or designed 'deleting' text may only mark it as deleted and not really delete. This can apply to older versions of Word which has a 'Fast Save' feature, new versions have removed this and do a proper delete.
    You would have to give more details like what you are doing to the document, what kind of document, and what the two sizes are.
    Finally, there is one other potential difference, some systems and manufacturers use 1024 as a unit for measuring file and disk sizes, some use 1000. It will be the same number of bytes in each case but 1000 bytes in one case would exactly equal 1MB, and in the other it would be 0.9765MB.

  • Does web server bounce required if xml file is changes

    Hi All,
    A quick question. I have changed the query of a VO and it doesnt have any row or vo impl class. Only xml file is moved to server. I cant see any data. Do I have to bounce web server? If not how can I test my vo object is working file?
    Thanks in advance.
    Hitesh

    Please be more specific when you say "I have made some changes in VO object I can see those changes and I have also created custom views but I cant see values from that view".
    You don't need to upload server.xml unless you have added a new VO to AM.
    --Shiv                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Alac file size changes when added to iTunes

    Hello,
    I've converted some FLAC files to ALAC using dbPowerAmp.  The file bit rate remains unchanged.  However, when I add the ALAC files to iTunes, the bit rate shown in iTunes is reduced.
    Example - Alice Cooper, Hello Hurray
    FLAC Bit Rate - 4608 kbs
    Converted ALAC Bit Rate - 4608 kbs
    iTunes ALAC Bit Rate - 3012 kbs
    Any ideas what's going on?
    I'm using iTunes 10.7.0.21
    Thanks.

    Nothing has been changed.
    Sounds like iTunes & dbPowerAmp figure the bit rate differently.
    dbPowerAmp uses the uncompressed file sizeand iTunes figures the compressed file size.
    Since the file is smaller, the bit rate will have to be lower.

  • PDF File Size changes

    I have just upgraded from CS3 to CS5. I set my documents up for print and when saving to pdf small file size the file size have increased dramatically. A typical previous size would have been about 600 to 700k now the files are comming out at about 1.7 mb. I am getting complaints about this from clients. I have tried the save as reduced file size pdf, but it really doesn't make that much difference. Can anyone help?

    Try creating your own job options file in Distiller. You can change the job options setting to reduce the file size as you see fit. You may want to make sure you font subset or you can turn off font embedding entirely. You can make sure your graphics are as small as you feel appropriate.

  • Maximum XML file size that can parsed with c++ XML parser

    Hi!
    what is the maximum file size that can be parsed using the xml parser(version 1) for c++ on linux .
    i'm getting an error(error no 231) when i try to parse an XML file of 3MB on Red Hat Linux 6.1
    Regards
    anjana

    moving to xml db forum

  • Xml file automaticaly changes in jdevloper

    Hi,
    Am having a big problem,what ever am creating (AM,VO,VL) its xml file changes after paticular time interval so am not able modify the components and they through error,
    i want to know wheather is this a bug in oracle or some thing wrong am doiing.
    Thanks

    Sounds weird. Use a fresh copy of Jdev and test.
    --Shiv                                                                                                                                                                                                       

  • Why do file sizes change when I use "Process Multiple Images" to add watermarks?

    Hi,
    I'm using Elements 11. In order to add watermarks to many JPG pics at once, I use the function "Process Multiple Files".
    I select a source folder and a destination folder, and adds a three digit serial number to each file. I do NOT tick the checkbox marked "Change picture size". After that I define the watermark I want printed on my pics and hits OK. All the files in the source folder are processed and saved, with a new name in the destination folder. Fair and square.
    But. The file size of the new file is heavily reduced, compared to the original. It goes from 10 MB down to 500 KB (in general). Why is that? Is there any way I can prevent it?
    Regards,
    /Mikael Lindgren

    Or you can uncheck the convert files box to save them in their original format, or select one of the lossless formats PSD, BMP, TIF. What is most appropriate depends on what you intend to do with the watermarked versions.
    Cheers,
    Neale
    Insanity is hereditary, you get it from your children
    If this post or another user's post resolves the original issue, please mark the posts as correct and/or helpful accordingly. This helps other users with similar trouble get answers to their questions quicker. Thanks.

  • Variable library xml file size

    I'm working with a simple illustrator (CS4) image that has four variables that are being read in from an xml file. Currently I'm pulling in files that are 278kb (~100 data sets), with no issue. But I'm working with a developer to generate larger batch files of these, and I'm curious about size limitations.
    Is there a spec somewhere that talks about how large our xml files can be?
    Anyone have experience with this sort of functionality?
    [I also am aware that this is probably something that I should just script instead of loading it into illustrator, but that's a project for another day. ]
    Many thanks,
    Katy.

    XML size has no limit, i don't think there is a spec for what is a permissible XML size which can be imported. If you talk about a very large xml, a 5-10 MB xml, buddy ur inviting a rattlesnake to bite you on ur forehead, such large xml's i have even seen having problems opening in notepad itself, better be on the safer side 'n' not overloading ur xml.

  • XML file size restriction

    Hi,
    I am usnig the file adapter to read the XML file, is there any restriction on the file size. Is this documented anywhere in oracle document.
    Please advice.
    Regards,
    Sreejit

    Hi Sreejit,
    Apart from what is documented in the 10.1.3.3 SOA Best Practices Guide, if you have a huge XML file with repeating nodes, you can process them in manageable chunks.
    With attachment feature, you can copy/move large files opaquely from one folder to another.
    Hope this helps!
    Cheers
    Anirudh Pucha

Maybe you are looking for