Suggested size of XML files?

Hello,
I am currently building a catalog website where each page
will display from 1 to 5 products. I am using Spry with XML to
populate the products on the pages and with that using the
master/detail method to display more information about each
product.
For each catalog page I have created a XML file to go along
with it, listing the products and their various features. With 200
product pages, there will be 200 XML files! I believe creating
these by hand is my best way to go, programming in PHP and using
databases will be slow going for me because I am not skilled at all
with those. Using conditionals and the products unique SKU number,
can I create one monster XML file and load the products on each
page using conditionals?
Would a large XML file take too long to parse when checking
the conditionals? Would it will be too big to load in an appropiate
time? Once it is loaded will it be kept in the cache? Or maybe 200
XML files is the way to go when not using PHP...
Thank you,
Braxo

hi Braxo,
A couple points.
First, Spry should be able to handle the XML file. We have
tested files with 1000 records or more. It will take some time to
download, but once downloaded, it will be cached. Performance-wise,
it depends on the final size of the XML.
Second, with 200 XML files, it seems that in the time that
would take to make them, you can learn enough PHP to write one
script that would handle the whole thing. I think if you have a
database, use it. Managing those static files will be a total
headache.
You can do data set filtering to do what you want from one
big file.
Hope this helps.
Don

Similar Messages

  • Maximum size of XML files and number of IDocs for IDoc receiver adapter

    Hi Guys,
    We have an XML file to IDoc scenario where XI picks up an XML file with multiple Customer records in it, it does a simple mapping and creates one DEBMAS06 IDoc per record in the XML file. All IDocs are sent in a single file in XML-IDOC format to IDoc adapter which then posts the separate DEBMAS IDocs to R/3.
    a) What is the maximum size of XML files that XI can handle with mapping involved ?
    b) What is the maximum number of IDocs in a single file that the receiver IDoc adapter can handle ?
    The first time this interface runs almost 200,000 Customer records will be exported in one XML file.
    Thank you.

    Hi,
    Well it is difficult to find out the  maximum Size of xml messgaes that can be processed by XI and also Maximum number idocs an recevier Idoc adapter can handle.
    This totally depends on your production system loads and the limits can be set totally on trail & error basis..
    In my heavy loaded production system, i found out that the maximum size of the successfull messages after processing by XI is around 75 MB (seen in transaction SXMB_MONI). Whereas messages with size around 100 MB went into error.
    I havent encounter any such limits with respect to Idocs.
    I would suggest that you divide your data into smaller chunks and sent it part by part instead of sending it all once since you data size is huge.
    You can vary your batch size as per your system load.
    Regards,
    - Deepak.

  • Hi, extract data from xml file and insert into another exiting xml file

    i am searching code to extract data from xml file and insert into another exiting xml file by a java program. I understood it is easy to extract data from a xml file, and how ever without creating another xml file. We want to insert the extracted data into another exiting xml file. Suggestions?
    1st xml file which has two lines(text1.xml)
    <?xml version="1.0" encoding="iso-8859-1"?>
    <xs:PrintDataRequest xmlns:xs="http://com.unisys.com/Anid"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://com.unisys.com/Anid file:ANIDWS.xsd">
    <xs:Person>
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://com.unisys.com/Anid file:ANIDWS.xsd">
    These two lines has to be inserted in the existing another xml(text 2.xml) file(at line 3 and 4)
    Regards,
    bubbly

    Jadz_Core wrote:
    RandomAccessFile? If you know where you want to insert it.Are you sure about this? If using this, the receiving file would have to have bytes inserted that exactly match the number of bytes replaced. I'm thinking that you'll likely have to stream through the second XML with a SAX parser and copy information (or insert new information) as you stream with an XML writer of some sort.

  • EMD upload error: upload XML Files skipped

    Hi,
    Problem with an agent to upload data in Grid. Any ideas about it ?
    Thank you so much.

    Check the status of the agent with emctl status agent and refer to the troubleshooting section of the Installation and Basic configuration manual Part V Appendix A.
    You are looking for things such as : Has the Agent been registered in the repository. Last successful heartbeat to the OMS (Is it talking to it?) Have any of the XML files been uploaded?
    If you have a secure OMS, then it may be as simple as just registering the target via emctl secure agent <password>
    Oracle Enterprise Manager 10g Release 10.2.0.0.0.
    Copyright (c) 1996, 2005 Oracle Corporation. All rights reserved.
    Agent Version : 10.2.0.0.0
    OMS Version : 10.2.0.0.0
    Protocol Version : 10.2.0.0.0
    Agent Home : /scratch/OracleHomes2/agent10g
    Agent binaries : /scratch/OracleHomes2/agent10g
    Agent Process ID : 9985
    Parent Process ID : 29893
    Agent URL : https://stadv21.us.oracle.com:1831/emd/main/
    Repository URL : https://stadv21.us.oracle.com:1159/em/upload
    Started at : 2005-09-25 21:31:00
    Started by user : tthakur
    Last Reload : 2005-09-25 21:31:00
    Last successful upload : (none)
    Last attempted upload : (none)
    Total Megabytes of XML files uploaded so far : 0.00
    Number of XML files pending upload : 2434
    Size of XML files pending upload(MB) : 21.31
    Available disk space on upload filesystem : 17.78%
    Last attempted heartbeat to OMS : 2005-09-26 02:40:40
    Last successful heartbeat to OMS : unknown
    Agent is Running and Ready
    A.6.1 Prerequisite Check Fails With Directories Not Empty Error During Retry
    Good Luck!
    Richard

  • Write XML CLOB 10 MB in size to an XML file

    Hi,
    I am trying to export a table into an XML file. Using various examples, I have learned that is really easy to handle objects smaller that 32767 bytes (characters).
    The problem is working with big chunks. My XML CLOB is just over 10 MB in size. I can easilly break it into smaller pieces but then I risk to write chr(10) in the middle of a XML Tag (which happened). I have an idea of finding a way around the problem but there are two issues:
    1. DBMS_LOB.instr function returns 0 if offset grows above appx 1100.
    2. I tried this in oreder to avid the limitation from item 1.:
    for c = 1..dbms_lob.getlength(CLOB) loop
    dbms_lob.read(CLOB,c,1,MyString);
    if MyString = chr(10) then
    utl_file.put_line(MyLine);
    MyLine := '';
    else
    MyLine := MyLine || MyString;
    end if;
    end loop;
    This way I generate perfect XML structure, and it takes about an hour of cpu time to create 2.3 MB file. I have tried to run ir for a big one, and it took just over 7 hours to get to 10.2 MB when I had shut it down.
    Does anybody has any suggestions?

    utl_file.put(...) will write the contents of the buffer without a newline
    utl_file.put_line will write the contents of the buffer plus a newline character. So, if you use utl_file.put_line(largebuffer) you will get a newline character after the buffer that may break tags.
    The following should produce the string:
    <tag>
    in a file:
    utl_file.put(fil, '<ta');
    utl_file.put(fil, 'g>');
    the following PL/SQL code will produce the string:
    <ta
    g>
    in a file:
    utl_file.put_line(fil, '<ta');
    utl_file.put_line(fil, 'g>');
    Are you saying that utl_file.put() is putting newlines in a file?
    This can be because you are not setting MAXLINESIZE for the file to 32767 (or >= your buffer size ) in fopen:
    UTL_FILE.FOPEN (
    location IN VARCHAR2,
    filename IN VARCHAR2,
    open_mode IN VARCHAR2,
    max_linesize IN BINARY_INTEGER) <<<<<<<<<<< set this to 32767 >>>>>>>>>>>>>>>>>>
    RETURN file_type;
    Larry

  • Illustrator - Maximum allowed size of variable library xml file loaded from variables palette?

    Hi-
    One question I can’t seem to find the answer to anywhere – any idea on what the maximum size xml file one can use when making data driven graphics in Illustrator?
    I can successfully batch when I load a single variable library with about 100 datasets at one time but anything much larger than that in the same xml format gives me a “the incoming variable library is invalid” error. Elsewhere on this discussion forum I have seen answers say there is no limit to the size of the XML file. (http://forums.adobe.com/thread/647934). But my experience with Illustrator CS5 says otherwise.
    I have thousands I need to create. Any help would be most appreciated.
    1. If there is a maximum size either in file size, number of datasets, or number of variables/values, please share what the limit is.
    2. I still need to create thousands of the final document I am creating - if the limit is close to 100 and I need to create many thousands - any suggestions on either scripting or other methods to get this done?
    Thanks,
    Michael

    yey!! now we know 1,000 is not the limit. Tested wiht 2 variables, 1122 data sets.
    <v:sampleDataSet  dataSetName="1121">
    <ZIPPER>
    <p>6” ZIPPER</p>
    </ZIPPER>
    <FLY>
    <p>1121</p>
    </FLY>
    </v:sampleDataSet>
    <v:sampleDataSet  dataSetName="1122">
    <ZIPPER>
    <p>6” ZIPPER</p>
    </ZIPPER>
    <FLY>
    <p>1122</p>
    </FLY>
    </v:sampleDataSet>
    </v:sampleDataSets>
    </variableSet>
    </variableSets>
    </svg>

  • When parsing a large size xml file , OutOfMemorry exception(attach code)

    Dear all ,
    I met a OutOfMemorry exception when I parse a xml file (20M) in my application(I extract data in xml file for building my application data), but I found that it take long time(for 2 more minutes) even just searching the xml file , because my application is viewed by web page , so it's too bad for waiting.
    what I used is org.jdom.input.SAXBuilder(jdom1.0beta8-dev) , and my xml file structure is like :
    <errors>
    <item1>content</item1>
    <item2>content</item2>
    </errors>
    and this is my source code of parsing xml file :
    import java.io.*;
    import java.text.*;
    import java.util.*;
    import org.jdom.*;
    import org.jdom.input.*;
    import org.jdom.output.*;
    public class XMLProperties {
        private File file;
        private Document doc;
        private Map propertyCache = new HashMap();
        public XMLProperties(String filename) {
                SAXBuilder builder = new SAXBuilder();
                // Strip formatting
                DataUnformatFilter format = new DataUnformatFilter();
                builder.setXMLFilter(format);
                long time_start = System.currentTimeMillis();
                Runtime run = Runtime.getRuntime();
                doc = builder.build(new File(filename));
                System.out.println("Build doc memory ="+(run.totalMemory()-
                         run.freeMemory())/1024+"K");
                System.out.println("Build doc used time :"+(
                         System.currentTimeMillis()-time_start)/1000+" s");
            catch (Exception e) {
                System.err.println("Error creating XML parser in "
                    + "PropertyManager.java");
                e.printStackTrace();
         public String [] getChildrenProperties(String parent) {
            // Search for this property by traversing down the XML heirarchy.
            Element element = doc.getRootElement();
            element = element.getChild(parent);
            if (element == null) {
                // This node doesn't match this part of the property name which
                // indicates this property doesn't exist so return empty array.
                return new String [] { };
            // We found matching property, return names of children.
            List children = element.getChildren();
            int childCount = children.size();
            String [] childrenNames = new String[childCount];
            for (int i=0; i<childCount; i++) {
                childrenNames[i] = ((Element)children.get(i)).getName();
            return childrenNames;
        }the test main class:
    import java.util.Map;
    import java.util.HashMap;
    import org.jdom.*;
    import org.jdom.input.*;
    public class MyTest {
      public static void main(String[] args) {
        long time_start = System.currentTimeMillis();
        Runtime run = Runtime.getRuntime();
        Map childs = new HashMap();
        System.out.println("Used memory before="+(run.totalMemory()-run.freeMemory())/1024 + "K");
        XMLProperties parser = new XMLProperties("D:\\projects\\edr\\jsp\\status-data\\edr-status-2003-09-01.xml");
        String[] child = parser.getChildrenProperties("errors");
        for(int i=0;i<child.length;i++) {
    //      childs.put(new Integer(i), child);
    if(i%1000 == 0) {
    System.out.println("Used memory while="+(run.totalMemory()-run.freeMemory()/1024+"K"));
    System.out.println("child.length="+child.length);
    System.out.println("Used memory after="+(run.totalMemory()-run.freeMemory())/1024+"K");
    System.out.println("Time used: "+(System.currentTimeMillis()-time_start)/1000+"s");
    The result is : Used memory before=139K
    Used memory while=56963K
    Used memory after=51442K
    child.length=27343
    Time used: 146s
    is that some way to solve this problem ?
    Thanks for your help

    I met a OutOfMemorry exception when I parse a xml
    l file (20M) in my application(I extract data in xml
    file for building my application data)...Rule of thumb for parsing XML: the memory you need is about 10 times the size of the XML file. So in your case you need 200 MB of free memory.
    , but I found
    that it take long time(for 2 more minutes) even just
    searching the xml file , because my application is
    viewed by web page...Then you need to redesign your application. Parsing 20 megabytes of XML for every request is -- as you can see -- impractical. Sorry I can't suggest how, since I have no idea what your application is.

  • Load and Read XML file size more than 4GB

    Hi All
    My environment is Oracle 10.2.0.4 on Solaris and I have processes to work with XML file as below detail by PL/SQL
    1. I read XML file over HTTP port into XMLTYPE column in table.
    2. I read value no.1 from table and extract to insert into another table
    On test db, everything is work but I got below error when I use production XML file
         ORA-31186: Document contains too many nodes
    Current XML size about 100MB but the procedure must support XML file size more than 4GB in the future.
    Belows are some part of my code for your info.
    1. Read XML by line into variable and insert into table
    LOOP
    UTL_HTTP.read_text(http_resp, v_resptext, 32767);
    DBMS_LOB.writeappend (v_clob, LENGTH(v_resptext), v_resptext);
        END LOOP;
        INSERT INTO XMLTAB VALUES (XMLTYPE(v_clob));
    2. Read cell value from XML column and extract to insert into another table
    DECLARE
    CURSOR c_xml IS
    (SELECT  trim(y.cvalue)
    FROM XMLTAB xt,
    XMLTable('/Table/Rows/Cells/Cell' PASSING xt.XMLDoc
    COLUMNS
    cvalue
    VARCHAR(50)
    PATH '/') y;
        BEGIN
    OPEN c_xml;
    FETCH c_xml INTO v_TempValue;
    <Generate insert statement into another table>
    EXIT WHEN c_xml%NOTFOUND;
    CLOSE c_xml;
        END
    And one more problem is performance issue when XML file is big, first step to load XML content to XMLTYPE column slowly.
    Could you please suggest any solution to read large XML file and improve performance?
    Thank you in advance.
    Hiko      

    See Mark Drake's (Product Manager Oracle XMLDB, Oracle US) response in this old post: ORA-31167: 64k size limit for XML node
    The "in a future release" reference, means that this boundary 64K / node issue, was lifted in 11g and onwards...
    So first of all, if not only due to performance improvements, I would strongly suggest to upgrade to a database version which is supported by Oracle, see My Oracle Support... In short Oracle 10.2.x was in extended support up to summer 2013, if I am not mistaken and is currently not supported anymore...
    If you are able to able to upgrade, please use the much, much more performing XMLType Securefile Binary XML storage option, instead of the XMLType (Basicfile) CLOB storage option.
    HTH

  • Hello Anybody, I have a question. Can any of you please suggest me how to make an xml file from the database table with all the rows? Note:- I am having the XSD Schema file and the resulted XML file should be in that XSD format only.

    Hello Anybody, I have a question. Can any of you please suggest me how to make an xml file from the database table with all the records?
    Note:- I am having the XSD Schema file and the resulted XML file should be in that XSD format only.

    The Oracle documentation has a good overview of the options available
    Generating XML Data from the Database
    Without knowing your version, I just picked 11.2, so you made need to look for that chapter in the documentation for your version to find applicable information.
    You can also find some information in XML DB FAQ

  • Change font size in an XML file

    I bought a template and this page is the text on the opening
    page. The Title is bold and a bigger font than the next paragraph
    and the first line of the data is what I really want in the tittle
    but the line is a bit to long so I'm looking to change the size of
    the font. I see no code that makes the tittle bold and a bigger
    text so i'm thinking it must be tied to the template. Please if
    someone can help I would be greatly appreciated.
    It's been 15 years since I messed with a web page. Also one
    last question. Look at my site dkdaniels.com and the bottom of the
    first page left side where copyright is at there is a file called
    settings.xml and there is a place to change that info which I did,
    but also on the right it sasy designed by and your suppose to be
    able to imput your name and I see no such info in the settings file
    so does anyone know where one can edit that?
    Many Thanks
    DK

    .oO(dkdaniels)
    >I bought a template and this page is the text on the
    opening page. The Title is
    >bold and a bigger font than the next paragraph and the
    first line of the data
    >is what I really want in the tittle but the line is a bit
    to long so I'm
    >looking to change the size of the font. I see no code
    that makes the tittle
    >bold and a bigger text so i'm thinking it must be tied to
    the template. Please
    >if someone can help I would be greatly appreciated.
    An XML file alone is pretty useless. For the Web there
    usually has to be
    a script or a stylesheet to turn the XML into HTML which a
    browser can
    understand. So check that stylesheet or whatever else there
    might be.
    > It's been 15 years since I messed with a web page. Also
    one last question.
    >Look at my site dkdaniels.com and the bottom of the first
    page left side where
    >copyright is at there is a file called settings.xml and
    there is a place to
    >change that info which I did, but also on the right it
    sasy designed by and
    >your suppose to be able to imput your name and I see no
    such info in the
    >settings file so does anyone know where one can edit
    that?
    There's no content on that site. All I get is a blank/black
    page.
    Micha

  • Execute a xquery in XML file ith Size Limit.

    Hi there,
    I have the following XML file:
    &lt;Fuzzy&gt;
    &lt;Rule&gt;
    &lt;rule_ID&gt;1&lt;/rule_ID&gt;
    &lt;event&gt;insert&lt;/event&gt;
    &lt;happen&gt;after&lt;/happen&gt;
    &lt;condaction&gt;collection('test.dbxml')/Bookstore/Book&lt;/condaction&gt;
    &lt;/Rule&gt;
    &lt;/Fuzzy&gt;
    I use the following code to execute the xquery in the TAG &lt;condaction&gt;:
    String Query = "collection('test.dbxml')/Fuzzy/Rule/condaction/text()";
    XmlQueryExpression ue = myManager.prepare(Query, context);
    XmlResults results = ue.execute(context);
    while(results.hasNext()) {
    System.out.println("==================================");
    String test = results.next().asString();
    System.out.println(test);
    XmlQueryExpression e = myManager.prepare(test, context);
    XmlResults results1 = e.execute(context);
    while(results1.hasNext()) {
    System.out.println("==================================");
    System.out.println(results1.next().asString());
    The above execute the xquery and provides me the result. But the same code has problems when I use the below Xquery which has a longer Xquery length. Is there any limit on the size of xquery ?
    &lt;Fuzzy&gt;
    &lt;Rule&gt;
    &lt;rule_ID&gt;1&lt;/rule_ID&gt;
    &lt;event&gt;insert&lt;/event&gt;
    &lt;happen&gt;af ter&lt;/happen&gt;
    &lt;condaction&gt;if(collection('test.dbxml')/Bookstore/Book/book_ID gt 0 and collection('test.dbxml')/Bookstore/Book/price lt 20) then insert nodes &lt;b4&gt;inserted child&lt;/b4&gt; after
    collection('test.dbxml')/Bookstore/Book/title else insert nodes &lt;b4&gt;inserted child&lt;/b4&gt; after collection('test.dbxml')/Bookstore/Book&lt;/condaction&gt;
    &lt;/Rule&gt;
    &lt;/Fuzzy&gt;
    The problem I identify is that the 'test' variable only prints the part of the xquery present in the tag:
    ================================
    if(collection('test.dbxml')/Bookstore/Book/book_ID gt 0 and collection('test.dbxml')/Bookstore/Book/price lt 20) then insert nodes
    Please help me to identify if I am wrong or if there are any limitations that is causing this problem.
    Thanks.

    FYI..
    XML has a special set of characters that cannot be used in normal XML strings. These characters are:
    <ol><li>& - &amp; </li>
    <li>&lt; - &lt; </li>
    <li>&gt; - &gt; </li>
    <li>" - " </li>
    <li>' - ' </li>
    </ol>
    I replaced the above ones in my XML Tag that has xquery and it worked !!

  • XML DB: As XML file size doubles validate time more than doubles

    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    We have default XMLDB install. I am using schemavalidate() to successfullly validate xml files against a registered xsd. For some reason as the files size doubles the time to validate the file grows at a faster rate than double? We wanted to generate approx 18MB files since they hold 5000 records. But as you can see it appears I would be better off generating six 3MB files instead. What init.ora or xdbconfig.xml parameter might I change to help out with behavior? Any other ideas of why this happening? Is this an example of "doesn't scale"?
    1.5MB 3 Seconds
    3MB 15 Seconds
    6MB 1 Minute
    12MB 6 Minutes
    18MB 16 Minutes
    Code is simple..
    procedure validate_xml_file (p_dir in varchar2, p_file in varchar2) is
    v_xmldoc xmltype;
    begin
    select XMLTYPE(bfilename(p_dir, p_file),NLS_CHARSET_ID('AL32UTF8'))
    into v_xmldoc
    from dual;
    v_xmldoc.schemaValidate();
    end validate_xml_file;

    If I take the clause off the end of the cview table this procedure does not work at all.. It fails at the schemavalidate call...
    Error report:
    ORA-19030: Method invalid for non-schema based XML Documents.
    ORA-06512: at "SYS.XMLTYPE", line 345
    ORA-06512: at line 26
    19030. 00000 - "Method invalid for non-schema based XML Documents."
    *Cause:    The method can be invoked on only schema based xmltype objects.
    *Action:   Don't invoke the method for non schema based xmltype objects.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Maximum input payload size(for an XML file) supported by OSB

    Hey Everyone,
    I wanted to know, what is the maximum payload size that OSB can handle.
    The requirement is to pass XML files as input to OSB and insert the data of the XML files in the oracle staging tables. The OSB will host all the .jca,wsdl, xml, xml schema and other files required to perform the operation.
    The hurdle is to understand, what is the maximum XML file size limit, that OSB can allow to pass through without breaking.
    I did some test runs and got the following output,
    Size of the XML file:  OSB successfully read a file of size, 3176kb but failed for a file of size 3922kb, so the OSB breakpoint occurs somewhere between 3-4 MB, as per the test runs.
    Range of number of Lines of XML:  102995 to 126787, since OSB was able to consume a file with lines (102995) and size 3176kb but broke for a file with number of lines (126787) and size 3922kb.
    Request to please share your views on the test runs regarding the OSB breakpoint and also kindly share the results, if the same test has been performed at your end.
    Thank you very much.

    Hey Everyone,
    I wanted to know, what is the maximum payload size that OSB can handle.
    The requirement is to pass XML files as input to OSB and insert the data of the XML files in the oracle staging tables. The OSB will host all the .jca,wsdl, xml, xml schema and other files required to perform the operation.
    The hurdle is to understand, what is the maximum XML file size limit, that OSB can allow to pass through without breaking.
    I did some test runs and got the following output,
    Size of the XML file:  OSB successfully read a file of size, 3176kb but failed for a file of size 3922kb, so the OSB breakpoint occurs somewhere between 3-4 MB, as per the test runs.
    Range of number of Lines of XML:  102995 to 126787, since OSB was able to consume a file with lines (102995) and size 3176kb but broke for a file with number of lines (126787) and size 3922kb.
    Request to please share your views on the test runs regarding the OSB breakpoint and also kindly share the results, if the same test has been performed at your end.
    Thank you very much.

  • Dynamic number of xml files with a specified max size

    Hi all,
    I'm using a custom report to generate xml files (via a ST program and a call transformation) containing
    data belonging to is-u invoices (some are much more complex and rich of data than other ones).
    I'm asked to generate the minimun number of xml files, the only limit is their maximum size (40 MB).
    How can I dynamically understand when I need to save an xml file and generate the next one if the
    size is only readable after the transformation,as far as I know?
    Thanks a lot in advance for your ideas.
    Angelo

    My problem is that different users may be loading different files.
    As there may be many simultaneous users and the xml document nodes do not get edited in any way, I only want to have one copy of each file in memory .... hence the application scope.
    If I make the var name fixed then a user loading xml2.xml would overwrite any previous file xml1.xml or whatever.
    I could switch to session scope but that would load possibly hundreds of copies of any given file at any given time and the files are not small :-(
    Keith

  • XML file size increases GREATLY with ![CDATA[

    I have several .xml files that provide info for a website of
    mine and was quite surprised to discover, while I was uploading
    them, that some of them were around 60-70kb, while others were 4kb
    or less. Knowing that the amount of content in all of them doesn't
    vary that much, I noticed that the biggest file sizes corresponded
    to the files using <![CDATA[ ]]>.
    I included a sample file code below. It's in Portuguese, but
    I think it's still obvious that this file could not be more than a
    few kb, and Dreamweaver (CS3) saves it with a whopping 62kb!
    I tried saving the exact same file in Text Edit and it
    resulted in a normal file size.
    Has anyone encountered similar situations? I'm guessing this
    is some sort of bug, that Dreamweaver puts some extra content -
    somewhere - because of the CDATA. Is there any sort of reason or
    fix for this?
    Thanks in advance.

    Ok... embarassing moment. Just realized that DW CS3 is not
    guilty. In Leopard, in the file's Get Info panel, I changed the
    preferred application to open it, and the file's size changed
    according to different applications. Reverting back to DW CS3, it
    still resulted in the 60-70kb size, but deleting the file and
    saving a new one it the same name solved the problem.
    Sorry guys.

Maybe you are looking for

  • Having troulbe viewing pdf files in my adobe reader

    I am having trouble seeing some pdf files like income tax returns or my sons fafsa application both online and when i try to uninstall to reinstall to see if that would fix the problem, in my add/remove programs it comes up and says "this patch packa

  • Performace Which is better ?  : Bapi inside a loop OR Select from Tables

    Hi Gurus, I have a report which displays purchase info records. If I am selecting from tables i need to use EINA, EINE, EORD and some other tables. There is a BAPI which gets all purchase info records specific to a vendor , material , purchasing orga

  • File Open By Another Application

    I am trying to use an application to Sync files on 2 different IMACs across a Local Network. Many of the Syncs are failing because the Application claims that Files are exclusively Opened by another Application. I know that this is NOT true since I h

  • Using DBMS_STAT.CREATE_STAT_TABLE

    could someone please advise me the format DBMS_STAT.CREATE_STAT_TABLE is used and what columns need to be in there and the format. Thanks in advance

  • Optimum workflow for creating an audio-visual from scanned slides?

    I'd like some suggestions as to the best workflow to adopt when generating an AV from scanned slides. I'm transferring my 3-screen audio-visuals from slide format to Blu-ray. The slides are being scanned on a Nikon Coolscan, and after editing end up