Need help in inserting XML Documents as XMLType

I am using an XMLType column in one of my tables to store XML messages that come from a B2B application. My insert takes the form of:
insert into mytable values(
'100',
'unvalidated',
'final',
XMLTYPE(
'The XML Message
This has been working fine for the last several weeks. Over the weekend I was working with a new message type and getting ready to enter some test messages for some new validation functions. When I try to insert the row, I get a
ERROR at line 6:
ORA-01704: string literal too long
error message.
In looking it up, the book says that the string literal is over 4000 characters long, which the message is.
I can't find anything in the docs on how to insert an XML doc with over 4000 characters into an XMLType column. I'm OK on record size which is supposed to be 2 GB - my string is just too long.
Please help - I need to be testing this today.
Rick Price

Hi Rick,
For a very large string , i am using following proc.
See if it can be of any help.
declare
str clob ;
begin
str := '<?xml version="1.0" ?>
BIG LARGE STRING ';
insert into stylesheet values (1, 'style_name' ,XMLTYPE(str)) ;
commit;
Cheers
Pashabhai
end;

Similar Messages

  • Need help in Parsing XML Document in Oracle

    Hello Experts,
    I urgently need your help. We have xml document on the Web but not on the File System. How can I parse the xml document on the web in Oracle.
    Any link, blog or sample code would be appreciated.
    Your help would be appreciated.
    Kind Regards,
    Bhavin
    London, UK

    This breaks down to two issues
    1) Getting the XML into Oracle
    2) Parsing the XML inside Oracle
    For #1, the first two options that come to my mind are httpuirtype and utl_http. Both can be used to get information from a URL that understands HTTP requests.
    For #2, you can treat the XML as a DOMDocument or XMLType to parse it. You also could parse it via a SQL statement using XQuery or XMLTable as well.
    Many of those examples can be found on the {forum:id=34} forum or [Marco&apos;s blog|http://www.liberidu.com/blog/]. A few parsing examples that I've done can be seen at {message:id=3610259}, make sure to also follow the link I put in there to a previous example before that.

  • ORA-00600: inserting xml document

    I am getting an this error when inserting an xml document into Oracle.
    The schema below and sample xml document reproduce the problem.
    If I move the element named "theID" from the type "BaseObjectType"
    to "NextObjectType" and remove the extension declaration in this type
    all works ok. complete error is:
    ORA-00600: internal error code, arguments: [qmxConvUnkType], [], [], [], [],
    Schema:
    <?xml version="1.0" encoding="UTF-8"?>
    <xs:schema
    targetNamespace="http://www.x.com/test"
    xmlns:xs="http://www.w3.org/2001/XMLSchema"
    xmlns:tes="http://www.x.com/test">
    <xs:complexType name="BaseObjectType" abstract="true">
    <xs:sequence>
    <xs:element name="theID" type="xs:string">
    </xs:element>
    </xs:sequence>
    </xs:complexType>
    <xs:complexType name="NextObjectType" abstract="true">
    <xs:complexContent>
    <xs:extension base="tes:BaseObjectType">
    <xs:sequence>
    <xs:element name="title" type="xs:string">
    </xs:element>
    </xs:sequence>
    </xs:extension>
    </xs:complexContent>
    </xs:complexType>
    <xs:complexType name="RBaseType" abstract="true">
    <xs:complexContent>
    <xs:restriction base="tes:NextObjectType">
    <xs:sequence>
    <xs:element name="theID" type="xs:string"/>
    <xs:element name="title" type="xs:string"/>
    </xs:sequence>
    </xs:restriction>
    </xs:complexContent>
    </xs:complexType>
    <xs:complexType name="RType">
    <xs:complexContent>
    <xs:extension base="tes:RBaseType">
    <xs:sequence>
    <xs:element name="theName" type="xs:string">
    </xs:element>     
    </xs:sequence>
    </xs:extension>
    </xs:complexContent>
    </xs:complexType>
    <xs:element name="RObject" type="tes:RType"/>
    </xs:schema>
    Sample xml document:
    <?xml version="1.0" encoding="UTF-8"?>
    <RObject xmlns="http://www.x.com/test"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:noNamespaceSchemaLocation="http://localhost:8080:/test.xsd">
    <theID xmlns="">123</theID>
    <title xmlns="">title</title>
    <theName xmlns="">name</theName>
    </RObject>
    Register Schema call:
    BEGIN
    DBMS_XMLSCHEMA.registerSchema(
    SCHEMAURL => 'http://localhost:8080/test.xsd',
    SCHEMADOC => bfilename('TMP3','test.xsd'),
    LOCAL => TRUE,
    GENTYPES => TRUE,
    GENTABLES => TRUE,
    CSID => nls_charset_id('AL32UTF8'));
    END;
    Insert xml document call:
    insert into "RObject566_TAB" values
    (XMLType(bfilename('TMP3','test.xml'),nls_charset_id('AL32UTF8')));
    Changing NextObjectType to this gets insert to work:
    <xs:complexType name="NextObjectType" abstract="true">
    <xs:sequence>
    <xs:element name="theID" type="xs:string"/>
    <xs:element name="title" type="xs:string"/>
    </xs:sequence>
    </xs:complexType>
    Anyone else seen this?
    jcl.

    BTW: This is with Oracle 10.2.0

  • Need Help in Inserting first ever record

    I need help in inserting my first ever record from an OAF page.
    I've created an AM 'MasterAM', added 'MasterVO' to it. Created a Page CreatePG which has a submit button, id = Apply
    Below is processRequest of CreateCo
    public void processRequest(OAPageContext pageContext, OAWebBean webBean)
    super.processRequest(pageContext, webBean);
    OAApplicationModule am = pageContext.getApplicationModule(webBean);
    if (!pageContext.isFormSubmission()) {
    am.invokeMethod("createRecord", null);
    and below is processFormRequest
    public void processFormRequest(OAPageContext pageContext, OAWebBean webBean)
    super.processFormRequest(pageContext, webBean);
    OAApplicationModule am = pageContext.getApplicationModule(webBean);
    if (pageContext.getParameter("Apply") != null)
    OAViewObject vo = (OAViewObject)am.findViewObject("MasterVO1");
    am.invokeMethod("apply");
    pageContext.forwardImmediately("OA.jsp?page=/abcd/oracle/apps/per/selfservice/xxdemo/webui/CreatePG",
    null,
    OAWebBeanConstants.KEEP_MENU_CONTEXT,
    null,
    null,
    true,
    OAWebBeanConstants.ADD_BREAD_CRUMB_NO);
    Below are 'createRecord' and 'apply' in MasterAMImpl
    public void createRecord(){
    OAViewObject vo = (OAViewObject)getMasterVO1();
    if (!vo.isPreparedForExecution()) {
    vo.executeQuery();
    Row row = vo.createRow();
    vo.insertRow(row);
    row.setNewRowState(Row.STATUS_INITIALIZED);
    public void apply() {
    getTransaction().commit();
    When I run the page, it opens and I try to enter some data and press Apply. it does not insert into the table.
    Could anyone help me out.
    My jdeveloper version is 10.1.3.3.0.3

    I am facing the same issue.
    rows get inserted into the tbale, but only whol columns have the data.
    all the attributes are correctly mapped to view instance and view attribute.
    My VO has 1 EO and , i have joined another table to get desctriptions of the field.
    could that be the problem ?
    ex :
    select item , desc
    from t , master
    where t.cola=master.colb
    table t is the custom table I want the data to go in. but only who columns appear after commiting.
    any clues ?

  • Error in inserting XML document

    I'm trying to insert into a table from an HTML form.
    Here's the table I'm trying to insert into
    TABLE employee
    name VARCHAR2(40),
    ssnum NUMBER NOT NULL
    Note: even though SSNUM is number I take it as a string and I
    have a before insert trigger that converts it to number. Works
    from SQL plus if entry is number.
    Here's the testxsql.html
    <html>
    <body>
    Input your name and ssnumber ..
    <form action="testxsql.xsql" method="post">
    Name<input type="text" name="name_field" size="30">
    SSNUM<input type = "text" name="ssnum_field" size="10">
    <input type="submit">
    </form>
    </body>
    </html>
    it calls testxsql.xsql which is
    <?xml version="1.0"?>
    <?xml-stylesheet type="text/xsl" href="employee.xsl"?>
    <page connection="demo">
    <insert table="employee" transform="testxsql.xsl"/>
    <content>
    <query tag-case="lower" max-rows="5" rowset-element=""
    row-element="emp" >
    select *
    from employee
    order by ssnum desc
    </query>
    </content>
    </page>
    and testxsql.xsl is
    <?xml version = '1.0'?>
    <ROWSET xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
    <xsl:for-each select="request">
    <ROW>
    <NAME><xsl:value-of select="name_field"/></NAME>
    <SSNUM><xsl:value-of select="ssnum_field"/></SSNUM>
    <SOURCE>User-Submitted</SOURCE>
    </ROW>
    </xsl:for-each>
    </ROWSET>
    I get the following error while trying to insert
    XSQL-015: Error inserting XML Document
    invalid LABEL name in XML doc-
    If I remove the insert tag, the query portion works fine.
    I'm new to XML so I've already pulled all the hair that I can
    muster. Thanks
    null

    Kumar Pandey (guest) wrote:
    : I'm trying to insert into a table from an HTML form.
    : Here's the table I'm trying to insert into
    : TABLE employee
    : name VARCHAR2(40),
    : ssnum NUMBER NOT NULL
    : Note: even though SSNUM is number I take it as a string and I
    : have a before insert trigger that converts it to number. Works
    : from SQL plus if entry is number.
    : Here's the testxsql.html
    : <html>
    : <body>
    : Input your name and ssnumber ..
    : <form action="testxsql.xsql" method="post">
    : Name<input type="text" name="name_field"
    size="30">
    : SSNUM<input type = "text" name="ssnum_field"
    size="10">
    I'm using apache web and JServ and 8i as db
    : <input type="submit">
    : </form>
    : </body>
    : </html>
    : it calls testxsql.xsql which is
    : <?xml version="1.0"?>
    : <?xml-stylesheet type="text/xsl" href="employee.xsl"?>
    : <page connection="demo">
    : <insert table="employee" transform="testxsql.xsl"/>
    : <content>
    : <query tag-case="lower" max-rows="5" rowset-element=""
    : row-element="emp" >
    : select *
    : from employee
    : order by ssnum desc
    : </query>
    : </content>
    : </page>
    : and testxsql.xsl is
    : <?xml version = '1.0'?>
    : <ROWSET xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
    : <xsl:for-each select="request">
    : <ROW>
    : <NAME><xsl:value-of select="name_field"/></NAME>
    : <SSNUM><xsl:value-of select="ssnum_field"/></SSNUM>
    : <SOURCE>User-Submitted</SOURCE>
    : </ROW>
    : </xsl:for-each>
    : </ROWSET>
    : I get the following error while trying to insert
    : XSQL-015: Error inserting XML Document
    : invalid LABEL name in XML doc-
    : If I remove the insert tag, the query portion works fine.
    : I'm new to XML so I've already pulled all the hair that I can
    : muster. Thanks
    null

  • Very urgent help needed- Error while passing XML document to Oracle stored

    Hi !
    I have been struggling a lot to call Oracle 9i stored procedure passing Stringbuilder object type from ASP.NET
    I am using Visual Studio 2008 Professional, OS: Windows XP and Oracle: 9.2.0.1.0
    Following is the procedure:
    CREATE or REPLACE PROCEDURE loadCompanyInfo (clobxml IN clob) IS
    -- Declare a CLOB variable
    ciXML clob;
    BEGIN
    -- Store the Purchase Order XML in the CLOB variable
    ciXML := clobxml;
    -- Insert the Purchase Order XML into an XMLType column
    INSERT INTO companyinfotbl (companyinfo) VALUES (XMLTYPE(ciXML));
    commit;
    --Handle the exceptions
    EXCEPTION
    WHEN OTHERS THEN
    raise_application_error(-20101, 'Exception occurred in loadCompanyInfo procedure :'||SQLERRM);
    END loadCompanyInfo ;
    And following is the ASP.net code:
    StringBuilder b = new StringBuilder();
    b.Append("<?xml version=\"1.0\" encoding=\"utf-8\" ?>");
    b.Append("<item>");
    b.Append("<price>500</price>");
    b.Append("<description>some item</description>");
    b.Append("<quantity>5</quantity>");
    b.Append("</item>");
    //Here you'll have the Xml as a string
    string myXmlString1 = b.ToString();
    //string result;
    using (OracleConnection objConn = new OracleConnection("Data Source=testdb; User ID=testuser; Password=pwd1"))
    OracleCommand objCmd = new OracleCommand();
    objCmd.Connection = objConn;
    objCmd.CommandText = "loadCompanyInfo";
    objCmd.CommandType = CommandType.StoredProcedure;
    //OracleParameter pmyXmlString1 = new OracleParameter("pmyXmlString1", new OracleString(myXmlString1));
    objCmd.Parameters.Add("myXmlString1", OracleType.clob);
    objCmd.Parameters.Add(myXmlString1).Direction = ParameterDirection.Input;
    //objCmd.Parameters.Add("result", OracleType.VarChar).Direction = ParameterDirection.Output;
    try
    objConn.Open();
    objCmd.ExecuteNonQuery();
    catch (Exception ex)
    Label1.Text = "Exception: {0}" + ex.ToString();
    objConn.Close();
    When I am trying to execute it, I am getting the following error:
    Exception: {0}System.Data.OracleClient.OracleException: ORA-06550: line 1, column 7: PLS-00306: wrong number or types of arguments in call to 'LOADCOMPANYINFO' ORA-06550: line 1, column 7: PL/SQL: Statement ignored at System.Data.OracleClient.OracleConnection.CheckError(OciErrorHandle errorHandle, Int32 rc) at System.Data.OracleClient.OracleCommand.Execute(OciStatementHandle statementHandle, CommandBehavior behavior, Boolean needRowid, OciRowidDescriptor& rowidDescriptor, ArrayList& resultParameterOrdinals) at System.Data.OracleClient.OracleCommand.ExecuteNonQueryInternal(Boolean needRowid, OciRowidDescriptor& rowidDescriptor) at System.Data.OracleClient.OracleCommand.ExecuteNonQuery() at Default.Button1Click(Object sender, EventArgs e)
    I understand from this that the .net type is not the correct one, but I am not sure how to correct it. I could not find any proper example in any documentation that I came across. Most of the examples give information on how to read but not how to insert XML into Oracle table by calling Stored Procedure.
    Can you please help me to solve this problem? I hope that you can help solve this.
    Also, can you please give me an example of passing XML document XMLdocument to Oracle Stored procedure.
    In both the cases, if you can provide the working code then it would be of great help.
    Thanks,

    Hi ,
    Additional to the Above error details my BPEL code looks like this:
    <process name="BPELProcess1"
    targetNamespace="http://xmlns.oracle.com/Application10/Project10/BPELProcess1"
    xmlns="http://schemas.xmlsoap.org/ws/2003/03/business-process/"
    xmlns:client="http://xmlns.oracle.com/Application10/Project10/BPELProcess1"
    xmlns:ora="http://schemas.oracle.com/xpath/extension"
    xmlns:bpelx="http://schemas.oracle.com/bpel/extension"
    xmlns:bpws="http://schemas.xmlsoap.org/ws/2003/03/business-process/">
    <partnerLinks>
    <partnerLink name="bpelprocess1_client" partnerLinkType="client:BPELProcess1" myRole="BPELProcess1Provider" partnerRole="BPELProcess1Requester"/>
    </partnerLinks>
    <variables>
    <variable name="inputVariable" messageType="client:BPELProcess1RequestMessage"/>
    <variable name="outputVariable" messageType="client:BPELProcess1ResponseMessage"/>
    </variables>
    <sequence name="main">
    <receive name="receiveInput" partnerLink="bpelprocess1_client" portType="client:BPELProcess1" operation="process" variable="inputVariable" createInstance="yes"/>
    <invoke name="callbackClient" partnerLink="bpelprocess1_client" portType="client:BPELProcess1Callback" operation="processResponse" inputVariable="outputVariable"/>
    </sequence>
    </process>
    Kindly help if anyone has faced this Issue before.
    Regards,
    Rakshitha

  • Inserting a long XML document into XMLType

    I'm trying to load the following document into an XMLType column in 10.2. I've tried every example I can find and can push the data into CLOBs using the Java work around just fine (http://www.oracle.com/technology/sample_code/tech/java/codesnippet/xmldb/HowToLoadLargeXML.html).
    Can anyone provide a solution or let me know if there is a limitation please?
    Given the table;
    SQL> describe xmltable_1
    Name Null? Type
    DOC_ID NUMBER
    XML_DATA XMLTYPE
    How do I load this data into 'XML_DATA'?
    <?xml version="1.0" encoding="UTF-8"?>
    <metadata>
    <idinfo>
    <citation>
    <citeinfo>
    <origin>Rand McNally and ESRI</origin>
    <pubdate>1996</pubdate>
    <title>ESRI Cities Geodata Set</title>
    <geoform>vector digital data</geoform>
    <onlink>\\OIS23\C$\Files\Working\Metadata\world\cities.shp</onlink>
    </citeinfo>
    </citation>
    <descript>
    <abstract>World Cities contains locations of major cities around the world. The cities include national capitals for each of the countries in World Countries 1998 as well as major population centers and landmark cities. World Cities was derived from ESRI's ArcWorld database and supplemented with other data from the Rand McNally New International Atlas</abstract>
    <purpose>606 points, 4 descriptive fields. Describes major world cities.</purpose>
    </descript>
    <timeperd>
    <timeinfo>
    <sngdate>
    <caldate>1996</caldate>
    </sngdate>
    </timeinfo>
    <current>publication date</current>
    </timeperd>
    <status>
    <progress>Complete</progress>
    <update>None planned</update>
    </status>
    <spdom>
    <bounding>
    <westbc>
    -165.270004</westbc>
    <eastbc>
    177.130188</eastbc>
    <northbc>
    78.199997</northbc>
    <southbc>
    -53.150002</southbc>
    </bounding>
    </spdom>
    <keywords>
    <theme>
    <themekt>city</themekt>
    <themekey>cities</themekey>
    </theme>
    </keywords>
    <accconst>none</accconst>
    <useconst>none</useconst>
    <ptcontac>
    <cntinfo>
    <cntperp>
    <cntper>unknown</cntper>
    <cntorg>unknown</cntorg>
    </cntperp>
    <cntpos>unknown</cntpos>
    <cntvoice>555-1212</cntvoice>
    </cntinfo>
    </ptcontac>
    <datacred>ESRI</datacred>
    <native>Microsoft Windows NT Version 4.0 (Build 1381) Service Pack 6; ESRI ArcCatalog 8.1.0.570</native>
    </idinfo>
    <dataqual>
    <attracc>
    <attraccr>no report available</attraccr>
    <qattracc>
    <attraccv>1000000</attraccv>
    <attracce>no report available</attracce>
    </qattracc>
    </attracc>
    <logic>no report available</logic>
    <complete>no report available</complete>
    <posacc>
    <horizpa>
    <horizpar>no report available</horizpar>
    </horizpa>
    <vertacc>
    <vertaccr>no report available</vertaccr>
    </vertacc>
    </posacc>
    <lineage>
    <srcinfo>
    <srccite>
    <citeinfo>
    <title>ESRI</title>
    </citeinfo>
    </srccite>
    <srcscale>20000000</srcscale>
    <typesrc>CD-ROM</typesrc>
    <srctime>
    <timeinfo>
    <sngdate>
    <caldate>1996</caldate>
    </sngdate>
    </timeinfo>
    <srccurr>publication date</srccurr>
    </srctime>
    <srccontr>no report available</srccontr>
    </srcinfo>
    <procstep>
    <procdesc>no report available</procdesc>
    <procdate>Unknown</procdate>
    </procstep>
    </lineage>
    </dataqual>
    <spdoinfo>
    <direct>Vector</direct>
    <ptvctinf>
    <sdtsterm>
    <sdtstype>Entity point</sdtstype>
    <ptvctcnt>606</ptvctcnt>
    </sdtsterm>
    </ptvctinf>
    </spdoinfo>
    <spref>
    <horizsys>
    <geograph>
    <latres>0.000001</latres>
    <longres>0.000001</longres>
    <geogunit>Decimal degrees</geogunit>
    </geograph>
    <geodetic>
    <horizdn>North American Datum of 1927</horizdn>
    <ellips>Clarke 1866</ellips>
    <semiaxis>6378206.400000</semiaxis>
    <denflat>294.978698</denflat>
    </geodetic>
    </horizsys>
    </spref>
    <eainfo>
    <detailed>
    <enttyp>
    <enttypl>
    cities</enttypl>
    </enttyp>
    <attr>
    <attrlabl>FID</attrlabl>
    <attrdef>Internal feature number.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Sequential unique whole numbers that are automatically generated.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>Shape</attrlabl>
    <attrdef>Feature geometry.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Coordinates defining the features.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>NAME</attrlabl>
    <attrdef>The city name. Spellings are based on Board of Geographic Names standards and commercial atlases.</attrdef>
    <attrdefs>ESRI</attrdefs>
    </attr>
    <attr>
    <attrlabl>COUNTRY</attrlabl>
    <attrdef>An abbreviated country name.</attrdef>
    </attr>
    <attr>
    <attrlabl>POPULATION</attrlabl>
    <attrdef>Total population for the entire metropolitan area. Values are from recent census or estimates.</attrdef>
    </attr>
    <attr>
    <attrlabl>CAPITAL</attrlabl>
    <attrdef>Indicates whether a city is a national capital (Y/N).</attrdef>
    </attr>
    </detailed>
    <overview>
    <eaover>none</eaover>
    <eadetcit>none</eadetcit>
    </overview>
    </eainfo>
    <distinfo>
    <stdorder>
    <digform>
    <digtinfo>
    <transize>0.080</transize>
    </digtinfo>
    </digform>
    </stdorder>
    </distinfo>
    <metainfo>
    <metd>20010509</metd>
    <metc>
    <cntinfo>
    <cntorgp>
    <cntorg>ESRI</cntorg>
    <cntper>unknown</cntper>
    </cntorgp>
    <cntaddr>
    <addrtype>unknown</addrtype>
    <city>unknown</city>
    <state>unknown</state>
    <postal>00000</postal>
    </cntaddr>
    <cntvoice>555-1212</cntvoice>
    </cntinfo>
    </metc>
    <metstdn>FGDC Content Standards for Digital Geospatial Metadata</metstdn>
    <metstdv>FGDC-STD-001-1998</metstdv>
    <mettc>local time</mettc>
    <metextns>
    <onlink>http://www.esri.com/metadata/esriprof80.html</onlink>
    <metprof>ESRI Metadata Profile</metprof>
    </metextns>
    </metainfo>
    </metadata>
    rtacce>Vertical Positional Accuracy is expressed in meters. Vertical accuracy figures were developed by comparing elevation contour locations on 1:24,000 scale maps to elevation values at the same location within the digital database. Some manual interpolation was necessary to complete this test. The analysis results are expressed as linear error at a 90% confidence interval.</vertacce>
    </qvertpa>
    </vertacc>
    </posacc>
    <lineage>
    <srcinfo>
    <srccite>
    <citeinfo>
    <origin>National Imagery and Mapping Agency</origin>
    <pubdate>1994</pubdate>
    <title>Operational Navigational Chart</title>
    <geoform>map</geoform>
    <pubinfo>
    <pubplace>St.Louis, MO</pubplace>
    <publish>National Imagery and Mapping Agency</publish>
    </pubinfo>
    </citeinfo>
    </srccite>
    <srcscale>1000000</srcscale>
    <typesrc>stable-base material</typesrc>
    <srctime>
    <timeinfo>
    <rngdates>
    <begdate>1974</begdate>
    <enddate>1994</enddate>
    </rngdates>
    </timeinfo>
    <srccurr>Publication dates</srccurr>
    </srctime>
    <srccitea>ONC</srccitea>
    <srccontr>All information found on the source with the exception of aeronautical data</srccontr>
    </srcinfo>
    <srcinfo>
    <srccite>
    <citeinfo>
    <origin>National Imagery and Mapping Agency</origin>
    <pubdate>199406</pubdate>
    <title>Digital Aeronautical Flight Information File</title>
    <geoform>model</geoform>
    <pubinfo>
    <pubplace>St. Louis, MO</pubplace>
    <publish>National Imagery and Mapping Agency</publish>
    </pubinfo>
    </citeinfo>
    </srccite>
    <typesrc>magnetic tape</typesrc>
    <srctime>
    <timeinfo>
    <sngdate>
    <caldate>1994</caldate>
    </sngdate>
    </timeinfo>
    <srccurr>Publication date</srccurr>
    </srctime>
    <srccitea>DAFIF</srccitea>
    <srccontr>Airport records (name, International Civil Aviation Organization, position, elevation, and type)</srccontr>
    </srcinfo>
    <srcinfo>
    <srccite>
    <citeinfo>
    <origin>Defense Mapping Agency</origin>
    <pubdate>1994</pubdate>
    <title>Jet Navigational Chart</title>
    <geoform>map</geoform>
    <pubinfo>
    <pubplace>St.Louis, MO</pubplace>
    <publish>Defense Mapping Agency</publish>
    </pubinfo>
    </citeinfo>
    </srccite>
    <srcscale>2,000,000</srcscale>
    <typesrc>stable-base material</typesrc>
    <srctime>
    <timeinfo>
    <rngdates>
    <begdate>1974</begdate>
    <enddate>1991</enddate>
    </rngdates>
    </timeinfo>
    <srccurr>Publication date</srccurr>
    </srctime>
    <srccitea>JNC</srccitea>
    <srccontr>All information found on the source with the exception of aeronautical data. JNCs were used as source for the Antartica region only.</srccontr>
    </srcinfo>
    <srcinfo>
    <srccite>
    <citeinfo>
    <origin>USGS EROS Data Center</origin>
    <pubdate></pubdate>
    <title>Advance Very High Resolution Radiometer</title>
    <geoform>remote-sensing image</geoform>
    <pubinfo>
    <pubplace>Sioux Falls, SD</pubplace>
    <publish>EROS Data Center</publish>
    </pubinfo>
    </citeinfo>
    </srccite>
    <srcscale>1000000</srcscale>
    <typesrc>magnetic tape</typesrc>
    <srctime>
    <timeinfo>
    <rngdates>
    <begdate>199003</begdate>
    <enddate>199011</enddate>
    </rngdates>
    </timeinfo>
    <srccurr>Publication date</srccurr>
    </srctime>
    <srccitea>AVHRR</srccitea>
    <srccontr>6 vegetation types covering the continental US and Canada</srccontr>
    </srcinfo>
    <procstep>
    <procdesc>For the first edition DCW, stable-based positives were produced from the original reproduction negatives (up to 35 per ONC sheet). These were digitized either through a scanning-raster to vector conversion or hand digitized into vector form. The vector data was then tagged with attribute information using ARC-INFO software. Transformation to geographic coordinates was performed using the projection graticules for each sheet. Digital information was edge matched between sheets to create large regional datasets. These were then subdivided into 5 x 5 degree tiles and converted from ARC/INFO to VPF. The data was then pre-mastered for CD-ROM. QC was performed by a separate group for each step in the production process.</procdesc>
    <procdate>199112</procdate>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Environmental Systems Research Institute</cntorg>
    </cntorgp>
    <cntpos>Applications Division</cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>380 New York St.</address>
    <city>Redlands</city>
    <state>CA</state>
    <postal>92373</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>909-793-2853</cntvoice>
    <cntfax>909-793-5953</cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdate>199404</procdate>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Geonex</cntorg>
    </cntorgp>
    <cntpos></cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>8950 North 9th Ave.</address>
    <city>St. Petersburg</city>
    <state>FL</state>
    <postal>33702</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>(813)578-0100</cntvoice>
    <cntfax>(813)577-6946</cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdesc>Transferred digitally directly into the VPF files.</procdesc>
    <procdate>199408</procdate>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Geonex</cntorg>
    </cntorgp>
    <cntpos></cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>8950 North 9th Ave.</address>
    <city>St. Petersburg</city>
    <state>FL</state>
    <postal>33702</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>813-578-0100</cntvoice>
    <cntfax>813-577-6946</cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdesc>Stable-based positives were produced from the original reproduction negatives (up to 35 per ONC sheet). These were digitized either through a scanning-raster to vector conversion or hand digitized into vector form. The vector data was then tagged with attribute information using ARC-INFO software. Transformation to geographic coordinates was performed using the projection graticules for each sheet. Digital information was edge matched between sheets to create large regional datasets. These were then subdivided into 5 x 5 degree tiles and converted from ARC/INFO to VPF. The data was then pre-mastered for CD-ROM. QC was performed by a separate group for each step in the production process.</procdesc>
    <procdate>199112</procdate>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Environmental Systems Research Institute</cntorg>
    </cntorgp>
    <cntpos>Applications Division</cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>380 New York St.</address>
    <city>Redlands</city>
    <state>CA</state>
    <postal>92373</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>909-793-2853</cntvoice>
    <cntfax>909-793-5953</cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdesc>Daily AVHRR images were averaged for two week time periods over the entire US growing season. These averaged images, their rates of change, elevation information, and other data were used to produce a single land classification image of the contental US. The VMap-0 data set extended this coverage over the Canadian land mass, however vegetation classification was further subdivided into nine vegetation types.</procdesc>
    <procdate>199402</procdate>
    <srcprod>EROS data</srcprod>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>USGS Eros Data Center</cntorg>
    </cntorgp>
    <cntpos></cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address></address>
    <city>Sioux Falls</city>
    <state>SD</state>
    <postal></postal>
    <country>US</country>
    </cntaddr>
    <cntvoice></cntvoice>
    <cntfax></cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdesc>The Eros data (raster files) were converted to vector polygon, splined (remove stairstepping), thinned (all ploygons under 2km2 were deleted), and tied to existing DCW polygons (water bodies, built-up areas). The resulting file was tiled and converted to a VPF Vegetation coverage for use in the DCW. All processing was performed using ARC-INFO software.</procdesc>
    <procdate>199412</procdate>
    <srcprod>VMap-0 Vegetation Coverage</srcprod>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Geonex</cntorg>
    </cntorgp>
    <cntpos></cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>8950 North 9th Ave.</address>
    <city>St. Petersburg</city>
    <state>FL</state>
    <postal>33702</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>813-578-0100</cntvoice>
    <cntfax>813-577-6946</cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdesc>Data was translated from VPF format to ArcInfo Coverage format. The coverages were then loaded into a seamless ArcSDE layer.</procdesc>
    <procdate>02152001</procdate>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Geodesy Team, Harvard University</cntorg>
    </cntorgp>
    <cntemail>[email protected]</cntemail>
    </cntinfo>
    </proccont>
    </procstep>
    </lineage>
    </dataqual>
    <spdoinfo>
    <direct>Vector</direct>
    <ptvctinf>
    <sdtsterm>
    <sdtstype>Complete chain</sdtstype>
    </sdtsterm>
    <sdtsterm>
    <sdtstype>Label point</sdtstype>
    </sdtsterm>
    <sdtsterm>
    <sdtstype>GT-polygon composed of chains</sdtstype>
    </sdtsterm>
    <sdtsterm>
    <sdtstype>Point</sdtstype>
    </sdtsterm>
    <vpfterm>
    <vpflevel>3</vpflevel>
    <vpfinfo>
    <vpftype>Node</vpftype>
    </vpfinfo>
    <vpfinfo>
    <vpftype>Edge</vpftype>
    </vpfinfo>
    <vpfinfo>
    <vpftype>Face</vpftype>
    </vpfinfo>
    </vpfterm>
    </ptvctinf>
    </spdoinfo>
    <spref>
    <horizsys>
    <geograph>
    <latres>0.000000</latres>
    <longres>0.000000</longres>
    <geogunit>Decimal degrees</geogunit>
    </geograph>
    <geodetic>
    <horizdn>D_WGS_1984</horizdn>
    <ellips>WGS_1984</ellips>
    <semiaxis>6378137.000000</semiaxis>
    <denflat>298.257224</denflat>
    </geodetic>
    </horizsys>
    <vertdef>
    <altsys>
    <altdatum>Mean Sea Level</altdatum>
    <altunits>1.0</altunits>
    </altsys>
    </vertdef>
    </spref>
    <eainfo>
    <detailed>
    <enttyp>
    <enttypl>
    lc.pat</enttypl>
    </enttyp>
    <attr>
    <attrlabl>FID</attrlabl>
    <attrdef>Internal feature number.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Sequential unique whole numbers that are automatically generated.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>Shape</attrlabl>
    <attrdef>Feature geometry.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Coordinates defining the features.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>AREA</attrlabl>
    <attrdef>Area of feature in internal units squared.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Positive real numbers that are automatically generated.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>PERIMETER</attrlabl>
    <attrdef>Perimeter of feature in internal units.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Positive real numbers that are automatically generated.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>LC#</attrlabl>
    <attrdef>Internal feature number.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Sequential unique whole numbers that are automatically generated.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>LC-ID</attrlabl>
    <attrdef>User-defined feature number.</attrdef>
    <attrdefs>ESRI</attrdefs>
    </attr>
    <attr>
    <attrlabl>LCAREA.AFT_ID</attrlabl>
    </attr>
    <attr>
    <attrlabl>LCPYTYPE</attrlabl>
    <attrdef>Land cover poygon type</attrdef>
    <attrdefs>NIMA</attrdefs>
    <attrdomv>
    <edom>
    <edomv>1</edomv>
    <edomvd>Rice Field</edomvd>
    </edom>
    <edom>
    <edomv>2</edomv>
    <edomvd>Cranberry bog</edomvd>
    </edom>
    <edom>
    <edomv>3</edomv>
    <edomvd>Cultivated area, garden</edomvd>
    </edom>
    <edom>
    <edomv>4</edomv>
    <edomvd>Peat cuttings</edomvd>
    </edom>
    <edom>
    <edomv>5</edomv>
    <edomvd>Salt pan</edomvd>
    </edom>
    <edom>
    <edomv>6</edomv>
    <edomvd>Fish pond or hatchery</edomvd>
    </edom>
    <edom>
    <edomv>7</edomv>
    <edomvd>Quarry, strip mine, mine dump, blasting area</edomvd>
    </edom>
    <edom>
    <edomv>8</edomv>
    <edomvd>Oil or gas</edomvd>
    </edom>
    <edom>
    <edomv>10</edomv>
    <edomvd>Lava flow</edomvd>
    </edom>
    <edom>
    <edomv>11</edomv>
    <edomvd>Distorted surface area</edomvd>
    </edom>
    <edom>
    <edomv>12</edomv>
    <edomvd>Unconsolidated material (sand or gravel, glacial moraine)</edomvd>
    </edom>
    <edom>
    <edomv>13</edomv>
    <edomvd>Natural landmark area</edomvd>
    </edom>
    <edom>
    <edomv>14</edomv>
    <edomvd>Inundated area</edomvd>
    </edom>
    <edom>
    <edomv>15</edomv>
    <edomvd>Undifferentiated wetlands</edomvd>
    </edom>
    <edom>
    <edomv>99</edomv>
    <edomvd>None</edomvd>
    </edom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>TILE_ID</attrlabl>
    <attrdef>VPF Format tile ID</attrdef>
    <attrdefs>NIMA</attrdefs>
    </attr>
    <attr>
    <attrlabl>FAC_ID</attrlabl>
    </attr>
    </detailed>
    <overview>
    <eaover>The DCW used a product-specific attribute coding system that is composed of TYPE and STATUS designators for area, line, and point features; and LEVEL and SYMBOL designators for text features. The TYPE attribute specifies what the feature is, while the STATUS attribute specifies the current condition of the feature. Some features require both a TYPE and STATUS code to uniquely identify their characteristics. In order to uniquely identify each geographic attribute in the DCW, the TYPE and STATUS attribute code names are preceded by the two letter coverage abbreviation and a two letter abbreviation for the type of graphic primitive present. The DCW Type/Status codes were mapped into the FACC coding scheme. A full description of FACC may be found in Digital Geographic Information Exchange Standard Edition 1.2, January 1994.</eaover>
    <eadetcit>Entities (features) and Attributes for DCW are fully described in: Department of Defense, 1992, Military Specification Digital Chart of the World (MIL-D-89009): Philadelphia, Department of Defense, Defense Printing Service Detachment Office.</eadetcit>
    </overview>
    </eainfo>
    <distinfo>
    <distrib>
    <cntinfo>
    <cntorgp>
    <cntorg>NIMA</cntorg>
    </cntorgp>
    <cntpos>ATTN: CC, MS D-16</cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>6001 MacArthur Blvd.</address>
    <city>Bethesda</city>
    <state>MD</state>
    <postal>20816-5001</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>301-227-2495</cntvoice>
    <cntfax>301-227-2498</cntfax>
    </cntinfo>
    </distrib>
    <distliab>None</distliab>
    <stdorder>
    <digform>
    <digtinfo>
    <formname>VPF</formname>
    <formverd>19930930</formverd>
    <formspec>Military Standard Vector Product Format (MIL-STD-2407). The current version of this document is dated 28 June 1996. Edition 3 of VMap-0 conforms to a previous version of the VPF Standard as noted. Future versions of VMap-0 will conform to the current version of VPF Standard.</formspec>
    <transize>0.172</transize>
    </digtinfo>
    <digtopt>
    <offoptn>
    <offmedia>CD-ROM</offmedia>
    <recfmt>ISO 9660</recfmt>
    </offoptn>
    </digtopt>
    </digform>
    <fees>Not Applicable</fees>
    </stdorder>
    </distinfo>
    <distinfo>
    <distrib>
    <cntinfo>
    <cntorgp>
    <cntorg>USGS Map Sales</cntorg>
    </cntorgp>
    <cntaddr>
    <addrtype>mailing address</addrtype>
    <address>Box 25286</address>
    <city>Denver</city>
    <state>CO</state>
    <postal>80225</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>303-236-7477</cntvoice>
    <cntfax>303-236-1972</cntfax>
    </cntinfo>
    </distrib>
    <distliab>None</distliab>
    <stdorder>
    <digform>
    <digtinfo>
    <transize>0.172</transize>
    </digtinfo>
    </digform>
    <fees>$82.50 per four disk set</fees>
    <ordering>For General Public: Payment (check, money order, purchase order, or Government account) must accompany order.
    Make all drafts payable to Dept. of the Interior- US Geological Survey.
    To provide a general idea of content, a sample data set is available from the TMPO Home Page at:</ordering>
    </stdorder>
    </distinfo>
    <distinfo>
    <distrib>
    <cntinfo>
    <cntorgp>
    <cntorg>Geodesy Team, Harvard University</cntorg>
    <cntper>Geodesy Team</cntper>
    </cntorgp>
    <cntemail>[email protected]</cntemail>
    </cntinfo>
    </distrib>
    <resdesc>Geodesy layer</resdesc>
    <distliab>None</distliab>
    <stdorder>
    <digform>
    <digtinfo>
    <formname>SHP</formname>
    <transize>0.172</transize>
    </digtinfo>
    <digtopt>
    <onlinopt>
    <computer>
    <networka>
    <networkr>geodesy.harvard.edu</networkr>
    </networka>
    </computer>
    </onlinopt>
    </digtopt>
    </digform>
    <fees>none</fees>
    </stdorder>
    <availabl>
    <timeinfo>
    <sngdate>
    <caldate>1992</caldate>
    </sngdate>
    </timeinfo>
    </availabl>
    </distinfo>
    <metainfo>
    <metd>20010226</metd>
    <metc>
    <cntinfo>
    <cntorgp>
    <cntorg>Geodesy Team</cntorg>
    <cntper>REQUIRED: The person responsible for the metadata information.</cntper>
    </cntorgp>
    <cntvoice>REQUIRED: The telephone number by which individuals can speak to the organization or individual.</cntvoice>
    <cntemail>[email protected]</cntemail>
    </cntinfo>
    </metc>
    <metstdn>FGDC Content Standards for Digital Geospatial Metadata</metstdn>
    <metstdv>FGDC-STD-001-1998</metstdv>
    <mettc>local time</mettc>
    <metextns>
    <onlink>http://www.esri.com/metadata/esriprof80.html</onlink>
    <metprof>ESRI Metadata Profile</metprof>
    </metextns>
    </metainfo>
    </metadata>

    Have you tired the directory and bfile methods? Here is the example for that in the Oracle XML Developer's Guide:
    CREATE DIRECTORY xmldir AS 'path_to_folder_containing_XML_file';
    Example 3-3 Inserting XML Content into an XMLType Table
    INSERT INTO mytable2 VALUES (XMLType(bfilename('XMLDIR', 'purchaseOrder.xml'),
    nls_charset_id('AL32UTF8')));
    1 row created.
    The value passed to nls_charset_id() indicates that the encoding for the file to be read is UTF-8.
    ben

  • Need help with Berkeley XML DB Performance

    We need help with maximizing performance of our use of Berkeley XML DB. I am filling most of the 29 part question as listed by Oracle's BDB team.
    Berkeley DB XML Performance Questionnaire
    1. Describe the Performance area that you are measuring? What is the
    current performance? What are your performance goals you hope to
    achieve?
    We are measuring the performance while loading a document during
    web application startup. It is currently taking 10-12 seconds when
    only one user is on the system. We are trying to do some testing to
    get the load time when several users are on the system.
    We would like the load time to be 5 seconds or less.
    2. What Berkeley DB XML Version? Any optional configuration flags
    specified? Are you running with any special patches? Please specify?
    dbxml 2.4.13. No special patches.
    3. What Berkeley DB Version? Any optional configuration flags
    specified? Are you running with any special patches? Please Specify.
    bdb 4.6.21. No special patches.
    4. Processor name, speed and chipset?
    Intel Xeon CPU 5150 2.66GHz
    5. Operating System and Version?
    Red Hat Enterprise Linux Relase 4 Update 6
    6. Disk Drive Type and speed?
    Don't have that information
    7. File System Type? (such as EXT2, NTFS, Reiser)
    EXT3
    8. Physical Memory Available?
    4GB
    9. Are you using Replication (HA) with Berkeley DB XML? If so, please
    describe the network you are using, and the number of Replica’s.
    No
    10. Are you using a Remote Filesystem (NFS) ? If so, for which
    Berkeley DB XML/DB files?
    No
    11. What type of mutexes do you have configured? Did you specify
    –with-mutex=? Specify what you find inn your config.log, search
    for db_cv_mutex?
    None. Did not specify -with-mutex during bdb compilation
    12. Which API are you using (C++, Java, Perl, PHP, Python, other) ?
    Which compiler and version?
    Java 1.5
    13. If you are using an Application Server or Web Server, please
    provide the name and version?
    Oracle Appication Server 10.1.3.4.0
    14. Please provide your exact Environment Configuration Flags (include
    anything specified in you DB_CONFIG file)
    Default.
    15. Please provide your Container Configuration Flags?
    final EnvironmentConfig envConf = new EnvironmentConfig();
    envConf.setAllowCreate(true); // If the environment does not
    // exist, create it.
    envConf.setInitializeCache(true); // Turn on the shared memory
    // region.
    envConf.setInitializeLocking(true); // Turn on the locking subsystem.
    envConf.setInitializeLogging(true); // Turn on the logging subsystem.
    envConf.setTransactional(true); // Turn on the transactional
    // subsystem.
    envConf.setLockDetectMode(LockDetectMode.MINWRITE);
    envConf.setThreaded(true);
    envConf.setErrorStream(System.err);
    envConf.setCacheSize(1024*1024*64);
    envConf.setMaxLockers(2000);
    envConf.setMaxLocks(2000);
    envConf.setMaxLockObjects(2000);
    envConf.setTxnMaxActive(200);
    envConf.setTxnWriteNoSync(true);
    envConf.setMaxMutexes(40000);
    16. How many XML Containers do you have? For each one please specify:
    One.
    1. The Container Configuration Flags
              XmlContainerConfig xmlContainerConfig = new XmlContainerConfig();
              xmlContainerConfig.setTransactional(true);
    xmlContainerConfig.setIndexNodes(true);
    xmlContainerConfig.setReadUncommitted(true);
    2. How many documents?
    Everytime the user logs in, the current xml document is loaded from
    a oracle database table and put it in the Berkeley XML DB.
    The documents get deleted from XML DB when the Oracle application
    server container is stopped.
    The number of documents should start with zero initially and it
    will grow with every login.
    3. What type (node or wholedoc)?
    Node
    4. Please indicate the minimum, maximum and average size of
    documents?
    The minimum is about 2MB and the maximum could 20MB. The average
    mostly about 5MB.
    5. Are you using document data? If so please describe how?
    We are using document data only to save changes made
    to the application data in a web application. The final save goes
    to the relational database. Berkeley XML DB is just used to store
    temporary data since going to the relational database for each change
    will cause severe performance issues.
    17. Please describe the shape of one of your typical documents? Please
    do this by sending us a skeleton XML document.
    Due to the sensitive nature of the data, I can provide XML schema instead.
    18. What is the rate of document insertion/update required or
    expected? Are you doing partial node updates (via XmlModify) or
    replacing the document?
    The document is inserted during user login. Any change made to the application
    data grid or other data components gets saved in Berkeley DB. We also have
    an automatic save every two minutes. The final save from the application
    gets saved in a relational database.
    19. What is the query rate required/expected?
    Users will not be entering data rapidly. There will be lot of think time
    before the users enter/modify data in the web application. This is a pilot
    project but when we go live with this application, we will expect 25 users
    at the same time.
    20. XQuery -- supply some sample queries
    1. Please provide the Query Plan
    2. Are you using DBXML_INDEX_NODES?
    Yes.
    3. Display the indices you have defined for the specific query.
         XmlIndexSpecification spec = container.getIndexSpecification();
         // ids
         spec.addIndex("", "id", XmlIndexSpecification.PATH_NODE | XmlIndexSpecification.NODE_ATTRIBUTE | XmlIndexSpecification.KEY_EQUALITY, XmlValue.STRING);
         spec.addIndex("", "idref", XmlIndexSpecification.PATH_NODE | XmlIndexSpecification.NODE_ATTRIBUTE | XmlIndexSpecification.KEY_EQUALITY, XmlValue.STRING);
         // index to cover AttributeValue/Description
         spec.addIndex("", "Description", XmlIndexSpecification.PATH_EDGE | XmlIndexSpecification.NODE_ELEMENT | XmlIndexSpecification.KEY_SUBSTRING, XmlValue.STRING);
         // cover AttributeValue/@value
         spec.addIndex("", "value", XmlIndexSpecification.PATH_EDGE | XmlIndexSpecification.NODE_ATTRIBUTE | XmlIndexSpecification.KEY_EQUALITY, XmlValue.STRING);
         // item attribute values
         spec.addIndex("", "type", XmlIndexSpecification.PATH_EDGE | XmlIndexSpecification.NODE_ATTRIBUTE | XmlIndexSpecification.KEY_EQUALITY, XmlValue.STRING);
         // default index
         spec.addDefaultIndex(XmlIndexSpecification.PATH_NODE | XmlIndexSpecification.NODE_ELEMENT | XmlIndexSpecification.KEY_EQUALITY, XmlValue.STRING);
         spec.addDefaultIndex(XmlIndexSpecification.PATH_NODE | XmlIndexSpecification.NODE_ATTRIBUTE | XmlIndexSpecification.KEY_EQUALITY, XmlValue.STRING);
         // save the spec to the container
         XmlUpdateContext uc = xmlManager.createUpdateContext();
         container.setIndexSpecification(spec, uc);
    4. If this is a large query, please consider sending a smaller
    query (and query plan) that demonstrates the problem.
    21. Are you running with Transactions? If so please provide any
    transactions flags you specify with any API calls.
    Yes. READ_UNCOMMITED in some and READ_COMMITTED in other transactions.
    22. If your application is transactional, are your log files stored on
    the same disk as your containers/databases?
    Yes.
    23. Do you use AUTO_COMMIT?
         No.
    24. Please list any non-transactional operations performed?
    No.
    25. How many threads of control are running? How many threads in read
    only mode? How many threads are updating?
    We use Berkeley XML DB within the context of a struts web application.
    Each user logged into the web application will be running a bdb transactoin
    within the context of a struts action thread.
    26. Please include a paragraph describing the performance measurements
    you have made. Please specifically list any Berkeley DB operations
    where the performance is currently insufficient.
    We are clocking 10-12 seconds of loading a document from dbd when
    five users are on the system.
    getContainer().getDocument(documentName);
    27. What performance level do you hope to achieve?
    We would like to get less than 5 seconds when 25 users are on the system.
    28. Please send us the output of the following db_stat utility commands
    after your application has been running under "normal" load for some
    period of time:
    % db_stat -h database environment -c
    % db_stat -h database environment -l
    % db_stat -h database environment -m
    % db_stat -h database environment -r
    % db_stat -h database environment -t
    (These commands require the db_stat utility access a shared database
    environment. If your application has a private environment, please
    remove the DB_PRIVATE flag used when the environment is created, so
    you can obtain these measurements. If removing the DB_PRIVATE flag
    is not possible, let us know and we can discuss alternatives with
    you.)
    If your application has periods of "good" and "bad" performance,
    please run the above list of commands several times, during both
    good and bad periods, and additionally specify the -Z flags (so
    the output of each command isn't cumulative).
    When possible, please run basic system performance reporting tools
    during the time you are measuring the application's performance.
    For example, on UNIX systems, the vmstat and iostat utilities are
    good choices.
    Will give this information soon.
    29. Are there any other significant applications running on this
    system? Are you using Berkeley DB outside of Berkeley DB XML?
    Please describe the application?
    No to the first two questions.
    The web application is an online review of test questions. The users
    login and then review the items one by one. The relational database
    holds the data in xml. During application load, the application
    retrieves the xml and then saves it to bdb. While the user
    is making changes to the data in the application, it writes those
    changes to bdb. Finally when the user hits the SAVE button, the data
    gets saved to the relational database. We also have an automatic save
    every two minues, which saves bdb xml data and saves it to relational
    database.
    Thanks,
    Madhav
    [email protected]

    Could it be that you simply do not have set up indexes to support your query? If so, you could do some basic testing using the dbxml shell:
    milu@colinux:~/xpg > dbxml -h ~/dbenv
    Joined existing environment
    dbxml> setverbose 7 2
    dbxml> open tv.dbxml
    dbxml> listIndexes
    dbxml> query     { collection()[//@date-tip]/*[@chID = ('ard','zdf')] (: example :) }
    dbxml> queryplan { collection()[//@date-tip]/*[@chID = ('ard','zdf')] (: example :) }Verbosity will make the engine display some (rather cryptic) information on index usage. I can't remember where the output is explained; my feeling is that "V(...)" means the index is being used (which is good), but that observation may not be accurate. Note that some details in the setVerbose command could differ, as I'm using 2.4.16 while you're using 2.4.13.
    Also, take a look at the query plan. You can post it here and some people will be able to diagnose it.
    Michael Ludwig

  • Inserting XML data into xmltype column

    Oracle version: 10.1.0.5
    OpenVms Alpha V8.3
    1) Tried this and get the error shown below. Removed charset and placed a zero. Same error.
    INSERT INTO xml_demo (xml_data) -- column of xmltype
    VALUES
    xmltype
    bfilename('XML_DIR', 'MOL.XML'),
    nls_charset_id('AL32UTF8')
    ORA-22993: specified input amount is greater than actual source amount
    ORA-06512: at "SYS.DBMS_LOB", line 637
    ORA-06512: at "SYS.XMLTYPE", line 283
    ORA-06512: at line 1
    2) This PL/SQL block works. However maximum raw size around 32K. The file can be around 100K. May be I can load it into a table of raw and somehow concatnate it to insert. Not sure whether this is possible but I am sure there must me a simple way of doing this.
    Subset of the xml file is pasted below.
    set serveroutput on size 1000000
    DECLARE
    file1 bfile;
    v_xml XMLType;
    len1 number(6);
    v_rec1 raw(32000);
    BEGIN
    file1 := bfilename('XML_DIR','MOL.XML');
    DBMS_LOB.fileopen(file1, DBMS_LOB.file_readonly);
    len1 := DBMS_LOB.getLength(file1);
    v_rec1 := dbms_lob.substr(file1,len1,1);
    v_xml := xmltype(UTL_RAW.CAST_TO_VARCHAR2(v_rec1));
    INSERT INTO xml_demo (xml_data) VALUES (v_xml);
    COMMIT;
    DBMS_LOB.fileclose(file1);
    exception
    when others then
    dbms_output.put_LINE (sqlerrm);
    DBMS_LOB.fileclose(file1);
    END;
    <?xml version="1.0" encoding="UTF-8"?>
    <MolDocument DtdVersion="3" DtdRelease="0">
    <DocumentIdentification v="MOL_20100331_1500_1600"/>
    <DocumentVersion v="1"/>
    <DocumentType v="A43"/>
    <SenderIdentification codingScheme="A01" v="17X100Z100Z0001H"/>
    <SenderRole v="A35"/>
    <ReceiverIdentification codingScheme="A01" v="10XFR-RTE------Q"/>
    <ReceiverRole v="A04"/>
    <CreationDateTime v="2010-03-31T14:10:00Z"/>
    <ValidTimeInterval v="2010-03-31T15:00Z/2010-03-31T16:00Z"/>
    <Domain codingScheme="A01" v="10YDOM-1001A001A"/>
    <MolTimeSeries>
    <ContractIdentification v="RTE_20100331_1500_16"/>
    <ResourceProvider codingScheme="A01" v="10XFR-RTE------Q"/>
    <AcquiringArea codingScheme="A01" v="17Y100Z100Z00013"/>
    <ConnectingArea codingScheme="A01" v="10YFR-RTE------C"/>
    <AuctionIdentification v="AUCTION_20100331_1500_1600"/>
    <BusinessType v="A10"/>
    <BidTimeInterval v="2010-03-31T15:00Z/2010-03-31T16:00Z"/>
    <MeasureUnitQuantity v="MAW"/>
    <Currency v="EUR"/>
    <MeasureUnitPrice v="MWH"/>
    <Direction v="A02"/>
    <MinimumActivationQuantity v="50"/>
    <Status v="A06"/>
    <Period>
    <TimeInterval v="2010-03-31T15:00Z/2010-03-31T16:00Z"/>
    <Resolution v="PT60M"/>
    <Interval>
    <Pos v="1"/>
    <Qty v="50"/>
    <EnergyPrice v="50.45"/>
    </Interval>
    </Period>
    </MolTimeSeries>
    </MolDocument>

    Marc
    Thanks. I understand what you are saying. I have been copying files in binary mode from NT servers into VMS. I have to get a proper xml file via FTP from the originating system to further investigate.
    I have one last item i need help on. If anything looks obvious let me know:
    +1) The xsd defintion of Qty (type: QuantityType) and EnergyPrice (type: Amount Type)+
                   <xsd:element name="Qty" type="ecc:QuantityType">
                        <xsd:annotation>
                             <xsd:documentation/>
                        </xsd:annotation>
                   </xsd:element>
                   <xsd:element name="EnergyPrice" type="ecc:AmountType" minOccurs="0">
                        <xsd:annotation>
                             <xsd:documentation/>
                        </xsd:annotation>
                   </xsd:element>
    +2) Definition of AmountType and QuantityType in the parent xsd+
         <xsd:complexType name="AmountType">
              <xsd:annotation>
                   <xsd:documentation>
                        <Uid>ET0022</Uid>
                        <Definition>The monetary value of an object</Definition>
                   </xsd:documentation>
              </xsd:annotation>
              <xsd:attribute name="v" use="required">
                   <xsd:simpleType>
                        <xsd:restriction base="xsd:decimal">
                             <xsd:totalDigits value="17"/>
                        </xsd:restriction>
                   </xsd:simpleType>
              </xsd:attribute>
         </xsd:complexType>
         <!--_________________________________________________-->
         <xsd:complexType name="QuantityType">
              <xsd:annotation>
                   <xsd:documentation>
                        <Uid>ET0012</Uid>
                        <Definition>(Synonym "qty") The quantity of an energy product. Positive quantities shall not have a sign.</Definition>
                   </xsd:documentation>
              </xsd:annotation>
              <xsd:attribute name="v" type="xsd:decimal" use="required"/>
         </xsd:complexType>
         <!--________________
    +3. Data in the XML file+
    <Period>
    <TimeInterval v="2010-03-31T15:00Z/2010-03-31T16:00Z"/>
    <Resolution v="PT60M"/>
    <Interval>
    <Pos v="1"/>
    <Qty v="50"/>
    <EnergyPrice v="50.45"/>
    </Interval>
    +4) When I do the load:+
    the EnergyPrice is saved in the xmltype column as <EnergyPrice v="50"/>
    Losing its decimal value of .45
    +5) When I select as follows:+
    **DEV** SQL>> l
    1 SELECT
    2 EXTRACTVALUE(x2.column_value,'/MolTimeSeries/Period/Interval/EnergyPrice/@v') v1,
    3 EXTRACTVALUE(x2.column_value,'/MolTimeSeries/Period/Interval/EnergyPrice') v2,
    4 EXTRACTVALUE(x2.column_value,'/MolTimeSeries/Period/Interval/Qty') v3
    5 FROM balit_mol_xml x,
    6 TABLE(
    7 XMLSEQUENCE(
    8 EXTRACT(x.xml_payload, '/MolDocument/MolTimeSeries')
    9 )
    10 ) x2
    11* WHERE EXISTSNODE(x.xml_payload,'/MolDocument/DocumentIdentification[@v="MOL_20100331_1500_1600"]') = 1
    +6) get the result+
    50
    AmountType479_T(XDB$RAW_LIST_T('1301000000'), 50)
    QuantityType471_T(XDB$RAW_LIST_T('1301000000'), 50)
    +7) XDB$RAW_LIST_T('1301000000'),+
    Does that tell what I am doing wrong?

  • Urgent : Need help in parsing XML from Sharepoint and save it into DB

    Hi ,
    I am Sharepoint guy and a newbie in Oracle . PL/SQL
    I am using UTL_DBWS Package to call a Sharepoint WebService " and was sucessfull , Now the xml has to be parsed and stored into a Table. I am facing the issue as the XML has a different namesoace and normal XPATH query is not working
    Below is the XML and need help in parsing it
    declare
    responsexml sys.XMLTYPE;
    testparsexml sys.XMLTYPE;
    begin
    responsexml := sys.XMLTYPE('<GetListItemsResponse xmlns="http://schemas.microsoft.com/sharepoint/soap/">
    <GetListItemsResult>
    <listitems xmlns:s="uuid:BDC6E3F0-6DA3-11d1-A2A3-00AA00C14882" xmlns:dt="uuid:C2F41010-65B3-11d1-A29F-00AA00C14882" xmlns:rs="urn:schemas-microsoft-com:rowset" xmlns:z="#RowsetSchema">
    <rs:data ItemCount="2">
    <z:row ows_MetaInfo="1;#" ows__ModerationStatus="0" ows__Level="1" ows_Title="Test Title 1" ows_ID="1" ows_owshiddenversion="1" ows_UniqueId="1;#{9C45D54E-150E-4509-B59A-DB5A1B97E034}" ows_FSObjType="1;#0" ows_Created="2009-09-12 17:13:16" ows_FileRef="1;#Lists/Tasks/1_.000"/>
    <z:row ows_MetaInfo="2;#" ows__ModerationStatus="0" ows__Level="1" ows_Title="Testing Tasks" ows_ID="2" ows_owshiddenversion="1" ows_UniqueId="2;#{8942E211-460B-422A-B1AD-1347F062114A}" ows_FSObjType="2;#0" ows_Created="2010-02-14 16:44:40" ows_FileRef="2;#Lists/Tasks/2_.000"/>
    </rs:data>
    </listitems>
    </GetListItemsResult>
    </GetListItemsResponse>');
    testparsexml := responsexml.extract('/GetListItemsResponse/GetListItemsResult/listitems/rs:data/z:row/@ows_Title');
    DBMS_OUTPUT.PUT_LINE(testparsexml.extract('/').getstringval());
    end;
    The issue is with rs:data , z:row nodes.... please suggest how to handle these kind of namespaces in Oracle
    I need the parse the attribute "ows_Title" and save it into a DB
    this script would generate "Error occured in XML Parsing"
    Help is appriciated, thanks for looking

    SQL> SELECT *
      FROM XMLTABLE (
              xmlnamespaces ('http://schemas.microsoft.com/sharepoint/soap/' as "soap",
                             '#RowsetSchema' AS "z"
              'for $i in //soap:*//z:row return $i'
              PASSING xmltype (
                         '<GetListItemsResponse xmlns="http://schemas.microsoft.com/sharepoint/soap/">
    <GetListItemsResult>
    <listitems xmlns:s="uuid:BDC6E3F0-6DA3-11d1-A2A3-00AA00C14882" xmlns:dt="uuid:C2F41010-65B3-11d1-A29F-00AA00C14882" xmlns:rs="urn:schemas-microsoft-com:rowset" xmlns:z="#RowsetSchema">
    <rs:data ItemCount="2">
    <z:row ows_MetaInfo="1;#" ows__ModerationStatus="0" ows__Level="1" ows_Title="Test Title 1" ows_ID="1" ows_owshiddenversion="1" ows_UniqueId="1;#{9C45D54E-150E-4509-B59A-DB5A1B97E034}" ows_FSObjType="1;#0" ows_Created="2009-09-12 17:13:16" ows_FileRef="1;#Lists/Tasks/1_.000"/>
    <z:row ows_MetaInfo="2;#" ows__ModerationStatus="0" ows__Level="1" ows_Title="Testing Tasks" ows_ID="2" ows_owshiddenversion="1" ows_UniqueId="2;#{8942E211-460B-422A-B1AD-1347F062114A}" ows_FSObjType="2;#0" ows_Created="2010-02-14 16:44:40" ows_FileRef="2;#Lists/Tasks/2_.000"/>
    </rs:data>
    </listitems>
    </GetListItemsResult>
    </GetListItemsResponse>')
    columns ows_MetaInfo varchar2(20) path '@ows_MetaInfo',
             ows_Title varchar2(20) path '@ows_Title',
             ows__ModerationStatus varchar2(20) path '@ows__ModerationStatus'
    OWS_METAINFO         OWS_TITLE            OWS__MODERATIONSTATUS
    1;#                  Test Title 1         0                   
    2;#                  Testing Tasks        0                   
    2 rows selected.

  • Inserting XML document into XDB fails with can't convert to OPAQUE

    Hi,
    When I try to insert a document using oracle 9.2.0.5 client software into a 9.2.0.5 database, with the following code:
    void insertDocument(String document) {
    XMLType xt = XMLType.createXML(connection, document);
    PreparedStatement ps = connection.prepareStatement("insert into xmldocuments values(?)");
    ps.setObject(1, xt);
    ps.executeUpdate();
    The setObject function always throws an exception (after a very long time) with:
    java.sql.SQLException: Fail to convert to internal representation: OPAQUE()
    This also fails when we use the InputStream and org.w3c.xml.Document variants.
    We use the OCI driver, otherwise we get errors retrieving the documents from XMLDB.
    What is going wrong here?

    David,
    If you search through the historical data in this list there are previous post regarding opaque.
    These may be useful. Possibly your reaching the size limit.

  • Need help with this xml gallery !!!

    i have build a gallery but its very simple...... it takes images from xml file.
    i have attached all files in zip.
    i just want two things if anyone can help.
    first when i press next button it goes to next image but with no effect. it just displays next image ... i want to incorporate a sliding effect when the image is changed to another.
    and second i want to use autoplay feature.
    as soon as swf starts the images came one by one with difference of few seconds.
    thx in advance... i really need help in this....!

    You're welcome.
    I don't have an example to offer for the autorun.  You should be able to think it thru.  One key, as I mentioned is to preload all of the images first, that will allow for smooth playing of the show--no waiting for images to load between changes.  You can load them into empty movieclips and hide them (_viisible = false) until they are needed.  You could load them when called for, but you would have to put conditions on the displaying of things until the image loads, which will change when they are all loaded, so I recommend just loading them all first.
    For the timing you can use setInterval.  If something is going to be allowed to interupt the autorun, then you will need to make use of the clearInterval function as well, so that you stop the clock.
    Since you will be wanting to know when things are loaded, you will need to use the MovieClipLoader.loadClip method for loading the images instead of using loadMovie.  This is because the MovieClipLoader class supports having an event listener.  If you look in the help documents in the MovieClipLoader.addListener section, there is an example there that provides a fairly good complete overview of using the code.  The only difference is you'd be looking for the onLoadComplete event rather than the onLoadInit event.

  • Want help in converting XML Document to a postscript file

    I want to convert XML Document to a postscript file in Java, not from command line. Any help is appreciated.

    well, you first need an XML parser, which is easy to implement
    then you need to either write the postscript out bit by bit or use a Java API that writes PostScript for you.
    It's really simple, the trick is HOW you decide to write out the PS

  • Need help with INSERT and WITH clause

    I wrote sql statement which correctly work, but how i use this statment with INSERT query? NEED HELP. when i wrote insert i see error "ORA 32034: unsupported use of with clause"
    with t1 as(
    select a.budat,a.monat as period,b.vtweg,
    c.gjahr,c.buzei,c.shkzg,c.hkont, c.prctr,
    c.wrbtr,
    c.matnr,
    c.menge,
    a.monat,
    c.zuonr
    from ldw_v1.BKPF a,ldw_v1.vbrk b, ldw_v1.bseg c
    where a.AWTYP='VBRK' and a.BLART='RV' and a.BUKRS='8431' and a.awkey=b.vbeln
    and a.bukrs=c.bukrs and a.belnr=c.belnr and a.gjahr=c.gjahr and c.koart='D'
    and c.ktosl is null and c.gsber='4466' and a.gjahr>='2011' and b.vtweg='01'
    ,t2 as(
    select a.BUKRS,a.BELNR, a.GJAHR,t1.vtweg,t1.budat,t1.monat from t1, ldw_v1.bkpf a
    where t1.zuonr=a.xblnr and a.blart='WL' and bukrs='8431'
    ,tcogs as (
    select t2.budat,t2.monat,t2.vtweg, bseg.gjahr,bseg.hkont,bseg.prctr,
    sum(bseg.wrbtr) as COGS,bseg.matnr,bseg.kunnr,sum(bseg.menge) as QUANTITY
    from t2, ldw_v1.bseg
    where t2.bukrs=bseg.bukrs and t2.belnr=bseg.BELNR and t2.gjahr=bseg.gjahr and BSEG.KOART='S'
    group by t2.budat,t2.monat,t2.vtweg, bseg.gjahr,bseg.hkont,bseg.prctr,
    bseg.matnr,bseg.kunnr
    ,t3 as
    select a.budat,a.monat,b.vtweg,
    c.gjahr,c.buzei,c.shkzg,c.hkont, c.prctr,
    case when c.shkzg='S' then c.wrbtr*(-1)
    else c.wrbtr end as NTS,
    c.matnr,c.kunnr,
    c.menge*(-1) as Quantity
    from ldw_v1.BKPF a,ldw_v1.vbrk b, ldw_v1.bseg c
    where a.AWTYP='VBRK' and a.BLART='RV' and a.BUKRS='8431' and a.awkey=b.vbeln
    and a.bukrs=c.bukrs and a.belnr=c.belnr and a.gjahr=c.gjahr and c.koart='S'
    and c.ktosl is null and c.gsber='4466' and a.gjahr>='2011' and b.vtweg='01'
    ,trevenue as (
    select t3.budat,t3.monat,t3.vtweg, t3.gjahr,t3.hkont,t3.prctr,
    sum(t3.NTS) as NTS,t3.matnr,t3.kunnr,sum(t3.QUANTITY) as QUANTITY
    from t3
    group by t3.budat,t3.monat,t3.vtweg, t3.gjahr,t3.hkont,t3.prctr,t3.matnr,t3.kunnr
    select NVL(tr.budat,tc.budat) as budat,
    NVL(tr.monat,tc.monat) as monat,
    NVL(tr.vtweg,tc.vtweg) as vtweg,
    NVL(tr.gjahr, tc.gjahr) as gjahr,
    tr.hkont as NTS_hkont,
    tc.hkont as COGS_hkont,
    NVL(tr.prctr,tc.prctr) as prctr,
    NVL(tr.MATNR, tc.MATNR) as matnr,
    NVL(tr.kunnr, tc.kunnr) as kunnr,
    NVL(tr.Quantity, tc.Quantity) as Quantity,
    tr.NTS as NTS,
    tc.COGS as COGS
    from trevenue TR full outer join tcogs TC
    on TR.BUDAT=TC.BUDAT and TR.MONAT=TC.MONAT and TR.GJAHR=TC.GJAHR
    and TR.MATNR=TC.MATNR and TR.KUNNR=TC.KUNNR and TR.QUANTITY=TC.QUANTITY
    and TR.VTWEG=TC.VTWEG and TR.PRCTR=TC.PRCTR
    Edited by: user13566113 on 25.03.2011 5:26

    Without seeing what you tried it is hard to say what you did wrong, but this is how it would work
    SQL> create table t ( n number );
    Table created.
    SQL> insert into t
      2  with test_data as
      3    (select 1 x from dual union all
      4     select 2 x from dual union all
      5     select 3 x from dual union all
      6     select 4 x from dual)
      7  select x from test_data;
    4 rows created.
    SQL>

  • Need help in insert statment query

    Hi,
    I have a table T1 with values like
    col1
    1
    2
    3
    4
    I need to write insert statment for t2 as
    insert into t2(col1,col2) values ('AA',select col1 from t1);
    The output of T2 should be
    col1 col2
    AA 1
    AA 2
    AA 3
    AA 4
    Any help in modifying the query.
    Ashish

    What is wrong with this?
    INSERT INTO id_own_dw.id_t_dw_org_dq_tgt
           (cost_center_cod_vc_old,
            cost_center_cod_vc_new
         SELECT '3016052',<<<_-- this you can replace with your varibale in pl/sql block
                cost_center_cod_vc
           FROM id_own_dw.id_t_dw_msl_org_cctr2mkdiv
          WHERE busi_unit_cod_vc = '3016496'

Maybe you are looking for

  • Can't get music from desktop to iTunes?

    My dad sent me a recording, and I downloaded it onto my desktop. But I can't figure out how to get it onto my itunes? Before you say it, I've already tried importing it, and usually it works when I just drag it in, but it didn't work this time. I've

  • Site map cannot be build

    Dear Great Knowers, Everytime I make a new site, it shows a warning: The selected folder does not contain the current site's home page. The site map annot be built. The folder I choose is the main folder holding all of the site, including the index.h

  • Chapters

    I have made a dvd using idvd5. There are 7 mp4 movies (each about 20 minutes long) imported and each one is listed on the menu (title) page of the dvd. According to idvd chapters should be created automatically and there are no options for chapters.

  • Timeline Troubles CS4

    I'm sure it's a simple fix but my timeline won't allow me to drag or play the project.  I'm using a Macbook Pro, circa 2009 if that makes any difference.  On a new project I can add things to the time line, move them around but when I try to drag or

  • Transient Sybase SET CHAINED Tx exception.

    Hi, I've been experiencing a bit of strange behavior with WebLogic 6.1 SP3 and Sybase 11.9.2.4. We are using the Sybase driver that comes with WL 6.1. We are seeing occasional SQLExceptions concerning transactional modes for stored procedures running