Using DB Adapter with XMLDB and CLOB

Hi,
with oracle xmldb i can store XMLs in lob or structured storage (s. Oracle® XML DB Developer's Guide 10g Release 2 (10.2) B14259-02).
Now my question:
If want to use a database adapter in the ESB to wake up several BPELs. Is there a functionality how i can transform such a clob into a XML where i can use XSLTs?
The db adapter does only map each column to one xml-tag...
If i try to use a table with xmltype the creation of a database adapter fails with:
"some tables contain columns that are not recognized by the database adapter, so they will captured as strings:
sys_nc_rowinfo$(sys.xmltype)
change teh type of the above columns to that of the closest supported type...."
Message was edited by:
HEWizard

Hi,
To get it working you have to create an AQ-table with payload type xmltype. The xmltype payload data type is available on rdbms 10g and up. If you are on 9i or lower use raw data type. Create a queue inside the AQ-table. I'm not quite sure what's the best scenario for you to the queue with data, but I would imagine that you could create a database trigger on the table you're talking about and enqueue the xml-data (payload) to the AQ-table. Thus using the AQ-table as a staging table for the ESB system.
Next, create an ESB system using an AQAdapter to dequeue the queue (like you did using a DBAdapter). The adapter can handle both single-user and multi-consumer queue's and all data types (raw, xmltype and object types).
Messages are dequeued generally within a second after you commited the transaction.
Kind regards,
H

Similar Messages

  • Problem with Send using JMS Adapter with Websphere MQ

    Hi,
    We have two scenarios using JMS Adapter with MQ:
    1. File->PI->MQ
    This works fine and drop the file in MQ correctly.
    2. MQ->PI->File
    This gives an error reading from the queue: (see highlighted in the log below). It correctly connects to the queue name. But it fails with the message "Could not begin a AF transaction".
    Our PI version is 7.0 SP 13
    Websphere MQ Version 6.0.
    Any idea what could be wrong?
      Cluster Node  Administration Information  Availability Times
    Cluster Node Details for Channel CC_JMS_MQ_Sender
    Short Log 
    In the Last 4 Hours Server 0 15_92786
    _Sucessfully connected to destination 'queue:///MMPP.PLM.FGH41? CCSID=37&targetClient=1'_
       Line 1 / 1
    Processing Details for Cluster Node Server 0 15_92786
    Type 
    Time Stamp 
    Message ID 
    Explanation 
    9/2/08 2:27:28 PM 2e6206f0-7925-11dd-bc02-0003bae50b4d Error while processing message '2e6206f0-7925-11dd-bc02-0003bae50b4d';  _detailed error description: com.sap.aii.adapter.jms.api.channel.filter.MessageFilterException: Could not begin a AF transaction: TxManagerException: Unable to open transaction: com.sap.engine.services.ts.exceptions.BaseSystemException at com.sap.aii.adapter.jms.core.channel.filter.TxManagerFilter.filterSend(TxManagerFilter.java:103) ..._  
    9/2/08 2:27:28 PM 2e6206f0-7925-11dd-bc02-0003bae50b4d XI message ID corresponding to JMS message with ID 'ID:414d512071736431202020202020202047d9462024028b02' will be created as a new GUID with value '2e6206f0-7925-11dd-bc02-0003bae50b4d'
    Amith Dharmasiri

    Checked the drivers installed in :
    /usr/sap/<SID>/DVEBMGS<SYSNO>
    /j2ee/cluster/server0/bin/ext/com.sap
    .aii.af.jmsproviderlib
    They are available and properly added to aii_af_jmsproviderlib.sda.
    The drivers are:
    CL3Export.jar
    CL3Nonexport.jar
    com.ibm.mq.jar
    com.ibm.mqjms.jar
    connector.jar
    dhbcore.jar
    rmm.jar
    These were installed per OSS note 747601.
    Any other suggestions, why retrieving from MQ doesn't work?

  • How to use File Adapter with hierarchial Structure?

    Hi,
    How to use File Adapter with hierarchial Structure like..
    Data:
    --Header Details:
    Line Item Details:
    Data
    Bcoz I am getting a Flatfile in a hierarchial way as shown below.
    Header Details :1
    Line Item a
    Line Item b
    Header Details :2
    Line Item c
    Line Item d
    Kishore

    Hey Kishore,
    In order to create a structure you need to use the file with convertion mode on the sending communication channel of the file adapter.
    check the link for the needed configuration paramters.
    If the structure is more complexed you can use the Contetnt master(CM) from itemfield which allows to ceate XML file from complex flat files and more.
    <a href="http://help.sap.com/saphelp_erp2005/helpdata/en/0d/5ab43b274a960de10000000a114084/frameset.htm">File sender adapter</a>
    If you have any question i'll be more than happy to assist.
    Nimrod Gisis

  • Printer is overprinting old jobs on top of new jobs.  Using a Mac with OSx and a HP photosmart for over a year.  this just started happening.

    Printer is overprinting old jobs on top of new jobs.  Using a Mac with OSx and a HP photosmart for over a year.  this just started happening.

    RReset printing system again and the restart in Safe Mode . This will clear some caches,to do this hold down the Shift key when you hear the startup tone until a progress bar appears, after it has fully booted restart normally and add the printer.

  • Using iPhone 4s with IOS5 and my Music icon has disappeared.  How do I get it back?  Please.

    Using iPhone 4s with IOS5 and my Music icon has disappeared.  Please help me to get it back.

    It should be in your applications folder.  Locate it; click and hold onto it then drag it back to where it was.

  • C++: Is it possible using callback function with ncacn_http and rpcproxy server ?

    I have a remote procedure and I can call it using http over rpc. I pass trough an rpc proxy server for arriving to my rpc server.
    But I cannot call a callback function to my client inside the server function.
    Is it possible using callback function with ncacn_http and rpcproxy server ?
    We are using IIS on windows server 2008 R2 and the server rpc and the client on the same PC with rpc rpoxy.
    If I use ncan_ip_tcp all works fine.
    Thanks
    Gianluca

    Hi,
    About the develop question please post to the MSDN forum.
    MSDN forum Developer Network
    http://social.msdn.microsoft.com/Forums/en-US/home?forum=WAVirtualMachinesVirtualNetwork&filter=alltypes&sort=lastpostdesc
    Thanks for your understanding and support.
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • I am from India using my iphone4 with Vodafone and  firmware version 5.0.1(9A405). I am unable to access 3G from my device as the CELLULAR DATA NETWORK option in the settings is missing. Any methods or options for fixing the same????

    I am from India using my iphone4 with Vodafone and  firmware version 5.0.1(9A405). I am unable to access 3G from my device as the CELLULAR DATA NETWORK option in the settings is missing. Any methods or options for fixing the same????

    I am from India using my iphone4 with Vodafone and  firmware version 5.0.1(9A405). I am unable to access 3G from my device as the CELLULAR DATA NETWORK option in the settings is missing. Any methods or options for fixing the same????

  • Frequent disconnect using peap wpa2 with aes and tkip

    I got frequent disconnect for the users on wireless using peap wpa2 with aes and tkip.
    My network is setup with :
    -Wireless controller 4404
    -ACS 4.0
    -28 access point 1131g
    -Peap authentication with active directory windows 2003
    -windows xp - mschap2 with aes- tkip
    when i check only aes on the wireless controller 4404 the network user are able work in a stable condition

    This might similar to the bug where Wireless phones dont associate if WPA2 is configured with both AES/TKIP. In this case try to upgrade the controller.

  • PC Used to work with XP and airport Extreme Basestation but not with Vista

    Please help, my PC Used to work with XP and airport Extreme Basestation but not now with Vista. I have a MacBook Pro connected and also a Mac Pro connected to the network on it and they still work, i have put the latest software on the PC and the firmware is up to date. The only thing i can see on the network on the PC is the basestation Hard Disk (shared also to the Macs) but i cannot connect with the password, i have tried everything in my knowledge, including formatting the machine and reinsatlling OS and changing security setting to no avail. Any ideas anyone?

    With regard to your printer problem - take a look at this discussion:
    http://discussions.apple.com/thread.jspa?messageID=6312413&tstart=0

  • Encoding problem with convert and CLOB involving UTF8 and EBCDIC

    Hi,
    I have a task that requires me to call a procedure with a CLOB argument containing a string encoded in EBCDIC. This did not go well so I started narrowing down the problem. Here is some SQL to illustrate it:
    SQL> select * from v$version;
    BANNER
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    PL/SQL Release 10.2.0.4.0 - Production
    CORE 10.2.0.4.0 Production
    TNS for Solaris: Version 10.2.0.4.0 - Production
    NLSRTL Version 10.2.0.4.0 - Production
    SQL> select value from v$nls_parameters where parameter = 'NLS_CHARACTERSET';
    VALUE
    AL32UTF8
    SQL> select convert(convert('abc', 'WE8EBCDIC500'), 'AL32UTF8', 'WE8EBCDIC500')
    output from dual;
    OUT
    abc
    SQL> select convert(to_char(to_clob(convert('abc', 'WE8EBCDIC500'))), 'AL32UTF8', 'WE8EBCDIC500') output from dual;
    OUTPUT
    ╒╫¿╒╫¿╒╫¿
    So converting to and from EBCDIC works fine when using varchar2, but (if I am reading this right) fails when involving CLOB conversion.
    My question then is: Can anyone demonstrate how to put correct EBCDIC into a CLOB and maybe even explain why the examples do what they do.

    in order to successfully work with xmldb it is recommended that you use 9.2.0.4
    and above. Its seems to have lower version.
    Okay now related to the problem , if your data that you want to send to the attributes are not greater than 32767, then you can use the pl/sql varchar2 datatype to hold the data rather then CLOB and overcome this problem.
    here is the sample. use function with below pl/sql to return the desired output.
    SQL> declare
      2   l_clob     CLOB := 'Hello';
      3   l_output   CLOB;
      4  begin
      5    select  xmlelement("test", xmlattributes(l_clob AS "a")).getclobval()
      6      into l_output from dual;
      7  end;
      8  /
      select  xmlelement("test", xmlattributes(l_clob AS "a")).getclobval()
    ERROR at line 5:
    ORA-06550: line 5, column 44:
    PL/SQL: ORA-00932: inconsistent datatypes: expected - got CLOB
    ORA-06550: line 5, column 3:
    PL/SQL: SQL Statement ignored
    SQL> declare
      2   l_vchar     varchar2(32767) := 'Hello';
      3   l_output   CLOB;
      4  begin
      5    select  xmlelement("test", xmlattributes(l_vchar AS "a")).getclobval()
      6      into l_output from dual;
      7    dbms_output.put_line(l_output);
      8  end;
      9  /
    <test a="Hello"></test>
    PL/SQL procedure successfully completed.

  • How to copy a table with LONG and CLOB datatype over a dblink?

    Hi All,
    I need to copy a table from an external database into a local one. Note that this table has both LONG and CLOB datatypes included.
    I have taken 2 approaches to do this:
    1. Use the CREATE TABLE AS....
    SQL> create table XXXX_TEST as select * from XXXX_INDV_DOCS@ext_db;
    create table XXXX_TEST as select * from XXXX_INDV_DOCS@ext_db
    ERROR at line 1:
    ORA-00997: illegal use of LONG datatype
    2. After reading some threads I tried to use the COPY command:
    SQL> COPY FROM xxxx/pass@ext_db TO xxxx/pass@target_db REPLACE XXXX_INDV_DOCS USING SELECT * FROM XXXX_INDV_DOCS;
    Array fetch/bind size is 15. (arraysize is 15)
    Will commit when done. (copycommit is 0)
    Maximum long size is 80. (long is 80)
    CPY-0012: Datatype cannot be copied
    If my understanding is correct the 1st statement fails because there is a LONG datatype in XXXX_INDV_DOCS table and 2nd one fails because there is a CLOB datatype.
    Is there a way to copy the entire table (all columns including both LONG and CLOB) over a dblink?
    Would greatelly appriciate any workaround or ideas!
    Regards,
    Pawel.

    Hi Nicolas,
    There is a reason I am not using export/import:
    - I would like to have a one-script solution for this problem (meaning execute one script on one machine)
    - I am not able to make an SSH connection from the target DB to the local one (although the otherway it works fine) which means I cannot copy the dump file from target server to local one.
    - with export/import I need to have an SSH connection on the target DB in order to issue the exp command...
    Therefore, I am looking for a solution (or a workaround) which will work over a DBLINK.
    Regards,
    Pawel.

  • $200 reward to solve problem with JDBC and CLOB.getCharacterOutputStream

    I'm trying to update CLOB with the getCharacterOutputStream as suggested in the example code. It works with US7ASCII DB instance but not instances in UTF8.
    I've been browsing through all the Oracle doc's and found some rather confusing statements:
    In the page at http://oradoc.photo.net/ora816/java.816/a81354/oralob2.htm#1043220
    it says: [When writing to or reading from a CLOB, the JDBC drivers perform all character set conversions for you.]
    also: [The oracle.sql.CLOB class supports all the character sets that the Oracle data server supports for CLOB types.]
    So far so good.
    In the page at http://oradoc.photo.net/ora816/java.816/a81354/oraint3.htm#1012518
    it says [The oracle.sql package supports these datatypes in several ways: CLOBs point to large fixed-width character data items (that is, characters that require a fixed number of bytes per character) and are supported by the oracle.sql.CLOB class.]
    Ooh no! Is this for real? UTF8 is variable width and does this mean it is not supported?
    Any way to get around this?
    In the page at http://oradoc.photo.net/ora816/java.816/a81358/03_pub2.htm#36009
    says [6.The mappings to oracle.sql classes are optimal because they preserve data formats and require no character-set conversions (apart from the usual network conversions). Those classes are especially useful in applications that "shovel" data between SQL and Java.]
    "No character set conversion"? Very confusing!
    I've been hammering on this CLOB/JDBC/UTF8 problem for more than a week now and I really appreciate some solutions, workarounds, or whatever help I can get. I'm running java stored procedure in 8.1.6 on Linux RH6.2.
    For your trouble, I'd pay $200 for the first guy who come up with a verifiable solution.

    This is just findings based upon your comments:
    Please refer to document Oracle8i National Language Support Guide
    Release 2 (8.1.6) from Oracle Documentation Library, Release 8.1.7
    Chapter 6 Java,
    There its clearly mention that:
    "Oracle JDBC drivers provide globalization support by allowing users to retrieve data from or insert data into a database in any character set that Oracle supports. Because Java strings are UCS2 encoded (16-bit Unicode) for JDBC programs, the target character set on the client is always UCS2. Character set conversion is required to convert data from the database character set (Db Charset) to UCS2. This applies to CHAR, LONG, CLOB, and VARCHAR2 data types; RAW data is not converted. "
    Also..please refer this...
    "oracle.sql.CLOB's method getCharacterStream() returns the contents of a CLOB as a Unicode stream."
    "The techniques that Oracle's drivers use to perform character set conversion for Java applications depend on the character set the database uses. The simplest case is where the database uses a US7ASCII or WE8ISO8859P1 character set. In this case, the driver converts the data directly from the database character set to UCS2,which is used in Java applications. "
    "If you are working with databases that employ a non-US7ASCII or non-WE8ISO8859P1 character set (for example, Japanese or Korean), then the driver converts the data, first to UTF8, then to UCS2. "
    In my case the characte-set of the database is WE8ISO8859P1 and for security reason i can't change the character set but my feeling is that if you are updating the CLob from the java client you are forming a reference of a clob in the client which is UCS2 at the Java side. Now when you are populating the clob through java.io.Writer and call the procedure to pass the reference of the clob to the procedure then I believe the JDBC will convert the UCS2 datatype of Clob to UTF8 in the database.
    You can try out the code snippet:
    package ServletGDC;
    import java.io.*;
    import java.util.*;
    import javax.servlet.*;
    import javax.servlet.http.*;
    import oracle.sql.*;
    import java.sql.*;
    import oracle.jdbc.driver.*;
    import ClassesGDC.*;
    public class testUpload extends HttpServlet {
    private String m_strMessage="";//It stores the message to be uploaded along with the Document
    Connection conn=null;
    public void doPost(HttpServletRequest req, HttpServletResponse res)
    throws ServletException, IOException {
    String strContent="";
    //res.setContentType("application/msword");
    res.setContentType("text/html");
    PrintWriter out = res.getWriter();
    try {
    CallableStatement cmt=null;
    OutputStream output=null;
    ByteArrayOutputStream byteoutput=null;
    String strDocString="";
    oracle.sql.CLOB tempClob = null;
    String strPassedFileName="";     // the file name passed in the request object
    String strStdFilename="";//the file name to be given to the best practice
    String strSaveDirectory="";     //the directory in which the bp is to be saved
    String strParamName="";//name of parameters
    String strParamValue="";//value of parameters
    int intTempVariable=0; // temporaty variable
    long lngSizeOfFileUploaded=0;//stores the size of the file which had been uploaded in the file system
    File filePathOfFileUploaded=null;//stores the path of the file uploaded to the file system
    String strQuery="";
    //ST------------checks if the user has logged in or not-----------------------
    HttpSession session=req.getSession(true);
    if(req.getContentLength()>20*1024*1024)
    throw new skip("The size of the posted content is more than 10 MB . If you have a best practice whose size is more than 1 MB please mail it to Us.");
    byteoutput = new ByteArrayOutputStream();
    MultipartParser mp = new MultipartParser(req, 20*1024*1024); // 10MB is the limit of the file to be uploaded
    Part part;//Its an abstact part which helps in retrieving information about the file and the parameters
    while ((part = mp.readNextPart()) != null) {//Reads the next part
    strParamName = part.getName();
    // the following if is executed if the part is for a parameter rather than a file
    if (part.isParam()) {
    }else if (part.isFile()) {
    // it's a file part
    m_strMessage="inside file part";
    FilePart filePart = (FilePart) part;
    strPassedFileName = filePart.getFileName();
    strContent= filePart.getContentType();
    out.println("<BR><font color=red>strPassedFileName is "+strPassedFileName+"</font>");
    if(strPassedFileName != null || !(strPassedFileName.trim().equals("")) ) {
    // the part actually contained a file
    out.println("<BR><font color=red> before forming long</font>");
    //lngSizeOfFileUploaded = filePart.writeTo(filePathOfFileUploaded);      //the statement upload the bestpractice in the
    lngSizeOfFileUploaded = filePart.writeTo(byteoutput);     //specified file path filePathOfFileUploaded.
    out.println("<BR><font color=red> after file is written into the outputstream</font>");
    else {
    throw new skip("The file name is null or it is empty space. Files in such Format are not accepted");
    }//end of else if
    }//end of while loop
    if(     lngSizeOfFileUploaded==0)     {// the size of the file uploaded is zero then the file supplied was not proper and hence exception is to be thrown
    //if(filePathOfFileUploaded.exists())
    //     filePathOfFileUploaded.delete();
    throw new skip("The File could not be uploaded,Possible reasons may be that the file is sent null or the file is corrupted");
    //END---------------the file is uploaded in the proper directory--------------------
    //res.setContentType(strContent);
    out.println("<BR><font color=red>long value is : "+lngSizeOfFileUploaded+" and content is "+strContent+"</font>");
    String strbyte= byteoutput.toString();
    byteoutput.flush();
    Class.forName("oracle.jdbc.driver.OracleDriver");
    // Establish network connection to database
    conn = DriverManager.getConnection("jdbc:oracle:thin:@pc-p32670:1521:GDCDBI","gdc_user","myuser");
    //if(conn!=null)
    out.println("<BR><font color=red>Connection formed"+conn);
    //els
    //out.println("<BR><font color=red>long value is : "+strbyte+"</font>");
    try{
    tempClob = oracle.sql.CLOB.createTemporary(conn,true, oracle.sql.CLOB.DURATION_SESSION);
    out.println("<BR><font color=red>tempClob : "+tempClob);
    tempClob.open( oracle.sql.CLOB.MODE_READWRITE);
    java.io.Writer tempClobWriter = tempClob.getCharacterOutputStream();
    // writing the string formed from the multipart file to the clob
    tempClobWriter.write(strbyte);
    if(tempClob!=null){}
    out.println("<BR><font color=red>CLOB value is : "+tempClob+"</font>");
    strQuery="{call INSERT_CLOB(?,?)}";
    cmt=conn.prepareCall(strQuery);
    cmt.setString(1,strPassedFileName);
    cmt.setClob(2,tempClob);
    cmt.registerOutParameter(2,java.sql.Types.CLOB);
    cmt.execute();
    tempClobWriter.flush();
    tempClobWriter.close();
    tempClob.freeTemporary();
    //res.setContentType(strContent);
    //strDocString.toString();
    out.println("<BR><font color=red>bob is "+strbyte+"</font>");
    tempClob.close();
    }catch(Exception e){
    tempClob.close();
    out.println("<font color=blue> Error is :"+e.getMessage()+"</font>");
    //e.printStackTrace(out);
    cmt.close();
    //out.println("<BR><font color=red><h2><b>SUCCESS</h2></font>");
    //res.sendRedirect("../test/showfile.jsp?contentype="+strContent.trim()+"");
    }catch(Exception e){
    java.util.Date d = new java.util.Date();
    String s =d.toString();
    out.println("<font color=blue> Error is :"+e.getMessage()+"</font>");
    //e.printStackTrace(out);
    }finally{
    try{
    if(conn!=null)
    conn.close();
    }catch(Exception e){
    out.println("<font color=blue> Error is :"+e.getMessage()+"</font>");
    }// end of finally
    } //end of doPost
    } //end of class
    in the Procedure you will be inserting/updating the clob in a table with the reference clob in the out parameter of the procedure
    Thanks.

  • Using OleDbDataAdapter Update with InsertCommands and getting blocking locks on Oracle table

    The following code snippet shows the use of OleDbDataAdapter with InsertCommands.  This code is producing many inserts on the Oracle table and is now suffering from contention... all on the same table.  How does the OleDbDataAdapter produce
    inserts from a dataset... what characteristics do these inserts inherent in terms of batch behavior... or do they naturally contend for the same resource. 
    oc.Open();
    for (int i = 0; i < xImageId.Count; i++)
    // Create the oracle adapter using a SQL which will not return any actual rows just the structure
    OleDbDataAdapter da =
       new OleDbDataAdapter("SELECT BUSINESS_UNIT, INVOICE, ASSIGNMENT_ID, END_DT, RI_TIMECARD_ID, IMAGE_ID, FILENAME, BARCODE_LABEL_ID, " +
       "DIRECT_INVOICING, EXCLUDE_FLG, DTTM_CREATED, DTTM_MODIFIED, IMAGE_DATA, PROCESS_INSTANCE FROM sysadm.PS_RI_INV_PDF_MERG WHERE 1 = 2", oc);
    // Create a data set
    DataSet ds = new DataSet("documents");
    da.Fill(ds, "documents");
    // Loop through invoices and write to oracle
    string[] sInvoices = invoiceNumber.Split(',');
    foreach (string sInvoice in sInvoices)
        // Create a data set row
        DataRow dr = ds.Tables["documents"].NewRow();
        ... map the data
        // Populate the dataset
        ds.Tables["documents"].Rows.Add(dr);
    // Create the insert command
    string insertCommandText =
        "INSERT /*+ append */ INTO PS_table " +
        "(SEQ_NBR, BUSINESS_UNIT, INVOICE, ASSIGNMENT_ID, END_DT, RI_TIMECARD_ID, IMAGE_ID, FILENAME, BARCODE_LABEL_ID, DIRECT_INVOICING, " +
        "EXCLUDE_FLG, DTTM_CREATED, DTTM_MODIFIED, IMAGE_DATA, PROCESS_INSTANCE) " +
        "VALUES (INV.nextval, :BUSINESS_UNIT, :INVOICE, :ASSIGNMENT_ID, :END_DT, :RI_TIMECARD_ID, :IMAGE_ID, :FILENAME,  " +
        ":BARCODE_LABEL_ID, :DIRECT_INVOICING, :EXCLUDE_FLG, :DTTM_CREATED, :DTTM_MODIFIED, :IMAGE_DATA, :PROCESS_INSTANCE)";
    // Add the insert command to the data adapter
    da.InsertCommand = new OleDbCommand(insertCommandText);
    da.InsertCommand.Connection = oc;
    // Add the params to the insert
    da.InsertCommand.Parameters.Add(":BUSINESS_UNIT", OleDbType.VarChar, 5, "BUSINESS_UNIT");
    da.InsertCommand.Parameters.Add(":INVOICE", OleDbType.VarChar, 22, "INVOICE");
    da.InsertCommand.Parameters.Add(":ASSIGNMENT_ID", OleDbType.VarChar, 15, "ASSIGNMENT_ID");
    da.InsertCommand.Parameters.Add(":END_DT", OleDbType.Date, 0, "END_DT");
    da.InsertCommand.Parameters.Add(":RI_TIMECARD_ID", OleDbType.VarChar, 10, "RI_TIMECARD_ID");
    da.InsertCommand.Parameters.Add(":IMAGE_ID", OleDbType.VarChar, 8, "IMAGE_ID");
    da.InsertCommand.Parameters.Add(":FILENAME", OleDbType.VarChar, 80, "FILENAME");
    da.InsertCommand.Parameters.Add(":BARCODE_LABEL_ID", OleDbType.VarChar, 18, "BARCODE_LABEL_ID");
    da.InsertCommand.Parameters.Add(":DIRECT_INVOICING", OleDbType.VarChar, 1, "DIRECT_INVOICING");
    da.InsertCommand.Parameters.Add(":EXCLUDE_FLG", OleDbType.VarChar, 1, "EXCLUDE_FLG");
    da.InsertCommand.Parameters.Add(":DTTM_CREATED", OleDbType.Date, 0, "DTTM_CREATED");
    da.InsertCommand.Parameters.Add(":DTTM_MODIFIED", OleDbType.Date, 0, "DTTM_MODIFIED");
    da.InsertCommand.Parameters.Add(":IMAGE_DATA", OleDbType.Binary, System.Convert.ToInt32(filedata.Length), "IMAGE_DATA");
    da.InsertCommand.Parameters.Add(":PROCESS_INSTANCE", OleDbType.VarChar, 10, "PROCESS_INSTANCE");
    // Update the table
    da.Update(ds, "documents");

    Here is what Oracle is showing as blocking locks and the SQL that has been identified with each of the SIDS.  Not sure why there is contention.  There are no triggers or joined tables in this piece of code.
    Here is the SQL all of the SIDs below are running:
    INSERT INTO sysadm.PS_RI_INV_PDF_MERG (SEQ_NBR, BUSINESS_UNIT, INVOICE, ASSIGNMENT_ID, END_DT, RI_TIMECARD_ID, IMAGE_ID, FILENAME, BARCODE_LABEL_ID, DIRECT_INVOICING, EXCLUDE_FLG, DTTM_CREATED, DTTM_MODIFIED, IMAGE_DATA, PROCESS_INSTANCE) VALUES (SYSADM.INV_PDF_MERG.nextval,
    :BUSINESS_UNIT, :INVOICE, :ASSIGNMENT_ID, :END_DT, :RI_TIMECARD_ID, :IMAGE_ID, :FILENAME, :BARCODE_LABEL_ID, :DIRECT_INVOICING, :EXCLUDE_FLG, :DTTM_CREATED, :DTTM_MODIFIED, :IMAGE_DATA, :PROCESS_INSTANCE)
    SID 1452 (BTSUSER,BIZTPRDI,BTSNTSvc64.exe) in instance FSLX1 is blocking SID 1150 (BTSUSER,BIZTPRDI,BTSNTSvc64.exe) in instance FSLX1
    SID 1452 (BTSUSER,BIZTPRDI,BTSNTSvc64.exe) in instance FSLX1 is blocking SID 1452 (BTSUSER,BIZTPRDI,BTSNTSvc64.exe) in instance FSLX1
    SID 1452 (BTSUSER,BIZTPRDI,BTSNTSvc64.exe) in instance FSLX1 is blocking SID 1156 (BTSUSER,biztprdi,BTSNTSvc64.exe) in instance FSLX3
    SID 1452 (BTSUSER,BIZTPRDI,BTSNTSvc64.exe) in instance FSLX1 is blocking SID 6 (BTSUSER,BIZTPRDI,BTSNTSvc64.exe) in instance FSLX2
    SID 1452 (BTSUSER,BIZTPRDI,BTSNTSvc64.exe) in instance FSLX1 is blocking SID 1726 (BTSUSER,BIZTPRDI,BTSNTSvc64.exe) in instance FSLX2
    SID 1452 (BTSUSER,BIZTPRDI,BTSNTSvc64.exe) in instance FSLX1 is blocking SID 2016 (BTSUSER,biztprdi,BTSNTSvc64.exe) in instance FSLX2

  • Using converter/adapter with macbook extension cord'

    I'm taking my macbook to the u.k. and want to know if i can use just an adapter with the magsafe extension cord. I already have an adapter for the the magsafe cord but may need to use the extension cord as well.

    Yes you can just use a plug adapter. It will work fine either way.

  • RE: Rollback while using DB Adapter with toplink

    Hi All,
    I am using DB Adapter's Toplink to insert data in ms-sql table.
    In case, If I am inserting 10 records and suppose after 6 record insertion is being failed.
    Then in above scenario, Is previously inserted records would be roll backed or not?
    Is there any thing, we need to do in BPEL or this would be taken care by Top link.
    My version of BPEL: 10.1.3.4.0
    Thanks in advance.
    Regards

    Hi Pavan,
    Thanks for your response. My process is Asysc. I did one poc, tested if a data of one record is wrong, then it throwing error, and data is not being inserted in data base.
    I tested with 4 records, if data of any records is wrong, then it is not being inserted.
    Is there any other good way to handle this.
    Regards
    Edited by: vikky123 on Sep 22, 2010 8:43 PM

Maybe you are looking for

  • Informatics PC 8.6.0 error on windows server 2008 enterprise.

    Hi, when iam trying to install INFO PC Server on windows 2008 enterprise server. It is able to configure domain but last but one step...cannot start services error. Error: Use the error below in catalina.out and node.log in the server/tomcat/logs dir

  • UR save as default options

    Is it possible to put a check box for each index tab like the main globals and pcb settings where whatever settings you have used in the past will be recalled during startup. I know that their are "technology" files.....but those really are only spec

  • Time Machine Freezes During Backup!

    Hi there, When I back up my computer for the first time using Time Machine, it gets to around 53.2 GB. After that, it sits there with the blue progress bar at the same level. Then, i get a message that says I did not eject the disk properly, and then

  • Moving markers with clips

    I'm just starting to use markers more. If markers are already placed on the timeline, how to I get them to move when I delete a clip that shifts the sequence timing?

  • Zen microphoto-computer does not detect it, help ple

    I got my zen microphoto for christmas, so i have had it less than a month. a few days ago the screen froze on me while i was looking for music to listen to. i had been listening to it for about an hour or so, the album once done and i went to get som