JDBC-Adapter - Are these Properties Correct?

Hi,
I am trying to execute a stored procedure on a Microsoft MSSQL Server via the JDBC Adapter.
The adapter is installed like it is described in the Driver-Manual and I configured the Adapter according to the User-Manual of the driver. But it doens't work.
Are these properties correct?!
http://www.devilstorm.com/downloads/jdbc.jpg
Thank you for your help
Thomas

1. Did you check whether your communication channel is active and in green status in the adapter monitor ?
2. If it is in the error status what is the error ? If it is in green status does it say it last polled at date and time ?
3. it is not call ZZ_COPA_ZCHECK, it should be EXEC ZZ_COPA_CZECH.
One other thing is you do not have a namespace, click on Advanced Mode (Erweiterter Modus) and add
Name = docNamespace
Value = <your namespaece>
regards
SKM

Similar Messages

  • HT1751 Are these instructions correct?  They don't mention the ITL file...

    Are these instructions correct?  They don't mention the ITL file...

    I'd say they are generally correct, however they don't account for the circumstances in which, following advice given elsewhere in Apple's support documents, the media folder is no longer contained inside the main iTunes folder. This can happen when the media has been consolidated elsewhere,, e.g. an external drive, due to space constraints. This creates what I call a "split library". In such cases the media folder and iTunes folder (holding the .itl database) each need to be backed up and, when restoring, the media folder must be restored to exactly the same path or iTunes will have trouble connecting to the media.
    Should you have created a split library that you'd like to reconnect see make a split library portable.
    There is some odd numbering on the support doc. page so I can't give a section number, but the bit on where to find your iTunes folder doesn't make it clear that the "iTunes" (or "library") folder is normally the parent folder of the "iTunes Media" folder shown in the dialog box, and that they are not the same thing. I suspect this is one of the reasons that some users manage to migrate or restore their media only to find they have lost playlists, ratings, play counts etc.
    Another issue with the article is that it provides a way to backup the library once, but not to efficiently maintain that backup. Deleteing the entire backup to regenerate it would be a waste of time and potentially risky. See this iTunes backup tip for a suggested strategy.
    tt2

  • HT2463 Are these ports correct?

    Are these ports correct? NTP more usually uses UDP as does DNS. Also, we found that we needed to open TCP5223 as well.

    Don't worry, the numbers are correct.
    In Intel's calculation, FSB is calculated as Quad Rate, so you'll have to mutiply the reported FSB by 4 which is 200MHz x 4 = 800MHz.

  • Creating RMAN catalog. Are these steps correct?

    Are these steps to set up the RMAN catalog correct?
    I.
    The following two databases are the target databases which needs to be backed up.
    1. a 11.1.0.7.0 database
    2. a 10.2.0.1.0 databaseII.
    Catalog schema in a 10.2.0.4.0 database called catdb.
    III.
    I am going to use the RMAN client(10.2.0.1.0) in my laptop (Windows Server 2003).
    CATALOG schema creation
    =======================
    To create the Catalog Database (10.2.0.4.0), this what i am going to do in catdb
         a. create tablespace CAT_TBS
         b.     CREATE USER rman_cat_schema
              IDENTIFIED BY rman_cat_schema
              DEFAULT TABLESPACE cat_tbs
              TEMPORARY TABLESPACE temp
              QUOTA UNLIMITED ON cat_tbs;
         c.      GRANT create session TO rman_cat_schema;
         d.      GRANT recovery_catalog_owner TO rman_cat_schema;
         e.      GRANT execute ON dbms_stats TO rman_cat_schema;
       f. Invoke the <font color="red"><b>RMAN binary from my laptop</b></font> :
                  C:\>set ORACLE_SID=catdb
               C:\>rman catalog rman_cat_schema/rman_cat_schema@catdb log=catalog.log
                               Hopefully Catalog will get the created at catdb .
    Registering the database
    =========================
    To register the database,this is what i am going to do.
         a. Add the Tns Entry about the catdb in tnsnames.ora file of the target database
         b. Log in to target database's machine and invoke the<font color="red"><b> RMAN executable in the target database. </b></font>
            $  export ORACLE_SID=oraprod314
            $  rman TARGET / CATALOG rman_cat_schema/rman_cat_schema@catdb
                 c. Then i can give the REGISTER DATABASE command. like
                 $rman > register database;
            Edited by: Citizen_2 on 14-Sep-2009 11:46

    Yes, You can connect to RMAN catalog using a lower version client. see this.
    SQL> conn rman/rman@mycatdb
    Connected.
    SQL> select * from rcver;
    VERSION
    11.01.00.06
    SQL>
    rman
    Recovery Manager: Release 9.2.0.7.0 - 64bit Production
    Copyright (c) 1995, 2002, Oracle Corporation. All rights reserved.
    RMAN> connect target /
    connected to target database: STSTDB (DBID=2045697105)
    RMAN> connect catalog rman/rman@abcd.
    connected to recovery catalog database
    RMAN>
    You can take the backups and do recoveries. But, if you want to achieve certain features of the recovery catalog of higher versions to backup / recover your database, you better use the right RMAN client which allows the features to use.
    It is always advisable to use the RMAN Client same as the database version you want to backup to avoid any confusions.
    Hope it is helpul.
    Regards
    Edited by: vasu77 on Sep 15, 2009 10:01 AM
    Edited by: vasu77 on Sep 15, 2009 2:11 PM

  • Are these steps correct  for ALE configuaration ?

    1.An ALE Configuration we have to do five steps
    a)Define logical System (SALE&#61664;sending and receiving system logical system&#61664;define logical system)
    b)Assign client to the logical system( SALE&#61664;sending and receiving system&#61664;logical system&#61664; Assign client to the logical system)
    c)We have to define port at (WE21)
    d)GENERATING PARTNER PROFILES(WE20)
    e)Define target system for RFC calls (sm59)
    f)Creating Distribution model (BD64)
    g)WE42 for creating process code.
    h)Send Data ( bd10)
    i)Get Data(bd11)
    j)BD51
    k)WE57
    We have to repeat the FIRST 6 steps for each of the sender and receiver and those are compulsory. I am not yet clear about the last 5 steps.
    E.g. our sender is 200 and receiver is 400
    So for each of them we have to do the 5 steps twice.

    folow the link
    http://searchsap.techtarget.com/general/0,295582,sid21_gci1130337,00.html
    /people/kevin.wilson2/blog/2006/11/13/ale-scenario-development-guide
    which will givw u the steps for ale configuration
    steps are
    . ALE->Sending and Receiving Systems->Logical Systems->Define Logical System. a. Here you add the name of the logical system for the distributed system. To make them easy to find I like to name them based on instance/client so in the case of the system listed above I would call it DM7200. This naming convention makes it easy to follow in various systems.
    2. ALE->Sending and Receiving Systems->Logical Systems->Assign Client to Logical System.
    a. In this step you assign your logical system to a particular client. WARNING: DO NOT CHANGE THIS IF IT IS ALREADY CONFIGURED!!! That holds especially true on an FI system as it cannot be changed once it is set. Make sure to discuss this with the Basis team before entering the data.
    3. ALE->Sending and Receiving Systems->Systems in Network->Synchronous Processing->Determine Target System for RFC Calls
    a. This area allows you to tell SAP where to make RFC calls based on the logical system. That means that when I want to look up Vendor information in cases like infotype 194 and 195 where a function module is called that I can tell the system where to go to find that information. To configure this you should make sure that the system where the financial information will be kept is listed.
    4. ALE->Modelling and Implementing Business Processes->Maintain Distribution Model and Distribute Views
    a. In this step we will define where the FI data comes from and what data we expect to receive. We will also define the system that will be receiving the payroll posting and what data we will be sending that system.
    b. SAP Recommends the following list of messages to be received from the financial system:
    i. COACTV-IDoc for cost center/activity type
    ii. COAMAS-Master activity type
    iii. COELEM-Cost Element Master Data
    iv. COGRP1-Cost Center Groups
    v. COGRP2-Cost Element Groups
    vi. COGRP5-Activity Type Groups
    vii. COSMAS-Master Cost Centers
    viii. CRECOR-Core Vendor master data distribution
    ix. CREMAS-Vendor master data distribution
    x. GLCORE-Master data G/L accounts (CORE IDoc)
    xi. GLMAST-Master data G/L accounts (master IDoc)
    xii. PRCMAS-Profit center master record
    xiii. RCLROL-Reconciliation ledger rollup
    c. To add these messages you can follow the steps listed here:
    i. Go into change mode by clicking on the pencil/glasses button or Distribution model->Switch Processing Modes via the menu
    ii. Click on the Create Model View Button
    Fill in the data for your model. Below is my example.
    iii. Now it is as simple as adding message types. Click on the Add message type button and enter the required data. The sender is the FI system and the receiver is the Payroll system. Tip: Once you add the first message type if you select it before adding the next message you will only need to fill out the message portion and not the sender and receiver
    iv. Now we need to build the partner profiles so that the systems know where the distribution model needs to get or put things. To do this go to Environment->Generate Partner Profiles.
    Enter the partner system you want to use and press execute or F8.
    This screen shows the successful creation of the partner profiles. If you want to see them can go to transaction WE20.
    v. Now we must send the model to the other system. To do this use menu path Edit->Model View->Distribute
    Now I should be able to see this model in my FI system, in this case, DM3. I can take a look at this via transaction BD64 in the other system
    You'll notice that the coloring on the messages and models is in gray. This indicates that the model is maintained in another system.
    d. We also have to configure the model for posting to the other system. In this case we will be using BAPI's and not message types. SAP recommends the following BAPI's:
    i. AcctngServices.DocumentDisplay, Accounting: Display Method for Follow-On Document Display
    ii. AcctngEmplyeeExpnses.Post, Accounting: Post G/L Acct Assignment for HR Posting
    iii. AcctngEmplyeeRcvbles.Post, Accounting: Post Cust. Acct Assignment for HR Posting
    iv. AcctngEmplyeePaybles.Post, Accounting: Post Vendor Acct Assignement for HR Posting
    e. For the most part the steps are the same as above but it is slightly different because you use the Add BAPI button instead of the Add Message Type. BAPI's are case sensitive so you'll need to be exact.
    <REMOVED BY MODERATOR>
    <REMOVED BY MODERATOR>
    Edited by: Alvaro Tejada Galindo on Feb 22, 2008 2:27 PM

  • Are these the correct linked libraries inside of the AppKit?

    I was wondering, I was having strange issues with a program and DTrace was showing this error:
    dtrace: error on enabled probe ID 2 (ID 1371: pid353:libobjc.A.dylib:objc_msgSend:entry): invalid address (0x7fff9410a22a) in action #3 at DIF offset 24
    dtrace: error on enabled probe ID 2 (ID 1371: pid353:libobjc.A.dylib:objc_msgSend:entry): invalid address (0x7fff9410a22a) in action #3 at DIF offset 24
    dtrace: error on enabled probe ID 2 (ID 1371: pid353:libobjc.A.dylib:objc_msgSend:entry): invalid address (0x7fff9410a22a) in action #3 at DIF offset 24
    Which ultimately showed up belonging to AppKit in LLDB, like so:
    (lldb) image lookup -v -a 0x7fff9410a22a
          Address: AppKit[0x0000000000acb22a] (AppKit.__TEXT.__objc_methname + 502802)
          Summary: "_inspectorBarView"
           Module: file = "/System/Library/Frameworks/AppKit.framework/Versions/C/AppKit", arch = "x86_64"
    Anyways, using otool, I saw a whole bunch of dynamic libraries linked with the AppKit binary and would like to confirm that these are all legit (Any help would be appreciated, thanks!):
    jburnetts-Mac-Pro:~ jburnett$ otool -L /System/Library/Frameworks/AppKit.framework/Versions/C/AppKit
    /System/Library/Frameworks/AppKit.framework/Versions/C/AppKit:
              /System/Library/Frameworks/AppKit.framework/Versions/C/AppKit (compatibility version 45.0.0, current version 1265.20.0)
              /System/Library/PrivateFrameworks/RemoteViewServices.framework/Versions/A/Remot eViewServices (compatibility version 1.0.0, current version 1.0.0)
              /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Application Services (compatibility version 1.0.0, current version 48.0.0)
              /System/Library/Frameworks/AudioToolbox.framework/Versions/A/AudioToolbox (compatibility version 1.0.0, current version 492.0.0)
              /System/Library/Frameworks/AudioUnit.framework/Versions/A/AudioUnit (compatibility version 1.0.0, current version 1.0.0)
              /System/Library/Frameworks/CoreData.framework/Versions/A/CoreData (compatibility version 1.0.0, current version 481.1.0)
              /System/Library/PrivateFrameworks/DataDetectorsCore.framework/Versions/A/DataDe tectorsCore (compatibility version 1.0.0, current version 354.4.0)
              /System/Library/PrivateFrameworks/DesktopServicesPriv.framework/Versions/A/Desk topServicesPriv (compatibility version 1.0.0, current version 738.3.1)
              /System/Library/Frameworks/Foundation.framework/Versions/C/Foundation (compatibility version 300.0.0, current version 1056.13.0)
              /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/HIToolbox.fra mework/Versions/A/HIToolbox (compatibility version 1.0.0, current version 698.0.0)
              /System/Library/Frameworks/QuartzCore.framework/Versions/A/QuartzCore (compatibility version 1.2.0, current version 1.8.0)
              /System/Library/Frameworks/Security.framework/Versions/A/Security (compatibility version 1.0.0, current version 55471.14.2)
              /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/SpeechRecogni tion.framework/Versions/A/SpeechRecognition (compatibility version 1.0.0, current version 1.0.0)
              /usr/lib/libauto.dylib (compatibility version 1.0.0, current version 1.0.0)
              /usr/lib/libicucore.A.dylib (compatibility version 1.0.0, current version 51.1.0)
              /usr/lib/libxml2.2.dylib (compatibility version 10.0.0, current version 10.9.0)
              /usr/lib/libz.1.dylib (compatibility version 1.0.0, current version 1.2.5)
              /System/Library/PrivateFrameworks/CoreUI.framework/Versions/A/CoreUI (compatibility version 1.0.0, current version 231.0.0)
              /System/Library/Frameworks/CoreAudio.framework/Versions/A/CoreAudio (compatibility version 1.0.0, current version 1.0.0)
              /System/Library/Frameworks/DiskArbitration.framework/Versions/A/DiskArbitration (compatibility version 1.0.0, current version 1.0.0)
              /usr/lib/liblangid.dylib (compatibility version 1.0.0, current version 1.0.0)
              /System/Library/PrivateFrameworks/MultitouchSupport.framework/Versions/A/Multit ouchSupport (compatibility version 1.0.0, current version 245.13.0)
              /System/Library/Frameworks/IOKit.framework/Versions/A/IOKit (compatibility version 1.0.0, current version 275.0.0)
              /usr/lib/libDiagnosticMessagesClient.dylib (compatibility version 1.0.0, current version 1.0.0)
              /System/Library/Frameworks/CoreServices.framework/Versions/A/CoreServices (compatibility version 1.0.0, current version 59.0.0)
              /System/Library/PrivateFrameworks/PerformanceAnalysis.framework/Versions/A/Perf ormanceAnalysis (compatibility version 1.0.0, current version 1.0.0)
              /System/Library/PrivateFrameworks/GenerationalStorage.framework/Versions/A/Gene rationalStorage (compatibility version 1.0.0, current version 160.2.0)
              /System/Library/Frameworks/OpenGL.framework/Versions/A/OpenGL (compatibility version 1.0.0, current version 1.0.0)
              /System/Library/PrivateFrameworks/Sharing.framework/Versions/A/Sharing (compatibility version 1.0.0, current version 132.2.0)
              /usr/lib/libobjc.A.dylib (compatibility version 1.0.0, current version 228.0.0)
              /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1197.1.1)
              /System/Library/Frameworks/ImageIO.framework/Versions/A/ImageIO (compatibility version 1.0.0, current version 1.0.0)
              /System/Library/Frameworks/CoreText.framework/Versions/A/CoreText (compatibility version 1.0.0, current version 1.0.0)
              /System/Library/Frameworks/CoreGraphics.framework/Versions/A/CoreGraphics (compatibility version 64.0.0, current version 600.0.0)
              /System/Library/Frameworks/CoreFoundation.framework/Versions/A/CoreFoundation (compatibility version 150.0.0, current version 855.14.0)

    Thanks for this info. Much appreciated.
    Can I ask why you recommend purchasing a hard drive? On this G5 we're speaking about, I have a 'Macintosh HD' drive (of which there are only 6 Gigs left -- although this is probably because the computer is clogged up with crap. I will clear this up soon).
    There is also a Movie Drive (70 Gigs left on it) and just rebuilt using Disk Warrior 3 after it disappeared on me.
    It might sound like stupid question, but if I buy another drive, can I insert that into the G5 and leave the other two in it aswell, or are you saying to replace a damaged drive or something? Would a third drive speed up the computer. Sorry, but I don't understand. Thanks again.

  • Are these the correct memory cards?

    Hi, I'd love it if someone could help me out here. I have previously posted about speeding up my old Mac G5. It is only a single processor 1.8 Ghz.
    I was recommended OWC, in order to buy memory, but I'm living in Ireland and have been recommended this product from this company:
    http://www.komplett.ie/k/ki.aspx?sku=310789
    Does anyone know if this is the one to buy? Here are my specs. Thanks!
    Machine Name: Power Mac G5
    Machine Model: PowerMac7,2
    CPU Type: PowerPC 970 (2.2)
    Number Of CPUs: 1
    CPU Speed: 1.8 GHz
    L2 Cache (per CPU): 512 KB
    Memory: 512 MB
    Bus Speed: 900 MHz
    Boot ROM Version: 5.1.5f2

    Thanks for this info. Much appreciated.
    Can I ask why you recommend purchasing a hard drive? On this G5 we're speaking about, I have a 'Macintosh HD' drive (of which there are only 6 Gigs left -- although this is probably because the computer is clogged up with crap. I will clear this up soon).
    There is also a Movie Drive (70 Gigs left on it) and just rebuilt using Disk Warrior 3 after it disappeared on me.
    It might sound like stupid question, but if I buy another drive, can I insert that into the G5 and leave the other two in it aswell, or are you saying to replace a damaged drive or something? Would a third drive speed up the computer. Sorry, but I don't understand. Thanks again.

  • Table name in Receiver JDBC Adapter

    Hi All,
    I am using receiver JDBC adapter.
    But table name consist of " like BPC."#II" .
    After mapping table name becomes BPC.&quot;#II&quot; .
    I am getting error while pulling data from DB.
    Is the table name creating a problem.
    Please remedy of this if you have come across such scenario.
    Regards
    Piyush

    Hi Piyush,
    All that i could get from the SAP Help regarding JDBC Adapters are these links where i never found anything much regarding the table name.anyways just go through these links and see if you find anything useful.
    http://help.sap.com/bp_bpmv130/Documentation/Planning/XIUnicodeGuide030411.pdf
    <a href="http://help.sap.com/saphelp_nw04/helpdata/en/b0/676b3c255b1475e10000000a114084/content.htm">Configuring the Receiver JDBC Adapter</a>
    <a href="http://help.sap.com/saphelp_nw04/helpdata/en/cf/406642ea59c753e10000000a1550b0/content.htm">Mapping Lookups</a>
    <a href="http://help.sap.com/saphelp_nw04/helpdata/en/22/b4d13b633f7748b4d34f3191529946/content.htm">JDBC Adapter</a>
    <a href="http://help.sap.com/saphelp_nw04/helpdata/en/64/ce4e886334ec4ea7c2712e11cc567c/content.htm">Configuring the Receiver JDBC Adapter - part 2</a>
    This is all i got from the help files, anyways you can also go through those links and see if you find anything else useful.
    - Escape Symbol for Apostrophe
    The apostrophe character (‘) is a reserved character in SQL syntax and is therefore replaced by an escape character if it occurs within value strings. This replacement character can be database-specific. Typical replacement characters are \’ or ’’(default value). If a character occurs that is invalid for the database being used, the adapter triggers an error message (an SQL exception) concerning the SQL syntax that is generated by the database.
    - Column Name Delimiter
    Depending on the database being used, column names can be enclosed by a special delimiter character, for example, if the names can contain special characters (such as ”). This character can be specified at this point. The default setting is no delimiter character. If a character occurs that i
    Also check if there are notes in the service market place related to the same.
    Regards,
    abhy
    Message was edited by: Abhy Thomas

  • JDBC adapter taking too much time for inserting

    we` have given a "update insert" query to be used in the JDBC adapter to insert these records into the database. Consists of 3 tables with 29fields in first, 4 fields in the other two.
    While message in XI gets processed in just 1-3 secs, the database processing time goes up to as much as 8 minutes for one record to get inserted.
    As immediate solution, is there any way we can have the JDBC adapter process these messages faster? These messages get queued up and hence all the other messages also get queued up delaying the other interfaces. We have a central adapter engine...
    Also is there any way we can get alert when the status is in "Processing/To be delivered/Delivering" and the message count exceeds a certain number say 1000

    I am using only one receiver JDBC channel
    We have been inserting three different table by using 3 different statemets tags(i.e) statement1(for table1),statement2(for table2),statement3(for table3).
    My structure is,
    <messagetype Name>
        <Statement1>
                 <tag1>
                          <action>UPDATE_INSERT</action>
                          <table>Table1</table>
                           <access>
                                        <field1>
                                         <field2>
                                          <field28>
                            <key>
                                   <MatNumber>
                 </tag1>
        </statement1>
       <Statement2>
                 <tag1>
                          <action>UPDATE_INSERT</action>
                          <table>Table2</table>
                           <access>
                                        <field1>
                                         <field2>
                                          <field4>
                            <key>
                                   <MatNumber>
                 </tag1>
        </statement2>
        <Statement3>
                 <tag3>
                         <action>UPDATE_INSERT</action>
                          <table>Table3</table>
                           <access>
                                        <field1>
                                         <field2>
                                          <field4>
                            <key>
                                   <MatNumber>
                 </tag3>
        </statement3>
    You can see we are also using key as well.In the first table we have 28 fields,second & third we are having 4.
    Edited by: rajesh shanmugasundaram on Jul 31, 2008 11:08 AM

  • Sender JDBC adapter SELECT / UPDATE issue - updates more rows than selected

    Hi,
    We have configured a Sender JDBC Adapter to poll records from an Oracle table based on a flag field and then update the flag for the selected records. When tested in DEV and QA environments (where test data comes in intermittently and not in huge volumes), itu2019s working fine.
    Both SELECT and UPDATE queries written in the Sender JDBC adapter are getting properly executed and are changing the status of the flag for the selected records from Y to N once read from the database.
    select * from <table> where flag = 'N'.
    update <table> set flag = 'Y' where flag = 'N'.
    But in the PROD environment (with records getting updated in the database every second), after XI executes the SELECT query and just before the UPDATE query is executed, new records come into the Oracle table with status flag 'N". So when the UPDATE query runs just after the SELECT query, then these unselected records also get updated to 'Y'. Thus these records never get into the resultset and hence XI and thus remain unprocessed.
    So when XI does a SELECT and UPDATE on the Oracle DB table and concurrently there is an INSERT happening into the table from the other end, the JDBC sender adapter is picking up a certain number of records but updating the status of more records than it picked up.
    So how does XI deal with such a common scenario without dropping records?
    Thanks,
    Vishak

    The condition being checked is the same for both SELECT and UPDATE statements.
    Initially I tried setting transaction isolation levels on the database to repeatable_read and serializable but it was throwing me a java.sql.SQLException error saying that these transaction levels were not valid.
    I asked for these transaction level permissions for the XI user from my DBA but the DB I am accessing provides only a view into other databases and so it's not possible.

  • Adapter Engine Prioritation for JDBC Adapter

    Hi ,
    I am currently working on an Idoc to Jdbc Scenario. The jdbc adapter is used to push the messages to three different legacy systems through three different scenarios. I need to prioratize the messages of the legacy systems(For Eg. Legacy System A is more critical than Legacy System B) in adapter engine or in adapter itself.
    Regards,
    UserPI

    Hi
    We are having the same problem. Once one of the interfaces which is using a JDBC communication channel is processing a bigger amount of messages, the JDBC adapter seems to be blocked exclusively for that interface. All other interfaces using the JDBC adapter are on hold (status "Scheduled" in Adapter Engine Message Monitoring). I set up message prioritization in RWB (Component: Adapter Engine), but it doesn't help (nothing different from before).
    Does anybody has a hint on that?
    Thanks
    Philippe

  • Messages are in hold state due to one message at receiver JDBC adapter

    Hello,
    I am using a receiver JDBC adapter and trying to send an XML file which has an insert query to insert some data into the database i.e., Oracle 9i.
    Here at the receiver side due to one message (  which is in to be delivered state) all other messages are got hold and waiting for long time.
    I am getting the exceptions as this,
    JDBC Adapter processing failed with Error processing request in sax parser: Error when executing statement for table/stored proc. 'FSASMGR.XTBL_KL06_IINQUIRY' (structure 'REC01'): java.sql.SQLException: ORA-12899: value too large for column "FSASMGR"."XTBL_KL06_IINQUIRY"."CASE_TITLE" (actual: 81, maximum: 80)
    Exception caught when executing statement for table/stored proc. 'FSASMGR.XTBL_KL01_ISTAFF_MST' (structure 'REC1'):
    java.sql.SQLException: ORA-00001: unique constraint (FSASMGR.XTBL_KL01_ISTAFF_MST) violated
    Can anyone help me out in solving this issue.
    THanks,
    Soorya

    hi surya,
    jdbc channel tries a request for n number of times(no of rettries),if that request fails it will take up the following requests,its not like it will process a request forever.
    to be delivered will be resulting if reciever is down or network between XI and database server is bad.
    try to ping databse server from XI host system.
    if everything is fine then look at the channel configuration->advanced->no of retries of database transaction on sql server.
    just redue the count to 1 and recheck.
    Regards,
    rama Krishna

  • What are the Batch mode parameters for Receiver JDBC Adapter

    Hi All,
             Could some pls tell me how to set bacth mode in receiver JDBC adapter and what are its parameters and how to configure them. i beleive there is something like max count parameters etc..
    Regards,
    Xier

    Hi,
    Check this for more info
    http://help.sap.com/saphelp_nw04s/helpdata/en/64/ce4e886334ec4ea7c2712e11cc567c/frameset.htm
    Regards
    Seshagiri

  • JDBC adapter disconnects, if there are no message processing for 2 days

    Hi All,
    I have sender JDBC adapter configured, which polls messages every 3 mins.
    XI is constantly polling the database (SQL Server) and since there are no messages on weekends, after a certain interval, it disconnects... giving a null exception.
    I have checked the advance option "Disconnect from Database After Processing Each Message". still it is giving this error. I am on SP 12.
    Can someone please let me know, what could be the reason of this null exception. why does it disconnects if there are no message processing for a long interval.
    Any help would be appreciated.
    Regards,
    Amit

    It means we don't support that file type for doing imports. It sounds like a reasonable request to [add to the Exchange|sqldeveloper.oracle.com] though.

  • JDBC adapter: greek characters

    Hi All,
    We are reading and sending data from a ms sql dbase towards our R/3 system via PI (JDBC adapter).
    I am having some problems related with exotic characters (greek). Reading these characters from the dbase works fine without any additional config, but writing these characters gives problems.
    When I check the message logging inside XI, the characters look fine, but the result in the dbase is not.
    ex:
      XI       = Martine &#913;&#932;&#932;&#921;&#922;&#927; &#920;&#917;&#929;&#913;&#928;&#917;&#933;&#932;&#919;&#929;&#921;&#927;
      dbase = Martine ?????? T???????????
    I already tried changing the charset of the communication channel to "iso-8859-7", but that did not fix the problem.
    Anyone an idea?
    Kindest regards
    Joris

    In note 831162 (FAQ: XI 3.0 / PI 7.0 JDBC Adapter) is mentioned:
    "2. Unicode Handling
    Q: I am inserting Unicode data into a database table or selecting Unicode data from a table. However, the data inserted into or retrieved from the table appears garbled. Why doesn't the JDBC Adapter handle Unicode correctly?
    A: While the JDBC Adapter is Unicode-aware, many JDBC drivers and/or database management systems aren't by default and need a codepage or Unicode-awareness to be configured explicitly. For the respective JDBC drivers, this codepage setting is often configured via the driver URL. For details, refer to the documentation of your JDBC driver or database management system.
    Note: For the Lotus Domino Driver for JDBC, there does not seem to be any documented method to enable support for Unicode strings."
    Does this help you?
    Regards
    Stefan

Maybe you are looking for