Performance problem in RFC to JDBC interface

Hello everybody!
i'm working whit SAP PI 7.1
We defined some interfaces RFC - PI - JDBC (SQL server) but we have some performance problem.
If we have many row to write on the table then interface finish in timeout :
Synchronous timeout exceeded.
Returning to application. Exception: com.sap.engine.interfaces.messaging.api.exception.MessageExpiredException: Message 1d1f00b0-fecf-11de-8738-0015600446f0(OUTBOUND) expired.
I read the PI tuning document and i tried to apply configuration whit Advanced Adapter Engine but whitout result.
Now we want change the timeout in visual admin and maybe we solve the error but i'm asking myself....:
It's normal that for write 1500 row in a table we need more than 4 minuts????
It's possible accelerate this process??? After go live we will write messages whit more than 50.000 row.
somebody may help me?
PS: please no link to tuning guide or to notes (to increase the timeout parameter).

This could be because your Database system (JDBC server) is taking more time to insert. The problem is not on PI side but on the receiving system side. Try inserting the same number od rows on the database server itself and check for the time taken for execution. Adding indexes on your database table solves the issue lot of times.
Here PI is not the culprit but definitely  the receiver system.
VJ

Similar Messages

  • Entire JDBC communication stopped if problem with one single JDBC interface

    Hello,
    Will the entire JDBC communication stopped if problem with one single JDBC interface?
    Thanks,
    Soorya.

    hi surya,
    this will happend if u use maintain order at runtime at interface determination.
    just uncheck this option if u dont neet EOIO.
    if you are getting the problem if u r going for EO then the problem might be using same JDBC channel for all interfaces.
    if each interface is expected with a high load then it better to go for dedicated channels for interfaces.
    like INTERFACE A should use JDBC A channel and INTERFACE B should use JDBC B channel.
    Thanks & Regards,
    Rama krishna

  • JDBC communication stopped if problem with one single JDBC interface

    Hello,
    Can you please explain this phrase
    "JDBC communication stopped if problem with one single JDBC interface"
    THanks,
    Soorya

    If you are having a problem with a JDBC interface (lets consider this to be a communication channel) then the communication is stopped (via that channel) (only in the case of EOIO).
    Hope this clarifies.
    Cheers,
    Sarath
    Award if helpful.

  • Mapping problem in message response  RFC -- xi -- JDBC

    Hi all,
    I'm using the follow scenario:
    RFC <-> xi <-> JDBC
    It's a synchronous interface.
    In JDBC the message do a SELECT in database and returns selected rows in message response.
    The message response transfers selected rows to RFC.response in the mapping.
    It seems working right but no data are transfered to RFC.
    In SXMB_MONI I can see the selected rows from database.
    There is no error in JDBC and RFC adapter. Bellow is the message response in the SXMB_MONI:
      <ns1:MT_jbdc_select_response xmlns:ns1="http://braskem.com.br/xi/sapxi03">
      <t7_productionorder_response>
      <row>
      <PRODUCT>Nafta Media</PRODUCT>
      </row>
      <row>
      <PRODUCT>Nafta Media</PRODUCT>
      </row>
      </t7_productionorder_response>
      </ns1:MT_jbdc_select_response>
    I think that there is an error in message mapping between RFC.response <-- message_response, because I'm just mapping the fields <row> and <product>, because if I make a link beteween <t7_productionorder_response> and return table of RFC, a Short DUMP happens.
    Bellow the structure of MESSAGE RESPONSE:
       <t7_productionorder_response>
          <row>
             <PRODUCT/>
          </row>
       </t7_productionorder_response>
    Bellow the structure of RFC RESPONSE:
    <RFC.response>
      <t_return>
         <item>
             <product/>
         </item>
      </return>
    </RFC.response>
    When I make a link in the mapping between
    <t7_productionorder_response> -> <t_return>
    a short DUMP happens in RFC call in R/3.
    Anyone could help me about this problem ?
    Thanks in advance
    Regis Ferrato

    Hi,
      <i>because if I make a link beteween <t7_productionorder_response> and return table of RFC, a Short DUMP happens.</i>
    After doing this mapping, did you test it in the IR mapping editor? was it successful?
    Do specify the occurence of all the headers so we can help you better.
    Regards,
    Smitha.

  • Synchronous - asynchronous interface (RFC - XI - jdbc)

    hello to everybody!
    i want configure a synchronous - asynchronous interface (RFC - PI - jdbc) whit my PI 7.1
    This RFC send 2 output tables to PI/XI and i want write it into 2 different tables in a SQL Server Database.
    How can i do it?
    i tried whit one mapping to a unique outbound datatype...but only header table was insert and not rows.
    i tried whit 2 Receiver Interfaces but i have on mapping the follow error:
    Multiple inbound interfaces not supported for synchronous calls
    so??
    other possibility?
    BPM?
    please help me
    Thanks
    Alessandro

    Hi,
    I have a similar problem; I have JMS - PI - RFC synchronous scenario. I will receive message from JMS, and depending on the data received, I should call either of two RFCs (NOT BOTH). The BAPI return message should then be sent back to JMS.
    Hence, I have one "Sender/Outbound" Message Interface, one receiver determination for that Message Interface - in which there is "one" receiver (backend SAP system).
    In "Interface Determination", I specifed the two BAPIs as two Interfaces, and defined conditions based on "context objects" - since I need to know the Interface Mapping "dynamically"
    However, I get the error - "Multiple inbound interfaces not supported for synchronous calls ". any suggestions on how to proceed?
    I am posting this question here as it is a related issue. Hope it is fine.
    Thanks,
    Archana

  • Performance Problems with Web Layouts in web interface

    Hello Gurus,
    We have a a BPS web interface tool which has the following design:
    1> A web interface with several tabs
    2> Each tab has around 3-4 input layouts which are dependent on each other
    3> In all there are 120-140 layouts that the tool uses...
    My questions were in term of performance
    1> Is there a limit to how many web layouts you can use per page/tab/view or if SAP recommends specific number of web layouts per page/tab/view ?
    2> If there is a limitation...our intention was to convert all the display layouts into BW reports so as to increase the performance of the tool....
    3> Would like to know the restriction on the number of users who can log into the tool as a specific point of time ? We may have 50-60 minimum using this tool.
    I would appreciate your help in this regard.
    Thanks in advance

    Hello Rashmi,
    Have you got a chance to look at the performance guide and SAP notes on BPS performance. If not here are the details
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/documents/a1-8-4/performance guide - sap sem bw bps.pdf
    Enclosed are the few SAP notes related for improving performance.
    358921 - Oracle database parameterization for SEM
    459897 - SEM-CPM: Performance when reading transaction data
    566713 – Required information for the analysis of performance problems 
    560369 - Proposals BW aggregates for SEM-BPS
    180605 - Oracle database parameter settings for BW
    124361 - Oracle parameterization (R/3 >= 4.x, Oracle 8.x / 9.x)
    358529 - Overview of performance notes
    350011 - Technical performance: Using the business content
    340246 - Techn. performance: Overview of statistics   
    417091 - Optimize execution time of planning functions
    Some of them are Oracle specific, ignore them if you are not oracle database. Hope this helps.
    Thanks,
    Praveen
    PS.Dont forget to reward points

  • Problem in back to back interface calls using JDBC adpater

    Hi,
    I am facing a problem while using the JDBC adapter from XI to oracle for back to back iterface calls.
    as my scenario is as follows i am doing synchronus call with the first interface which is working fine, immediately after the synchronus call i am doing a asynchronus call which is showing as successfull in XI but it is not at all reaching to oracle side as it is supposed to update a table in oracle using a stored procedure.
    any idea what could be the problem?
    Shravan

    Hi Shravan,
    It would be helpful if you could tell us the error thrown
    by your JDBC adapter in the <b>RWB--> ADAPTER MONITORING</b>.
    Also, if possible can you give us the message being passed to your JDBC adapter.
    meanwhile, just check your JDBC adapter document format against the formats in this link.
    http://help.sap.com/saphelp_nw04/helpdata/en/2e/96fd3f2d14e869e10000000a155106/content.htm
    For execution of a stored procedure using JDBC adapter, make sure that the action attribute is
    <b>EXECUTE</b> and the table name provided is the name of the STORED PROCEDURE.
    Regards,
    Bhavesh

  • Performances Problem in XI using RFC

    Hi All,
    I have some doubts about XI performance:
    Does anybody knows if there is any
    performance restrictions to do RFC calls to XI ?
    What's the best performance Solution in XI ?
    iDoc Adpater or RFC Adapter, File Adapter or RFC Adapter ?
    Is it possible that something in Mapping
    causes performance problem ?
    Thanks

    Hi
    You need to create the Type H RFC in PI system pointing to ECC and reverse in ECC system pointing to PI.
    Create the RFC and test. When you test connection it will show error because of empty message but this will work fine for the Proxy with Data.
    Thanks
    Gaurav

  • JDBC performance problem with Blob

    Hi,
    I've some performance problem while inserting Blob in an Oracle
    8i database. I'm using JVM 1.17 on HP-UX 11.0 with Oracle thin
    client
    The table I used contains only two columns : an integer (the
    primary key) and a blob.
    First I insert a row in the table with an empty_blob()
    Then I select back this row to get the blob
    and finally I fill in the blob with the data.
    But it takes an average 4.5 seconds to insert a blob of 47 Kb.
    Am I doing something wrong?
    Any suggestion or hint will be welcome.
    Thanks in advance
    Didier
    null

    Don S. (guest) wrote:
    : Didier Derck (guest) wrote:
    : : Hi,
    : : I've some performance problem while inserting Blob in an
    Oracle
    : : 8i database. I'm using JVM 1.17 on HP-UX 11.0 with Oracle
    thin
    : : client
    : : The table I used contains only two columns : an integer (the
    : : primary key) and a blob.
    : : First I insert a row in the table with an empty_blob()
    : : Then I select back this row to get the blob
    : : and finally I fill in the blob with the data.
    : : But it takes an average 4.5 seconds to insert a blob of 47
    Kb.
    : : Am I doing something wrong?
    : : Any suggestion or hint will be welcome.
    : : Thanks in advance
    : : Didier
    : In our testing if you use blob.putBytes() you will get better
    : performance. The draw back we found was the 32k limit we ran
    in
    : to. We had to chunk things larger than that and make calls to
    : the append method. I was dissappointed in Oracle's phone
    support
    : on what causes the 32k limit. In addition the getBytes() for
    : retrieve doesn't seem to work. You'll have to use the
    : InputStream for that. Oh, and FYI. We ran into a 2k limit on
    : putChars() for CLOB's.
    the thin drivers currently use the package "dbms_lob" behind the
    scenes while the jdbc oci 815 drivers and higher uses the native
    oci calls which makes them much faster.
    there also is a 32k limit on pl/sql stored procedures/functions
    for parms.
    you may have run into that.
    null

  • Reg error in interface mapping in RFC to JDBC scenario

    Hi Techies,
    Iam trying to map the scenario RFC to JDBC.
    In configuration window Iam testing the configuration.
    when Iam testing it is giving the error.
    " com.sap.aii.utilxi.misc.api.BaseRuntimeException thrown during application mapping com/sap/xi/tf/_mm_mapping_: Parsing an empty source. Root element expected! "
    My mapping is as follows
                       mt_receiver                    
                                                            statement               
                                                                ROW          
                                                     action---insert     
                 ZHRT002----
         Table     
         Item----
    access     
              MANDT               
              PERNR                                     PERNR
              PERID                            PERID
              SHOPN                           SHOPN
    Sender
    <?xml version="1.0" encoding="UTF-8"?>
    <ns0:ZTEST_PI_ZHRT002 xmlns:ns0="urn:sap-com:document:sap:rfc:functions">
       <ZHRT002>
          <item>
             <MANDT/>
             <PERNR/>
             <PERID/>
    </item>
       </ZHRT002>
    </ns0:ZTEST_PI_ZHRT002>
    Reciver
    <?xml version="1.0" encoding="UTF-8"?>
    <ns1:mt_receiver xmlns:ns1="urn:sap-com:RFC2JDBC"><Statement1><ROW action="INSERT"><TABLE>ZHRT002</TABLE><access><PERNR></PERNR><PERID></PERID></access></ROW></Statement1></ns1:mt_receiver>
    Please give suggestion how to go about this.
    Thanks in advance,
    Regards,
    Kiran

    Hi,
    Please check the payload you are using to test the configuration.
    The structure seems to be Incorrect.
    Regards,
    Deepak

  • RFC-XI-JDBC Scenario: Help with RFC code

    HI,
    I am doing RFC-XI-JDBC Scenario, where I have to poll the contents of my Ztable in SAP to Oracle.
    The appraoach i am using here is :
    1. Created a FM with import parameters as the fields of my Ztable and without any  export parameter or source code.
    2. Created a report to call that FM in background. The code of my report is as follows :
    ************************REPORT***********************************************************
    Data:       it_zrfc_read_table type table of zrfc_read_table,
                wa_zrfc_read_table like line of it_zrfc_read_table.
    PARAMETERS: tab_name like DD02L-TABNAME.
    Data:       l_tabname type DD02L-TABNAME.
    At selection-screen.
    select single tabname from DD02L into l_tabname  where tabname = tab_name.
    if sy-subrc <> 0.
      message 'incorrect table name' type 'E'.
    endif.
    start-of-selection.
    select * from (tab_name) into corresponding fields of table it_zrfc_read_table.
    loop at it_zrfc_read_table into wa_zrfc_read_table.
    CALL FUNCTION 'ZRFC_READ_TABLE2XI'
    IN BACKGROUND TASK DESTINATION 'ORACLEGIS_RFC_SENDER'
      EXPORTING
        valve_id       = wa_zrfc_read_table-valve_id
        equnr          = wa_zrfc_read_table-equnr
        ernam          = wa_zrfc_read_table-ernam
        invnr          = wa_zrfc_read_table-invnr
        groes          = wa_zrfc_read_table-groes
        elief          = wa_zrfc_read_table-elief
        gwlen          = wa_zrfc_read_table-gwlen
        gwldt          = wa_zrfc_read_table-gwldt
        serge          = wa_zrfc_read_table-serge
        typbz          = wa_zrfc_read_table-typbz.
        endloop.
        commit work.
    NOw my problem is that although I am able to send a table with a single record, but when my records > 1, it is not able to poll.
    Please guide me what can be altered in the code or any other suggestion.
    Thanks,
    Puneet

    Hi,
    Instead of calling RFC many times I would:
    1) create RFM with only one import parameter
    TYPE your_table type
    2) call only once RFM in the report:
      CALL FUNCTION 'ZRFCNAME'
        IN BACKGROUND TASK
        DESTINATION 'RFCDEST'
        EXPORTING
          pt_table = i_table.
      COMMIT WORK AND WAIT.
    This will also improve the performance.
    Regards,
    Jakub

  • Performance problem when running a personalization rule

    We have a serious performance problem when running a personalization rule.
    The rule is defined like this:
    Definition
    Rule Type: Content
    Content Type: LoadedData
    Name: allAnnouncements
    Description: all announcements of types: announcement, deal, new release,
    tip of the day
    If the user has the following characteristics:
    And when:
    Then display content based on:
    (CONTENT.RessourceType == announcement) or (CONTENT.RessourceType == deal)
    or (CONTENT.RessourceType == new release) or (CONTENT.RessourceType == tip
    of the week)
    and CONTENT.endDate > now
    and CONTENT.startDate <= now
    END---------------------------------
    and is invoked in a JSP page like this:
    <%String customQuery = "(CONTENT.language='en') && (CONTENT.Country='nl'
    || CONTENT.Country='*' ) && (!(CONTENT.excludeIds like '*#7#*')) &&
    (CONTENT.userType ='retailer')"%>
    <pz:contentselector
    id="cdocs"
    ruleSet="jdbc://com.beasys.commerce.axiom.reasoning.rules.RuleSheetDefinitio
    nHome/b2boost"
    rule="allAnnouncements"
    sortBy="startDate DESC"
    query="<%=customQuery%>"
    contentHome="<%=ContentHelper.DEF_DOCUMENT_MANAGER_HOME%>" />
    The customQuery is constructed at runtime from user information, and cannot
    be constructed with rules
    administration interface.
    When I turn on debugging mode, I can see that the rule is parsed and a SQL
    query is generated, with the correct parameters.
    This is the generated query (with the substitutions):
    select
    WLCS_DOCUMENT.ID,
    WLCS_DOCUMENT.DOCUMENT_SIZE,
    WLCS_DOCUMENT.VERSION,
    WLCS_DOCUMENT.AUTHOR,
    WLCS_DOCUMENT.CREATION_DATE,
    WLCS_DOCUMENT.LOCKED_BY,
    WLCS_DOCUMENT.MODIFIED_DATE,
    WLCS_DOCUMENT.MODIFIED_BY,
    WLCS_DOCUMENT.DESCRIPTION,
    WLCS_DOCUMENT.COMMENTS,
    WLCS_DOCUMENT.MIME_TYPE
    FROM
    WLCS_DOCUMENT
    WHERE
    ((((WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'RessourceType'
    AND WLCS_DOCUMENT_METADATA.VALUE = 'announcement'
    )) OR (WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'RessourceType'
    AND WLCS_DOCUMENT_METADATA.VALUE = 'deal'
    )) OR (WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'RessourceType'
    AND WLCS_DOCUMENT_METADATA.VALUE = 'new release'
    )) OR (WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = ''
    AND WLCS_DOCUMENT_METADATA.VALUE = 'tip of the week'
    )) OR (WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'RessourceType'
    AND WLCS_DOCUMENT_METADATA.VALUE = 'press release'
    AND (WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'endDate'
    AND WLCS_DOCUMENT_METADATA.VALUE > '2001-10-22 15:53:14.768'
    AND (WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'startDate'
    AND WLCS_DOCUMENT_METADATA.VALUE <= '2001-10-22 15:53:14.768'
    AND ((WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'language'
    AND WLCS_DOCUMENT_METADATA.VALUE = 'en'
    AND ((WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'Country'
    AND WLCS_DOCUMENT_METADATA.VALUE = 'nl'
    )) OR (WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'Country'
    AND WLCS_DOCUMENT_METADATA.VALUE = '*'
    AND NOT (WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'excludeIds'
    AND WLCS_DOCUMENT_METADATA.VALUE LIKE '%#7#%' ESCAPE '\'
    AND (WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'userType'
    AND WLCS_DOCUMENT_METADATA.VALUE = 'retailer'
    At this moment, the server makes the user wait more than 10 min for the
    query to execute.
    This is what I found out about the problem:
    1)When I run the query on an Oracle SQL client (We are using Oracle 8.1.7.0)
    , it takes 5-10 seconds.
    2)If I remove the second term of (CONTENT.Country='nl' ||
    CONTENT.Country='*' ) in the custom query,
    thus retricting to CONTENT.Country='nl', the performance is OK.
    3)There are currently more or less 130 records in the DB that have
    Country='*'
    4)When I run the page on our QA server (solaris), which is at the same time
    our Oracle server,
    the response time is OK, but if I run it on our development server (W2K),
    response time is ridiculously long.
    5)The problem happens also if I add the term (CONTENT.Country='nl' ||
    CONTENT.Country='*' )
    to the rule definition, and I remove this part from the custom query.
    Am I missing something? Am I using the personalization server correctly?
    Is this performance difference between QA and DEV due to differences in the
    OS?
    Thank you,
    Luis Muñiz

    Luis,
    I think you are working through Support on this one, so hopefully you are in good
    shape.
    For others who are seeing this same performance issue with the reference CM implementation,
    there is a patch available via Support for the 3.2 and 3.5 releases that solves
    this problem.
    This issue is being tracked internally as CR060645 for WLPS 3.2 and CR055594 for
    WLPS 3.5.
    Regards,
    PJL
    "Luis Muniz" <[email protected]> wrote:
    We have a serious performance problem when running a personalization
    rule.
    The rule is defined like this:
    Definition
    Rule Type: Content
    Content Type: LoadedData
    Name: allAnnouncements
    Description: all announcements of types: announcement, deal, new release,
    tip of the day
    If the user has the following characteristics:
    And when:
    Then display content based on:
    (CONTENT.RessourceType == announcement) or (CONTENT.RessourceType ==
    deal)
    or (CONTENT.RessourceType == new release) or (CONTENT.RessourceType ==
    tip
    of the week)
    and CONTENT.endDate > now
    and CONTENT.startDate <= now
    END---------------------------------
    and is invoked in a JSP page like this:
    <%String customQuery = "(CONTENT.language='en') && (CONTENT.Country='nl'
    || CONTENT.Country='*' ) && (!(CONTENT.excludeIds like '*#7#*')) &&
    (CONTENT.userType ='retailer')"%>
    <pz:contentselector
    id="cdocs"
    ruleSet="jdbc://com.beasys.commerce.axiom.reasoning.rules.RuleSheetDefinitio
    nHome/b2boost"
    rule="allAnnouncements"
    sortBy="startDate DESC"
    query="<%=customQuery%>"
    contentHome="<%=ContentHelper.DEF_DOCUMENT_MANAGER_HOME%>" />
    The customQuery is constructed at runtime from user information, and
    cannot
    be constructed with rules
    administration interface.
    When I turn on debugging mode, I can see that the rule is parsed and
    a SQL
    query is generated, with the correct parameters.
    This is the generated query (with the substitutions):
    select
    WLCS_DOCUMENT.ID,
    WLCS_DOCUMENT.DOCUMENT_SIZE,
    WLCS_DOCUMENT.VERSION,
    WLCS_DOCUMENT.AUTHOR,
    WLCS_DOCUMENT.CREATION_DATE,
    WLCS_DOCUMENT.LOCKED_BY,
    WLCS_DOCUMENT.MODIFIED_DATE,
    WLCS_DOCUMENT.MODIFIED_BY,
    WLCS_DOCUMENT.DESCRIPTION,
    WLCS_DOCUMENT.COMMENTS,
    WLCS_DOCUMENT.MIME_TYPE
    FROM
    WLCS_DOCUMENT
    WHERE
    ((((WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'RessourceType'
    AND WLCS_DOCUMENT_METADATA.VALUE = 'announcement'
    )) OR (WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'RessourceType'
    AND WLCS_DOCUMENT_METADATA.VALUE = 'deal'
    )) OR (WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'RessourceType'
    AND WLCS_DOCUMENT_METADATA.VALUE = 'new release'
    )) OR (WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = ''
    AND WLCS_DOCUMENT_METADATA.VALUE = 'tip of the week'
    )) OR (WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'RessourceType'
    AND WLCS_DOCUMENT_METADATA.VALUE = 'press release'
    AND (WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'endDate'
    AND WLCS_DOCUMENT_METADATA.VALUE > '2001-10-22 15:53:14.768'
    AND (WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'startDate'
    AND WLCS_DOCUMENT_METADATA.VALUE <= '2001-10-22 15:53:14.768'
    AND ((WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'language'
    AND WLCS_DOCUMENT_METADATA.VALUE = 'en'
    AND ((WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'Country'
    AND WLCS_DOCUMENT_METADATA.VALUE = 'nl'
    )) OR (WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'Country'
    AND WLCS_DOCUMENT_METADATA.VALUE = '*'
    AND NOT (WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'excludeIds'
    AND WLCS_DOCUMENT_METADATA.VALUE LIKE '%#7#%' ESCAPE '\'
    AND (WLCS_DOCUMENT.ID IN (
    SELECT
    WLCS_DOCUMENT_METADATA.ID
    FROM
    WLCS_DOCUMENT_METADATA
    WHERE
    WLCS_DOCUMENT_METADATA.ID = WLCS_DOCUMENT.ID
    AND WLCS_DOCUMENT_METADATA.NAME = 'userType'
    AND WLCS_DOCUMENT_METADATA.VALUE = 'retailer'
    At this moment, the server makes the user wait more than 10 min for the
    query to execute.
    This is what I found out about the problem:
    1)When I run the query on an Oracle SQL client (We are using Oracle 8.1.7.0)
    , it takes 5-10 seconds.
    2)If I remove the second term of (CONTENT.Country='nl' ||
    CONTENT.Country='*' ) in the custom query,
    thus retricting to CONTENT.Country='nl', the performance is OK.
    3)There are currently more or less 130 records in the DB that have
    Country='*'
    4)When I run the page on our QA server (solaris), which is at the same
    time
    our Oracle server,
    the response time is OK, but if I run it on our development server (W2K),
    response time is ridiculously long.
    5)The problem happens also if I add the term (CONTENT.Country='nl' ||
    CONTENT.Country='*' )
    to the rule definition, and I remove this part from the custom query.
    Am I missing something? Am I using the personalization server correctly?
    Is this performance difference between QA and DEV due to differences
    in the
    OS?
    Thank you,
    Luis Muñiz

  • Performance Problem with File Adapter using FTP Conection

    Hi All,
    I have a pool of 19 interfaces that send data from R/3 using RFC Adpater, and these interfaces generate 30 TXT files in a target Server. I'm using File Adapters as Receiver Comunication Channel. It's generating a serious perfomance problem. In File Adpater I'm using FTP Conection with Permanently Conection, Somebody knows if PERMANENTLY CONECTION is the cause of performances problem ?
    These interfaces will run once a day with total of 600 messages.
    We still using a Test Server with few messages.

    Hi Regis,
        We also faced teh same porblem. Whats happening is that when the FTP session is initiated by the file adapter, then its getting done from teh XI server. Hence the memory of the server is also eaten up. Why dont you give a try by using 'per file transfer'.
        If this folder to which you are connecting is within your XI server network then you can mount(or map) that drive to the XI server and use it with a NFS protocol of the file adapter and thereby increasing the performance.
    Cheers
    JK

  • RFC to JDBC Sync. Scenario

    Hello Guys,
    I'm trying to set up a RFC - XI - JDBC Scenario. Where the RFC sends some parameters and the JDBC performs a SELECT Query.
    I have a RFC that already was imported with the following definition:
      IMPORTING
         VALUE(LOGIN) TYPE  STRING
         VALUE(VIGENCIA) TYPE  STRING
      EXPORTING
         VALUE(LOGIN_RES) TYPE  STRING
         VALUE(ID_USUARIO) TYPE  STRING
         VALUE(FECHA_CREACION) TYPE  STRING
         VALUE(VIGENCIA_RES) TYPE  STRING
         VALUE(PERFIL) TYPE  STRING
    I already define the JDBC Request Structure as:
       <Statement_Select>
          <TableName Action="SELECT">
             <Table>USUARIO_WEB2</Table>
             <Access>
                <ID_USUARIO_WEB/>
                <LOGIN/>
                <FECHA_CREACION/>
                <VIGENCIA/>
                <PERFIL/>
             </Access>
             <Key>
                <LOGIN>value1</LOGIN>
                <VIGENCIA>value2</VIGENCIA>
             </Key>
          </TableName>
       </Statement_Select>
    After testing the RFC I'm getting the following error in the CC monitor:
    Message processing failed. Cause: com.sap.aii.af.ra.ms.api.RecoverableException: Error processing request in sax parser: Error when executing statement for table/stored proc. 'null' (structure 'LOGIN'): java.lang.IndexOutOfBoundsException: Index: 1, Size: 1
    Can anyone please help me with this? I'm not sure where the error migth be.
    Thanks!
    Felipe

    Hi Agasthuri,
    Yes I'm not passing any value to <LOGIN/> because is a SELECT statement. This is the Request "SELECT" Structure that I'm using.  Is there any problem there?
    <Statement_Select>
    <TableName Action="SELECT">
    <Table>USUARIO_WEB2</Table>
    <Access>
    <ID_USUARIO_WEB/>
    <LOGIN/>
    <FECHA_CREACION/>
    <VIGENCIA/>
    <PERFIL/>
    </Access>
    <Key>
    <LOGIN>value1</LOGIN>
    <VIGENCIA>value2</VIGENCIA>
    </Key>
    </TableName>
    </Statement_Select>
    I'm still with the problem, please help.
    Felipe

  • RFC to JDBC Empty tables parameters in response

    Hello,
    I've got a problem with an XI query (RFC to JDBC, UPDATE_INSERT) which contains a "tables" parameter. When I get the RFC's response in se37 (in the calling system of course), the "tables" parameter is empty. I need the values for further processing. What can I do to get back the values?
    Thanks in advance for your answers.
    Pierre Lejeune

    Hi Sarvesh.
    Thanks for your answer. The XML response in SXI_MONITOR tells me table is empty.
    My scenario is very simple. I got Receiver Determination, Interface Determination, Sender Agreement and Receiver Agreement, a sender CC to R/3 system and a receiver one to JDBC.
    Maybe it's a mapping problem. In the response mapping, I don't know what to link to the "tables" parameter.

Maybe you are looking for

  • Video rendering problem in Photoshop CS5

    I'm trying out PS's video features and having some problems. Using a 288MB quicktime file (1920x1080) shot with a Nikon D7000, all I did in PS was add two curves layers, one masked and the other not. If I export (to the same file type) with "all fram

  • Itunes update fail with 1XE800000A what is it?

    Today i want to make an update for itunes.... but it fails with mistake number 1XE800000A Who knows what it means... Who can help? OS is win 7 64bit

  • Chroma Upsampling Error (Chroma Bug) on new iMac

    I just scored my 24" 2.4GHz iMac last Friday, and I've been geekin' on it non-stop ever since. I'm completely in love with it. I do have one complaint tho, and it goes to the DVD drive and it's MPEG decoder: my 24" iMac displays a glaringly obvious c

  • [REDHAT] Newb, rpm, Failed dependencies ...

    People, I'm trying to install Oracle on my RH box. The Oracle installer complained about some missing rpms: Checking for compat-libstdc++-296-2.96-132.7.2; found Not found. Failed <<<< Checking for libstdc++devel-3.4.3-22.1; found Not found. Failed <

  • What are all the latest technologies in netweaver

    hi friends , i'm new to sap .can any one answer my question.