Issue While executing the Query for Pagination using ROWNUM with like

Issue While executing the Query for Pagination using ROWNUM with like.
Database is Oracle11G.
Oracle Database Table contains 8-9 lakh records
1) SQL equal (=)
SELECT /*+ FIRST_ROWS(n) */ ROWNUM RNUM, A.* FROM LINE A
WHERE A.REFERENCE = 'KMF22600920'
Execution Time:- 0.00869245 seconds
Returns 2 resultsets
2) SQL like (one %)
SELECT /*+ FIRST_ROWS(n) */ ROWNUM RNUM, A.* FROM LINE A
WHERE A.REFERENCE = 'KMF22600920%'
Execution Time:- 0.01094301 seconds
Returns 2 resultsets
3) SQL like (two%)
SELECT /*+ FIRST_ROWS(n) */ ROWNUM RNUM, A.* FROM LINE A
WHERE A.REFERENCE like '%KMF22600920%'
Execution Time:- 6.43989658 seconds
Returns 2 resultsets
In Pagination, we are using Modified version of SQL Query 3) with ROWNUM as mentioned below :-
4) SELECT * FROM (
SELECT /*+ FIRST_ROWS(n) */ ROWNUM RNUM, A.* FROM LINE A
WHERE REFERENCE like '%KMF22600920%' AND ROWNUM <= 20 ) WHERE RNUM > 0
Execution Time:- Infinite
ResultSets:- No as execution time is infinite
a) Instead of like if we use = in the above query it is returning the 2 resultsets (execution time 0.02699282 seconds)
b) Instead of two % in the above query, if use one example REFERENCE like 'KMF22600920%' it is returning the 2 resultsets (execution time 0.03313019 seconds)
Issue:- When using two % in like in the above query i.e. REFERENCE like '%KMF22600920%' AND ROWNUM <= 20 ) , it is going to infinite.
Could you please let us know what is the issue with two % used in like and rownum
5) Modified version of Option1 query (move out the RNUM condition AND RNUM <= 20)
SELECT * FROM (
SELECT /*+ FIRST_ROWS(n) */ ROWNUM RNUM, A.* FROM LINE A
WHERE REFERENCE like '%KMF22600920%' ) WHERE RNUM > 0 AND RNUM <= 20
Execution Time:- 7.41368914 seconds
Returns 2 resultsets
Is the above query is best optimized query which should be used for the Pagination or still can improve on this ?

This would be easier to diagnose if there was an explain plan posted for the 'good' and 'bad' queries. Generally speaking using '%' on both sides precludes the use of any indexes.

Similar Messages

  • While executing the query in the web template I am  facing below issue

    Hello SAP geniuses,
    Please help me on my issue  ,
    while executing the query in the web template i am  facing below issue.
    The variable for characteristic (region) is appearing but this characteristic (region) is not appearing in the free characteristic zone  but when we are executing the query with out webtemplate it is showing both variable and free characteristic
    can anybody help us to identify what is the issue with the web template.
    Thanks
    Alok

    Hi,
    Plz check ur report and execute at the designer and take its technical name and go to RSRT. log out and log in and try..
    if not check ur authorisation.
    Regards....KP

  • While executing the query the output is completey different from the query

    Dear friends,
    We built a query and loaded the data for that infoprovider.
    While executing the query the output is completey different from the query what we have built.
    Output has no relevance to the query.
    Kindly help the same.
    Regards,
    Sathya

    HI Satya,
    In my opinion the issue is with your front end.
    Results should be identical in RSRT, BEx Analyzer (Workbooks) and the ABAP-Web since all three frontend tools use the "ABAP Runtime layer".
    It may differ In the Portal where the new "Java Runtime layer". (Java-Web.) is used.
    This is given in OSS Note 1296576 - Different Query Behaviour: ABAP - Java.
    So i think the problem lies in your analyser. If you are using Analyzer: BI Add on 7.X (Based on 7.10) try and get support package 8.
    Hope this helps.
    Thanks,
    Rahul

  • Serious system error while executing the query: java.lang.OutOfMemoryError

    From ALSB, we are trying to insert records in a table, by calling the ALDSP webservice. It is working fine when the xml (ie., given as input to the ALDSP webservice) is small. But facing the following error when the input xml size is large.
    <ALDSP> <BEA-000000> <Product> <Serious system error while executing the query:
    {ld:ABC/Test}createTest:1
    java.lang.OutOfMemoryError: Java heap space
    We do not want to increase the heap size. Is there any other way we can solve this problem?

    In logical dataservice of ALDSP we have created a procedure called createTest, which is used to insert mulitple rows in the table. We have created a webservice for that logical DataService.
    Using the ALSB, we are calling the webservice -> createTest Operation and we are passing xml as input to that createTest function.
    Input xml:
    <ns1:createTest>
    <ns1:createTemps>
    <ns0:createTemp>
         <ns0:field1>1</ns0:field1>
              <ns0:field10>test1</ns0:field10>
    </ns0:createTemp>                
    <ns0:createTemp>
         <ns0:field1>2</ns0:field1>
              <ns0:field10>test2</ns0:field10>
    </ns0:createTemp>
         </ns1:createTemps>     
    </ns1:createTest>
    each ns0:createTemp represent a row that need to be inserted in the table.
    When the number of ns0:createTemp is less ( when the number of rows that need to be inserted is less) then no problem occurs, it is getting inserted properly. But when there are more number of ns0:createTemp then we are getting the following error
    <ALDSP> <BEA-000000> <Product> <Serious system error while executing the query:
    {ld:ABC/Test}createTest:1
    java.lang.OutOfMemoryError: Java heap space

  • Issue while posting the invoice in background using the WF-BATCH user

    Hi Friends,
      I am facing an issue while posting the invoice in background using the WF-BATCH user. I am using a invoice approval workflow where in when the approver approvers the invoice the invoice document get posted using a background method, which uses BO FIPP and Method POST and i am returning the Message Text to my workflow container from this method. When i see the log an exception is rasied from this method with an error  message "V004: You are not authorized to change this document", but WF-BATCH is having SAP_ALL and SAP_NEW authorizations. If i try to post the invoice using the method from my user id it is getting posted. What could be the issue. Please advice.

    Hi Sapient,
    The Parameter, Roles would be different for the LOGIN USER and WF-BATCH.. So ask your administrator
    to set the Roles & Parameters similar to that of LOGIN USER to WF-BATCH.
    For further refrence check in SU01 giving the LOGIN USER and then check with WF-BATCH... you would
    find the difference...
    Hope this would help you..
    Good luck
    Narin

  • Error while executing the Job for Objects :null  Batch Risk Analysis

    Hi All,
    We've recently upgraded Virsa to version  5.3_14 .  I'm encountering a problem when executing the Batch Risk Analysis job for users, roles and profiles.  The job does not complete for some objects and it seems to be sporadic and shows this error: -
    Background Job History: job id=395, status=2, message=Error while executing the Job for Object(s) :ABROWN:null                                                                               
    I've attached the log for your review.
    Thanks in advance for your help.                                                                               
    Linda Lewis                                                                               
    Feb 9, 2011 1:47:53 PM com.virsa.cc.xsys.meng.ObjAuthMatcher <init>
    FINEST: ObjAuthMatcher constructed: 4ms, #singles=2141, #ranges=0, #super=0
    Feb 9, 2011 1:47:54 PM com.virsa.cc.xsys.riskanalysis.AnalysisEngine riskAnalysis
    WARNING:  Job ID:395 : Failed to run Risk Analysis
    java.lang.StringIndexOutOfBoundsException at java.lang.String.substring(String.java:1019)
    at com.virsa.cc.xsys.util.RuleLoader.getPermRule(RuleLoader.java:573)
    at com.virsa.cc.xsys.riskanalysis.AnalysisEngine.performActPermAnalysis(AnalysisEngine.java:1609)
    at com.virsa.cc.xsys.riskanalysis.AnalysisEngine.riskAnalysis(AnalysisEngine.java:321)
    at com.virsa.cc.xsys.bg.BatchRiskAnalysis.performBatchRiskAnalysis(BatchRiskAnalysis.java:1166)
    at com.virsa.cc.xsys.bg.BatchRiskAnalysis.performBatchSyncAndAnalysis(BatchRiskAnalysis.java:1464)
    at com.virsa.cc.xsys.bg.BgJob.runJob(BgJob.java:560)
    at com.virsa.cc.xsys.bg.BgJob.run(BgJob.java:363)
    at com.virsa.cc.xsys.riskanalysis.AnalysisDaemonBgJob.scheduleJob(AnalysisDaemonBgJob.java:375)
    at com.virsa.cc.xsys.riskanalysis.AnalysisDaemonBgJob.start(AnalysisDaemonBgJob.java:92)
    at com.virsa.cc.comp.BgJobInvokerView.wdDoModifyView(BgJobInvokerView.java:444)
    at com.virsa.cc.comp.wdp.InternalBgJobInvokerView.wdDoModifyView(InternalBgJobInvokerView.java:1236)
    at com.sap.tc.webdynpro.progmodel.generation.DelegatingView.doModifyView(DelegatingView.java:78)
    at com.sap.tc.webdynpro.progmodel.view.View.modifyView(View.java:337)
    at com.sap.tc.webdynpro.clientserver.cal.ClientComponent.doModifyView(ClientComponent.java:481)
    at com.sap.tc.webdynpro.clientserver.window.WindowPhaseModel.doModifyView(WindowPhaseModel.java:551)
    at com.sap.tc.webdynpro.clientserver.window.WindowPhaseModel.processRequest(WindowPhaseModel.java:148)
    at com.sap.tc.webdynpro.clientserver.window.WebDynproWindow.processRequest(WebDynproWindow.java:335)
    at com.sap.tc.webdynpro.clientserver.cal.AbstractClient.executeTasks(AbstractClient.java:143)
    at com.sap.tc.webdynpro.clientserver.session.ApplicationSession.doProcessing(ApplicationSession.java:332)
    at com.sap.tc.webdynpro.clientserver.session.ClientSession.doApplicationProcessingStandalone(ClientSession.java:741)
    at com.sap.tc.webdynpro.clientserver.session.ClientSession.doApplicationProcessing(ClientSession.java:694)
    at com.sap.tc.webdynpro.clientserver.session.ClientSession.doProcessing(ClientSession.java:253)
    at com.sap.tc.webdynpro.clientserver.session.RequestManager.doProcessing(RequestManager.java:149)
    at com.sap.tc.webdynpro.serverimpl.defaultimpl.DispatcherServlet.doContent(DispatcherServlet.java:62)
    at com.sap.tc.webdynpro.serverimpl.defaultimpl.DispatcherServlet.doGet(DispatcherServlet.java:46)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:740)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
    at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.runServlet(HttpHandlerImpl.java:401)
    at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.handleRequest(HttpHandlerImpl.java:266)
    at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:386)
    at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:364)
    at com.sap.engine.services.httpserver.server.RequestAnalizer.invokeWebContainer(RequestAnalizer.java:1039)
    at com.sap.engine.services.httpserver.server.RequestAnalizer.handle(RequestAnalizer.java:265)
    at com.sap.engine.services.httpserver.server.Client.handle(Client.java:95)
    at com.sap.engine.services.httpserver.server.Processor.request(Processor.java:175)
    at com.sap.engine.core.service630.context.cluster.session.ApplicationSessionMessageListener.process(ApplicationSessionMessageListener.java:33)
    at com.sap.engine.core.cluster.impl6.session.MessageRunner.run(MessageRunner.java:41)
    at com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java:37)
    at java.security.AccessController.doPrivileged(AccessController.java:207)
    at com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java:104)
    at com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java:176)
    Feb 9, 2011 1:47:54 PM com.virsa.cc.xsys.util.Lock lock
    FINEST: Lock:1004
    Feb 9, 2011 1:47:54 PM com.virsa.cc.xsys.util.Lock unlock
    FINEST: Unlock:1004
    Feb 9, 2011 1:47:54 PM com.virsa.cc.xsys.bg.BatchRiskAnalysis performBatchRiskAnalysis
    WARNING: Error: while executing BatchRiskAnalysis for JobId=395 and object(s):ABROWN: Skipping error to continue with next object: null Feb 9, 2011 1:47:54 PM com.virsa.cc.xsys.bg.BgJob updateJobHistory
    FINEST: --- @@@@@@@@@@@ Updating the Job History -
    2@@Msg is Error while executing the Job for Object(s) :ABROWN:null
    Feb 9, 2011 1:47:54 PM com.virsa.cc.xsys.bg.dao.BgJobHistoryDAO insert
    INFO: -
    Background Job History: job id=395, status=2, message=Error while executing the Job for Object(s) :ABROWN:null
    Feb 9, 2011 1:47:54 PM com.virsa.cc.xsys.util.Lock lock
    FINEST: Lock:1004
    Feb 9, 2011 1:47:54 PM com.virsa.cc.xsys.util.Lock unlock
    FINEST: Unlock:1004
    Feb 9, 2011 1:47:54 PM com.virsa.cc.xsys.bg.BatchRiskAnalysis performBatchRiskAnalysis
    INFO: --- BKG User Permission Analysis (System: P20:020) completed ---  elapsed time: 4522 ms
    Feb 9, 2011 1:47:54 PM com.virsa.cc.xsys.util.Lock lock
    Edited by: Linda Lewis on Feb 9, 2011 9:08 PM

    Hi,
    Was a solution found for this error?
    Thanks,
    Glen

  • Authorization issue while opening the query in Bex Analyzer

    Hi All,
    I get the below error mesage while executing a report.I have executed the same query in RSRT and i get the below message.
    I have created some roles and added to a test user for testing purpose and when i log in as a tets user i get the below meesage.
    I am not sure what i am missing in the roles.When i tried executing the query in my id then no issues.
    No authority for the node from characteristic 0EMPLOYEE__0MAST_CCTR, hier
    Your user master record is not sufficiently maintained for object Authori
    Kindly let me how to overcome this.
    Thanks in advance.
    Regards,
    Kumar

    Hi Harish,
    Just check this query by this way:
    Use Tcode RSECADMIN -> Analysis ->Execution as -> In Execute as user: Put your Test user Id also tick With log and select RSRT. Now use Start Transaction.
    Might be the chance to get more details about this issue for resolution.
    Also ensure, Have you did the User Comparison while assigning test user id in role?
    Regards,
    Abdullah

  • Dump while executing the query on MProvider on Stock and Billing

    Hi,
    I am getting this dump while executing a Query on Multiprovider. Here Multiprovider is on Billing cube(z-cube) and 0IC_C03 cubes. In the report , I have KF of both cubes. I am sure I never faced this type of problem in version 3.5. It seems we are going back day by day because of version 7.0.
    A:
    1) I am not getting any error if I take Kfs of only Billing cube in the
    report on the multiprovider.
    2) But I am getting this problem by taking only KF from 0IC_c03 on this multiprovider .
    3) And also this problem is not raising in any queries(ofcourse same type) of those 2 cubes.
    4) Here Fiscal Variant is constant filter, and User will enter the Fiscal year.
    5) In the Columns area ,I have taken 12 selections each one restricted with one CALMONTH2.
    6) in Rows part I have 3 selections. Here 2 selections have Billing KF and other selection has Blockd Stock qty KF.
    B:
    The above dump is not raising if I take 12 selections on columns with a restriction on 0CALMONTH with Customer exit varible , but different offsets in selections. Here the Customer exit variable gets April of the User entered Financial year as the value.  But the problem is For all columns it is giving same Value for Stock KF even though we have different values in the system. Surprisingly issue is , Sales figures are comming proper.
    I am on Version 7 and patch 13. And  I am using Querydesigner 3.5.
    Dump message: GETWA_NOT_ASSIGNED
    <i>Short text:
        Field symbol has not yet been assigned.
    What happened?
        Error in the ABAP Application Program
        The current ABAP program "RSDRC_NCUM_ROUTINES" had to be terminated because it has come across a statement that unfortunately cannot be executed.
    keywords:
    "GETWA_NOT_ASSIGNED" " "
    "RSDRC_NCUM_ROUTINES" or "RSDRC_NCUM_ROUTINES"
    "FILTER_SELDR"</i>
    Thanks and Regards,
    Anil Kumar Sharma .P
    Message was edited by:
            Anil Kumar Sharma
    Message was edited by:
            Anil Kumar Sharma

    OK,
    I come back to my question: you said "<i>But the problem is For all columns it is giving same Value for Stock KF even though we have different values in the system</i>"
    Can you say to which period this value correspond?
    This looks like you are missing some characteristic identification in your multi.
    Can you assess that?
    Do you have the same time periods in both cubes? I mean, do you have at least the periods used in your RKF / query in both cubes? Would you mind pasting your identification for time IObjs from table RSDICMULTIIOBJ?
    please see hereunder my identification for a sales and stock multicube
    INFOCUBE                       OBJVERS IOBJNM                         PARTCUBE                       PARTIOBJ                                                                               
    ZMC021                         A       0CALDAY                        ZICRT_C07                      0CALDAY                      
    ZMC021                         A       0CALMONTH                      ZICRT_020                      0CALMONTH                    
    ZMC021                         A       0CALMONTH                      ZICRT_C07                      0CALMONTH                    
    ZMC021                         A       0CALMONTH2                     ZICRT_020                      0CALMONTH2                   
    ZMC021                         A       0CALMONTH2                     ZICRT_C07                      0CALMONTH2                   
    ZMC021                         A       0CALQUART1                     ZICRT_020                      0CALQUART1                   
    ZMC021                         A       0CALQUART1                     ZICRT_C07                      0CALQUART1                   
    ZMC021                         A       0CALQUARTER                    ZICRT_020                      0CALQUARTER                  
    ZMC021                         A       0CALQUARTER                    ZICRT_C07                      0CALQUARTER                  
    ZMC021                         A       0CALWEEK                       ZICRT_C07                      0CALWEEK                     
    ZMC021                         A       0CALYEAR                       ZICRT_020                      0CALYEAR                     
    ZMC021                         A       0CALYEAR                       ZICRT_C07                      0CALYEAR                     
    ZMC021                         A       0FISCPER                       ZICRT_020                      0FISCPER                     
    ZMC021                         A       0FISCPER                       ZICRT_C07                      0FISCPER                     
    ZMC021                         A       0FISCPER3                      ZICRT_020                      0FISCPER3                    
    ZMC021                         A       0FISCVARNT                     ZICRT_020                      0FISCVARNT                   
    ZMC021                         A       0FISCVARNT                     ZICRT_C07                      0FISCVARNT                   
    ZMC021                         A       0FISCYEAR                      ZICRT_020                      0FISCYEAR                    
    ZMC021                         A       0FISCYEAR                      ZICRT_C07                      0FISCYEAR                    
    note that here ZICRT_020 is a monthly stock cube.
    hope that helps...
    Olivier.

  • Issue while executing the report

    Hello experts,
    I have an issue related to executing the BI Report. In one report , i get an issue of Time Out, as the report is executed for a single day only with a single restriction. Also i execute the same report in RSRT, there also it gives a short dump. I checked the data into the related component for the same selection parameter values. here I get the data but it is creating issue while executing in RSRT or analyser or EP.
    what is happening there am not able to understand. please advice.
    Also searched in SDN but not get the relevant thread.
    Thanks in advance
    Neha

    hii
    The query does not includes any RKF, CKF any complex calculations. as I said before also query is based on MP which is designed on a single Cube. There are no such navigational Attributes used for selection, Number of records are not more than 300 on that particular selection parameter
    there is only one info object which is creating issue & it is a master info object. If i am restricting this info object in the query & then executes the query , it gives the error msg also i put a variable for input parameter, & there I restrict the value, then also query executes with an error msg
    But if query is not restricted with  any value for that info object then query executes with in a second.
    Thanks
    neha

  • Error while executing the query

    Hi all,
    I am facing two errors while executing my query namely
    1)The validity interval has initial value as lower limit.
    2)Diagnosis
    Query  uses non-cumulative key figures. It requires calculation for non-cumulative values for more than 2000 periods in time. The most common reason for this is that the non-cumulative InfoCube has a very large validity interval and the query has no time restrictions. With exception aggregation AVERAGE (AV1 or AV2) in particular, this lack of time restriction would result in considerable memory and performance problems.
    System Response
    Query  cannot be executed in this way.
    Please suggest a possible reason for these errors.
    Thanks and Regards.
    Mudit Khanna

    Hi,
    See
    Refer Notes 496638,571364
    210432
    528202
    1318741
    208546
    and also see simila thread
    Problem getting data from BF cube in report (Non cummulative keyfigures)
    Thanks
    Reddy

  • While executing the query sales org not given in Selection screen but still

    Hi ,
    while am executing a query even though am not giving any sales organization  input in selection sceen , and execute , the query will run and pop up windows displays as ' you do not Authorization to read Zsales organization,
    same query i executed last days before  , it worked fine, but now its not working, plz advice

    Hi,
    you might made Sales org Info object as Auth relevant, hence it is giving that error.
    Eirther disable the authorization relevant check in the infoobject maintainance in Bex explorer tab...
    Or create a auth  variable for the infoobject and use it in the query..
    Regards,
    Rangz

  • Receiving SQL Error: INTERNAL_ERROR  while executing the query

    Dear All,
    I am receiving the below error while executing a query.
    SQL Error: INTERNAL_ERROR
    Diagnosis
    The database system registered an SQL error. As available, the error number and a description are included in the short text. Possible causes for SQL errors include:
    Overflow of database objects such as buffers, temporary tablespaces, rollback segments or data containers/tablespaces.
    ->These problems can generally be eliminated by system administrators.
    Missing generated database objects such as tables or views based on inconsistent or inactive InfoCubes or InfoObjects. Examples of this include the view of the fact table for an InfoCube or the attribute SID table (X/Y table) of a characteristic.
    -> These problems can generally be eliminated by a BW administrator.
    SQL requests with incorrect content.
    -> Problems of this type are generally programming errors.
    System Response
    Procedure
    Contact your system administrator.
    Procedure for System Administration
    If there is no error description, look for one in the reference from the database producer.
    Decide on the correct category for the SQL error and check if it can be eliminated. Generally the error can also be found in the syslog (transaction sm21). From there, transactions sm51 and sm50, the developer's trace log of the work process can be determined and the erroneous statement can be viewed in the log. This procedure is described in SAP Note 568768.
    Notification Number DBMAN 257 
    I verified the table spaces and also done RSRV Repair of InfoObjects and InfoCubes and didnt find any error. But I am receiving this error for one particular month and remaining months are executing fine. Any ideas why i am receiving this error.
    Regards
    Ravi Y

    OSS Note 1305568:
    Symptom
    A data mart query that
    you want to execute within a data transfer process (DTP), for example, terminates with the
    SQL error -1013 "Too many order columns".
    The following is displayed in the monitor log of the data transfer process:
    Error while extracting from source xxxxxx (type InfoProvider)
    Message number RSBK 242
    Exception CX_SQL_EXCEPTION occurred (program:
    CL_SQL_STATEMENT==============CP, include:
    CL_SQL_STATEMENT==============CM004, line: 32).
    Message number RS_EXCEPTION 000
    SQL error: POS(3306) Too many order columns
    Message number DBMAN 257
    Error reading the data of InfoProvider xxxxxx
    Message number DBMAN 305
    You have already implemented Note 1065380.
    Other terms
    CX_SQL_EXCEPTION, message number, RS_EXCEPTION 000, DBMAN 257,  RSBK 242,
    RS_EXCEPTION 000
    Reason and Prerequisites
    The MaxDB internal limit of 4016 bytes or 128 fields for GROUP BY columns was exceeded.
    Solution
    When you implement these corrections, no aggregation is performed for the data (GROUP BY) if the limits of the MaxDB database have been exceeded.
    The limit values for the aggregation bahavior can also be manually reduced if there are problems with the default values.
    Two RSADMIN parameters are provided for this.
    MAXDB_MAX_GROUP_BY_FLDS is the maximum number of GROUP BY fields. The default value is 128.
    MAXDB_MAX_GROUP_BY_LEN is the maximum total length of the GROUP BY fields. The default value is 4000.
    SAP NetWeaver BI 7.00
               Import Support Package 21 for SAP NetWeaver BI 7. 00 (SAPKW70021) into your BI system. The Support Package is available when Note 1270629"SAPBINews NW 7.00 BI Support Package 21", which describes this Support Package in more detail, is released for customers.

  • Issue while executing the discovery command from target nodes

    Hi Experts.
    I had to create cluster two node using openfile, after creation of successful lun and associated partition from all the nodes i have changed the ip address of openfiler.
    After changing the IP on open filer.
    A) openfile version:-
    [root@san ~]# uname -r
    2.6.26.8-1.0.11.smp.pae.gcc3.4.x86.i686
    B) Linux Oel5:-
    [root@rac1 ~]# uname -r
    2.6.18-194.el5xen
    [root@rac1 ~]#
    1:- I am able to ping and ssh etc from any node to openfiler.
    However, while executing the below command i am facing the below exception..
    service iscsi restart
    Stopping iSCSI daemon:
    iscsid dead but pid file exists [  OK  ]
    Starting iSCSI daemon: [  OK  ]
    [  OK  ]
    Setting up iSCSI targets: iscsiadm: No records found!
    [  OK  ]
    [root@rac1 ~]#
    Moreover, tried to discover the targets, unfortunately no message is getting displayed after execution of this below command.
    [root@rac1 ~]# iscsiadm -m discovery -t sendtargets -p 192.168.37.200
    [root@rac1 ~]#
    The quick response will be appreciated as my whole test case is down as of now due storage issues.
    Thanks,
    Arch.

    Are you running a firewall that needs to be adjusted to support your changed Openfiler IP network?

  • Database Alert Macros issue while executing the macros in Background

    Hi All,
    I am facing some problems while executing the Database alert macros in Background/Process Chain.
    There are two macros for which the problem exists.
    1.Excess Projected Inventory above Max
    The logic here is, the alert should work for Only Fixed Lot size Procedure.
    If the Stock on hand (projected EA) > (Safety Stock (EA) + Full SOQ (EA)) then alert = "Projected inventory is XX% above MAX".
    XX is the Percent above Max. 
    Note: SOQ => fixed lot size.
    2.Excess Actual Days of Supply
    The Logic here is, the alert should work for all Lot size Procedures except for "Fixed lot size".
    Actual Days Supply >=180 days. (current -> future buckets)
    -  For every receipt cell check the Actual Days Supply - if >= 180 days.
    The macros are working perfectly as expected.

    Hi Abhi,
    Hope you are doing good.
    Yes exactly, the macros are working in foreground/Interactively but not in the Background via Process chain. Let me send the details again.
    Issue :
    I am facing some problems while executing the Database alert macros in Background/Process Chain.
    There are two macros for which the problem exists.
    1.Excess Projected Inventory above Max
    The logic here is, the alert should work for Only Fixed Lot size Procedure. But in Background the alerts are getting created for Lot for Lot and other Planning procedures..
    If the Stock on hand (projected EA) > (Safety Stock (EA) + Full SOQ (EA)) then alert = "Projected inventory is XX% above MAX".
    XX is the Percent above Max. 
    Note: SOQ => fixed lot size.
    2.Excess Actual Days of Supply
    The Logic here is, the alert should work for all Lot size Procedures except for "Fixed lot size". But in Background the alerts are getting created for the Fixed Lot size procedures too..
    Actual Days Supply >=180 days. (current -> future buckets)
    -  For every receipt cell check the Actual Days Supply - if >= 180 days.
    The macros are working perfectly as expected in Foreground/Interactively but the samething is not happening while executing the macro in Background/Process Chain.
    I have tried running these macros in different sequences(Default/Start/Macro) but couldn't able to resolve the issue.
    Thanks in Advance,
    Jay.

  • Problem while executing the query

    Hi,
    I have a query when iam trying to execute the query it is giving following error
    Error Summary
    Exception occured while processing the current request; this exception cannot be handled by the application or framework
    If the information on this page does not help you locate and correct the cause of the problem, contact your system administrator
    To facilitate analysis of the problem, keep a copy of this error page Hint: Most Web browsers allow you to select all content, and copy and paste it into an empty document (such as in an email or simple text file)
    Root Cause
    The initial exception that caused the request to fail was:
         Select a valid Date (OR) Fiscal Period (OR) Calendar Month         
    com.sap.mw.jco.JCO$Exception: (104) RFC_ERROR_SYSTEM_FAILURE:      Select a valid Date (OR) Fiscal Period (OR) Calendar Month         
    at com.sap.mw.jco.MiddlewareJRfc.generateJCoException(MiddlewareJRfc.java:455)
    at com.sap.mw.jco.MiddlewareJRfc$Client.execute(MiddlewareJRfc.java:1452)
    at com.sap.mw.jco.JCO$Client.execute(JCO.java:3979)
    at com.sap.mw.jco.JCO$Client.execute(JCO.java:3416)
    at com.sap.ip.bi.base.application.service.rfcproxy.impl.jco640.Jco640Proxy.executeFunction(Jco640Proxy.java:267)
    Iam not sure why i could not able to execute this query.I could able to execute and run other queries.
    Please help me..
    thanks in advance

    Hi,
    Please check if you have used any formulae or function modules to derive the fiscal period/month from calday. Maybe the date being assigned is not of the same format as expected due to which the error (input valid date or calmonth or fiscal period) is being generated.
    Regards,
    Manoj

Maybe you are looking for