Error : Governor limit exceeded in cube generation (Maximum data records ex

Hi
I have created a report that throws this error.
Governor limit exceeded in cube generation (Maximum data records exceeded.)
Error Details
Error Codes: QBVC92JY
After going through various blog i found that i need to change the max value of pivot table view in the instanceconfig.xml file. I have added it to 500000.
But still it throws same error.
Ashok

Hi Ashok,
There are a number of setting who work in parrallel. Have a look here:
http://obiee101.blogspot.com/2008/02/obiee-controling-pivot-view-behavior.html
regards
John
http://obiee101.blogspot.com

Similar Messages

  • Governor limit exceeded in cube generation (Maximum data records exceeded.)

    There are similar posts which didn't help in my situation.
    I had the error: Governor limit exceeded in cube generation (Maximum data records exceeded.). The query returns about *64000* rows.
    I've changed the instance config and exagerated it, and then also the register, but I still get the error:Governor limit exceeded in cube generation (Maximum data records exceeded.)
    instanceconfig:
    <?xml version="1.0"?>
    <WebConfig>
    <ServerInstance>
    <CredentialStore>
    <CredentialStorage type="file" path="C:\OracleBIData\web\config\credentialstore.xml"/>
    </CredentialStore>
         <CubeMaxRecords>5000000</CubeMaxRecords>
         <CubeMaxPopulatedCells>10000000</CubeMaxPopulatedCells>
         <ResultRowLimit>5000000</ResultRowLimit>
    <PivotView>
              <MaxVisibleRows>5000000</MaxVisibleRows>
              <MaxVisibleColumns>1024</MaxVisibleColumns>
              <MaxVisiblePages>1024</MaxVisiblePages>
              <MaxVisibleSections>1024</MaxVisibleSections>
    </PivotView>
    </ServerInstance>
    </WebConfig>
    Also added
    *<CubeMaxRecords>5000000</CubeMaxRecords>*
    *     <CubeMaxPopulatedCells>10000000</CubeMaxPopulatedCells>*
    to the registry.
    But still got the error:(
    Thanks for your help
    Edited by: user635025 on Jul 24, 2009 4:34 AM

    I suggest disabling the cache. Setting the max rows to a very high number and disabling the cache is the way to go when you are querying an Oracle database :)
    ( for those who haven't allready did the the thing )
    In NQSConfig.INI
    # Query Result Cache Section
    [ CACHE ]
    ENABLE     =     NO;

  • Pivot view error "Governor Limit exceeded in cube generation"

    HI Gurus
    My Instanceconfig.xml contains this section.
    <PivotView>
    <MaxCells>600000</MaxCells>
    <CubeMaxRecords>600000</CubeMaxRecords>
    <CubeMaxPopulatedCells>600000</CubeMaxPopulatedCells>
    </PivotView>
    I restarted all the services but still I am getting
    "Governor Limit exceeded in cube generation & the maximum data records exceeded. "
    If the data is below 40K it works fine.
    I saw the previous posts on this error and all the suggestions already tried.
    Did I missed any thing.
    Please help me.
    Thanks

    I ran into this issue and didn't see the resolution here or on any other public forums, so I'll add a detail bit that I found in Metalink, in Doc ID 494163.1.
    1. The xml nodes you care about are CubeMaxRecords and CubeMaxPopulatedCells, at least from OBI 10.1.3.4
    2. These are both child nodes of ServerInstance. Their defaults are 40,000 and 150,000, respectively. That is, your baseline instanceconfig.xml file should look like:
    <CubeMaxRecords>40000</CubeMaxRecords>
    <CubeMaxPopulatedCells>150000</CubeMaxPopulatedCells>
    </ServerInstance>
    </WebConfig>
    3. You only want to tune these a little at a time, in 1000/10000 increments at most. They act as governors on how the presentation server uses memory, and just slapping an extra zero on the end of each number will probably cause your presentation server to crash. This is much worse than the QBVC92JY error, which at least comes with a nice little graphic.
    4. If you can't find a tuning on these limits that eliminates the error without causing your server to crash, the report author should be re-thinking what he/she is trying to do. It's likely that the report is returning far more data than you really want, and that it needs to be filtered in some way.
    Good luck,
    -Eric

  • Governor limit exceeded in cube generation error in OBIEE 10g

    Hi Gurus,
    one of my OBIEE Report is throwing the error called *"Governor limit exceeded in cube generation(Maximum data records exceeded.)"*
    I am using OBIEE 10g.Could you please suggest me here and report also runnig for long time like to get one day data it is running for 30 min to pick 9000 rows.
    Regards,
    SK

    Hello,
    Try to alter the values in the instanceconfig.xml. ( Under \OracleBIData\web\config )
    <CubeMaxRecords>200000</CubeMaxRecords>
    <CubeMaxPopulatedCells>200000</CubeMaxPopulatedCells>
    Also refer to :     OBIEE 10g: Error: "Governor Limit Exceeded In Cube Generate. Error Codes: QBVC92JY" or "Maximum number of allowed pages in Pivot Table exceeded...Error Detail" When Displaying a Pivot View or Chart [ID 1092854.1] if it did not solve the issue.
    Hope this helps. Pls mark if it does.
    Thanks,
    SVS

  • Error: QBVC92JY. Maximum data records exceeded

    Hi all,
    I've got a big problem with this error: Governor limit exceeded in cube generation (Maximum data records exceeded.).
    I've changed configuration in "instanceconfig.xml" where:
    <PivotView>
    <MaxCells>1000000</MaxCells>
    <CubeMaxRecords>1000000</CubeMaxRecords>
    <CubeMaxPopulatedCells>1000000</CubeMaxPopulatedCells>
    </PivotView>
    but I've still the same problem.
    Someone can help me?
    Steve

    stesappo:
    Put <CubeMaxRecords> and <CubeMaxPopulatedCells> out of <PivotView>.
    Try this in instanceconfig.xml:
    <PivotView>
    <MaxCells> 4000000 </MaxCells>
    <MaxVisibleColumns> 5000 </MaxVisibleColumns>
    <MaxVisiblePages> 2500 </MaxVisiblePages>
    <MaxVisibleRows> 50000 </MaxVisibleRows>
    <MaxVisibleSections> 3000 </MaxVisibleSections>
    <ResultRowLimit>20000</ResultRowLimit>
    </PivotView>
    <CubeMaxRecords> 1000000 </CubeMaxRecords>
    <CubeMaxPopulatedCells> 1000000 </CubeMaxPopulatedCells>
    Gabriel.

  • Query0;Runtime error time limit exceeded,with parallel processing via RFC

    Dear experts,
    I have created a report on 0cca_c11 cube and while running my report when i give cost center group which contains many cost centers , my report executes for long time and at last gives message
    "Error while reading data;navigation is possible" and
    "Query0;Runtime error time limit exceeded,with parallel processing via RFC"
    please tell me what is the problem and how can i solve this
    Regards
    Shweta

    hI,
    Execute the Query in RSRT with Execute and Debug option.
    Select SQL statements toknow where exactly it's taking time.
    Let us know the details once you done.
    Reg
    Pra

  • Runtime error(Time limit exceeds)after executing select query

    Dear experts, whenever i executing the select query in this zprogram i am getting runtime error that time limit exceeds.i am using inner join and into table.after that also i am geetting error. how can i resolve it??
    SELECT LIKP~VBELN LIKP~WADAT_IST LIKP~VEHICLE_NO LIKP~TRNAME
              LIKP~VEHI_TYPE LIKP~LR_NO LIKP~ANZPK LIKP~W_BILL_NO
              LIKP~SEALNO1                                       " Seal NO1
              LIKP~SEALNO2                                       " Seal NO2
              LIPS~LFIMG
              VBRP~VBELN VBRP~VGBEL VBRP~MATNR VBRP~AUBEL VBRP~FKIMG
              VBAK~AUART
              VBRK~FKART VBRK~KNUMV VBRK~FKSTO
              FROM LIKP INNER JOIN LIPS ON LIKP~VBELN EQ LIPS~VBELN
                        INNER JOIN VBRP ON LIKP~VBELN EQ VBRP~VGBEL
                        INNER JOIN VBAK ON VBRP~AUBEL EQ VBAK~VBELN
                        INNER JOIN VBRK ON VBRP~VBELN EQ VBRK~VBELN
              INTO TABLE  I_FINAL_TEMP
              WHERE LIKP~VSTEL = '5100' AND
                 LIKP~WADAT_IST IN S_WADAT  AND
                    VBRP~AUBEL IN S_AUBEL AND
                    VBAK~AUART IN ('ZJOB','ZOR') AND
                    VBRK~FKART IN S_FKART AND
    *               VBRK~FKART IN ('ZF8','ZF2','ZS1') AND
                    VBRK~FKSTO NE 'X'.
    When I am debugging the select query.the cursor will not go to next step.after 15-20 minutes i am getting runtime error(time limit exceeds).
    how can i resolve it for that scenario??

    Looks like whole SD flow you trying to fetch in single query
    First you check the database statistic of these table are upto date in system ( Check with basis team )
    if this query was working fine earlier.
    Most of table involved are huge volume tables which queried with any primary key
    Any secondary index on created for LIKP on VSTEL WADET ?
    My suggestion would be split the selection queries and make use of primary or existing secondary index to fetch the desired result if possible. For testing purpose split the queries and find which is taking more time and which needs index by taking squel trace in ST05.
    Also take ST05 trace of this query in debugger ( New debugger -> special tool -> trace > ST05/SE30)

  • APO jobs cancelled due to error- Time limit Exceed.

    Dear All,
    Three jobs are scheduled daily to transfer data from R/3 to APO,
    These are cancelled due to error -"Time limit exceed".

    Hi Pankaj,
    There is a specific time allocated for each queue to clear and to get transferred across the system. the probable reasons are.
    1. The queue itself is taking more than the allocated time to clear due to system load. Each queue requires specific free memory from server , in case of overload system is unable to allocate desired memory hence the error might appear.
    2. If in a queue the very first entry is stuck due to x,y,z reason the queues following automatically go into "timeout" and then "system fail"
    Proposed solution.
    1. Analyze the first entry functionally. If Ok then ask the basis team to clear that particular entry.
    2. after time out the queue will go to "SYSFAIL". Double clicking on this will reveal the reason. Based on the technical reason I can suggest the relevant OSS notes to be applied.
    For detailed technical analysis check t-code CFG1.
    Regards,
    Ankur
    Edited by: Ankur Shrivastava on Dec 19, 2007 5:58 AM

  • Error Contract limit exceeded

    Hello All,
    We got an error "Contract limit exceeded" When tried to create a PO.
    I checked in the contract header limit.But it seems not useful.Pl provide me any Table name where i can check the Release values of each line item of the contract in various Purchase orders.
    OR provide a path where can i check the contract used limit.
    Thanks in advance to all
    Kalyani
    Hyderabad

    It seems Contract Limit is Exceeded. Go to ME33K and Select Item Then Click on Release Documentation (Ctrl Shift F12) then it will Show the Number of release pos created on Contract.
    Also Check the Contract Limit in Header Details.
    Regards,
    Ashok

  • PI 7.0: IDOCs struck in IDX5 with error "Time Limit Exceeded".

    Hi All,
    We have a File to IDOC scenario in PI 7.0. After mapping the IDOCs are posted from PI to ECC System.
    On a normal day this interface works good, yesterday we received a huge file which resulted in the creation of about 25000 IDOCs from one single file. The mapping went fine, however the IDOCs created were not posted into ECC System. When we monitor the IDOCs using transaction code IDX5 in PI system, we found the error message as "Time limit exceeded", the user shown was "PIAFUSER". To overcome this error, we increased the time limit of PIAFUSER from default to about 1500 seconds.
    Now, I want to push these IDOCs from PI into ECC System. Could you please let us know, how to push these IDOCs sitting in PI system to ECC?
    We do not want to reprocess the file from the beginning. Please let us know if it is possible to push the IDOCs without processing the file? If yes, how to reprocess?
    Thanks in advance.
    Regards,
    Manohar Dubbaka.

    Hi,
    the help documentation is as follows:
    Check the tRFC Status  
    Use
    tRFC calls which transfer IDocs use the function module IDOC_INBOUND_ASYNCHRONOUS at reception (before release 4.0: INBOUND_IDOC_PROCESS).
    If an IDoc in the sending system has been passed to tRFC (IDoc status "03"), but has not yet been input in the receiving system, this means that the tRFC call has not yet been executed.
    Activities
    To check the status of the tRFC calls, choose Tools ® IDoc Interface/ALE ® Administration ® Monitoring ® Troubleshooting ® RFC Queue (SM58) and specify any additional selection criteria.
    The program RSARFCEX restarts unsuccessful tRFC calls.
    You cannot choose the option is being executed in background processing.
    Best Regards,
    Erik Hubers

  • TRFC error "time limit exceeded"

    Hi Prashant,
    No reply to my below thread...
    Hi Prashant,
    We are facing this issue quite often as i stated in my previous threads.
    As you mentioned some steps i have already followed all the steps so that i can furnish the jog log and tRFC details for reference long back.
    This issue i have posted one month back with full details and what we temporarily follow to execute this element successfully.
    Number of times i have stated that i need to know the root cause and permanent solution to resolve this issue as the log clearly states that it is due to struck LUWs(Source system).
    Even after executing the LUWs manually the status is same (Request still running and the status is in yellow color).
    I have no idea why this is happening to this element particularly as we have sufficient background jobs.
    we need change some settings like increasing or decreasing data package size or something else to resolve the issue permanently?
    For u i am giving the details once again
    Data flow:Standard DS-->PSA--->Data Target(DSO)
    In process monitor screen the request is in yellow color. NO clear error message s defined here.under update 0 record updated and missing message with yellow color except this the status against each log is green.
    Job log:Job is finished TRFCSSTATE=SYSFAIL message
    Trfcs:time limit exceeded
    What i follow to resolve the issue:Make the request green and manually update from PSA to Data target and the job gets completed successfully.
    Can you please tell me how to follow in this scenario to resolve the issue as i waiting for the same for long time now.
    And till now i didn't get any clue and what ever i have investigated i am getting replies till that point and no further update beyond this
    with regards,
    musai

    Hi,
    You have mentioned that already you have checked for LUWs, so the problem is not there now.
    In source system, go to we02 and check for idoc of type RSRQST & RSINFO. If any of them are in yellow status, take them to BD87 and process them. If the idoc processed is of RSRQST type, it would now create the job in source system for carrying out dataload. If it was of RSINFO type, it would finish the dataload in SAP BI side as well.
    If any in red, then check the reason.

  • Error "time limit Exceed"?"

    hi Experts,
    What to do when a load is failing with "time limit Exceed"?

    Hi,
    Time outs can be due to many reasons.  You will need to find out for your specific build.  Some of the common ones are:
    1. You could have set a large packet size.  See in the monitor whether the number of records in one packet seem inordinately large, say 100,000.  Reduce the number in steps and see which ones work for you.
    2.  The target may have a large number of fields, even then you will receive a time out as the size of the packet may become large.  Same solution as point 1.
    3.  You may have built an index in the target ODS which may impact your write speed.  Remove any indexes and run with the same package size, if it works then you know the index is the problem.
    4.  There is a basis setting for time out.  Check that is set as per SAP recommendation for your system.
    5.  Check the transactional RFCs in the source system.  It may have choked due to a large number of errors or hung queues.
    Cheers...

  • Maximum Data records in a Segment?

    Hi Folks,
    I wanted to know if there is a limitation to number of data records in an IDoc segment.The requirement to generate 1 segment for each material number but the problem is whenever a material number contains greater then 1000 records,it is split into two segments.
    The IDoc is an extension of PROACT01 IDoc.
    Thanks in Advance.
    -Abhishek

    Hi Abhishek,
    Maximum records that can be created 1 segment is 9999999999.
    It all depends on the attributes of Segment.
    Regards,
    Vineesh B    

  • Error: Time limit exceeded...

    Hello Experts,
    I am currently modifying a report where it exceeds the time alloted for programs to run
    in our production server which is 10 minutes. I debbuged the report and found out that it fetches
    around 550k to 700k of records from tables VBAK and VBAP. After selecting, the internal table
    will now be looped and this where the report stops since it cannot process the huge volume of records
    which is 550k++. I need help on how to fix this since I can't think of anyway to limit the records being
    fetched. Below is the select statement:
    SELECT avbeln bposnr a~auart
             avkorg avtweg a~kunnr
             bmatnr bpstyv b~spart
        FROM vbak AS a
       INNER JOIN vbap AS b
          ON avbeln = bvbeln
       APPENDING TABLE lt_vbap
       WHERE a~auart IN lr_auart
         AND a~vbeln IN s_vbeln
         AND a~vbeln IN s_vbill
         AND a~vbtyp EQ p_vbtyp.
    LOOP AT lt_vbap ASSIGNING <wa_vbap>.
    ENDLOOP.
    Hope you can help me out here guys. Thank you and take care!

    Vitraylab,
    SELECT avbeln bposnr a~auart
    avkorg avtweg a~kunnr
    bmatnr bpstyv b~spart
    FROM vbak AS a
    INNER JOIN vbap AS b
    ON avbeln = bvbeln
    INTO TABLE lt_vbap
    PACKAZE SIZE 30000
    WHERE a~auart IN lr_auart
    AND a~vbeln IN s_vbeln
    AND a~vbeln IN s_vbill
    AND a~vbtyp EQ p_vbtyp.
    Write the functionality here, which you have written in between  loop end loop.
    ENDSELECT.
    change the packaze size according to your program run  time  10 minutes.
    Don't forget to reward if useful.

  • Error in File Database connector Error Time Limit exceeded

    Hello Experts,
    I am getting following error while fatching data from SAP BW using MDX connection.
    I am using Crystal Reports 2011 and save it in SAP BO 4.1 SP2.
    I have OLD BO system also. It is running successfully over there with using same connection(MDX) in Crystal Report 2008.
    Let me know if you faced same issue and provide solution.
    Thanks In Advance.
    Regards,
    Dhaval Dave.

    Hi Dave,
    Check the below one for BOE 3.1 :
    https://websmp130.sap-ag.de/sap%28bD1lbiZjPTAwMQ==%29/bc/bsp/sno/ui_entry/entry.htm?param=69765F6D6F64653D3030312669765F…
    or
    https://websmp130.sap-ag.de/sap/support/notes/convert2pdf/0001434564?sap-language=EN
    rEGARDS,
    dj

Maybe you are looking for

  • HT1657 I couldn't play movie rental and accidentally rented it twice. How can I remove one of the charges?

    I just bought and used my apple tv. I tried renting a movie but was asked to go to my itunes to verify my info. In the  process, I ended up renting the movie 2x and had to accept to be able to get the PLAY button to pop up on the screen. I watched to

  • XI Content for Material Master used by MDM

    Dear all, Where can I get the XI content for MDM 5.5 SP4 to receive the master data sent from ERP system? As far as I know, ERP sends two IDoc basic types (MATMAS and MDMRECEIPT). But why I cannot find at the URL https://websmp206.sap-ag.de/~sapidb/0

  • During Subcontracting GR

    hi all, i got the following error message while doing Subcontracting GR First i error message NO account 89100 created for Company code xxx Then i created in FS00 by copying from company code 0001 after this error i got the following "Account 89100 r

  • In-house Documentation

    Recently I launched a new project to electronically document our IT infrastructure, procedures, policies, etc. During this, I realized that I could also provide this as a service/system to other groups within the organization for similar purposes (th

  • Help With Servlet & SQL

    I'm working on a project for my Java class and I'm running into trouble with the JDBC section. My project is laid out as follows: This project consists of a HTML page, a servlet, and a database. I have already wrote up the code for the interaction be