Query is allocating too large memory Error ( 4GB) in Essbase 11.1.2

Hi All,
Currently we are preparing dashboards in OBIEE from the Hyperion Essbase ASO (11.1.2) Cubes.When are trying to retrieve data with more attributes we are facing the below error
"Odbc driver returned an error (SQLExecDirectW).
Error Details
Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 96002] Essbase Error: Internal error: Query is allocating too large memory ( > 4GB) and cannot be executed. Query allocation exceeds allocation limits. (HY000)"
Currently we have data file size less than 2GB so we are using "Pending Cache Size=64MB".
Please let me know which memory I have to increase to resolve this issue
Thanks,
SatyaB

Hi,
Do you have any dynamic hierarchies? What is the size of the data set?
Thanks,
Nathan

Similar Messages

  • Query is allocating too large memory error in OBIEE 11g

    Hi ,
    We have one pivot table(A) in our dashboard displaying , revenue against a Entity Hierarchy (i.e we have 8 levels under the hierarchy) And we have another pivot table (B) displaying revenue against a customer hierarchy (3 levels under it) .
    Both tables running fine under our OBIEE 11.1.1.6 environment (windows) .
    After deploying the same code (RPD&catalog) in a unix OBIEE 11.1.1.6 server , its throwing the below error ,while populating Pivot table A :
    Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
    *State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 96002] Essbase Error: Internal error: Query is allocating too large memory ( > 4GB) and cannot be executed. Query allocation exceeds allocation limits. (HY000)*
    But , pivot table B is running fine . Help please !!!!!
    data source used : essbase 11.1.2.1
    Thanks
    sayak

    Hi Dpka ,
    Yes ! we are hitting a seperate essbase server with Linux OBIEE enviorement .
    I'll execute the query in essbase and get back to you !!
    Thanks
    sayak

  • Query is allocating too large memory

    I’m building an Analysis in OBIEE against an ASO cube and am seeing the following error:
    Query is allocating too large memory ( > 4GB) and cannot be executed. Query allocation exceeds allocation limits
    The report we’re trying to build is intended to show information from eight dimensions. However, when I try to add just a few of the dimensions we get the “Query is allocating too large memory” error. Even if I filter down the information so that I only have 1 or 2 rows in the Analysis I get the error. It seems like there is something wrong that is causing our queries to become so bloated. We're using OBIEE 11.1.1.6.0.
    Any help would be appreciated.

    950121 wrote:
    I’m building an Analysis in OBIEE against an ASO cube and am seeing the following error:
    Query is allocating too large memory ( > 4GB) and cannot be executed. Query allocation exceeds allocation limits
    The report we’re trying to build is intended to show information from eight dimensions. However, when I try to add just a few of the dimensions we get the “Query is allocating too large memory” error. Even if I filter down the information so that I only have 1 or 2 rows in the Analysis I get the error. It seems like there is something wrong that is causing our queries to become so bloated. We're using OBIEE 11.1.1.6.0.
    Any help would be appreciated.Hi,
    This sounds like a known Bug 13331507 : RFA - DEBUGGING 'QUERY IS ALLOCATING TOO LARGE MEMORY ( > 4GB)' FROM ESSBASE.
    Cause:
    A filter has been added in several lines in the 'Data Filters' Tab of the 'Users Permissions' Screen in the Administration Tool (click on Manage and then Identity menu items). This caused the MDX Filter statement to be added several times to the MDX issues to the underlying Database, which in turn caused too much memory to be used in processing the request.
    Refer to Doc ID: 1389873.1 for more information on My Oracle Support.

  • TDS buffer length too large & Protocol error in TDS stream

    Hi,
    While performing the HFM Copy application(using the copy utility) from Production to Development, I receive some errors many times noticed from the log file which are as follows:
    11-24-2009 01:06:37 : 157 : Error : Number=(-2147467259)(80004005) Source=(Microsoft OLE DB Provider for SQL Server) Description=(TDS buffer length too large) SQLState=(HY000)NativeError=(0)
    11-24-2009 01:06:37 : 157 : Error : Number=(-2147467259)(80004005) Source=(Microsoft OLE DB Provider for SQL Server) Description=(Protocol error in TDS stream) SQLState=(HY000)NativeError=(0)
    I would like to know why did this error appeared and any help would be of great help.
    Thanks.
    Regards,
    Ravi Shankar S

    I have seen this where the server doesn't respond in time, pulling data from SQL via ODBC using Excel. If so the fix is fairly simple:
    sp_configure
    'remote query timeout (s)',3600
    GO
    RECONFIGURE
    GO
    JCEH

  • (413) Request Entity too large intermittent error

    I have page in a SharePoint 2013 website which is viewed over https. The page has several input fields including two file upload controls. I am trying to upload a sample picture less than 1MB for each of the controls.
    I am calling a BizTalk WCF service on Submit. Sometimes, when I try to submit I get ‘413 Request Entity Too Large'. This error happens intermittently though because if I try submitting the same data a number of times, it fails sometimes and works other times.
    The binding settings for the service are set in code (not in Web.Config) as shown below ...
    var binding = RetrieveBindingSetting();
    var endpoint = RetrieveEndPointAddress(“enpointAddress”);
    var proxy = new (binding, endpoint);
    proxy.Create(request);
    public BasicHttpBinding RetrieveBindingSetting()
    var binding = new BasicHttpBinding
    MaxBufferPoolSize = 2147483647,
    MaxBufferSize = 2147483647,
    MaxReceivedMessageSize = 2147483647,
    MessageEncoding = WSMessageEncoding.Text,
    ReaderQuotas = new System.Xml.XmlDictionaryReaderQuotas
    MaxDepth = 2000000,
    MaxStringContentLength = 2147483647,
    MaxArrayLength = 2147483647,
    MaxBytesPerRead = 2147483647,
    MaxNameTableCharCount = 2147483647
    Security =
    Mode = BasicHttpSecurityMode.Transport,
    Transport = { ClientCredentialType = HttpClientCredentialType.Certificate }
    return binding;
    I have also set the uploadReadAheadSize in applicationHost.config file on IIS as by running the command below, as suggested here ...
    appcmd.exe set config "sharepoint" -section:system.webserver/serverruntime /uploadreadaheadsize:204800 /commit:apphost
    Nothing I do seems to fix this issue so I was wondering if anyone had any ideas?
    Thanks

    Sounds like it's not a SharePoint problem, does the page work correctly if you don't call the BizTalk WCF service? And what happens if a console app is calling the WCF service independently of SharePoint, does it fail then as well? In both cases, it would
    limit the troubleshooting scope to the WCF service, which gets you a step further.
    Kind regards,
    Margriet Bruggeman
    Lois & Clark IT Services
    web site: http://www.loisandclark.eu
    blog: http://www.sharepointdragons.com

  • Sharing large folio 700mb too large? (error downloading)

    Hi, I've created a 700mb folio just to share with a client. However when they try and download it using content viewer they get an "error during download - Could not download file" message.
    Can anyone explian why this happens, is it simply a file size limit issue?
    Theyre using a galaxy tablet incase this matters.
    Thanks

    It's large but that shouldn't cause the error. Could be a hiccup in the
    servers.
    I've seen that error on a 5meg folio.
    Bob

  • Query throwing Exception as 'Query too large'

    HI,
    AM working on BI Content 7.
    I have created a query and the query name is TEST_RA_0064.
    when i am executing this query in the Analyser it is throwing the below mentioned errors.
    1. Query TEST_RA_0064 is too large.
    2. Program error in class SAPMSSY1 method:UNCAUGHT_EXCEPTION.
    Please help me.
    Thanks,
    Rajesh janardanan

    Execute this query via RSRT - for investigation purpose..
    Check for the detailed error message here.
    Hope it Helps
    Chetan
    @CP..

  • "result too large" error when accessing files

    Hi,
    I'm attempting to make a backup copy of one of my folders (using tar from shell). For several files, I got "Read error at byte 0, reading 1224 bytes: Result too large" error message. It seems those files are unreadable. Whatever application attempts to access them results with the same error.
    The files reside on the volume that I created a day ago. It's a non-journaled HFS+ volume on external hard drive. They are part of an Aperture Vault that I wanted to make an archive copy and store offsite. Aperture was closed (not running) when I was creating the archive.
    This means two things. The onsite backup of my photos is broken, obviously (some of the files are unreadable). My offsite backup is broken, since it doesn't contain those files.
    I've searched the net, and found couple of threads on some mailing lists describing same problem. But no answer. Couple of folks on those mailing lists suggested it migh point to full disk. However, in my case, there is some 450GB of free space on the volume I was getting read errors on (the destination volume had about 200GB free, and system drive had about 50GB free, so there was plenty of space all around the system too).
    File system corruption?
      Mac OS X (10.4.9)  

    Here's the tar command with the output:
    $ tar cf /Volumes/WINNIPEG\;TOPORKO/MacBackups/2007-05-27/aperture.tar Alex\ -\ External\ HD.apvault
    tar: Alex - External HD.apvault/Library/2003.approject/2007-03-24 @ 08\:17\:52 PM - 1.apimportgroup/IMG0187/Thumbnails/IMG0187.jpg: Read error at byte 0, reading 3840 bytes: Result too large
    tar: Alex - External HD.apvault/Library/2006.approject/2007-03-24 @ 08\:05\:07 PM - 1.apimportgroup/IMG2088/IMG2088.jpg.apfile: Read error at byte 0, reading 1224 bytes: Result too large
    tar: Alex - External HD.apvault/Library/Jasper and Banff 2006.approject/2007-03-25 @ 09\:41\:41 PM - 1.apimportgroup/IMG1836/IMG1836.jpg.apfile: Read error at byte 0, reading 1224 bytes: Result too large
    tar: Alex - External HD.apvault/Library/Old Scanned.approject/2007-03-24 @ 12\:42\:55 AM - 1.apimportgroup/Image04_05 (1)/Info.apmaster: Read error at byte 0, reading 503 bytes: Result too large
    tar: Alex - External HD.apvault/Library/Old Scanned.approject/2007-03-24 @ 12\:42\:55 AM - 1.apimportgroup/Image16_02/Info.apmaster: Read error at byte 0, reading 499 bytes: Result too large
    tar: Alex - External HD.apvault/Library/Vacation Croatia 2006.approject/2007-03-25 @ 09\:47\:17 PM - 1.apimportgroup/IMG0490/IMG0490.jpg.apfile: Read error at byte 0, reading 1224 bytes: Result too large
    tar: Error exit delayed from previous errors
    Here's the "ls -l" output for one of the files in question:
    $ ls -l IMG_0187.jpg
    -rw-r--r-- 1 dijana dijana 3840 Mar 24 23:27 IMG_0187.jpg
    Accessing that file (or any other from the above list) gives same/similar error. The wording differes from command to command, but basically it's the same thing (read error, or result too large, or both combined). For example:
    $ cp IMG_0187.jpg ~
    cp: IMG_0187.jpg: Result too large
    The console log doesn't show any related errors.

  • BPC 7M SP4 EVDRE missing rows - Error is  "1004-Selection is too large.}

    Hello,
    On a customer who installed BPC 7 Ms SP4 I have on client Exception log the error:
    ===================[System Error Tracing]=====================
    [System Name]   : BPC_ExcelAddin
    [Job Name]         : clsExpand::applyDataRangeFormula
    [DateTime]          : 2009-07-17 09:44:13
    [Exception]
           Detail<sg     : {1004-Selection is too large}
    ===================[System Error Tracing End ]=====================
    When I see this error there is an EVDRE input schedule expanding which have some of the rows missing rows descriptions.
    This means there is a gap of missing row header formulas stating from second row to somewhere in the middle of the report.
    If I reduce the number of the resulting rows for the same EVDRE input schedule results are ok.
    Do you know some setting to fix this?
    I tried increasing the Maximum Expansion Limit for rows and columns in Workbook Options without success.
    Thank you.

    Hi all,
    I have the same problem with 7.0MS SP07. With 59 expanded members it works. With 60 expanded members it fails.
    Mihaela, could you explain us what is the purpose of the parameters you talk about?
    thanks,
    Romuald

  • Error when executing IDB: "bind(): Result too large"

    I'm trying to use USB debugging in the iPad, as per this guide:
    http://help.adobe.com/en_US/air/build/WS901d38e593cd1bac7b2281cc12cd6bced97-8000.html
    But when I try to execute "idb.exe -forward 7936 7936 1" (1 being my iPad handle), I get the error message:
    "bind(): Result too large"
    What's happening?

    Here's the tar command with the output:
    $ tar cf /Volumes/WINNIPEG\;TOPORKO/MacBackups/2007-05-27/aperture.tar Alex\ -\ External\ HD.apvault
    tar: Alex - External HD.apvault/Library/2003.approject/2007-03-24 @ 08\:17\:52 PM - 1.apimportgroup/IMG0187/Thumbnails/IMG0187.jpg: Read error at byte 0, reading 3840 bytes: Result too large
    tar: Alex - External HD.apvault/Library/2006.approject/2007-03-24 @ 08\:05\:07 PM - 1.apimportgroup/IMG2088/IMG2088.jpg.apfile: Read error at byte 0, reading 1224 bytes: Result too large
    tar: Alex - External HD.apvault/Library/Jasper and Banff 2006.approject/2007-03-25 @ 09\:41\:41 PM - 1.apimportgroup/IMG1836/IMG1836.jpg.apfile: Read error at byte 0, reading 1224 bytes: Result too large
    tar: Alex - External HD.apvault/Library/Old Scanned.approject/2007-03-24 @ 12\:42\:55 AM - 1.apimportgroup/Image04_05 (1)/Info.apmaster: Read error at byte 0, reading 503 bytes: Result too large
    tar: Alex - External HD.apvault/Library/Old Scanned.approject/2007-03-24 @ 12\:42\:55 AM - 1.apimportgroup/Image16_02/Info.apmaster: Read error at byte 0, reading 499 bytes: Result too large
    tar: Alex - External HD.apvault/Library/Vacation Croatia 2006.approject/2007-03-25 @ 09\:47\:17 PM - 1.apimportgroup/IMG0490/IMG0490.jpg.apfile: Read error at byte 0, reading 1224 bytes: Result too large
    tar: Error exit delayed from previous errors
    Here's the "ls -l" output for one of the files in question:
    $ ls -l IMG_0187.jpg
    -rw-r--r-- 1 dijana dijana 3840 Mar 24 23:27 IMG_0187.jpg
    Accessing that file (or any other from the above list) gives same/similar error. The wording differes from command to command, but basically it's the same thing (read error, or result too large, or both combined). For example:
    $ cp IMG_0187.jpg ~
    cp: IMG_0187.jpg: Result too large
    The console log doesn't show any related errors.

  • "The document can't be opened because it's too large."

    I just copied a Numbers '09 file I just made at work (and can still open there) to my MBP and when I try to open it, I get a "The document can’t be opened because it’s too large." error. It's only 1.4MB. I tried reinstalling iWork and re-saving and copying the file to no avail. Has anyone seen this or have any ideas?

    I am having a similar problem. I am evaluating Numbers 09 for the first time and would like to get rid of Excel, which seems incredibly slow to open. I'm ready for a change and I like the look and feel of Numbers.
    I am working with an export file of data from FileMaker Pro - 3900 rows by 336 colums, so over 1.3 million potentially filled cells. My MacBook Pro has 2 gb of RAM and currently shows 185 free, 495 inactive, 557 wired, and 804 active.
    When I try to open the txt file (8.3 mb), I get the error message referred to above. When I open it in Excel, it immediately opens. I can then save it as an Excel file. When I try to open this Excel file in Numbers, it will open but I get a message stating that only 255 or fewer columns are supported.
    Could this possibly be right? Granted, who in their right mind would create a data table with that many columns, but when working with legacy data (which is why I have a job) this is often the case and it's my job to clean it up.
    Any assistance in both of my errors would be appreciated . . . even if the advice is to stick with Excel

  • Query Error Information: Result set is too large; data retrieval ......

    Hi Experts,
    I got one problem with my query information. when Im executing my report and drill my info in my navigation panel, Instead of a table with values the message "Result set is too large; data retrieval restricted by configuration" appears. I already applied "Note 1127156 - Safety belt: Result set is too large". I imported Support Package 13 for SAP NetWeaver 7. 0 BI Java (BIIBC13_0.SCA / BIBASES13_0.SCA / BIWEBAPP13_0.SCA) and executed the program SAP_RSADMIN_MAINTAIN (in transaction SE38), with the object and the value like Note 1127156 says... but the problem still appears....
    what Should I be missing ??????  How can I fix this issue ????
    Thank you very much for helping me out..... (Any help would be rewarded)
    David Corté

    You may ask your basis guy to increase ESM buffer (rsdb/esm/buffersize_kb). Did you check the systems memory?
    Did you try to check the error dump using ST22 - Runtime error analysis?
    Edited by: ashok saha on Feb 27, 2008 10:27 PM

  • Bex Report Error -- Query is Too Large

    Hello I am Using Hierarchies in Rows and in Columns Company Wise, Quarter Month Wise Values
    I am Using 310 Rows and 100 columns As this is Summary Report I cannot go for Filters and I cannot
    Decrease Keyfigures is there any solution for this , Please give me the valuable suggestion for this query.

    If the query is too large and you are running out of memory, then either you should run the report with smaller selection (may be year -wise) or increase your server parametres like memory space etc.

  • Requested buffer too large - but data is already in memory

    Hello all,
    I am writing a program that generates sound and then uses the Java Sound API to play it back over the speakers. Until recently, using clips have not led to any problems. On two computers I can play the sound without a hitch. However, on the newest computer (and also with the largest specs and especially more RAM), I am getting an error while trying to play back the sound. The exception that is thrown is:
    javax.sound.sampled.LineUnavailableException: Failed to allocate clip data: Requested buffer too large.
    I find this odd because the buffer already exists in memory: I don't have to read in a .wav file or anything because I am creating the audio during the course of my program's execution (this is also why I use Clips instead of streaming - the values are saved as doubles during the calculations and then converted into a byte array, which is the buffer that is used in the clip.open() method call). It has no problems allocating the double array, the byte array, or populating the byte array. It is only thrown during clip.open() call. I also find it strange that it would work on two other computers, both of which have less RAM (it runs fine on a machine with 512MB and 2GB of RAM, both XP 32-bit). The only difference is that the computer with the issue is running Windows 7 (the RTM build), 64-bit with 6GB of RAM. I am running it through Netbeans 6.7.1 with memory options set to use up to 512MB - but it's never gone up that far before. And I've checked the size of the buffer on all three computers and they are all the same.
    Does anyone know what the issue could be or how to resolve it? I am using JDK6 if that matters. Thank you for your time.
    Edited by: Sengin on Sep 18, 2009 9:40 PM

    Thanks for your answer. I'll try that.
    I figured it had something to do with Windows 7 since it technically hasn't been released yet (however I have the RTM version thanks to a group at my univeristy in cahoots with Microsoft which allows some students to get various Microsoft products for $12).
    Edit: I just changed the Clip to a SourceDataLine (and the few other necessary changes like changing the way the DataLine.Info object was created) and wrote the whole buffer into it, drained the line and then closed it. It works fine. I'll mark the question as answered, however that may not be the "correct" answer (perhaps it does have something to do with Windows 7 and not being completely tested yet). Thanks.
    Edited by: Sengin on Sep 21, 2009 8:44 PM
    Edited by: Sengin on Sep 21, 2009 8:46 PM

  • XCode 3 download gives "file too large" error msg

    When I tried to download XCode 3 (XCode 3.26 and iOS SDK 4.3) from the developer website, the download manager displayed an error message just after the file finished downloading, saying that the file was too large for my hard drive (insufficient memory). Although it was a 4GB file, I had plenty of disk space. This happened twice with two different disk drives (a 250 GB Iomega and an 8GB flash drive). Any recommendations?
    Thanks,
    Josh

    I had plenty of disk space.
    That's a technical term way over my head
    What is the total capacity of your hard drive in GB?
    What is the exact amount of unused space on your hard drive?

Maybe you are looking for