BW System QBW: Result too large

Hi Everyone,
when i am running webi report its giving error.
we are using BO : 4.0 with conncetion.
BW System QBW: Result too large (917 070 cells);
data collection is limited by configuration
(maximum - 500000 cells) (WIS 00000)
Pls give me any solution for this issue......
Regards,
G

Hi Bhargav G,
If you still facing this issue.
Check the below link
BO 4 webi reports on Bex Query (BI 7.3)
It might be help you.
Regards,
Anish

Similar Messages

  • "ERROR: Could not read block 64439 of relation 1663/16385/16658: Result too large"

    Hi,
    I've already archived a lot of assets in my final cut server but since one week there is a message appearing when I click on an asset and choose "Archive". The pop-up says: "ERROR: Could not read block 64439 of relation 1663/16385/16658: Result too large"
    Does anyone know what's the problem and/or have any suggestions to solve my problem?! I can't archive anymore since the first appearance of this message.
    What happened before?
    -> I archived some assets via FCS and then transfered the original media to an offline storage media. That system worked fine for the last months and my normal server stays quit small in storage use. But now, after I added some more new productions and let FCS generate the assets, it doesn't work anymore...
    It's not about the file size - I tried even the smallest file I found in some productions.
    It's not a particular production - I tried some different productions.
    It's not about the storage - there's a lot of storage left on my server.
    So, if someone knows how get this server back on the road - let me know.
    THNX!
    Chris

    I would really appreciate some advice re: recent FCS search errors.
    We're having similar issues to C.P.CGN's 2 year old post, it's only developed for us in the last few weeks.
    Our FCS machine is running 10.6.8 mac os and 1.5.2 final cut server with the latest
    OS 10.6.x updates.
    FCS is still usable for 6 of 8 offliners, but on some machines, searching assets presents "ERROR: could not read block 74012 of relation1663/16385/16576: Input/output error."
    Assuming the OS and/or data drives on the FCS machine were failing, I cloned the database drive today and will clone the OS drive tomorrow night, but after searching the forums and seeing similar error messages I'm not so sure.
    FCS has been running fine for last 4 years, minus the recent Java security issues.
    Thanks in advance, any ideas appreciated!
    cheers,
    Aaron Mooney,
    Post Production Supervisor.
    Electric Playground Daily, Reviews On The Run Daily, Greedy Docs.
    epn.tv

  • "result too large" error when accessing files

    Hi,
    I'm attempting to make a backup copy of one of my folders (using tar from shell). For several files, I got "Read error at byte 0, reading 1224 bytes: Result too large" error message. It seems those files are unreadable. Whatever application attempts to access them results with the same error.
    The files reside on the volume that I created a day ago. It's a non-journaled HFS+ volume on external hard drive. They are part of an Aperture Vault that I wanted to make an archive copy and store offsite. Aperture was closed (not running) when I was creating the archive.
    This means two things. The onsite backup of my photos is broken, obviously (some of the files are unreadable). My offsite backup is broken, since it doesn't contain those files.
    I've searched the net, and found couple of threads on some mailing lists describing same problem. But no answer. Couple of folks on those mailing lists suggested it migh point to full disk. However, in my case, there is some 450GB of free space on the volume I was getting read errors on (the destination volume had about 200GB free, and system drive had about 50GB free, so there was plenty of space all around the system too).
    File system corruption?
      Mac OS X (10.4.9)  

    Here's the tar command with the output:
    $ tar cf /Volumes/WINNIPEG\;TOPORKO/MacBackups/2007-05-27/aperture.tar Alex\ -\ External\ HD.apvault
    tar: Alex - External HD.apvault/Library/2003.approject/2007-03-24 @ 08\:17\:52 PM - 1.apimportgroup/IMG0187/Thumbnails/IMG0187.jpg: Read error at byte 0, reading 3840 bytes: Result too large
    tar: Alex - External HD.apvault/Library/2006.approject/2007-03-24 @ 08\:05\:07 PM - 1.apimportgroup/IMG2088/IMG2088.jpg.apfile: Read error at byte 0, reading 1224 bytes: Result too large
    tar: Alex - External HD.apvault/Library/Jasper and Banff 2006.approject/2007-03-25 @ 09\:41\:41 PM - 1.apimportgroup/IMG1836/IMG1836.jpg.apfile: Read error at byte 0, reading 1224 bytes: Result too large
    tar: Alex - External HD.apvault/Library/Old Scanned.approject/2007-03-24 @ 12\:42\:55 AM - 1.apimportgroup/Image04_05 (1)/Info.apmaster: Read error at byte 0, reading 503 bytes: Result too large
    tar: Alex - External HD.apvault/Library/Old Scanned.approject/2007-03-24 @ 12\:42\:55 AM - 1.apimportgroup/Image16_02/Info.apmaster: Read error at byte 0, reading 499 bytes: Result too large
    tar: Alex - External HD.apvault/Library/Vacation Croatia 2006.approject/2007-03-25 @ 09\:47\:17 PM - 1.apimportgroup/IMG0490/IMG0490.jpg.apfile: Read error at byte 0, reading 1224 bytes: Result too large
    tar: Error exit delayed from previous errors
    Here's the "ls -l" output for one of the files in question:
    $ ls -l IMG_0187.jpg
    -rw-r--r-- 1 dijana dijana 3840 Mar 24 23:27 IMG_0187.jpg
    Accessing that file (or any other from the above list) gives same/similar error. The wording differes from command to command, but basically it's the same thing (read error, or result too large, or both combined). For example:
    $ cp IMG_0187.jpg ~
    cp: IMG_0187.jpg: Result too large
    The console log doesn't show any related errors.

  • Setting kern.ipc.maxsockbuf above MB errors with "Result too large"

    Hi
    I'm running Snow Leopard 10.6.4 on a machine that has 16GB memory.
    When I attempt to increase the kern.ipc.maxsockbuf above 4MB I get an error that states "Result too large". I don't have this problem on my 10.5 machines. Anyone know where this limitation comes from? Is it some hard limit on Smow Leopard?
    Regards - Tim
    # setting to 4MB works fine.
    sudo sysctl -w kern.ipc.maxsockbuf=4194304
    kern.ipc.maxsockbuf: 500000 -> 4194304
    # setting to 1 above 4MB starts the error.
    sudo sysctl -w kern.ipc.maxsockbuf=4194305
    kern.ipc.maxsockbuf: 4194304
    sysctl: kern.ipc.maxsockbuf: Result too large

    Firstly, there's no such thing as Apache 9.3, there's Apache 1 (and subversions) and Apache 2 (and subversions). Your error message -
    Oracle-HTTP-Server/1.3.28Shows you're using Apache 1.3.28
    Secondly, I'm confused by your comment -
    I do not have Apache 9.3 or higher but I think oracle should offer this in its companion CDOracle does offer the Apache server, if you're saying you didn't get it from Oracle then where did your Apache server come from?
    Thirdly, I notice from your config file -
    ErrorLog "|E:\oracle\product\10.1.0\Companion\Apache\Apache\bin\rotatelogs logs/error_log 43200"That you're piping the logs through rotatelogs, are you sure the logfiles haven't just been renamed?

  • Error when executing IDB: "bind(): Result too large"

    I'm trying to use USB debugging in the iPad, as per this guide:
    http://help.adobe.com/en_US/air/build/WS901d38e593cd1bac7b2281cc12cd6bced97-8000.html
    But when I try to execute "idb.exe -forward 7936 7936 1" (1 being my iPad handle), I get the error message:
    "bind(): Result too large"
    What's happening?

    Here's the tar command with the output:
    $ tar cf /Volumes/WINNIPEG\;TOPORKO/MacBackups/2007-05-27/aperture.tar Alex\ -\ External\ HD.apvault
    tar: Alex - External HD.apvault/Library/2003.approject/2007-03-24 @ 08\:17\:52 PM - 1.apimportgroup/IMG0187/Thumbnails/IMG0187.jpg: Read error at byte 0, reading 3840 bytes: Result too large
    tar: Alex - External HD.apvault/Library/2006.approject/2007-03-24 @ 08\:05\:07 PM - 1.apimportgroup/IMG2088/IMG2088.jpg.apfile: Read error at byte 0, reading 1224 bytes: Result too large
    tar: Alex - External HD.apvault/Library/Jasper and Banff 2006.approject/2007-03-25 @ 09\:41\:41 PM - 1.apimportgroup/IMG1836/IMG1836.jpg.apfile: Read error at byte 0, reading 1224 bytes: Result too large
    tar: Alex - External HD.apvault/Library/Old Scanned.approject/2007-03-24 @ 12\:42\:55 AM - 1.apimportgroup/Image04_05 (1)/Info.apmaster: Read error at byte 0, reading 503 bytes: Result too large
    tar: Alex - External HD.apvault/Library/Old Scanned.approject/2007-03-24 @ 12\:42\:55 AM - 1.apimportgroup/Image16_02/Info.apmaster: Read error at byte 0, reading 499 bytes: Result too large
    tar: Alex - External HD.apvault/Library/Vacation Croatia 2006.approject/2007-03-25 @ 09\:47\:17 PM - 1.apimportgroup/IMG0490/IMG0490.jpg.apfile: Read error at byte 0, reading 1224 bytes: Result too large
    tar: Error exit delayed from previous errors
    Here's the "ls -l" output for one of the files in question:
    $ ls -l IMG_0187.jpg
    -rw-r--r-- 1 dijana dijana 3840 Mar 24 23:27 IMG_0187.jpg
    Accessing that file (or any other from the above list) gives same/similar error. The wording differes from command to command, but basically it's the same thing (read error, or result too large, or both combined). For example:
    $ cp IMG_0187.jpg ~
    cp: IMG_0187.jpg: Result too large
    The console log doesn't show any related errors.

  • Quotacheck: searchfs Result too large: Result too large

    Aside from a 2006 post regarding this issue, I'm unsure how to resolve my scenario. We're using OSX server's time machine AFP goodies, but we needed to enable quotas for users. Simple? Maybe, but not mac style... so you head into the terminal, read some old posts on outdated forums... use repquota, quotacheck, and quotaon...
    And everything seemed to work, until you add a user (through edquota) who's quota isn't in fstab, who can't be found in repquota...
    sigh...
    I turned off quota checking, tried starting from scratch... what do I get with but an error who's last mention on this forum is from 2006:
    sudo quotacheck -a -v
    * Checking user and group quotas for /dev/rdisk4 (/Volumes/ColdStorage)
    34
    quotacheck: searchfs Result too large: Result too large
    Any ideas of ways around? The 2006 posts seem to indicate that after attempting variations of quotacheck, I might eventually break through!

    Hello,
    I've run into the same issue on our setup as well. (Xserve G5 10.4.8, data is an Xserve RAID, level 5 1TB, used for home directories) I'm working with Apple to see if there is a solution to this issue or if it is a bug. In the meanwhile, they recommended running quotacheck with the path to the share rather then -a
    sudo quotacheck -v /Volumes/OURDATA
    Using the command this way seems to work about half of the time for me, the other half still giving the same error message. I'm hoping this is a cosmetic issue with quotacheck, and not a hint of a problem with our setup.
    I'll be sure to post if I find anything else out.
    Matt Bryant
    ACTC
    Husson College and the New England School of Communications

  • Final Cut Server Error Result Too Large

    In Final Cut Server when trying to check out a project or add files to a project i get the following error ::
    Error: could not read block 5469422 of relation 1663/16385/16653: Result too large
    I have seen similar post's but no resolution.
    Final Cut Server on Xserve 10.5.8
    XSAN 2.2
    Present on all 12 client machines

    Does this occur only on specific projects or all projects?  If it only happens on a specific project it's likely a invalid asset in which deleting and re-cataloging it would be the simplest solution.  If this happens with any asset you try to check out, then there may be a larger database issue and you may consider restoring from a backup.

  • BUG: Web service returns request XML as response when result too large

    Hi,
    sorry for cross-posting, but the Web Services forum seems to be quite abandoned and this is an urgent issue for me.
    I have a web service returning some records of a given type (created using JDeveloper 10.1.3.3). The running environment and the service implementation do not seem to make any difference, as the situation is the same whether running it in embedded OC4J or in AS 10.1.3.1, and whether it is generated from a PL/SQL procedure or a method of a plain Java class.
    The problem is that if the result of this web service is too large (contains a lot of records), then the processing halts in some Oracle class in some web service library, so not in a debuggable generated web service source or in the service implementation itself.
    I think that the XML processing halts because of a "java.lang.OutOfMemoryError: Java heap space".
    Then a more serious problem follows: the service doesn't return a fault message but the original request XML as a response. Obviously, this can lead to some really unexpected errors.
    To reproduce this error:
    1. Create a Java class with a method returning an array of an arbitrary type, of the size specified in an input parameter.
    2. Create a web service from this class.
    3. Call it multiple times increasing the size parameter in every call until you get back the request as response or any error message.
    For example:
    - if you test the web service using the web page generated to access the endpoint, then you can see the response XML - in case you don't get an Internal Server Error (Java heap space).
    - if you use a generated web service proxy for testing, then it will give an error saying "unexpected element name: expected={namespace}someOperationResponseElement
    actual={namespace}someOperationElement".
    Any ideas how to locate / solve this problem?
    Regards,
    Patrik

    Patrik,
    the usual recommendation is to try with 10.1.3.3 instead of 10.1.3.1 to exclude you are hunting down an already fixed issue. From what you describe, the error seems less JDeveloper related than OC4J or OracleAs.
    So in case it reproduces in 10.1.3.3 I suggest to create a testcase and open a service request with support, or try the OC4J forum in case its known there.
    Frank

  • BW Web Report Issue - Result set too large

    Hi,
    When I execute a BEx Query on Web I am getting “Result set too large ; data retrieval restricted by configuration (maximum = 500000 cells)”.
    Following to my search in SDN I understood we can remove this restriction either across the BW system globally or for a specific query at WAD template.
    In my 7x Web template I am trying to increase default max no of rows parameters, As per the below inputs from SAP Note: 1127156.
    But I can’t find parameter “Size Restriction for Result Sets” for any of the web items (Analysis/Web Template properties/Data Provider properties)….in the WAD Web template
    Please advise where/how can I locate the properites
    Instructions provided in SAP Note…
    The following steps describe how to change the "safety belt" for Query Views:
    1. Use the context menu Properties / Data Provider in a BEx Web Application to maintain the "safety belt" for a Query View.
    2. Choose the register "Size Restriction for Result Sets".
    3. Choose an entry from the dropdown box to specify the maximum number of cells for the result set.
                  The following values are available:
    o Maximum Number
    o Default Number
    o Custom-Defined Number
                  Behind "Maximum Number" and "Default Number" you can find the current numbers defined in the customizing table RSADMIN (see below).
    4. Save the Query View and use it in another Web Template.
    Thanks in advance

    Hi Yasemin,
    Thanks for all help...i was off couple of days.
    To activate it I can suggest to create a dummy template, add your query in it, add a menu bar component add an action to save the query view. Then you run the template and change the size restriction for result set then you can save it by the menu.
    Can you please elaborate on the solution provided,I created dummy template with analysis and Menu bar item...i couldn't able to configure menu bar item...
    Thanks in advance

  • Query Error Information: Result set is too large; data retrieval ......

    Hi Experts,
    I got one problem with my query information. when Im executing my report and drill my info in my navigation panel, Instead of a table with values the message "Result set is too large; data retrieval restricted by configuration" appears. I already applied "Note 1127156 - Safety belt: Result set is too large". I imported Support Package 13 for SAP NetWeaver 7. 0 BI Java (BIIBC13_0.SCA / BIBASES13_0.SCA / BIWEBAPP13_0.SCA) and executed the program SAP_RSADMIN_MAINTAIN (in transaction SE38), with the object and the value like Note 1127156 says... but the problem still appears....
    what Should I be missing ??????  How can I fix this issue ????
    Thank you very much for helping me out..... (Any help would be rewarded)
    David Corté

    You may ask your basis guy to increase ESM buffer (rsdb/esm/buffersize_kb). Did you check the systems memory?
    Did you try to check the error dump using ST22 - Runtime error analysis?
    Edited by: ashok saha on Feb 27, 2008 10:27 PM

  • Result Set Too Large : Data Retrieval restricted by configuration

    Hi Guys,
    I get the above error when running a large dataset, with a hierarchy on - but when I run without a hierarchy I am able to show all data.
    The Basis guys increased the ESM buffer (rsdb/esm/buffersize_kb) but it still causes an issue.
    Anyone any ideas when it comes to reporting large volumes with a hierarchy?
    Much appreciated,
    Scott

    Hi there
    I logged a message on service marketplace andg got this reply from SAP:
    ' You might have to increase the value for parameters
    BICS_DA_RESULT_SET_LIMIT_DEF and BICS_DA_RESULT_SET_LIMIT_MAX as it
    seems that the result set is still too large. Please check your
    parameters as to how many data cells you should expect and set the
    parameter accordingly.
    The cells are the number of data points that would be send from abap
    to java. The zero suppression or parts of the result suppression are
    done afterwards. As a consequence of this, the number of displayed
    data cells might differ from the threshold that is effective.
    Starting with SPS 14 you get the information how many data cells are
    rejected. That gives you better ways to determine the right setting.
    Currently you need to raise the number to e.g. 2.000.000 to get all
    data.
    If BICS_DA_RESULT_SET_LIMIT_MAX is set to a lower value than
    BICS_DA_RESULT_SET_LIMIT_DEF, it would automatically cut the value of
    BICS_DA_RESULT_SET_LIMIT_DEF down to its own..
    Please note that altough this parameter can be increased via
    configuration, you should do a proper system sizing according to note
    927530 to ensure that the system can handle the number of users and
    resultset sizes you are expecting."
    Our basis team have subsequently apllied these changes, and I will be testing today.
    Thx

  • WAD : Result set is too large; data retrieval restricted by configuration

    Hi All,
    When trying to execute the web template by giving less restiction we are getting the below error :
    Result set is too large; data retrieval restricted by configuration
    Result set too large (758992 cells); data retrieval restricted by configuration (maximum = 500000 cells)
    But when we try to increase the number of restictions it is giving output. For example if we give fiscal period, company code ann Brand we are able to get output. But if we give fical period alone it it throwing the above error.
    Note : We are in SP18.
    Whether do we need to change some setting in configuration? If we yes where do we need to change or what else we need to do to remove this error
    Regards
    Karthik

    Hi Karthik,
    the standard setting for web templates is to display a maximum amount of 50.000 cells. The less you restrict your query the more data will be displayed in the report. If you want to display more than 50.000 cells the template will not be executed correctly.
    In general it is advisable to restrict the query as much as possible. The more data you display the worse your performance will be. If you have to display more data and you execute the query from query designer or if you use the standard template you can individually set the maximum amount of cells. This is described over  [here|Re: Bex Web 7.0 cells overflow].
    However I do not know if (and how) you can set the maximum amount of cells differently as a default setting for your template. This should be possible somehow I think, if you find a solution for this please let us know.
    Brgds,
    Marcel

  • SAP Note: 1127156 Result set is too large

    Hi,
    The note 1127156 explains how to resolve the issue with removing the message "Result Set is too large." I want to check the settings in the template and it gives the following steps but i cannot find the settings on the Analysis web item of the data provider.
    Can someone pls help.
    Query View
    The following steps describe how to change the "safety belt" for Query Views:
    1. Use the context menu Properties / Data Provider in a BEx Web Application to maintain the "safety belt" for a Query View.
    2. Choose the register "Size Restriction for Result Sets".
    3. Choose an entry from the dropdown box to specify the maximum number of cells for the result set.
                   The following values are available:
         Maximum Number
         Default Number
         Custom-Defined Number
                   Behind "Maximum Number" and "Default Number" you can find the current numbers defined in the customizing table RSADMIN (see below).
    4. Save the Query View and use it in another Web Template.
    Many thanks

    Hi,
    I was facing the same issue in BEx analyzer 7.0. The query was working fine in BW 3.5 analyzer but gave error in BEx 7.0.
    When executed in web, it gave error result set is too large.
    The maximum no of cells that can be displayed is 750000.
    Please check the result set returned by the query. Also check note Note 1040454 - Front-end memory requirement of the BEx Analyzer.
    Some enhancement are planned to be delivered in enhancement package 1.
    Regards,
      Niraj

  • Result set is too large; data retrieval restricted by configuration

    Hi,
    While executing query for a given period, 'Result set is too large; data retrieval restricted by configuration' message is getting displayed. I had searched in SDN and I had referred the following link:
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/d047e1a1-ad5d-2c10-5cb1-f4ff99fc63c4&overridelayout=true
    Steps followed:
    1) Transaction Code SE38
    2) In the program field, entered the report name SAP_RSADMIN_MAINTAIN and Executed.
    3) For OBJECT, entered the following parameters: BICS_DA_RESULT_SET_LIMIT_MAX
    4) For VALUE, entered the value for the size of the result set, and then executed the program:
    After the said steps, the below message is displayed:
    OLD SETTING:
    OBJECT =                                VALUE =
    UPDATE failed because there is no record
    OBJECT = BICS_DA_RESULT_SET_LIMIT_MAX
    Similar message is displayed for Object: BICS_DA_RESULT_SET_LIMIT_DEF.
    Please let me know as to how to proceed on this.
    Thanks in advance.

    Thanks for the reply!
    The objects are not available in the RSADMIN table.

  • Result set too large

    Hi Experts,
      When I run BI report on SAP portal, it shows me the message : Result set too large...., I had changed BICS_DA_RESULT_SET_LIMIT_MAX and BICS_DA_RESULT_SET_LIMIT_DEF, but I don't know how to(and where) config "safety belt", somebody can teach me that!? thanks a lot!

    Note 1127156 - Safety belt: Result set is too large
    Oops too late...!!! did not see earlier reply
    Edited by: Arun Varadarajan on Jul 6, 2010 9:31 PM

Maybe you are looking for

  • Questions about Indexing and Using an Indexing POA

    Although I have only about 50 users, at least 15 of them have in excess of 100,000 messages in their accounts and the POA (version 7.0.2) is regularly slowing to a crawl. (I just know that plans for revolution are fomenting!) I have embarked on a cam

  • Sending PDF as attachment using utl_smtp

    Hi all, I am encountering the following problem when i try to send the email using utl_smtp builtin,i receive the mail in my outlook,but not able to read the contents of the pdf file. Oracle Version :- Oracle Database 10g Enterprise Edition Release 1

  • Why does it take 12 hours to process a 2 hr. in imovie 11

    Why does it take 12 hours to process a 2 hr movie to quick time?

  • Syncing can't back up!

    This is a problem cause that means I can't properly update either! I haven't been able to properly sync my iphone for a few different iOS versions. Every time my iPhone gives me an error. saying something like "couldn't finish your iPhone failed to b

  • Display fields

    hi all i need to display all the field from a table with search help. User can enter a table name in the input box1. Input box2 will display all the table fields using search help? What must i do ?