Query throwing Exception as 'Query too large'

HI,
AM working on BI Content 7.
I have created a query and the query name is TEST_RA_0064.
when i am executing this query in the Analyser it is throwing the below mentioned errors.
1. Query TEST_RA_0064 is too large.
2. Program error in class SAPMSSY1 method:UNCAUGHT_EXCEPTION.
Please help me.
Thanks,
Rajesh janardanan

Execute this query via RSRT - for investigation purpose..
Check for the detailed error message here.
Hope it Helps
Chetan
@CP..

Similar Messages

  • Weblogic 10 - application deployment error: Exception is: "File too large"

    I posted this in Weblogic -> general but realise is should have really gone here as it's about admin server/deployment services setup / configuration.
    I am using weblogic application server 10 in a weblogic clustered enviornment.
    I am trying to deploy an application to a managed server when it starts up, all goes well and I can see it deploying the war files to the managed server.
    It hits a certain war and panics with the exception
    ####<Nov 19, 2011 2:03:59 PM BRST> <Error> <Deployer> <devnode01> <managedserver2> <[ACTIVE] ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <1321718639109> <BEA-149205> <Failed to initialize the application 'test_war' due to error weblogic.management.DeploymentException: Exception occured while downloading files.
    weblogic.management.DeploymentException: Exception occured while downloading files
    at weblogic.deploy.internal.targetserver.datamanagement.AppDataUpdate.doDownload(AppDataUpdate.java:43)
    at weblogic.deploy.internal.targetserver.datamanagement.DataUpdate.download(DataUpdate.java:56)
    at weblogic.deploy.internal.targetserver.datamanagement.Data.prepareDataUpdate(Data.java:97)
    at weblogic.deploy.internal.targetserver.BasicDeployment.prepareDataUpdate(BasicDeployment.java:682)
    at weblogic.deploy.internal.targetserver.BasicDeployment.stageFilesForStatic(BasicDeployment.java:725)
    at weblogic.deploy.internal.targetserver.AppDeployment.prepare(AppDeployment.java:104)
    at weblogic.management.deploy.internal.DeploymentAdapter$1.doPrepare(DeploymentAdapter.java:39)
    at weblogic.management.deploy.internal.DeploymentAdapter.prepare(DeploymentAdapter.java:187)
    at weblogic.management.deploy.internal.AppTransition$1.transitionApp(AppTransition.java:21)
    at weblogic.management.deploy.internal.ConfiguredDeployments.transitionApps(ConfiguredDeployments.java:233)
    at weblogic.management.deploy.internal.ConfiguredDeployments.prepare(ConfiguredDeployments.java:165)
    at weblogic.management.deploy.internal.ConfiguredDeployments.deploy(ConfiguredDeployments.java:122)
    at weblogic.management.deploy.internal.DeploymentServerService.resume(DeploymentServerService.java:173)
    at weblogic.management.deploy.internal.DeploymentServerService.start(DeploymentServerService.java:89)
    at weblogic.t3.srvr.SubsystemRequest.run(SubsystemRequest.java:64)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
    Caused By: java.io.IOException: [DeploymentService:290066]Error occurred while downloading files from admin server for deployment request "0". Underlying error is: "[DeploymentService:290065]Deployment service servlet encountered an Exception while handling the deployment datatransfer message for request id "0" from server "managedserver2". Exception is: "File too large"."
    at weblogic.deploy.service.datatransferhandlers.HttpDataTransferHandler.getDataAsStream(HttpDataTransferHandler.java:86)
    at weblogic.deploy.service.datatransferhandlers.DataHandlerManager$RemoteDataTransferHandler.getDataAsStream(DataHandlerManager.java:153)
    at weblogic.deploy.internal.targetserver.datamanagement.AppDataUpdate.doDownload(AppDataUpdate.java:39)
    at weblogic.deploy.internal.targetserver.datamanagement.DataUpdate.download(DataUpdate.java:56)
    at weblogic.deploy.internal.targetserver.datamanagement.Data.prepareDataUpdate(Data.java:97)
    at weblogic.deploy.internal.targetserver.BasicDeployment.prepareDataUpdate(BasicDeployment.java:682)
    at weblogic.deploy.internal.targetserver.BasicDeployment.stageFilesForStatic(BasicDeployment.java:725)
    at weblogic.deploy.internal.targetserver.AppDeployment.prepare(AppDeployment.java:104)
    at weblogic.management.deploy.internal.DeploymentAdapter$1.doPrepare(DeploymentAdapter.java:39)
    at weblogic.management.deploy.internal.DeploymentAdapter.prepare(DeploymentAdapter.java:187)
    at weblogic.management.deploy.internal.AppTransition$1.transitionApp(AppTransition.java:21)
    at weblogic.management.deploy.internal.ConfiguredDeployments.transitionApps(ConfiguredDeployments.java:233)
    at weblogic.management.deploy.internal.ConfiguredDeployments.prepare(ConfiguredDeployments.java:165)
    at weblogic.management.deploy.internal.ConfiguredDeployments.deploy(ConfiguredDeployments.java:122)
    at weblogic.management.deploy.internal.DeploymentServerService.resume(DeploymentServerService.java:173)
    at weblogic.management.deploy.internal.DeploymentServerService.start(DeploymentServerService.java:89)
    at weblogic.t3.srvr.SubsystemRequest.run(SubsystemRequest.java:64)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
    The error appears to be stating the physical file is too big to be deployed
    I'm running the managed servers with a heap size of 3GB and the managed server is running with 2GB - I know these are large but they where being used for debugging
    I can't find any documentation on the file size too large error, or how to resolve it
    DeploymentService:290065 says to look in the log (details are above) and DeploymentService:290066 says the error will be explained in it's description, which it is, "file size too big", it doesn't say where to see/set the max file size, there is plenty of disk space so I can only assume it's setting for the deployment service that needs to be increase, but I cannot find info on this.

    I don't think this would help, but would using the nostage option for deployment change this behaviour.
    I don't think it would as this is for disk based problems rather than transfer size issues.

  • SQL query throwing Exception.

    Hi,
    i am facing a problem while using the date as search criteria. In the JDO class the field type is of java.util.Date. The search is working in the local environment whose local date settings as M/d/YYYY(windows 2003 , english version).The search is not working in the chines OS environment , where the local date setting is YYYY-M-d. In the chinese enviroment , if we alter the session NLS settings in toad, we could able to execute the query.
    we are using kodo-2.5.8
    SELECT t0.JDO_ID, t0.JDO_CLASS, t0.BAD_NAMES, t0.CONCAT_NAME, t0.DDATE, t0.FIRST_NAME, t0.LAST_NAME, t0.MAX_SCN, t0.NAME,
    t0.NATURALID, t0.PARTYID, t0.PHONE, t0.ROLE FROM PARTY_VBO t0 WHERE (t0.ROLE LIKE '%ICR%' AND t0.DDATE = '17-APR-1974' AND ROWNUM <= 25) AND
    t0.JDO_CLASS = 2 ORDER BY t0.PARTYID ASC)
    This Query is throwing ORA-01843.
    From where the session opened by the java in oracle will take the data format?

    Plaese help.
    One of my sql query is throwing exception during
    execution. I do not know which one (there are several
    of them). Is there a way to display or extract the
    last query from SQLException class.
    Any help will be highly appreciated.
    Thanks,
    Indrasish.No there is not. You should do some debugging, such as logging something like: "Executing query: SELECT blahblahblah" - then execute the query, then log something like: "Query completed". Then find in your log which one didn't complete.

  • ODI Datastore Length differs with the DB length -IKM throws value too large

    ODI datastore when reverse engineered shows different length to that of the datalength in the actual db.
    ODI Datastore column details: char(44)
    Target db column : varchar2(11 char)
    The I$ table inserts char44 into varchar2(11char) in the target. As the source column value is empty ODI throws
    "ORA-12899: value too large for column (actual: 44, maximum: 11).

    Yes. I have reverse engineered the target also.
    source datatype     varchar2(11 char)
    After Reverse Engineering
    odi datstore datatype-Source :  char(44)
    target datatype: varchar2(11 char)
    after Reverse Engineering
    odi datstore datatype-Target :  char(44)
    Since the target datastore is char(44) in ODI Datastore and the values in the source column are null/spaces, the IKM inserts them into the target Column which is of 11 Char and the above mentioned value too large error occurs.
    There are no junk values seen on the column and I tried with substr(column,1,7) and
    Trim functions too and it does not help.

  • File is too large for attachment - BO Integration Error

    I developed Crystal reports (using Universe)in CR XIR2 and deployed in BO XIR 2 Repository.
    Reports database is -DB2.
    I am able to preview the reports in CMC/Infoview.
    But when we are integrating our Application(Called GBS) to BO ,and when I am trying to open CR reprots from BO repository,it is throwing error " File is too large for attachment " .
    The report is not too big even ,it is having only 100 records.
    I am not sure why it is throwing this error?Did anyone faced the same error ever?
    Please let me know any resolution for this,as it is blockign for me.
    is it something related to the application (GBS) or  any Pagerserver/Cacheserver issue ?
    Will wait for any response.
    Nitin

    Has this ever been resolved?  We are having a similar issue.
    Thanks!

  • Query is allocating too large memory error in OBIEE 11g

    Hi ,
    We have one pivot table(A) in our dashboard displaying , revenue against a Entity Hierarchy (i.e we have 8 levels under the hierarchy) And we have another pivot table (B) displaying revenue against a customer hierarchy (3 levels under it) .
    Both tables running fine under our OBIEE 11.1.1.6 environment (windows) .
    After deploying the same code (RPD&catalog) in a unix OBIEE 11.1.1.6 server , its throwing the below error ,while populating Pivot table A :
    Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
    *State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 96002] Essbase Error: Internal error: Query is allocating too large memory ( > 4GB) and cannot be executed. Query allocation exceeds allocation limits. (HY000)*
    But , pivot table B is running fine . Help please !!!!!
    data source used : essbase 11.1.2.1
    Thanks
    sayak

    Hi Dpka ,
    Yes ! we are hitting a seperate essbase server with Linux OBIEE enviorement .
    I'll execute the query in essbase and get back to you !!
    Thanks
    sayak

  • Query Error Information: Result set is too large; data retrieval ......

    Hi Experts,
    I got one problem with my query information. when Im executing my report and drill my info in my navigation panel, Instead of a table with values the message "Result set is too large; data retrieval restricted by configuration" appears. I already applied "Note 1127156 - Safety belt: Result set is too large". I imported Support Package 13 for SAP NetWeaver 7. 0 BI Java (BIIBC13_0.SCA / BIBASES13_0.SCA / BIWEBAPP13_0.SCA) and executed the program SAP_RSADMIN_MAINTAIN (in transaction SE38), with the object and the value like Note 1127156 says... but the problem still appears....
    what Should I be missing ??????  How can I fix this issue ????
    Thank you very much for helping me out..... (Any help would be rewarded)
    David Corté

    You may ask your basis guy to increase ESM buffer (rsdb/esm/buffersize_kb). Did you check the systems memory?
    Did you try to check the error dump using ST22 - Runtime error analysis?
    Edited by: ashok saha on Feb 27, 2008 10:27 PM

  • Bex Report Error -- Query is Too Large

    Hello I am Using Hierarchies in Rows and in Columns Company Wise, Quarter Month Wise Values
    I am Using 310 Rows and 100 columns As this is Summary Report I cannot go for Filters and I cannot
    Decrease Keyfigures is there any solution for this , Please give me the valuable suggestion for this query.

    If the query is too large and you are running out of memory, then either you should run the report with smaller selection (may be year -wise) or increase your server parametres like memory space etc.

  • Is there a reliable method for detecting that a query is too large?

    I am writing some code (that uses OCI) to properly detect when a query string is too long for OCI and/or the Oracle database server. I can't find any specific error code information in the docs, so I just started firing off large queries to see what would happen.
    The queries I am sending are >2MB in size, up to 16MB. If the queries are above ~10MB, I get the error "ORA-03113 'End-of-file on communications channel'" after a fairly short amount of time (i.e. not enough for a timeout to expire). If the queries are below ~10MB, but above ~2.5MB, it either just sits there and does not do anything (for more than 15 hours). So watching for ORA-03113 when executing large queries does not seem like a very reliable method for detecting the queries that are too large.
    Does anyone know of a reliable way of detecting that a query is too large for the OCI client and/or the Oracle database server?
    I am using version 10.2.0.1 for both the OCI client and the Oracle database server, but I'm getting similar errors for combinations of 10.2.0.1 and 9.2.0.7 for both the OCI client and the Oracle database server.
    These large queries need to be handled properly (i.e. distinguished from some generic failure) because the server handles requests from users, which could be programs that generate SQL queries (and have constructed huge ones in the past).
    Thanks for any information!

    The ORA-03113 means that the Oracle server process has died trying to satisfy your request. In almost all cases it is strictly correct to call that a bug, and we shouldn't easily forgive the server process when it happens. But in the case of multimegabyte statements my anger and disapointment turns to sympathy, for in my heart I can't bring myself to blame it. Can you?

  • Query is too large

    Hi,
    Can any one give me the solution to develop this report.
    I have 12 columns under each vertical.Like that i have 12 verticals for my Gross margin Report.
    Then 12* 12=144 coloumns.total are 144.
    But in Bi-Bex it will allow only limited columns. When i execute the report i'm getting the error that query is too large.Kow to overcome this.
    Note Points :
    Vertical1 :  ITG
    This Vertical consists of 20 Profit centers
    Vertical2 : UTG
                    This Vertical consists of 60 Profit centers
    Vertical3 : SBI
                    This Vertical consists of 50 Profit centers
    Vertical4 : SERVICEDIVISIONS
                    This Vertical consists of 10 Profit centers
    Vertical5 : ENGG
                    This Vertical consists of 50 Profit centers
    Vertical6 : HITECH
                    This Vertical consists of 10 Profit centers
    u2022     Data Before Eliminations is Total of All the Verticals Data.
    u2022     Eliminations Contains Values For Each Month for any of the above profit centers.
    u2022     Data After Eliminations is Data Before Eliminations + Eliminations Data.
    I searched in SDN but no luck..........
    Thanks

    hi sony,
    one way to go about this is to increase the "Size Restriction for Result Sets". This is pretty tricky. I wouldn't really recommend this since this can have performance issues.
    you can increase the size for result sets by running the abap report SAP_RSADMIN_MAINTAIN , Again be carefull when you are doing this. 
    Follow the below steps.
    1)     Transaction Code SA38 or SE38
    2)     In the program field, enter the report name SAP_RSADMIN_MAINTAIN and choose Execute.
    3)     For OBJECT, enter one of the following parameters;
            BICS_DA_RESULT_SET_LIMIT_DEF
            BICS_DA_RESULT_SET_LIMIT_MAX
    For VALUE, enter the value for the size of the result set, and then execute the program:
    Also, for your information, the default setting for the result set of Web applications is 500,000 cells. If the result set exceeds the size specified using the report, a message is displayed to inform the user.
    I would again ask you to be careful when you are modifying these values.
    Best would be to make sure that you check the table RSADMIN, whether you have the above entries (BICS_*), if you don't u can update these entries in the system. via the above said abap report.
    Hope this helps.
    regards,
    Sree.
    Edited by: Sree Nair on Aug 18, 2009 1:14 PM
    Edited by: Sree Nair on Aug 18, 2009 1:23 PM

  • Query is allocating too large memory

    I’m building an Analysis in OBIEE against an ASO cube and am seeing the following error:
    Query is allocating too large memory ( > 4GB) and cannot be executed. Query allocation exceeds allocation limits
    The report we’re trying to build is intended to show information from eight dimensions. However, when I try to add just a few of the dimensions we get the “Query is allocating too large memory” error. Even if I filter down the information so that I only have 1 or 2 rows in the Analysis I get the error. It seems like there is something wrong that is causing our queries to become so bloated. We're using OBIEE 11.1.1.6.0.
    Any help would be appreciated.

    950121 wrote:
    I’m building an Analysis in OBIEE against an ASO cube and am seeing the following error:
    Query is allocating too large memory ( > 4GB) and cannot be executed. Query allocation exceeds allocation limits
    The report we’re trying to build is intended to show information from eight dimensions. However, when I try to add just a few of the dimensions we get the “Query is allocating too large memory” error. Even if I filter down the information so that I only have 1 or 2 rows in the Analysis I get the error. It seems like there is something wrong that is causing our queries to become so bloated. We're using OBIEE 11.1.1.6.0.
    Any help would be appreciated.Hi,
    This sounds like a known Bug 13331507 : RFA - DEBUGGING 'QUERY IS ALLOCATING TOO LARGE MEMORY ( > 4GB)' FROM ESSBASE.
    Cause:
    A filter has been added in several lines in the 'Data Filters' Tab of the 'Users Permissions' Screen in the Administration Tool (click on Manage and then Identity menu items). This caused the MDX Filter statement to be added several times to the MDX issues to the underlying Database, which in turn caused too much memory to be used in processing the request.
    Refer to Doc ID: 1389873.1 for more information on My Oracle Support.

  • Query is allocating too large memory Error ( 4GB) in Essbase 11.1.2

    Hi All,
    Currently we are preparing dashboards in OBIEE from the Hyperion Essbase ASO (11.1.2) Cubes.When are trying to retrieve data with more attributes we are facing the below error
    "Odbc driver returned an error (SQLExecDirectW).
    Error Details
    Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 96002] Essbase Error: Internal error: Query is allocating too large memory ( > 4GB) and cannot be executed. Query allocation exceeds allocation limits. (HY000)"
    Currently we have data file size less than 2GB so we are using "Pending Cache Size=64MB".
    Please let me know which memory I have to increase to resolve this issue
    Thanks,
    SatyaB

    Hi,
    Do you have any dynamic hierarchies? What is the size of the data set?
    Thanks,
    Nathan

  • Reg: Query is too large

    Dear all,
    I am facing an error when trying to save the query "error: query is too large" and the below is the text help given in the documentation.
    Diagnosis
    Query ZYYYYYYY contains 8192 differing selection cells. However, only 8191 selection cells can be processed in a query.
    Procedure
    Please simplify query definition ZYYYYYYYY.

    Hi,
    this happens when maximum no. of Keyfigures are used in the rows....you might be using only one selection or a formula in the Rows of the query..and still face the same error...this reason behind this is that this selection or forumla inturn has a lot of RKFs or CKFs in it...
    to put it clear, say you use a selection named ZTEST in the rows...and the definition of this selection is something like ZTEST = CKF1CKF2CKF3
    and CKF1 = CKF4CKF5CKF6+CKF7
    and CKF2 = CKF8CKF9CKF10/CKF11
    and CKF3 = CKF12*CKF13
    and further if you see CKF 4, 5...13 will again be a combination of multiple CKFs or RKFs...so this goes on and finally ZTEST becomes indeed very complex and uses more than 8191 selections....
    I hope this is the issue you are facing...
    shamee

  • SAP BI Query 'too large'

    Hi,
    I have a problem with creating a special query in Query Designer. The query has 2 structures, one structure has 100 selections, the second structure has 132 selections. This query is to show 10 years of plan data per period and a detailed account structure. The customer is not willing to reduce the number of rows.
    When I try to check the query or execute the query, I get an error message:
    Query is too large. Query contains 8192 differing selection cells. However, only 8191 selection cells can be processed in a query.
    In fact the query has about 13000 selection cells!
    I can not set up the query in a different way by using a drilldown by attribute period or something similar as the query should show actuals to a certain period in a certain year and from that period on the rest of the ten years should show plan data for an entered plan version. The row structure should not show all accounts and nodes of a hierarchy but only certain nodes. Therefore I can not use a hierarchy here.
    Please could you tell me if there is a way to exceed the limit of     '8192 selection cells'            in a query ?
    many thanks for your help in advance
    Arndt Fritzsche-Marx

    Hi Arndt,
    have never had this problem before. However you can customize the amount of cells in a query yourself. Have you tried editing this number (as explained below), perhaps it helps?? For I do not have access to a BI 7.0 right now I can not check if there are more settings which can be customized.
    You can customize the maximum of rows which can be displayed in web. This setting can be customized as follows in Bex Web Analyzer for the query (execute your query from Query Designer and it will ope in Bex Web Analyzer):
    1. Settings --> Data Provider
    2. Size Restriction for Result Sets
    3. Set the setting to "Custom-Defined No. of Cells" and enter your required "Number of cells"
    However be aware that the display of such a big amount of rows has a great impact on the performance of the query.
    Brgds,
    Marcel

  • Financial Reports 9.3.1 - Query is too large and cannot be executed

    Hi,
    I'm trying to pull a Financial report from one of my ASO cube, version 9.3.1 and received the following error.
    "Query is too large and cannot be executed. The product of member counts across all dimensions in the query exceeds 2^64..."
    The report is pulling from a combination of 6 different cross dimensions and drill down to the bottom level 0. I have read a couple of blogs with trying to change the buffer and sort buffer size from 10 KB to like 1000 KB within the database settings as well as the Pending cache size limit setting from like 32 MB to 64 MB on the Application setting. I've tried all combinations but nothing seems to work.
    However, I was able to retrieve the same combinations through Essbase Excel Add-in by addding the MAXFORMULACACHESIZE in the Essbase config setting. But this didn't work for the Financial Reports. I know that in version 11.1.2 has that MDX Query enable through reports which may help with the query. Unfortunately, that MDX Query enable is not available within version 9.3.1.
    Please help.
    Thanks

    John,
    The Database has data that is rolled up to all levels.
    I can open the report to Edit in the Financial Reporting Studio. This allows me to click on members in the grid, however, when I search for members I am unable to do so. I can drill down to the member and select it, but when I run the report in FR studio, I get the same error message 'Error: Invalid Report Object'.
    Thanks and Regards
    Kunal Tripathi

Maybe you are looking for

  • Magic Mouse Question (does this go on this discussion board)?

    Hi - I've just purchased a Magic Mouse, and have a bit of a problem: it appears as though certain applications (ok, most, but, importantly, not all) do not recognize mouse clicks. EG, if I try to close or minimize a window in Mail (or Firefox), the m

  • Creating CRM source system in BI 7.0

    Hi Experts,                 How to create a CRM source system in BI 7.0. Any procedures to be followed??....a step by step guide will be a great help at this point. Thanks in advance Shiva

  • Need instructions on how to get Quanser working with LabView 7.1

    I need information on what LabView modules or addins to install in order to run Quanser (QNET Experiments release 2.2).  LabView 7.1 is loaded but we also have LabView 7.0 if needed.

  • All AAC Audio Files Lower After 6.0.2 Update

    Just checking to see if anyone has had this problem before. I did the update two days ago (6.0.2) and noticed that my AirPort Express output was noticeably lower afterwards. I checked through my iTunes Library, "Get Info", and discovered in "Summary"

  • Non-nike shoe first experience

    I used the sport kit for the first time today with my New Balance shoes. The distance was off terribly - distance was actually 3.7 miles but it monitored it as 1.52 miles. I put the sensor under the insole of my left foot...and it did move around sli