BEx: Too many calculations in a formula (). No more than 2000 allowed

Hi Experts,
I am into this really huge BEx report with 150+ rows and 10 columns. I have written enormous amounts of formulas as well as selections inside cells. Now while generating the report in RSRT (taking the technical information) I am receiving this particular error.
Too many calculations in a formula (). No more than 2000 allowed.
Message no. BRAIN538
Now my doubt is, will it be ok if i simply reduce the number of formulas being used? Or is the error message talkign about the number of variables used in all formulas together? Please suggest.
Regards,
Arun.

Hi,
The note 1394944 doesn't solve the issue, but it gives information on what actually happened. In my case the said note was already available in the BI version. The issue was that BEx allow only a maximum of 2000 variables in formulas in a query. So I had to restructure the whole query.
Regards,
Arun.

Similar Messages

  • My mac will not copy more than one file at a time and gets locked up if the file is too large, my mac will not copy more than one file at a time and gets locked up if the file is too large

    my mac will not copy more than one file at a time and gets locked up if the file is too large, my mac will not copy more than one file at a time and gets locked up if the file is too large

    So now that you have repeated the same thing three times that doesn't make things any clearer at all.
    You are copying files from where to where?
    How are you attempting to copy files, software or click and drag?
    Any other detail would be helpful.
    Allan

  • Depreciation calculated on credit memo is more than its own value

    Hi,
    We have noticed that the depreciation calculated on credit memo is more than its own values.
    please refer the screen shot below:
    24.05.2010     108,044.64     120     Goods receipt     108,044.64-     0     USD
    24.05.2010     108,044.64     120     Goods receipt     108,044.64-     0     USD
    24.05.2010     108,044.64     120     Goods receipt     108,044.64-     0     USD
    09.06.2010     128,064.75     120     Goods receipt     128,064.75-     0     USD
    22.06.2010     110,601.38     120     Goods receipt     110,601.38-     0     USD
    22.06.2010     7,346.35-     105     Credit memo in acquis. year45,337.47     0     USD
    22.06.2010     7,346.35     100     External asset acquisition     45,337.47-     0     USD
    Please let us know the possible reasons...
    Thanks a lot..

    Hi,
    Here are the details:
    Depreciation start date:21.11.2004
    Operating readiness: 23.12.2004
    Asset value date: 16.08.2010
    useful life: 01 period
    Kindly tell me know in case you need further info
    Thanks..
    Edited by: FI User on Oct 21, 2010 9:47 AM

  • Run the same procedure many times in parallel but no more than 10 at a time

    I have a table filled with individual such criteria and want to run a stored pl/sql procedure to run searches on an external server on each of them. The stored procedure deals with the API and the external server can handle the concurrency.
    I want to be able to the multiple searches concurrently (ie. run the stored procedure concurrently) but no more than 10 concurrent searches at a time (There are 20,000 searches to complete).
    I'm thinking this should involve AQ whose messages call "run once" event based jobs from the Scheduler (I'm running on 10g) but I am very new to Oracle and hence not sure where to begin.
    If somebody could fill in my obvious blanks and give me an approach that will actually achieve my end I would be extremely grateful!
    Cheers, Kim.

    > I'm hoping that each of my "spawned threads" (Jobs)
    will insert all outputs into the same table (I don't
    mind them waiting on each other as it is the
    searching that is the real bottle neck). Is this
    going to be a spanner in the works?
    Inserts into the same table is not a problem. The typical problem is spawning multiple copies of the same PL/SQL proc to update the same table. Somehow each PL/SQL proc needs to find itself a unique set of rows to lock and process - without impact its fellow processes. Using ROWNUM ranges for example does not work in this case.
    The search implies I/O. This is good and bad.
    Good in that the only real way to speed up a ton of I/O (assuming that the ton of I/O is non-negotiable and must happen) is to perform the I/O in parallel. I/O is the slowest operation on a db platform due to I/O latency. And this parallel I/O processing exactly what Oracle does with Parallel Query (PQ).
    The bad is that more processes means a bigger resource footprint and one needs to figure out just what the optimal size is.
    Oracle PQ has a very simplistic mechanism - it allows you to specify the number of PQs that can be spawned (and if on a a cluster, on how many nodes) per table. It also allows you to specify a PQ process pool ceiling.
    But this is fairly primitive as you still need to figure out (in some way) what the optimal PQ sizes are per table.
    The bad is that you either need to do something primitive and similar to PQ - which means a manual monitoring and tuning - or you need to try and automate it. Which in itself can be very interesting, but complex to develop and often quite difficult to get right.

  • Bex Query: Too many table names in the query The maximum allowable is 256

    Hi Experts,
    I need your help, Im working on a Query using a multiprovider of 2 datastores, I need to work with cells to assign specific acconts values to specific rows and columns, so I was creating a Structure with elements from a Hierarchy, but I get this error when I'm half way of the structure:
    "Too many table names in the query. The maximum allowable is 256.Incorrect syntax near ')'.Incorrect syntax near 'O1'."
    Any idea what is happening? is ti possible to fix it? do I need to ask for a modification of my Infoproviders? Some one told me is possible to combine 2 querys, is it true?
    Thanks a lot for your time and pacience.

    Hi,
    The maximum allowable limit is 256 holds true. It is the max no. of characteristics and key figures that can be used in the column side. While creating a structure, you create key figures (restricted or calculated) and formulas etc.. The objects that you use to create these should not be more than 256.
    http://help.sap.com/saphelp_nw70/helpdata/EN/4d/e2bebb41da1d42917100471b364efa/frameset.htm
    Not sure if combination of 2 query's is possible.  You can use RRI. Or have a woorkbook with 2 queries.
    Hope it helps.

  • "java.io.IOException: Too many open files"  in LinuX

    Hi Developers,
    * I am continiously running and processing more than 2000 XML files by using SAX and DOM.....
    * My process is as follows,
    - Converting the XML file as Document object by DOM....
    - And that DOM will be used while creating log file report, that log file will be created after executing all XML files..
    * After processing approx 1000 files, it throws *"java.io.IOException: Too many open files" in LinuX system* ....
    * I have googled more and more in all sites including sun forum also, but they are telling only to increase the system config by ULIMIT in linux....If i increase that its executing well without exception........
    * My question is, Is it possible to do it by JAVA code itself or any other VM arguments like -Xms512m and -Xmx512m.....
    * Please let me know , if you have any idea.....
    Thanks And Regards,
    JavaImran

    Doh! I forgot to post my little code sample...
    package forums.crap;
    import java.io.*;
    import java.util.*;
    public class TooManyFileHandles
      private static final int HOW_MANY = 8*1024;
      public static void main(String[] args) {
        List<PrintWriter> writers = new ArrayList<PrintWriter>(HOW_MANY);
        try {
          try {
            for (int i=1; i<=HOW_MANY; i++ ) {
              writers.add(new PrintWriter("file"+i+".txt"));
          } finally {
            for (PrintWriter w : writers) {
              if(w!=null)w.close();
        } catch (Exception e) {
          e.printStackTrace();
    }... and the problem still isn't OOME ;-)
    Cheers. Keith.

  • Send/receive issue after adding too many additional mailboxes

    Hello Everyone,
    I am trying to solve a problem with Outlook 2010, together with Exchange 2007 (SBS 2008 server). I am not sure if I should post this in the Exchange forum, or the Outlook 2010 section.
    The inbox (or actually all folders) are not updating when I add too many additional mailboxes via Account Settings>More Settings>Advanced Tab.
    When i keep it down at 4 mailboxes, all seems fine. When I add a fifth mailbox, it no longer syncs automaticly. The user has to manually select send/receive in order to receive mail.
    What i tried:
    1) Fixing OST
    2) Removing OST, create a new one.
    3) Create a new outlook profile
    4) Tried /cleanviews and /safe switches
    5) Checked all Office updates
    6) Tried different Send/receive group configurations (altough it is to my understanding this should not apply for Exchange)
    7) Disabling cache (not an option, outlook crashes a couple of minutes because of the large size of the additional mailboxes)
    8) Disabling "Download Shared Folders" in Account Settings>More Settings>Advanced Tab
    Could anyone assist with this problem? Thanks in advance!
    Regards, Anton

    Hi Anton,
    As far as I know, Outlook 2010 allow up to 5 Exchange accounts to be configured in a single mail profile by default. But the limit can be changed up to 15 accounts by adding a registry key:
    HKEY_CURRENT_USER\Software\Policies\Microsoft\Exchange
    DWORD: MaxNumExchange
    Value: a number between 1 and 15
    Please try adding this key and let me know the result.
    Regards,
    Steve Fan
    TechNet Community Support
    It's recommended to download and install
    Configuration Analyzer Tool (OffCAT), which is developed by Microsoft Support teams. Once the tool is installed, you can run it at any time to scan for hundreds of known issues in Office
    programs.

  • HTTP-400 Too many arguments passed

    Hello There,
    I installed the fresh 10g and applied patchset 2 on it. I get following error:
    HTTP-400 Too many arguments passed upper limit is 2000
    Please help

    Hello Qiang,
    Little more detail for your reference.
    README for 3355915
    MODPLSQL VERSION 9.0.2.6.0A ONE-OFF PATCH
    Date: January 6, 2004
    o What is in this patch?
    -> FIXES FOR:
    Bug#3355915 : BACKPORT BUG#3329645
    Bug#3116388 : TASK: MOD_PLSQL DOES NOT ALLOW PASSING MORE THAN 2000 PARAMETERS
    Bug#3024540 : NON-SUCCESS CODES WITH NO CONTENT SHOULD SERVE OHS ERRORDOCUMENT
    Bug#2023253 : NLS:FILE IS NOT PROPERTY UPLOADED IF IT'S FILENAME CONTAINS 0X5C
    Bug#3329645 : MODPLSQL IS UNABLE TO PASS PARAMETER LESS THAN 32K BYTES
    Bug#3264502 : DOC PATH REQUEST WITH PERPACKAGE AUTH CALL WRONG AUTH FUNCTION
    Bug#2802064 : ORA-01036 ILLEGAL VAR NAME/NUMBER USING VARIABLE NAMED P_ARRAY
    o What version can I apply this patch on?
    -> iAS 9.0.2.x.x

  • We can't sign you in. Looks like you're already signed in on too many devices

    Everytime I open Lync 2013 client on mobile I get this message:
    "We can't sign you in. Looks like you're already signed in on too many devices"
    I tried to increase the maximum endpoints allowed for the users but still I get this message for all the users who are using Lync 2013 on mobile, sometimes we get the same message for the Lync Client for PC.
    Does anyone know how to stop this? Because everytime this message pop up, the Lync client sign out!

    Hi,
    Please try to restart the Lync FE server and BE server to check the issue again.
    Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually answer your question. This can be beneficial to other community members reading the thread.
    Sean Xiao
    TechNet Community Support

  • How can I return files in Trash, more than 1 at a time? I can place many files into Trash at a time.

    I placed many unix.exe files into Trash, more than one at a time. I recently updated 'QuickTime' and the Mac version of 'SilverLight"
    (name might be totally 'correct). I don't know if removing these files can impede my iMac 2008 running OS 10.6.8(updated SnowLeopard).
    I know I can 'return' one file from Trash at a time. When I try to return more than one ('mouse click' and 'drag'
    over 2 or more files, the'return' option isn't listed. Is there a way I can either 'return all' or return more than one file
    at a time?

    Only one at a time. But where are they from? Did you delete system files or are these Windows files (.exe is a Windows extension?)

  • Value too large for "BPM_AUDIT_QUERY"."AUDIT_LOG"(real: 2077, max: 2000)

    Hello everybody:
    I really need some help. I've modeled and implemented a BPMN process. I'm trying to test it but I got the following error.
    ORA-12899: value too large for column "DEV1_SOAINFRA"."BPM_AUDIT_QUERY"."AUDIT_LOG" (actual: 2077, maximum: 2000)
    I went to the database and I saw the mentioned column ("DEV1_SOAINFRA"."BPM_AUDIT_QUERY"."AUDIT_LOG") and is RAW type and has no size defined. I suppose should be 2000 as default.
    I don't know how to increase the size of that column to more than 2000.
    Any help, or advice, will be welcomed.
    Regards,
    isabelbernely

    Rob
    Looks like a bug to me, but these two may give some insight...
    Unicode problem and ORA-12899 error!
    Re: Callback failure..
    Pete

  • Please allow more than 13 digits in the financial report calculation

    Dear SAP development team,
    I would like to suggest the financial report programming code or query or what else that is used to generate the financial reports (TB;P&L;BS) could allow the calculation of debit or credit more than 9223372036854. It made me laugh that it is not allowed.
    Why I said so ? Here are the reasons:
    1. The report data is not stored on a specific table
    2. If the report has calculated debit or credit greater than 9223372036854 while report processing is taking place, the report should still be appears but the amount of debit or credit that greater than 9223372036854, please put the zero on the debit or credit.
    I expect that we could still use the report either money overflow happened.
    John

    Hi, Yang
    My colleagues came across that problem recently. I advised them to split the transaction and create several transaction instead thereby avoinding using more than 13 digits.
    Another way that came to my mind is to create a new currency with excnage rate (say) 1:1000 . And create the transaction using this currency.
    Can't say that both solutions are perfect but I am afraid these are only ones.
    Konstantin.

  • Function error:  Too many records

    I have writing a function that needs to return the total count of a sql statement. It will divide two calculated columns to get an average. I have two versions. Version 1 compiled successfully and I am trying to either run it in Reports or in the database and call it. I get an error stating that the function returns too many records. I understand that is a rule for stored functions but how can I modify the code to get it return one value for each time it is called?
    Here is the main calculation. SUM(date1-date2) / (date1-date2) = Avg of Days
    version1:
    create or replace FUNCTION CALC_OVER_AGE
    RETURN NUMBER IS
    days_between NUMBER;
    days_over NUMBER;
    begin
    select (determination_dt - Filed_dt), SUM(determination_dt - Filed_dt) into days_between, days_over
    from w_all_cases_mv
    where (determination_dt - Filed_dt) > 60
    and ;
    return (days_between/days_over);
    END CALC_OVER_AGE;
    version2:
    CREATE OR REPLACE FUNCTION CALC_OVER_AGE (pCaseType VARCHAR2)
    RETURN PLS_INTEGER IS
    v_days_between W_ALL_CASES_MV.DAYS_BETWEEN%TYPE;
    v_total NUMBER;
    days_over NUMBER;
    i PLS_INTEGER;
    BEGIN
    SELECT COUNT(*)
    INTO i
    FROM tab
    WHERE case_type_cd = pCaseType
    AND determination_dt - Filed_dt > 60;
    IF i <> 0 THEN
    select SUM(determination_dt-Filed_dt), days_between
    into v_total, v_days_between
    from tab
    where determination_dt - Filed_dt > 60;
    RETURN v_total/v_days_between;
    ELSE
    RETURN 0;
    END IF;
    EXCEPTION
    WHEN OTHERS THEN
    RETURN 0;
    END CALC_OVER_AGE;

    Table Structure:
    WB_CASE_NR NUMBER(10)
    RESPONDENT_TYPE_CD VARCHAR2(10)
    INV_LOCAL_CASE_NR VARCHAR2(14)
    CASE_TYPE_CD VARCHAR2(10)
    FILED_DT DATE
    FINAL_DTRMNTN_DT DATE
    REPORTING_NR VARCHAR2(7)
    INVESTIGATOR_NAME VARCHAR2(22)
    OSHA_CD VARCHAR2(5)
    FEDERAL_STATE VARCHAR2(1)
    RESPONDENT_NM VARCHAR2(100)
    DAYS_BETWEEN NUMBER
    LAST_NM VARCHAR2(20)
    FIRST_NM VARCHAR2(20)
    DETERMINATION_DT DATE
    DETERMINATION_TYPE_CD VARCHAR2(2)
    FINAL_IND_CD VARCHAR2(1)
    DESCRIPTION VARCHAR2(400)
    DETERMINATION_ID NUMBER(10)
    ALLEGATION_CD VARCHAR2(1)
    ALGDESCRIPTION VARCHAR2(50)
    Output is for Reports, I am trying to get the last calculation, which is the Average Days. The reports is grouped on Case Types and has several bucket counts for each like this.
    Case Type Count All Completed Pending Over Age Avg Days
    A 5 3 4 2 15
    Z 10 7 6 3 30
    4 8 3 5 4 22
    To get the Avg Days, the Over Age calculation is used as well as the Days Between (Determination_Dt - Filed_Dt). That is the (date1-date2) example that I gave in my first post. So the calcuation would be the SUM(Days_Between) / Days_Between.

  • Result set does not fit; it contains too many rows

    Dear All,
    We are in BI7 and running reports on Excel 2007. Even though number of rows limitation in Excel 2007 is more than  1Million, when I try to execute a report with more than 65k records of output, system is generating output only for 65k rows with message "Result set does not fit; it contains too many rows".
    Our Patch levels:
    GUI - 7.10
    Patch level is 11
    Is there any way to generate more than 65000 rows in Bex?
    Thanks in advance...
    regards,
    Raju
    Dear Gurus,
    Could you please shed some light on this issue?
    thanks and regards,
    Raju
    Edited by: VaraPrasadraju Potturi on Apr 14, 2009 3:13 AM

    Vara Prasad,
    This has been discussed on the forums - for reasons of backward compatibility I do not think BEx supports more that 65000 rows .... I am still not sure about the same since I have not tried out a query with more that 65K rows on excel 2007 but I think this is not possible...

  • Too many lines

    When opening an excel file I received as an attachment, and I want to open in Numbers, I either get major lag in viewing the workbook and editing it, Or I receive a message that an error occurred because there are "too many lines" and the workbook won't even import to Numbers
    I have a "New" IPad 3rd gen.
    Apple told me it is because I only have 512ram, but I understood that the new iPad had 1gb?
    Is numbers just not strong enough?
    Also, when I do get files to import, fonts, formulas and formats are commonly not supported and it negates being able to use the ipad to work on a workbook remotely and send it forward.
    Are ther any fixes?

    this is the oracle forms forum. your chances increase going to the right forum.
    Gerd

Maybe you are looking for