Data load and calc script

Hi friend,<BR>in my cube i have one dimension that <BR>1)contain cosolidation operater ~ for all Level0 members <BR>2)for level 1 memebers haveing conslodation operator ~<BR>3)But Level 3 members consolodation operator is +<BR> <BR>if iam useing following calc will it affect on above dimension<BR>Set Update calc off;<BR>set aggmissing on;<BR>caldim(product);<BR>product is another dimesion in the cube.<BR>my question is <BR>1)if iam useing set aggmissing on willit affect on other dimension which are not calculated on that <BR>calculation script?<BR>2)iam loading date in both upper level and lower level of led dimesion,so if iam useing above calc script any impact led dimesion<BR>

Hi,<BR>have you tried it? What happened?<BR><BR>My opinion is "agg missing" doesn't have anything with consolidation (~) in common. When (~) is used then first child value is set to upper level. So no matter if consolidation is (~) or (+) the upper value will be overwritten if agg missing is used.<BR><BR>Also bear in mind if in the same dimension combination is used for child and upper level parent the parent value will also be overwritten is agg missing is used.<BR><BR>I suggest you create a simple sample only 2 dimensions and try it out! <img src="i/expressions/face-icon-small-smile.gif" border="0"><BR><BR>By the way. I always use (first command in calc script):<BR>SET UPDATECALC OFF;<BR>to turn of intelligence calculation.<BR><BR>Hope this helps,<BR>Grofaty

Similar Messages

  • After Loading and Calc'ing the database error in data Preview

    I am doing my Developer on my laptop and the transfering the object to the server
    I have loading and calc'd about 4 years of data on Bith My laptop and the Server
    when i try to Preview the data
    I get this error
    Duplicate Member names with data preview are not supported On my Laptop
    is the becasue i have clicked allow Dup memeber ??
    because i do not have any dup's in my outline
    and on the server this is the error i am getting is
    Can not connect to the olap service
    can not connect to Essbase Server
    Essbase Error 1051293
    Login Failied
    but i am connected and logged to the server
    Please advise
    Edited by: Next Level on Dec 16, 2011 1:26 PM
    Edited by: Next Level on Dec 16, 2011 1:31 PM

    I try to build a new new Application
    and did not click on the dup member !!!!
    I guess the Dup member tag is connected to The Properties of the OTL.file
    even If i not click on Dup Member when creating a
    There is not a way to bring over the .OTL and change the Property
    Please advise
    Edited by: Next Level on Dec 16, 2011 4:45 PM

  • Optimizing data load and calculation

    Hi,
    I have a cube that takes more than 2 hours to load and calculates more than 3 hours (at its fastest build). There are times that my cube loads and calculates for more than 8 hours. My calculation only uses Calc All. I am very new to Essbase and couldn't find a way to minimize the build time of my cube.
    Can anybody help? Here are some stats about my cube. I hope this helps.
    Dimension Name Type Declared Size Actual Size
    ===================================================================
    ALL_ACCOUNTS DENSE 7038 6141 Accounts <5> (Dynamic Calc)
    ALL_LEDGERS SPARSE 4 3 <1> (Label Only)
    ALL_YEARS SPARSE 3 1 <1> (Label Only)
    ALL_MONTHS SPARSE 22 22 Time <7> (Active Dynamic Time Series Members: Y-T-D, Q-T-D)
    ALL_FUNCTIONS SPARSE 55 54 <9>
    ALL_AFFILIATES SPARSE 715 696 <4>
    ALL_BUSINESS_UNITS SPARSE 452 440 <3>
    ALL_MCC SPARSE 1557 1536 <3>
    Any suggestions would be greatly appreciated.
    Thanks!
    Joe

    Joe,
    There are too many potential optimizations to list and not enough detail to make any one or two suggestions. I can see some potential areas from improvemt, but your best bet is to bring in a knowledgable consultant for a couple of days to review the cube and make changes. For example, at one client, I made changes that brought a calculation down from 4 + hours to 5 minutes. It took changes to load rules, calc scripts and how they loaded their data. So it was not one thing, but mutiple changes.
    If you look at Jason's Hyperion Blog http://www.jasonwjones.com/?m=200908 , he describes taking a calculation down from 20 minutes to a few seconds. Again, nat a single change, but a combination.

  • Hyperion business rules and calc scripts

    Hi...can anyone differentiate HBR and Calc scripts.. what is the advantage HBR got over Calc scripts.. replies will be highly appreciated

    Hi
    there are many you can easily get answer reading thro documentation.
    major difference is the runtime prompt in HBR , which differentiates Calc script.
    however I recently learned that you can put run time prompts in calc scripts lusing VBA macros.
    good luck.

  • Selective data load and transformations

    Hi,
    Can youu2019ll pls clarify me this
    1.Selective data load and transformations can be done in
        A.     Data package
        B.     Source system
        C.     Routine
        D.     Transformation Library-formulas
        E.     BI7 rule details
        F.     Anywhere else?
    If above is correct what is the order in performance wise
    2.Can anyone tell me why not all the fields are not appear in the data package data selection tab even though many include in datasource and data target.
    Tks in advance
    Suneth

    Hi Wijey,
    1.If you are talking about selective data load, you need to write a ABAP Program in the infopackage for the field for which you want to select. Otherway is to write a start routine in the transformations and delete all the records which you do not want. In the second method, you get all the data but delete unwanted data so that you process only the required data. Performancewise, you need to observe. If the selection logic is complicated and taks a lot of time, the second option is better.You try both and decide yourself as to which is better.
    2. Only the fields that are marked as available for selection in the DS are available as selection in the data package. That is how the system is.
    Thanks and Regards
    Subray Hegde

  • Load and calc - faster w/o attributes?

    Hello,I'm trying to optimize a cube that has 9 attribute dimensions...the largest base dimension, Customer, has almost 8000 stored members, and has 6 attributes.Since I've heard of attributes slowing down performance--and not just on retrievals--I was thinking of having the automated dim build remove the attributes before each load and calc, then re-adding them afterwards. Anyone have any thoughts on this?Thanks,Jared

    I have not noticed an measurable difference for a load and calc. I had heard the same thing, but my testing has showed no difference.Rich Sullivan - Beacon Analytics.

  • SQL LOADER and SHELL SCRIPT ISSUE

    Hello Guys,
    I know this not the right forum but i am not sure where i should post this.
    Pelase help
    I am running a shell script which is giving me error
    Username:SQL*Loader-128: unable to begin a session
    ORA-01017: invalid username/password; logon denied
    SQL*Loader: Release 10.2.0.4.0 - Production on Thu Nov 19 13:02:04 2009
    Copyright (c) 1982, 2007, Oracle.  All rights reserved.
    SQL*Plus: Release 10.2.0.4.0 - Production on Thu Nov 19 13:02:06 2009
    Copyright (c) 1982, 2007, Oracle.  All Rights Reserved.
    Enter user-name:
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options
    SQL>
    0 rows updated.
    Commit complete.
    SQL> SQL> Disconnected from Oracle Database 10g Enterprise Edition
    SQL> SQL> Release 10.2.0.4.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options Thu Nov 19 13:02:06 EST 2009In shell script i have used the same username and passwd which i am using from command line
    the shell script is calling sql loader to load file
    and for that also the username and passwd is same.
    i am able to run sqlldr command from command line dont knw why here its giving error
    here is my shell script
    set -a
    . $HOME/.edw.env
    . $admlib/checklib.sh
    LOGDIR=$admsrc/sigma6/ppadala/copg
    LOGFILE=${LOGDIR}/log/test`date '+%m%d'`.xtr
    DB_USER=copg
    DB_PWD=copg
    set +a
    cd $LOGDIR
    if test ! -f $admsrc/sigma6/ppadala/copg/DM_Daily_EFolderCloseCancel_Report_11192009.txt
    then
       echo "Error: DM_Daily_EFolderCloseCancel_Report_11192009.txt does not exist and/or is not a regular file." >> ${LOGFILE}
       exit 1
    fi
    echo 'End of Checking for the existence of the file - Successful'>> ${LOGFILE}
    sqlldr control=$admsrc/sigma6/ppadala/copg/Close_Cancle.ctl log=$admsrc/sigma6/ppadala/copg/Close_cancle.log
    userid=${DB_USER}/${DB_PWD} silent=\(HEADER,FEEDBACK,DISCARDS\)>> ${LOGFILE} 2>&1
    case $? in 0) :;;1|3) echo "Error: SQL Loader" >> ${LOGFILE}
         exit 1;;
    esac
    sqlplus << EOD
    ${DB_USER}/${DB_PWD}
    @Close_Cancle.sql
    EOD
    if [ $? -ne 0 ]
    then
        echo "Error: SQL Plus for script Processing" >> ${LOGFILE}
        echo "Resi Unit Scheduling Report Refresh failed" >> ${LOGFILE}
    fi
    ) > ${LOGFILE} 2>&1
    echo `date` >> ${LOGFILE}
    if [ -f ${LOGFILE} ]
    then
    mail -s "Resi Unit Scheduling" "[email protected]" < ${LOGFILE}
    sleep 3
    `ck_error ${LOGFILE}`
    fiplease help guys
    thanks

    Thanks for the reply
    In Close_cancle.log also its the same msg which i posted.
    logon denied..............
    and this is the log file contents when i do set - X on
    + cd /u2144009/src/sigma6/ppadala/copg
    + test ! -f
    + /u2144009/src/sigma6/ppadala/copg/DM_Daily_EFolderCloseCancel_Report_1
    + 1192009.txt echo End of Checking for the existence of the file -
    + Successful
    + 1>> /u2144009/src/sigma6/ppadala/copg/log/test1119.xtr
    + sqlldr control=/u2144009/src/sigma6/ppadala/copg/Close_Cancle.ctl
    + log=/u2144009/src/sigma6/ppadala/copg/Close_cancle.log
    Username:SQL*Loader-128: unable to begin a session
    ORA-01017: invalid username/password; logon denied
    SQL*Loader: Release 10.2.0.4.0 - Production on Thu Nov 19 17:32:17 2009
    Copyright (c) 1982, 2007, Oracle.  All rights reserved.
    + userid=copg/copg silent=(HEADER,FEEDBACK,DISCARDS)
    + 1>> /u2144009/src/sigma6/ppadala/copg/log/test1119.xtr 2>& 1
    + :
    + sqlplus
    + 0<<
    copg/copg
    @Close_Cancle.sql
    SQL*Plus: Release 10.2.0.4.0 - Production on Thu Nov 19 17:32:58 2009
    Copyright (c) 1982, 2007, Oracle.  All Rights Reserved.
    Enter user-name:
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options
    SQL>
    0 rows updated.
    Commit complete.
    SQL> SQL> Disconnected from Oracle Database 10g Enterprise Edition
    SQL> SQL> Release 10.2.0.4.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    + [ 0 -ne 0 ]
    Thu Nov 19 17:32:59 EST 2009Edited by: user10647455 on Nov 19, 2009 2:35 PM

  • Error while running Business Rules and Calc scripts

    We are trying to run calc script in a report and we received the error saying that " An error occured while running the specified cacl script.Check the log for details", we received the similar error when working with Business rules.
    This started when we tried to point JVMMODULELOCATION in essbase.cfg to null from its existing link to a java path.
    This error persists even after we changed the config file to its original one.
    Hyperion 9.3.1
    Oracle 10g
    Weblogic 9
    Solaris 10 64 bit
    john , please help. i will provide ytou with the exact error messages.. please tell me what this means ---> This started when we tried to point JVMMODULELOCATION in essbase.cfg to null from its existing link to a java path
    what is that JVMMODULELOCATION for??
    thankyou,
    Rciky

    Hi,
    Here is details on what the JVMMODULELOCATION is :- http://download.oracle.com/docs/cd/E10530_01/doc/epm.931/html_esb_techref/techref.htm
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Flash xml data loading and unloading specs

    hi i am trying to get specification information that i cannot
    find anywhere else.
    i am working a large flash project
    and i would like to load xml data into the same swf
    object/movieclip repeatedly.
    as i do not want the previously loaded items to unload i need
    to know if doing this will unload the items from the swf or just
    keep them in the library so they can be reposted without reloading.
    i cannot find any supporting documenation either way that
    tells me that if i load new content into a clip (i am aware
    levels overwrite) if it will or will not unload this content.
    thanks in advance.
    mk

    this is awful for me -- i cant even get the clip to duplicate
    -- and i thought this would be the simplest solution to keeping
    everything cached for one page before and one page after current in
    the project.
    i have used a simpler clip to test the code and see if i am
    insane.
    duplicateMovieClip(_root.circle, "prv", 5);
    prv._x = 300;
    prv._y = 300;
    prv._visible = true;
    prv.startDrag();
    this ALWAYS works when i use the _root.circle file of a green
    simple circle
    BUT
    when i change it to my main movie clip (which is loaded AND
    On screen -- it just doesnt duplicate at all!) -- i've even
    triggered it to go play frame 2 JUST IN CASE
    I've even set visibility to true JUST IN CASE
    ie all i do is change _root.circle to _root.cur
    and .... nada.
    AND _root.cur IS DEFINITELY on the screen and all xml
    components have been loaded into it. (it is a slide with a dynamic
    picture and dynamic type and it 100% works)
    has anyone had this insanity happen before?
    is this an error where flash cannot attach movie or duplicate
    a clip that has dynamic contents???

  • Increse No of BGP while data load and how to bypass the DTPin Process Chain

    Hello  All,
    We want to improve the performance of the loads. Currently we are loading the data from external Data Base though DB link. Just to mention we are on BI 7 system.  We are by passing the PSA to load the data quickest. Unfortunately we cannot use PSA.  Because loads times are more when we use PSA. So we are directly accessing views on external data base.  Also external data base is indexed as per our requirement.
    Currently our DTP is set to run on the 10 parallel processes (on DTP settings for batch Batch Manager with job class A). Even though we set to 10 we can see loads are running on 3 or 4 Back ground parallel processes only. Not sure why. Does any one know why it is behaving like that and how to increase them?
    If I want to split the load into three. (Diff DTPs with Different selections).  And all three will load the data into same info provider parallel. We have the routine in the selection that will look a table to get the respective selection conditions and all three DTPs will kick off parallel as part of the process chain.
    But in some cases we only get the data for two or oneDTPs(depends on the selection conditions). In this case is there any way in routine or process chain to say that if there is no selection for that DTP then ignore that DTP or set to success for that DTP and process chain should continue.
    Really appreciate your help.

    Hi
    Sounds like a nice problemu2026
    Here is a response to your questions:
    Before I start, I just want to mention that I do not understand how you are bypassing the PSA if you are using a DTP? Be that as it may, I will respond regardless.
    When looking at performance, you need to identify where your problem is.
    First, execute your view directly on the database. Ask the DBA if you do not have access. If possible perform a database explain on the view (this can also be done from within SAPu2026I think). This step is required to ensure that the view is not the cause of your performance problem. If it is, we need to implement steps to resolve that.
    If the view performs well, consider the following SAP BI ETL design changes:
    1. Are you loading deltas or full loads. When you have performance problems u2013 the first thing to consider is to make use of the delta queue (or changing the extraction to just send deltas to BI)
    2. Drop indexes before load and re-create them after the load 
    3. Make use of the BI 7.0 write optimized DSO. This allows for much faster loads.
    4. Check if you do ABAP lookups during the load. If you do, consider loading the DSO that you are selecting on in memory and change the lookup to refer to the table in memory rather. This will save tremendous time in terms of DB I/O
    5. This will have cost implications but the BI Accelerator will allow for much faster loads
    Good luck!

  • Write privileges on Essbase outline, substitiute and calc script variables

    Hi,
    What exact privileges need to be provision to the user\group to have the write access on outline, substitute variables and cal script without provision them to Database Manager.
    Thanks.

    To view the content of the calc, you'll have to edit it and edit needs databasemanager privileges.
    I don't think you achieve what you are looking for out-of box. So how about writing a bat file which copies the calc script to a shared folder where you can provide a read access to the users and they can view the calc scripts. (They can use text editor to view the script contents.)
    Regards
    Celvin
    http://www.orahyplabs.com

  • BUG in data grid and run script (V1 1467)

    Hi,
    I found a weird bug in data grid when displaying data defined as NUMBER(10,0).
    We have a table with a column defined as NUMBER(10,0). Whenever this column contains data >5000000000 the data in the row shows up as null (empty cell) in the data grid.
    I played around a bit and found that running a scrip like select * from testtab where a > 5000000000; with F5 it stops at the first number >5000000000. Running the script with F9 shows all rows correct.
    I'm running on WIN2000 using SQL Developer V1 (Build 1467).
    The error was not present in Build 1422.

    I tried the SQL you supplied (by the way, the table name helps in the INSERT statement) and then went to the DATA tab to view the data.
    Interesting results.
    Even though all of the insert statements executed successfully, only 2 of the rows had any data displaying in columns A and C. All rows had data in column B.
    Now for the real fun: by selecting one of the empty cells the "missing" data would appear. Try a commit (yeah, I know nothing has been updated, but SQL Developer doesn't know this) and check out the UPDATE statements that are generated in the log. The cells I selected are being updated with the "missing" data. Refresh the display and the data goes into hiding again.
    I don't even want to get into the dangers of a tool updating data when all you've done is to get focus on a field. I've already seen it wipe out the time portion of date/time stamps simply because the default setting for the date/time format does not include the time.
    Good start on a tool guys, but it's not quite ready for Prime Time.
    BTW...I'm using Oracle 9.i running SQL Developer on Windows 2000.

  • Data load and out put problem

    Hi Experts,
    Iam going through a peculiar problem.I have data flowing from R/3 to master data Info provider 0Requi. I have validated the data between R/3 and the master data info provider, it is found to be good. This data is being loaded from 0Requi to a staging write optimized DSO. there are routines in the Transformations for the DSO. when I check and compare the data, I found that out of 625 records in the Source only 599 were available in the target, and out of 599 records 17 records are duplicate and 29 records have not been populated from source to target.
    Any help to solve the issue will be highly appreciated and thanked with suitable points.
    Thanks and Regards
    SHL

    Thank you very much Jen, Full points to you.
    There was nothing in the error stack. Sy_Subrc in the routine was giving the problem. It has been rectified and the Data is loading fine in the development system.
    Now I am in another peculiar situation.
    The routines, after debugging, are working good in the development system but after transporting to Quality system for testing, they are failing there. Iam facing the same old problem there again. The transports are checked, they were done properly. The ABAPer is satisfied with the transported code. If you can then please guide me.Iam closing this thread and opening a new thread with Subject as " In different behavior of Routines"
    Thank you once again Jen. Full points assigned to you.
    Kind Regards
    SHL

  • Essbase Studio data load and other

    Hi There,
    I got my cube build from my data mart with one fact table and bunch of dimensions and deploy successfully to essbase server. The questions I have are:
    1. I have another fact table with the same dimensions, I need to load the data into the cube I build. How do I load the data from Essbase Studio, should I add that new Fact table into my schemas? I know I can load the data through EAS, but it seems defeated the purpose of Essbase studio.
    2. Is there any way I can specify from essbase studio for certain, for example, account level, as TB Last, or Avg, it seems you have to apply to the whole level as TB Last, etc. from Essbase Model Properties.
    Thanks

    Donny wrote:
    Hi There,
    I got my cube build from my data mart with one fact table and bunch of dimensions and deploy successfully to essbase server. The questions I have are:>
    1. I have another fact table with the same dimensions, I need to load the data into the cube I build. How do I load the data from Essbase Studio, should I add that new Fact table into my schemas? I know I can load the data through EAS, but it seems defeated the purpose of Essbase studio.Add the second fact table to your minischema with the proper joins
    >
    2. Is there any way I can specify from essbase studio for certain, for example, account level, as TB Last, or Avg, it seems you have to apply to the whole level as TB Last, etc. from Essbase Model Properties.ypu should have columns in your account table for the property values (Time balcance values are F,L,A for first last and average respectively. Then in the Essbase properties, you would specify to use an expernal source and give it the proper column name. Same thing for the skip values, variance reporting, consolidation, etc.
    >
    Thanks

  • Create Document from Data Load and Link to Transaction Record for Long Text

    Hi,
    I have a DBConnect Oracle datasource which contains a large text field.  I would like to build a process that will, as part of the load, create a text file from the content of this large field, upload the file into BW and create the document association with the transaction record.
    Is anyone aware of a HOW-TO to create the BW document entries and upload the files using ABAP?  I thought that I had seen a HOW-TO or instructions approx a year ago, but cannot locate them now.
    Thanks in advance,
    Mel W.

    Hi,
    I hope this is the how to document you were looking for:
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/8046aa90-0201-0010-5e99-962948c83331
    -Vikram

Maybe you are looking for