Clear data in ASO - Dynamic Members

Folks,
I've been trying to clear partial data in my ASO cube but encountered this error -
ERROR 1013358 - Dynamic members are not allowed in data clear region specification
There are few dimensions which are tagged as Dynamic and few members have formulas on them. Is it possible to exclude those members in the Maxl script or any better way of doing it?
I read a post where Glenn mentioned he used an exclude clause but I'm not familiar with the usage.
Thank you!
gugler

Why do you want to clear dynamic members? Clear out the members that you dynamic member is based off.

Similar Messages

  • Clear data in ASO 9.3.1.3

    how can i clear data for some combination of data
    what i ahve done is i run th report script of those combination and then i deleted the values and loaded the data with rul file
    is this way or or we have any other way to clear the data

    Hi,
    Be extremely careful if you are doing this..
    One way is to make use of calculation script:
    FIX(dim1, dim2) /*dim1 & dim2 are those members which form the combination where you donot want the data*/
    CLEARBLOCK ALL; /*Clears and removes all blocks*/
    ENDFIX;
    NOTE:
    1. Take backup before executing. Check the number of blocks before and after execution to get fair idea of whats happening.
    2. If you use CLEARBLOCK within a FIX command, Essbase clears only the cells within the fixed range, and not the entire block.
    3. Choose the FIX members cautiously.
    Hope it helps!
    Neha
    Sorry, this is for BSO only!!
    Edited by: NehaGahtori on Mar 26, 2010 5:32 AM

  • Clear ASO region with dynamic members

    I have a problem in clearing a region in an Oracle ESSBASE ASO cube that returns in an error like:
    dynamic members are not allowed in data clear region specification
    when i try to execute this query
    alter database 'aso'.'db' clear data in region '{CrossJoin(CrossJoin(CrossJoin(CrossJoin(CrossJoin(CrossJoin(CrossJoin(CrossJoin(CrossJoin({[var1]},{[var2]}),{[var3]}),{[var4]}),{[var5]}),{[a],[b ],[c],[d]}),{Descendants([node],Levels([dim1],0))}),{Descendants([dim2],Levels([dim3],0))}),{Descendants([dim4],Levels([node1],0))}),{Descendants([dim5],Levels([node2],0))})}' physical;
    In my case there are level 0 dynamic members in dim5.
    As you can see i tried the "physical" approach as found in some oracle forums but with without success. I want to work at leaf level for each node, excluding the ones marked as dynamic. How can I achieve this? How can I exclude from the query dynamic members in the hyerarchy and, as in my case, only dynamic leaf?
    Help is very appreciated.
    Thank you
    Edited by: user9289301 on Feb 23, 2013 12:46 PM

    There are two things I have done.
    1. Move all of the dynamically calcualted members under a a parent like statistical, then for Dim 5 I wuld do the level zero members except the statistics parent
    2. put a UDA on each of the members with formulas then exclude them.(You could conversely put a uda on members without formulas and select only those

  • MDX query performance on ASO cube with dynamic members

    We have an ASO cube and we are using MDX queries to extract data from that cube. We are doing some performance testing on the MDX data extract.
    Recently we made around 15-20 account dimension members dynamic in the ASO cube, and it is taking around 1 and a half hour for the query to run on an empty cube. Earlier the query was running in 1 minute on the empty cube when there were no dynamic members in the cube.
    Am not clear why it takes so much time to extract data from MDX on an empty cube when there is nothing to extract. The performance has also degraded while extracting data on the cube with data in it.
    Does dynamic members in the outline affect the MDX performance? Is there a way to exclude dynamic members from the MDX extract?
    I appreciate any insights on this issue.

    I guess it depends on what the formulas of those members in the dynamic hierarchy are doing.
    As an extreme example, I can write a member formula to count every unique member combination in the cube and assign it to multiple members, regardless if I have any data in the database or not, that function is going to resolve itself when you query it and it is going to take a lot of time. You are probably somewhere in between there and a simple function that doesn't require any over head. So without seeing the MDX it is hard to say what about it might be causing an issue.
    As far as excluding members there are various function in MDX to narrow down the set you are querying
    Filter(), Contains(), Except(), Is(), Subset(), UDA(), etc.
    Keep in mind you did not make members dynamic, you made a hierarchy dynamic, that is not the same thing and it does impact the way Essbase internally optimizes the database based on Stored vs dynamic hierarchies. So that alone can have an impact as well.

  • How can i clear the portion of data and load the new data in ASO App 9.3.1?

    Hi
    I want to delete the portion of data and i want to reload the new portion of that data In ASO Application 9.3.1 Version.
    Please give anyone with good procedure.
    Thanks
    Kranthi

    Have a read of :- Clear Data by Fix
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Maxl to clear partial data in ASO applications

    Hi ,
    Is there a maxl statement to clear partial data from ASO applications?
    Thanks
    Kannan.

    Hi,
    Another solution, close to what garycris suggested, is to create a report script to export the subset of data you want to delete to a text file. Then load this text file using a load rule that replaces your data column by a missing column.
    You can then call the report script and the load rule using a calc script.
    Problem is you need to also have a BSO database to do that. If it's the case then your good.
    If not, you can probably do that from a maxl script.
    Good practice would be to have a CLRASO load rule for all you clear process, and then make sure that your different clear report script fit its format. You end up with n clear report scripts, n clear calc scripts or maxl scripts, and 1 clear load rule.
    Cyril
    Edited by: user635693 on 12 janv. 2009 23:25

  • Retrieve data from a dynamic page via loadURL

    Hello.
    I would like to ask you how it is possible to retrieve data
    from a dynamic page (asp classic in my case) using the loadURL
    method.
    I would like to create an html authentication form (with
    username and password fields). The loadURL method should call an
    asp page and then pass to the usual function 'DoIfSucceded' the
    results of the elaboration.
    Of course I'm going to have a switch in the function in order
    to make different actions depending from the results of the asp
    page (authentication succeded or failed).
    I had a look to the examples at this page:
    Adobe
    samples
    Is there anyone who can explain clearly how the results data
    must be written by the asp page and how the success function can
    retrieve them ?
    I thank you in advance for your help.

    loadURL() uses the the XMLHttpRequest Object so if the
    content you return is XML, you have 2 choices for accessing your
    data. You can either access it as a text string via the
    XMLHttpRequest object's responseText property, or as a DOM document
    via the XMLHttpRequest object's responseXML property:
    function MySuccessCallback(req)
    // If my data is XML, I can access the data that was sent
    from the server
    // as DOM elements via the XMLHttpRequest's responseXML
    property.
    var xmlDOM = req.xhRequest.responseXML;
    // OR, you can access the data sent back from the server as
    text via
    // the XMLHttpRequest object's responseText property.
    var xmlAsAString = req.xhRequest.responseText;
    var req = Spry.Utils.loadURL("GET",
    "/app/book.php?id=1&code=54321", true, MySuccessCallback);
    If your serverside script wants to use some other format as a
    response like JSON or plain text, then you can only access the data
    as text via the responseText property.
    --== Kin ==--

  • Clear Data package - "The data file is empty." Error

    Hi,
    When I run the Clear package in Data Manager, I get the error "The data file is empty". We are running BPC 7.5 SP3. There are no calculated members selected. Any suggestions?
    Thanks.
    Tom

    Hi Tom,
    If i'm not mistaken when we run the clear data package, BPC inserts -ve values for the combination of dimensions selected.
    For ex if you have selected time as 2010.DEC it would reverse out all these values by inserting negative values which would then total up to zero.
    So i still think it could be a problem with the Transformation file.
    P.S. Maybe you can try using the Clear from Fact table package.
    Santosh

  • Clear data taking double the time than usual...

    Hello,
    We are on 11.1.1.3 and force archiving the database to ABC.arc file. All of sudden we noticed the double the size of ABC.arc file (from 3GB to 6GB) and the calc script to clear data is taking double the time than previous( earlier it was .30mins n now its 1hour).
    But the daily data files are of same size like 90MB and other 2 metadata files 80MB & 60KB.
    And also there was no change in the time to archive the database & calulating aggregation of whole database etc.,
    Only Clearing data script itself taking double the time. From EAS logs we noticed that the no. of blocks to be cleared are same (3000) on all days and no. of fixed account members also same on all days.
    There was no suspicious log while the clearing data script running..it is same as before.
    So,could any one has idea what might be wrong and what are the arficats would be there in .arc file which may cause to increase its size almost double.
    Any help would be appreciated.
    Thanks,
    Edited by: user11150227 on Dec 24, 2012 7:24 AM

    If it was taking 30 mins to clear data before something else is wrong to begin with. Also - FIXing on dense members with calc commands (CLEARDATA, COPY DATA) degrades performance severely.
    For what it's worth you will probably get more replies with a more professional post, where are is not abbreviated 'r' and would not abbreviated 'wud'.
    -Matt

  • Clearing data

    Hi
    I am
    using cleardata commond ,clearing the data values for the previous years
    what is the use of clearblock command , where can i use that?
    what is the benfit of doing it .

    Clearblock:
    - Will remove blocks as long as you are not fixed on dense members
    - Is somewhat less flexible to use based on syntax
    Cleardata:
    - Will never remove blocks (on it's own)
    - Is somewhat more flexible
    In practice, there are few situations where your first thought will be to use the clearblock method. Instead, you will be thinking about clearing data and hence only switch over to clearblock if you think it will net you something (and it can, but usually doesn't do much for you).
    Of course, on those occasions where it does do something for you, it's fairly obvious. One example is to clear the upper level blocks so you can do a more efficient consolidation calc.
    Edited by: DougWare on Sep 24, 2008 6:53 PM

  • Create SCOM Group with dynamic members about 10minutes !

    in our SCOM 2012 SP1 (CU3) environment with about 800 Windows Agents.
    OperationsDB on a Windows Cluster (2 physical server with 2 processors (six cores). Datawarehouse on separate cluster.
    When i create a group with dynamic members, it took about 10min. During this period all the consoles are busy and freezing. 
    Is that normal ?
    Regards
    Lehugo

    on the management server i got follow eventlog error durung this time: 
    OpsMgr Management Configuration Service failed to execute 'ConfigStoreStatsUpdate' engine work item due to the following exception
    Microsoft.EnterpriseManagement.ManagementConfiguration.DataAccessLayer.DataAccessException: Data access operation failed
       at Microsoft.EnterpriseManagement.ManagementConfiguration.DataAccessLayer.DataAccessOperation.ExecuteSynchronously(Int32 timeoutSeconds, WaitHandle stopWaitHandle)
       at Microsoft.EnterpriseManagement.ManagementConfiguration.SqlConfigurationStore.ConfigurationStore.ExecuteOperationSynchronously(IDataAccessConnectedOperation operation, String operationName)
       at Microsoft.EnterpriseManagement.ManagementConfiguration.SqlConfigurationStore.ConfigurationStore.WorkItemCompleted(IConfigServiceEngineWorkItemHandle workItemHandle, IConfigServiceEngineWorkItemResult workItemResult)
       at Microsoft.EnterpriseManagement.ManagementConfiguration.Interop.SharedWorkItem.ExecuteWorkItem()
       at Microsoft.EnterpriseManagement.ManagementConfiguration.Interop.ConfigServiceEngineWorkItem.Execute()
    System.Data.SqlClient.SqlException (0x80131904): Sql execution failed. Error 50000, Level 16, State 1, Procedure WorkItemMarkCompleted, Line 61, Message: Failed to report work item completion. Work item with id 1888748 is not assigned to service instance 'XXXXXX\Default'
       at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection)
       at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning()
       at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)
       at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString)
       at System.Data.SqlClient.SqlCommand.CompleteAsyncExecuteReader()
       at System.Data.SqlClient.SqlCommand.EndExecuteNonQuery(IAsyncResult asyncResult)
       at Microsoft.EnterpriseManagement.ManagementConfiguration.DataAccessLayer.NonQuerySqlCommandOperation.SqlCommandCompleted(IAsyncResult asyncResult)

  • Clear Data Manager Package Error "The data file is empty."

    Hi,
    When I run the Clear data package in Data Manager, I receive the error "The data file is empty." I selected a very specific set of dimension values (none are calculated) and am on BPC 7.5 SP3. I subsequently turned on debugging to troubleshoot, but do not see any obvious issues leading the the error message. The log file with debugging turned on is below. Any help would be greatly appreciated!
    Thanks.
    Tom
    TOTAL STEPS  3
    1. Export_Zero:        completed  in 1 sec.
    2. Load Cube:          Failed  in 0 sec.
    3. Clear:              completed  in 0 sec.
    [Selection]
    ENABLETASK= Yes
    CHECKLCK= Yes
    (Member Selection)
    Category: ACTUAL
    Time: 2010.C_SEP
    Affiliate: az_swhd
    Account: Donor_DART_ID_1
    Functional: Benchmark_F
    Report: Cons
    Restriction: AnyRestricted
    [Messages]
    The data file is empty. Please check the data file and try again.
    [EvModifyScript Detail]
    12-28-2010  17:30:05 - Debug turned ON
    INFO(%TEMPFILE%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)
    TASK(EXPORT_ZERO, APPSET, ESMetrics)
    TASK(EXPORT_ZERO, APP, CONSOLIDATED)
    TASK(EXPORT_ZERO, USER, NESSGROUP\tbardwil)
    TASK(EXPORT_ZERO, FILE, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)
    TASK(EXPORT_ZERO, SQL,
    select [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM ( SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED') UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTWBCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')  UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFAC2CONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')) as ZeroTable  group by [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID] OPTION(MAXDOP 1)
    TASK(EXPORT_ZERO, DATATRANSFERMODE, 2)
    TASK(LOAD CUBE, APPSET, ESMetrics)
    TASK(LOAD CUBE, APP, CONSOLIDATED)
    TASK(LOAD CUBE, USER, NESSGROUP\tbardwil)
    TASK(LOAD CUBE, FILE, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)
    TASK(LOAD CUBE, DATATRANSFERMODE, 4)
    TASK(LOAD CUBE, DMMCOPY, 0)
    TASK(LOAD CUBE, PKGTYPE, 0)
    TASK(LOAD CUBE, CHECKLCK, 1)
    TASK(CLEAR COMMENTS, APPSET, ESMetrics)
    TASK(CLEAR COMMENTS, APP, CONSOLIDATED)
    TASK(CLEAR COMMENTS, USER, NESSGROUP\tbardwil)
    TASK(CLEAR COMMENTS, DATATRANSFERMODE, 0)
    TASK(CLEAR COMMENTS, SELECTIONORFILE, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)
    TASK(CLEAR COMMENTS, ENABLETASK, 1)
    TASK(CLEAR COMMENTS, CHECKLCK, 1)
    INFO(%ENABLETASK%, 1)
    INFO(%CHECKLCK%, 1)
    INFO(%SELECTION%, [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED'))
    INFO(%TOSELECTION%, [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED'))
    INFO(%APPSET%, ESMetrics)
    INFO(%APP%, CONSOLIDATED)
    INFO(%CONVERSION_INSTRUCTIONS%, )
    INFO(%FACTCONVERSION_INSTRUCTIONS%, )
    INFO(%SELECTIONFILE%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\FROM_51_.TMP)
    INFO(%TOSELECTIONFILE%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\TO_51_.TMP)
    INFO(%DEFAULT_MEASURE%, PERIODIC)
    INFO(%MEASURES%, Periodic,QTD,YTD)
    INFO(%OLAPSERVER%, ETSCSAP047940.EASTER-SEALS.ORG)
    INFO(%SQLSERVER%, ETSCSAP047940.EASTER-SEALS.ORG)
    INFO(%APPPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\)
    INFO(%DATAPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\DataManager\DataFiles\)
    INFO(%DATAROOTPATH%, C:\BPC\Data\WebFolders\)
    INFO(%SELECTIONPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\DataManager\SelectionFiles\)
    INFO(%CONVERSIONPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\DataManager\ConversionFiles\)
    INFO(%TEMPPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\)
    INFO(%LOGICPATH%, C:\BPC\Data\WebFolders\ESMetrics\Adminapp\CONSOLIDATED\)
    INFO(%TRANSFORMATIONPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\DataManager\TransformationFiles\)
    INFO(%DIMS%, [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[Time])
    INFO(%FACTDIMS%, [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID])
    INFO(%CATEGORY_DIM%, [Category])
    INFO(%TIME_DIM%, [Time])
    INFO(%ENTITY_DIM%, [Affiliate])
    INFO(%ACCOUNT_DIM%, [Account])
    INFO(%CURRENCY_DIM%, )
    INFO(%APP_LIST%, Consolidated,ES_INC,GrantMgmt,LegalApp,LRate,Ownership,Rate)
    INFO(%ACCOUNT_SET%, DONOR_DART_ID_1)
    INFO(%AFFILIATE_SET%, AZ_SWHD)
    INFO(%CATEGORY_SET%, ACTUAL)
    INFO(%FUNCTIONAL_SET%, BENCHMARK_F)
    INFO(%REPORT_SET%, CONS)
    INFO(%RESTRICTION_SET%, ANYRESTRICTED)
    INFO(%TIME_SET%, 2010.C_SEP)
    INFO(%ACCOUNT_TO_SET%, DONOR_DART_ID_1)
    INFO(%AFFILIATE_TO_SET%, AZ_SWHD)
    INFO(%CATEGORY_TO_SET%, ACTUAL)
    INFO(%FUNCTIONAL_TO_SET%, BENCHMARK_F)
    INFO(%REPORT_TO_SET%, CONS)
    INFO(%RESTRICTION_TO_SET%, ANYRESTRICTED)
    INFO(%TIME_TO_SET%, 2010.C_SEP)
    INFO(DATAMGRGLOBALBPU, )
    INFO(DATAMGRGLOBALCLIENTMACHINEID, ETSCWLT048794)
    INFO(DATAMGRGLOBALERROR, )
    INFO(DATAMGRGLOBALPACKAGEINFOR, )
    INFO(DATAMGRGLOBALPACKAGENAME, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\DataManager\PackageFiles\System Files/Clear.dtsx)
    INFO(DATAMGRGLOBALSEQ, 51)
    INFO(DATAMGRGLOBALSITEID, )
    INFO(MODIFYSCRIPT, DEBUG(ON)<BR>PROMPT(SELECTINPUT,[CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED'),,"SELECT THE MEMBERS TO CLEAR",[Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[Time])<BR>PROMPT(RADIOBUTTON,1,"DO YOU WANT TO CLEAR COMMENTS ASSOCIATED WITH DATA REGIONS IN BPC?",1,{"YES","NO"},{"1","0"})<BR>PROMPT(RADIOBUTTON,1,"SELECT WHETHER TO CHECK WORK STATUS SETTINGS WHEN DELETING COMMENTS.",1,{"YES, DELETE COMMENTS WITH WORK STATUS SETTINGS","NO, DO NO DELETE COMMENTS WITH WORK STATUS SETTINGS"},{"1","0"})<BR>INFO(C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp,C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempfdla_51_.tmp)<BR>TASK(EXPORT_ZERO,APPSET,ESMetrics)<BR>TASK(EXPORT_ZERO,APP,CONSOLIDATED)<BR>TASK(EXPORT_ZERO,USER,NESSGROUP\tbardwil)<BR>TASK(EXPORT_ZERO,FILE,C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)<BR>TASK(EXPORT_ZERO,SQL,
    select [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM ( SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED') UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTWBCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')  UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFAC2CONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')) as ZeroTable  group by [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID] OPTION(MAXDOP 1)
    )<BR>TASK(EXPORT_ZERO,DATATRANSFERMODE,2)<BR>TASK(LOAD CUBE,APPSET,ESMetrics)<BR>TASK(LOAD CUBE,APP,CONSOLIDATED)<BR>TASK(LOAD CUBE,USER,NESSGROUP\tbardwil)<BR>TASK(LOAD CUBE,FILE,C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)<BR>TASK(LOAD CUBE,DATATRANSFERMODE,4)<BR>TASK(LOAD CUBE,DMMCOPY,0)<BR>TASK(LOAD CUBE,PKGTYPE,0)<BR>TASK(LOAD CUBE,CHECKLCK,1)<BR>TASK(CLEAR COMMENTS,APPSET,ESMetrics)<BR>TASK(CLEAR COMMENTS,APP,CONSOLIDATED)<BR>TASK(CLEAR COMMENTS,USER,NESSGROUP\tbardwil)<BR>TASK(CLEAR COMMENTS,DATATRANSFERMODE,0)<BR>TASK(CLEAR COMMENTS,SELECTIONORFILE,C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)<BR>TASK(CLEAR COMMENTS,ENABLETASK,1)<BR>TASK(CLEAR COMMENTS,CHECKLCK,1)<BR>BEGININFO(
    select [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM ( SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED') UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTWBCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')  UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFAC2CONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')) as ZeroTable  group by [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID] OPTION(MAXDOP 1)
    )<BR><BR><BR><BR><BR><BR><BR>SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 AS SIGNEDDATA FROM ( SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 AS SIGNEDDATA FROM TBLFACTCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED') UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 AS SIGNEDDATA FROM TBLFACTWBCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')  UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 AS SIGNEDDATA FROM TBLFAC2CONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')) AS ZEROTABLE  GROUP BY [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID] OPTION(MAXDOP 1)<BR><BR><BR><BR>ENDINFO<BR><BR><BR>)
    Edited by: Tom Bardwil on Dec 28, 2010 5:20 PM

    You can greatly improve your chance of receiving a helpful answer to your question if you state the version (MS or NW) and the release (5.1, 7.0, 7.5) of BPC which you are using.
    Also notice the sticky [note|Please do not post BPC, SSM or FI/CO questions here!; at the top of this forum whereby we announced new dedicated forums for BPC which are the proper place to post your questions regarding BPC in the future to be able to reach the right audience for your question.
    Thanks and best regards,
    [Jeffrey Holdeman|http://wiki.sdn.sap.com/wiki/display/profile/Jeffrey+Holdeman]
    SAP Labs, LLC
    BusinessObjects Division
    Americas Applications Regional Implementation Group (RIG)

  • Clearing Data in Essbase

    Does anyone know of a quick way to clear data for one dimension? I used the CLEARDATA command but that process took several minutes. I also used the Clear Data Combinations option in my load rule but that is still going (6 minutes later). I need to clear all the data for the parent "Revenue Statistics" before I re-load the Revenue Statistics. Any other way I can tackle this?

    I'm not 100% sure what your calc is doing, but when I design calcs, I try to make it so that no clearing is required beforehand, as the process to clear large amounts of data adds time to calcs. You then have to visit those blocks again to recalculate the values.The way I do it is I try to avoid calcs that add to existing values - any calculated value I try to use values that exist in OTHER members - rather than take the value that's already there and use it somehow.Second, I try to build the logic that will set #Missing values where it needs to if necessary, rather than clearing ALL the data then calc'ing again.Of course there are dozens of situations where you have to clear data, and I don't mean to over simplify the problem, but I try to keep that kind of stuff in mind when I am designing / re-designing calcs.Why do you have to clear the value before you calc? If you run a calc, any values that are there already should be replaced.Regards,Jade---------------------------------------Jade ColeSenior Business Intelligence ConsultantClarity [email protected]

  • EIS 7.1 - Load Data in ASO - Performance tuning

    Hi,
    Right now I've EIS integration to load member and data in ASO cube using 2 separate meta outlines. It takes long time to load the data in ASO cube. To enhance the data loading performance, I'm planning to take the following actions. Can you pls let me know if you've any better idea to do it in EIS/Essbase 7.1 version.
    1. Load the data using freeform loading/own rule file while members will be built using EIS meta outline. So that drill through reports won't be affected.
    2. Use custom sqls to fetch the data from relational database and while building the data file, do the required transformations in the sql scripts itself.
    3. Can split the source data file into multiple files and data load buffer can be used to load the data in ASO cube.
    Thanks in advance!

    Hi,
    You should try the essbase forum you will probably get more responses :- Essbase
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Firefox for Android becomes painfully slow over time. Clearing data fixes, but erases bookmarks. Where is the "Reset" function ??

    After a fresh install, Firefox loads websites very fast with almost no lag. After a few weeks, it becomes slower and slower. Getting to the point where it wil take aboit 20 seconds to display the Top Sites/Bookmarks window. Then sometimes opening a page, the loading circle stops moving and it freezes the phone. After about 15 seconds everything works again. The only way I found to fix this horrendous problem is to go into control panel for Firefox and Clear Data. I dont like doing that because I lose all of my bookmarks. It seems the Cache always shows "0 bytes" so there never is nothing to clear out. I would like to know where this "Reset" function is for Android. I see they have it for Windows versions..but what about Android?
    My phone is an HTC Rezound running stock Android 4.0.3
    Thanks in advance.

    Bruce,<br/>whilst waiting for Roland to get back you may wish to try this
    Enter into the address bar ''about:memory'' you get interactive dynamic information about memory usage, and maybe also ''about:compartments'' .
    That should at least help you see where the memory is going to.

Maybe you are looking for

  • Selecting mulitple items in a List programmaticly

    Hi everyone, my list doesn't shows my updates if i select a item programmaticly. this is an easy example showing my problem:     <fx:Script>         <![CDATA[             import mx.collections.ArrayList;             import mx.events.FlexEvent;       

  • Report with date range where the daily customer receipt profit centre wise

    Hi Experts, Could you please suggest me is there any developed  standard report or t-cod eis available  with which i can get date range where the daily customer receipt profit centre wise can be obtained............ Thanks in Advance........ Santosh

  • Precompiling JSP with admin/managed servers

    Grrr... The JSP engine is extremely frustrating! I've spent many hours           fighting the "staleness" checker in WL. I've been through all of the           newsgroup messages pertaining to pre-compiling, etc., and I've gotten           pre-compil

  • JMSconnector is not deployed

    Hi On a NW7.00 based XI system JMS is causing following problem, seen in trace file: java.rmi.RemoteException: Error in getting status of application spmlBatchStatusQueue - JMSConnector : Application sap.com/spmlBatchStatusQueue - JMSConnector is not

  • Help!  Download shuts down

    When I initiate updating of my older version 6.0.4.2 the download lasts less than a minute and then shuts down while the ITunes download site opens to the page that states: "Thank you for downloading Itunes". I have repeatedly tried to download and e