Cubes in process variant Aggr/BIA ROLLUP processed serially or parallelly

Hello,
are Cubes added to a process variant Aggr/BIA ROLLUP processed serially or parallelly?
We want to avoid interferences due to shared characteristics.
Thanks

Hi Axel,
Please have a look at the wiki content here:
http://wiki.sdn.sap.com/wiki/display/BI/GlobalParametersforBIAIndexing
-Vikram

Similar Messages

  • Cube compression process chain creating

    Hi,
    I want to create Cube compression process chain so that it will automate the process to compress the cube.
    Could any please provide me the steps to create Process chain to automate the cube compression.
    Regards,
    Prem.
    Please search the forum
    Edited by: Pravender on Aug 27, 2010 11:54 AM

    Hi,
    You can modify ur process chain in RSPC.
    Get into change mode in planning view and in the left side check for the different process where you can find the Compression process.
    Drag the same to right section and create a variant for the same and give the cube name. Once done then you can put the same after the cube load process (If roll up is there then it shud be after rollup if not then cube load process).
    Once all changes are done then, activate it and collect the same in transport to move the changes to next level.
    Hope this helps.
    Murali

  • Problem with delete index of cube in process chain

    HI ,i have a problem with my process chain,here on process name as delet index is ther which has been turned to red.when iam going inside of that in Display message iam getting the msg as"Indexes for InfoCube SMS_C06 have been deleted successfully".
    since the process is in red next process is not been triggering,but i have to continue the process chain.
    can anybody put some light into it.its very urgent........

    Hi,
    I ran into the same situation with one of my process chain that is part of a metachain. The process chain has the Delete Index that turned red. But the messages show that the indexes on all the cubes were deleted sucessfully.
    As suggested, I did a repeat on the delete index variant, but no luck as it turned red again.
    Anyone aware familiar with this issue?
    Thanks,
    Briston

  • Planning Cube Load process

    Hi Gurus,
    How to Load data to DP cubes from a BW 3.5 system

    If the BW 3.5 system is an external system then you need to build an Infocube in APO datamart and first load the data from external BW system to APO BW. Then create Planning area, POS, CVCs etc and then load the Planning data from APO Infocube to Planning area. using Transaction /SAPAPO/TSCUBE. DP Cubes are virtual ones and there is no physical existence. Please check this [DP Process|http://help.sap.com/saphelp_scm50/helpdata/en/8f/9d6937089c2556e10000009b38f889/frameset.htm] and also please go through the [Best Practices|http://help.sap.com/bp_scmv250/BBLibrary/HTML/DPL_EN_DE.htm]. open the link and click on Configuration guide. It will you through step by step process.

  • Error in generating/deleting the index of the cube in Process chain

    Hi All,
    I am getting an error in the process chain during the deletion/generation of index in the cube....
    error says: " Deleting/reconstructing indexes for InfoCube 0CSAL_C01 is not permitted "
    Need your input for the same.....
    Thanks & Regards,
    Anup

    Hi Anup,
    Try to analyse the error in the manage tab of that cube, and try to do the reconstruction of indexes manually.
    Good day!!!
    Regards
    Arun S

  • Data Warehouse Cubes Not Processing

    With a customer now who is having data warehouse problems, the main issues being:
    - ETL jobs run fine (except MPSync which finishes with only 179/180 jobs complete)
    - Cube processes are stuck in a RUNNING loop, they never complete or fail out and all show a last run time 1/10 and next run time 1/11
    Have scoured the internet to find any solution to this, have come across various blogs to fix the issue. Have tried to manually disable the jobs and then manually process the cubes, restart the DW server, restart the SSAS, etc. to no avail.
    Lastest solution we tried to take was to unregister the DW and re-register, however when we went to unregister the DW we received the following error:
    "Failed to unregister from the data warehouse"
    Title: Error
    Text: System.ArgumentException: SM data source not found.
    at
    Microsoft.EnterpriseManagement.ServiceManager.UI.Administration.DWRegistration.Unregistration.DWUnregistrationHelper.AcceptChanges(WizardMode wizardMode)
    So our next step was to unregister the data sources and then re-register them individually. Of the two data sources, we were able to unregister and re-register the DW data source (DW_COMPANY01), but when we tried to unregister the Operational data source
    (COMPANY01) we got the following error:
    Title:  An error was encountered while running the task.
    Text:  Exception has been thrown by the target of an invocation.
    Based on the two errors shown above I assume we cannot un-register this data source or the DW as a whole because it cannot find that operational data source. Couple things to point out above this data source and the environment to shed some light on the
    situation:
    Customer upgraded all servers to UR4 around the January time frame
    Customer promoted a new primary MS around January 20<sup>th</sup>
    Currently, when reviewing the details of the data source, the SDK Server references the computer name of the OLD primary management server
    Looking at the event logs on the DW MS (with SSRS/SSAS) an error shows that the MPSynch job failed to finish and reference the OLD primary management server
    Looking for any guidance for this issue. Have taken many steps to troubleshoot the problem to avail. Please let me know if you have any questions or need any more information.

    Hi !
    Probably it was a problem with the primary Keys, because when MPSyncJobs does not complete something will fail in the following ETL Jobs.
    Solution would have been here: http://technet.microsoft.com/en-us/library/jj614520.aspx
    In this stage, frankly spoken if you have still all the Data in your CMDB and nothing gone bcs of grooming, deinstall all DWH components
    (DM-MgmtServer, Databases) and reinstall DW completely.
    R.

  • Can a OLAP Cube be Processed in to RAM , Can this be done using Traditional BI ? , If we can place the entire SSAS Cube in RAM for better performance , how can it be done?

    I'm trying to increase the performance of my OLAP Cube and i thought placing the CUbe entirely in RAM can solve this problem. can anyone please answer me , if this can be done using Traditional BI i.e SSAS and how?

    Hi Nagarjuna:
    I do not believe you can load the entire cube into RAM and even if you were able to do so, I don't think you'll be able to solve you performance issues.
    Here is a thread with the same discussion and it has some good links that you can visit to learn more about where the performance issues are emanating from  -
    How to cache all SSAS 2008 cube processed data into RAM
    Also, please tale a look at the followoing guide - Analysis Services Performance
    Guide
    Hope this helps.
    Faisal Muhammed My Blog

  • Delete request in ODS and Cube from process chain

    Hi Experts,
    I have the next situation:  I have a process that each month need to extract data with a selection criteria of fiscalperiod and fiscalyear, only I can maintain a data request with this selection. For this, I need a delete the old request when I am extracting new data with the same selection criteria, in ODS and Cube.
    Is it possible?
    Thank you very much for your collaboration.
    Regards,
    Jeysi

    Hi Jeysi,
    DSO works on 'Overwrite' principle.
    You need to check your data source.(The images your data source sends).
    Even if your updating the key figures using addition method, the data source will send you the correct images and correct delta to keep your DSO as a reflection of the latest image of your source system.
    Hence you would not require to delete a request with the same selection in the DSO.
    Moreover, if you have a cube at the downstream of the DSO, loading deltas from the DSO to cube would alone be sufficient and you would not need to use delete overlapping request.
    However, if you are loading full request to the cube from the data source with the selections of fiscal year and fiscal period , then yes you do require delete overlapping selections.
    In 3.x data flow, this was configured in infopackage in data target tab there is one column for automatic deletions, here you could set the conditions on which the overlapping request in the data target(cube) in this case should be deleted.
    In process chain, you can use , delete overlapping request from the process chain.under application node, load process and post processing.(This can be configured for both infopackages and DTP's)
    Hope this helps,
    Regards,
    Sunmit.

  • Automate the Cube loading process using script

    Hi,
    I have created the Essbase cube using Hyperion Essbase Studio 11.1.1 and my data source is Oracle.
    How can I automate the data loading process into the Essbase cubes using .bat scripts?
    I am very new to Essbase. Can anyone help me on this in detail?
    Regards
    Karthi

    You could automate the dimension building and dataloading using Esscmd/ Maxl scripts and then call them via .bat scripts.
    Various threads available related to this post. Anyways, you could follow the following steps.
    For any script provide the login credentials and select the database.
    LOGIN server username password ;
    SELECT Applic_name DB_name;
    To build dimension:
    BUILDDIM location rulobjName dataLoc sourceName fileType errorLog
    Eg: BUILDDIM 2 rulfile 4 username password 4 err_file;
    For Dataload
    IMPORT numeric dataFile fileType y/n ruleLoc rulobjName y/n [ErrorFile]
    Eg: IMPORT 4 username password 2 "rulfile_name" "Y";
    Regards,
    Cnee

  • SSAS Cube Dimension Processing Fails Reporting: File system error: A FileStore error from WriteFile occurred

    Hi,
    I have a SSAS Cube Database for which the processing had been stopped since the year 2012, but as the requirements have again come up, this cube needs to be processed on a daily basis again, to refresh the cube with new incoming data.
    As i was testing the cube processing steps before finally re-activating the cube processing in production, i came across a weird error while doing a manual processing for a Dimension.
    File system error: A FileStore error from WriteFile occurred. Physical file: \\?\D:\MSAS10.MSSQLSERVER\OLAP\Data\Citi.184.db\Dim Extended Data.0.dim\54.Extended Data.ksstore. Logical file: . . Errors in the OLAP storage engine: An error occurred
    while the 'Extended Data' attribute of the 'ExtendedData' dimension from the 'Citi' database was being processed.
    This error came to me while i was doing a process update for the dimension, i even tried to do a Process Full for the same dimension which did not help. I then did a Unprocess and then a Process Full, which didn't work as well.
    Can anyone please help me with this issue. I am using the SQL Server 2008 where the cube is hosted and the processing used to work fine earlier, but now its failing with this error.
    Thanks

    This looks like the 4GB string store limit. If you can upgrade to SQL Server 2012 Analysis Services, you can change a simple setting and grow beyond that 4GB string store limit.
    http://technet.microsoft.com/en-us/library/gg471589(SQL.110).aspx
    If you can't upgrade, what data is in that ExtendedData attribute? If it's a meaningless surrogate key that you don't need users to see, you can use this trick to avoid the 4GB string store limit:
    http://cwebbbi.wordpress.com/2012/08/10/storage-and-the-namecolumn-and-keycolumns-properties/ But if it's something the users need to see, there's not an easy way to fix it without upgrading to SSAS 2012.
    http://artisconsulting.com/Blogs/GregGalloway

  • Loading from ODS to Cube in process chain

    Hi Experts,
    How can I do a full load from ODS to cube when using further processing in process chain? Your help is much appreciated.
    Thanks,
    Bill

    Hi,
    You can use a DTP for this.
    Create transformation between DSO and cube.
    Create DTP and run it.
    Loading data from one cube to another cube.
    Cube to cube data loading
    how to upload data from cube to cube
    Can we pull data from one cube to another cube
    Data Load Steps:
    Reading data from another cube
    Hope this helps.
    Thanks,
    JituK

  • Automating the cube load process

    Hi All,
    I have created the essbase cube using Hyperion Integration services 9.3.1 as my datasource is the star schema residing in a oracle db. I can successfully load the cube and see the results in smart view.
    I want to know how I can automate the data loading process(both dimension and fact) into the essbase cubes either by using unix scripts and windows .bat scripts.
    Your inputs are Appreciated. Thanks in Advance.
    Regards
    vr

    What everyone has shown you is the use of esscmd or MaxL, howevery you stated you are using EIS. Until you get to 11.1 and use Essbase studio, these won't help you. What you will need to do is to go into EIS, then select your metadata outlline. Pretend you are going to build the cube manually. Start scheduling it, when you get to the screen that allows you to schedule it or create a file, create the file only. You will want the name to end in .cbs (Hint, if you select both outline and data load, you will get all the statements in one script)
    After you are done, if you look on the EIS server under arborpath\EIS\Batch, you should see the .cbs file you created. It is a regular text file so you can look at it. If you open it, you will see loaddata and/or loaddimensions statements This is what does the work.
    To run it, create a batch file with the command olapicmd -fscriptFileName >logfilename
    For more information about the Integration Services Shell look at http://download.oracle.com/docs/cd/E10530_01/doc/epm.931/eis_sysadmin.pdf Chapter 5 tells you everything you need to know

  • Cube Compression & Process Chains

    Hello Friends
    Few Questions as I am a beginner.
    1) What is the entire concept behind Cube Compression. Why is it preferred for Delta uploads and not for full uploads.
    2) What do we mean by deleting and creating indexes using process chains.
    3) What is meant by the process chain "DB Statistics Refresh"? why do we need it.
    Any help is appreciated. Points will be generously assigned.
    Thanks and Regards
    Rishi

    Hello Rishi,
    As you may know, an InfoCube consists of fact tables and dimension tables. The fact table hold all key figures and the corresponding dimension keys, the dimension tables refer from dimension keys to InfoObject values.
    Now, there is not only one fact table but two - the F table and the E table. The difference from a technical point of view is just one InfoObject: 0REQID, the request number. This InfoObject is missing in the E table. As a result, different records in the F table could be aggregated to one record in the E table if they have the same key and were loaded by different requests.
    As you may know, you can delete any request from an InfoCube by selecting the request number. And here is the disadvantage of the E table. As there is no request number you cannot delete a request from this table.
    When data is loaded into an InfoCube it is stored in the F table. By compressing the InfoCube records are transmitted into the E table. Because of the disadvantage of the E table it can be defined per InfoCube if and when data has to be transmitted.
    More information can be found here: http://help.sap.com/saphelp_nw70/helpdata/en/ca/aa6437e7a4080ee10000009b38f842/frameset.htm
    An index is a database mechanism to accelerate the access to single records within a table. In BW indexes are used to increase the reporting speed.
    Whenever data in a table is added or deleted - in our case loaded - the index has to be modified. Depending on the amount of changes in the table it could be less time consumpting to delete the index, load without an existing index and to rebuild the index afterwards. This can be done in process chains.
    DB Statistics is something special for an Oracle database. As far as I know (I do not work with Oracle) it is used to optimize SQL commands which are needed for BW reports.
    I hope that these explanations are helpful.
    Kind regards,
    Stefan

  • Delete Requests (Except Compressed ones) in Cube in Process Chain's Routine

    Hi all,
        We load an InfoCube on weekdays and we compress the Month-End load request. My request is to
    keep the latest SEVEN requests and DELETE all other requests EXCEPT the Compressed requests.  I should not delete the COMPRESSED Requests. I want to accomplish it through ABAP Routine in a Process Chain
        In the below example, I should delete the Request Id's 2 & 3 only and keep Request Id's 1, 4 to 10.
    Request Id---Compression Status
    10----
    9----
    8----
    7----
    6----
    5----
    4----
    3----
    2----
    1----
    YES
    I have looked on SDN and it says to use the Function Module RSSM_ICUBE_REQUEST_GET and RSSM_DELETE_REQUEST or run program RSDRD_DELETE_FACTS for sample code, but I didn't find anything for my requirement and could you PLEASE send the sample ABAP code for it.
      Thanks in advance.
    Regards,
    Venkat.

    Hi,
    http://help.sap.com/saphelp_nw04/helpdata/EN/f8/e5603801be792de10000009b38f842/content.htm
    Automated Deletion of Infocube Request:
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a7ba9523-0b01-0010-18ab-cf8b3b8c374a
    Thanks
    Hema

  • Process type "Delete overlapping request in cube " in process chain

    Hi
    can somebody suggest me where i need to place the process type "Delete overlapping request" for an infocube.
    Regards,

    Hi,
    When you use the corresponding process in the process
    chain maintenance screens to specify the conditions for
    automatic deletion, these conditions are applied to all
    the InfoCubes for the selected InfoPackages
    Overlapping: If you set this indicator, existing requests are also
    delete from the InfoCube if the selection criteria of the new request partially or wholly overlap the selection criteria of the request to be deleted
    Same or Comprehensive: If you set this indicator,
    requests are only deleted from the InfoCube if
    the selection conditions of the new request are
    the same as or more comprehensive than the selection
    conditions of the request to be deleted.
    Thanks
    Bhagesh

Maybe you are looking for

  • Packaged PDF not showing the navigation sidebar for individual PDFs

    I've created a PDF Ebook as a Package, where individual PDFs are the chapters. My client tells me that some of the customers cannot see the left navigation panel showing all the "chapters", and therefore can only see the cover sheet PDF I haven't bee

  • I need a fmodule to forward workitems

    hello, I need a function module to forward workitems. I know about SAP_wi_forward and SAP_WI_OBJECT_FORWARD but these function modules can only forward one workitem at a time,. Iam looking for a fmodule which transfers multiple workitems to a user in

  • Ansynchronous Methods

    Hi, Is there a way to call an abap object oriented method asynchronously which would further help in parallel processing? Thanks, Subhrendu

  • PDF Recovery

    I am currently attempting to recover PDF files using computer forensic software and have some questions regarding the PDF specification. In order to successfully recover any file, I need to identify the file header and file footer and then extract th

  • Not a path

    Watch out for this one guys, although it is easy enough to work around. Its on 2010 and 2009, perhaps further back as well. Attachments: not-a-path-test-2009.vi ‏7 KB