Compression in Process Chain BI-7.0

Hello Forum.
I have integrated a process to compress the last 7 days of a cube in BI-7.0 environment.
The chain gets completed sucessfully but request does not get compress in the cube.
However,when I create an aggreagate and include a step to rollup in the chain before the compression process, the chain rolls up and also compresses the cube.
I am also aware of the compression while rollup, but along with aggreagates compression,infocube also gets compresses.
can anyone please suggest why does this compression process does not compresses the data in the cube ?
am I skipping any thing ?
Thank you in advance
Regards

hi,
check your compression variant in the processs chain
i

Similar Messages

  • Compression through process chain

    Hi Gurus,
    I have 76 cubes in my producation system.
    I want compress all cubes through process chain in every weekend.
    adding all cubes in one process chain...
    pls suggest me ,
    thanks,alex

    When using the 'Compression of the InfoCube' Process Variant, you don't have the capability of given a date range or year/month of requests to compress. It only has the capability of n days old.
    So, you'd have to create two Process Variants, one with marker and one without, for each month of requests to be compressed. Since you haven't compressed for two years, the values to put in the 'Collapse only those requests that were loaded XXX days ago' would be:
    730
    700
    670
    640
    610
    580
    550
    So in total, you're going to end up with something like 50 Process Variants (25 with marker and 25 without marker). You'd want then to execute in inverse order (730 first, then 710, then 670, and so on) so that it does only 30 days worth of work.
    Then once you have completed compressing the InfoCubes, you can then remove all of these Process Variants and only have one with marker and one without and set to how many days old you wish (SAP recommended to us not to compress requests that are less than 21 days old in the case there's an issue with a request - once you compress a request you can't identify and delete data by request).

  • Delete Requests (Except Compressed ones) in Cube in Process Chain's Routine

    Hi all,
        We load an InfoCube on weekdays and we compress the Month-End load request. My request is to
    keep the latest SEVEN requests and DELETE all other requests EXCEPT the Compressed requests.  I should not delete the COMPRESSED Requests. I want to accomplish it through ABAP Routine in a Process Chain
        In the below example, I should delete the Request Id's 2 & 3 only and keep Request Id's 1, 4 to 10.
    Request Id---Compression Status
    10----
    9----
    8----
    7----
    6----
    5----
    4----
    3----
    2----
    1----
    YES
    I have looked on SDN and it says to use the Function Module RSSM_ICUBE_REQUEST_GET and RSSM_DELETE_REQUEST or run program RSDRD_DELETE_FACTS for sample code, but I didn't find anything for my requirement and could you PLEASE send the sample ABAP code for it.
      Thanks in advance.
    Regards,
    Venkat.

    Hi,
    http://help.sap.com/saphelp_nw04/helpdata/EN/f8/e5603801be792de10000009b38f842/content.htm
    Automated Deletion of Infocube Request:
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a7ba9523-0b01-0010-18ab-cf8b3b8c374a
    Thanks
    Hema

  • Urgent: Compression process in Process chain failed

    Hello Experts,
    In one of the process chains, Compression process of a cube has failed. The status in the data target for the last two requests is as follows:
    Compression status in Cube : unflagged
    Compression status in aggregate : flagged
    Roll up of aggregate : flagged.
    Now, can I repeat the compression process to resolve or should I take any other steps?
    Please reply.
    Its my duty to assign the points,
    thanks in advance,
    Raj

    Hi Lakshman and Henry,
    The display messages says:
    collapse terminated , system error: condense facttable 5 error 30036. ORA:30036
    As per your reply I've repeated the step, but it failed again saying, " LOCK IN LOCK MANAGER COULD NOT BE SET"
    Lock in lock manager could not be set -> long text
    Message no. RSENQ009
    Diagnosis
    The lock in the lock manager could not be set. The system is trying to set the lock in Include LRSENQF00 with the form set_enqueue.
    Name of lock table: RSENQ_PROT_ENQ.
    An attempt was made to lock the following object:
    User/Object = USER/CUBE NAME
    Object_typ/Action = DATATARGET/COND_DTA
    Subobject = COND_DTA
    SubSubObject = CUBE NAME
    Could you please help me resolve this?
    thanks in advance,
    raj

  • Cube Compression & Process Chains

    Hello Friends
    Few Questions as I am a beginner.
    1) What is the entire concept behind Cube Compression. Why is it preferred for Delta uploads and not for full uploads.
    2) What do we mean by deleting and creating indexes using process chains.
    3) What is meant by the process chain "DB Statistics Refresh"? why do we need it.
    Any help is appreciated. Points will be generously assigned.
    Thanks and Regards
    Rishi

    Hello Rishi,
    As you may know, an InfoCube consists of fact tables and dimension tables. The fact table hold all key figures and the corresponding dimension keys, the dimension tables refer from dimension keys to InfoObject values.
    Now, there is not only one fact table but two - the F table and the E table. The difference from a technical point of view is just one InfoObject: 0REQID, the request number. This InfoObject is missing in the E table. As a result, different records in the F table could be aggregated to one record in the E table if they have the same key and were loaded by different requests.
    As you may know, you can delete any request from an InfoCube by selecting the request number. And here is the disadvantage of the E table. As there is no request number you cannot delete a request from this table.
    When data is loaded into an InfoCube it is stored in the F table. By compressing the InfoCube records are transmitted into the E table. Because of the disadvantage of the E table it can be defined per InfoCube if and when data has to be transmitted.
    More information can be found here: http://help.sap.com/saphelp_nw70/helpdata/en/ca/aa6437e7a4080ee10000009b38f842/frameset.htm
    An index is a database mechanism to accelerate the access to single records within a table. In BW indexes are used to increase the reporting speed.
    Whenever data in a table is added or deleted - in our case loaded - the index has to be modified. Depending on the amount of changes in the table it could be less time consumpting to delete the index, load without an existing index and to rebuild the index afterwards. This can be done in process chains.
    DB Statistics is something special for an Oracle database. As far as I know (I do not work with Oracle) it is used to optimize SQL commands which are needed for BW reports.
    I hope that these explanations are helpful.
    Kind regards,
    Stefan

  • Cube compression process chain creating

    Hi,
    I want to create Cube compression process chain so that it will automate the process to compress the cube.
    Could any please provide me the steps to create Process chain to automate the cube compression.
    Regards,
    Prem.
    Please search the forum
    Edited by: Pravender on Aug 27, 2010 11:54 AM

    Hi,
    You can modify ur process chain in RSPC.
    Get into change mode in planning view and in the left side check for the different process where you can find the Compression process.
    Drag the same to right section and create a variant for the same and give the cube name. Once done then you can put the same after the cube load process (If roll up is there then it shud be after rollup if not then cube load process).
    Once all changes are done then, activate it and collect the same in transport to move the changes to next level.
    Hope this helps.
    Murali

  • Compress aggregates via a process chain

    I am rolling up aggregates and not compressing the aggregate.  I would like to do this in a process chain for anything older than 10 days.  Can anyone advise on how I can do this
    Thanks

    S B Deodhar wrote:S B Deodhar wrote:hi,
    Thanks for the input.
    >
    > My scenario is this:
    >
    > We have had to drop and reload contents from a cube because it was quicker than dropping specific requests that had been rolled up and compressed
    i guess you cant delete that request which is already compressed, system doent allow you to delete those request  and regarding problematic request you can do selective deletion if required
    >
    > So what I would like to know is as follows:
    >
    > 1. Can I only compress an aggregate at the same time as I carry out a rollup i.e. if I do not check the compress flag for a request at the time I rollup, I am not able to compress that request going forward
    > 2. If I choose to compress data in the cube is it specifically the cube or will it also take into consideration the compression of requests in aggregates which are not compressed
    *If an info-cube is compressed then keeping aggregates uncompressed wont help you as request id will be lost.
    also you can try out collapse-> Select radio button Calculate Request ids ->Only compress those requests that are older than certain days also please note one thing request ,it will compress the below requests*
    hope it helps
    regards
    laksh

  • Regarding "Delete Index" process in the process chain.

    Hi Gurus
    In the process chain, I have Delete Index -> Load Info pacakge -> Create Index process in the above order.
    I am loading few records, so no need to delete indexes everytime. Can i remove the "Delete Index" process from the Chain without deleting the Create Index process. Or do I have to delete both.
    Could you please clarify my doubt if possible in detail.
    Thanks,
    Regards,
    aarthi

    With the numbers you provided, you probably don't need to have the steps.  With Oracle (not sure about all the other DB flavors) there are some other considerations:
    Not having the indexes when loading the data can improve the load time since the bitmap indexes are not very efficient with respect to inserts/updates. The more dimensions you have, the more indexes there will be and the more noticable the impact of having the indexes present when loading.
    The drop index process only drops the F fact table indexes.  If you compress your cubes regularly so that you don't have many uncompressed requests, the index rebuild time will remain small, but if you have many uncompressed requests in your cube, the index rebuild time can begin to exceed whatever time you might save on the load with indexes deleted.
    With bitmap indexes present, you can also occasionally receive a ORA-0600 deadlock error during the load process, and that can be prevented by dropping the indexes before loading or chose the load option packet by packet so that two packets are not trying to update the same index block at the same time.
    Another concern in shops where reporting on the cube might occur during the load process - if you drop indexes prior to a load, any one trying to run a query on teh cube could have poor query performance since all the indexes will be missing onthe F fact table, this agina becomes more apparent the more data you have in the uncompressed fact table.
    So it really comes down to your environment, but certainly drop the indexes any time you have large loads.

  • Questions on process chains

    Hai all,
             I have two questions on Process chains.
    1. I created a process chain for master data and a process chain for data loads(with ODS and cube). As the Master data has to be loaded first before loading transaction data, can I include the Master data process chain as a local process chain soon after the start process and then only after loading master data, the system proceeds to transaction data?
    2. I designed a process chain with the aggreagtion and compression of cube data. I have forgot to check the option delete overlapping requests in the infopackage to load the infocube. I ran the process chain and now there are 2 requests and the duplicate documents. I want to delete the recent request and check the deleting duplicate requests option in info package and re run. The problem is the system is not allowing me to delete the recent request.
    I appreciate any kind of help.
    Thanks.

    Hai Bhanu,
                Thanks for the reply. I am sceduling the request for deletion in the Manage infocube> but the request is not getting deleted. Everytime I click refresh, the delete icon disappears and the screen becomes as it was before scheduing the deletion. I checked after sometime, even then the same thing...
    I wonder the collpased requests can not be deleted???

  • Error while running the process chain

    Hi, I am getting following error while running the process chain.
    "Lock NOT set for: Deleting the data completely from a data target"
    Please suggest.
    amit

    hi,
    This is due to a lock on the objects which is being used by ur load.
    Even if master data used by this would be in a lock by an attribute change run. wait till that attribute change run finishes and then repeat.
    If this target itself is locked by someother process like rollup , compression , this error would appear. In this case also repeat the step after those locking process finish.
    some general locks during loading:
    Sometimes parallel processing in DTP might lead to this, its better set the DTP and infopackage in serial processing. In this case change the settings and then repeat the process.
    There is a specific case in which the Serial DTP gets locked by itself, in this case repeat it untill it succeeds.
    You can identify the above scenario as below. This is applicable only if the DTPs processing mode is serial.
    if u see in SM 37 this load would have three DTP processes running.
    it should have only two.
    In this case cancel the process and repeat it untill u get it right.
    hope this helps u
    regard,
    ranjith
    Edited by: Ranjith  Kumar Murugesan on Sep 12, 2008 10:44 AM
    Edited by: Ranjith  Kumar Murugesan on Sep 12, 2008 10:46 AM

  • Error while running FICO process chain

    Hi Experts,
    While monitoring the FICO related process chain.I got an error due to which the loading of the process chain stoped.
    The Error occured at DTP level and the displayed error messages are like this
    Aggregation Behaviour of InfoObjects Couldnot be read()
    Operation Could not be carried out for
    Exception xxxxxxxxxx locked.
    Can any one help me out of how to fix this problem
    Waiting for reply
    Thanks.
    Regards,
    Mohan

    hi,
    This is due to a lock on the objects which is being used by ur load.
    Even if master data used by this would be in a lock by an attribute change run. wait till that attribute change run finishes and then repeat.
    If this target itself is locked by someother process like rollup , compression , this error would appear. In this case also repeat the step after those locking process finish.
    some general locks during loading:
    Sometimes parallel processing in DTP might lead to this, its better set the DTP and infopackage in serial processing. In this case change the settings and then repeat the process.
    There is a specific case in which the Serial DTP gets locked by itself, in this case repeat it untill it succeeds.
    You can identify the above scenario as below. This is applicable only if the DTPs processing mode is serial.
    if u see in SM 37 this load would have three DTP processes running.
    it should have only two.
    In this case cancel the process and repeat it untill u get it right.
    hope this helps u
    regard,
    ranjith
    Edited by: Ranjith  Kumar Murugesan on Sep 12, 2008 10:44 AM
    Edited by: Ranjith  Kumar Murugesan on Sep 12, 2008 10:46 AM

  • Error msg in process chain

    hi guys,
    appended a process chain with process type compress,it shows fiine while checking & actvating,but while running it shows the error"Job BI_PROCESS_LOADING could not be activated
    (return code 8)".kindly let me know if it is a known error.
    Regards
    Kishore

    Hi,
    This error code 8 comes when there is an error while scheduling a job. check the background user and its properties that schedule the job.  Also look into the log of the error and see what it says. Background user should have USER TYPE set as SYSTEM.
    Hope it helps. Assign points if helpful.
    thanks
    Laura.

  • Error in process chain variant when executing BI IP planning sequence

    Hi All,
    I am trying to execute a planning sequnce as a background job using a process chain. Here one of the variant failed giving an error as : Overflow occurred when calculating <Key figure name> with value  6.5231044352058409E18+.
    I tried executing that process chain variant separately using the ABAP program RSPLS_PLSEQ_EXECUTE as a background job, but still its not working.
    At last I tried executing the individual planning step in the planning modeler, but again the same error.
    Earlier such case was resolved by compressing the planning cube requests and executing the failed variant using the ABAP program, but this time even compression is not helping to solve this.
    Can anyone please help on this issue ?

    Hi Andrey,
    This Key figure is created of the data type CURR (Stored as decimal) for Amount type. Now the length of CURR is maintained by SAP as 17 with 2 decimal places, so do you mean that I have to switch to other data type to hold the large value that is getting generated.
    My concern is : the error is coming for just one of the variant of the chain and not for other variants. If it has something to do with the KF data type, then certainly all others variants must have thrown the same error.

  • Process type in Process chain

    HI all;
    I want to cancel the process type if it takes let's say more than 2 hrs.; the reason is sometimes the rollup job don't finish and just hang it forever...until more than 12 hrs and we have to manually cancel the job in the morning when we come to the office. This is affecting the runstats job when we do after the process chain completes.
    so, is there any set up that I can do to cancel the process types let's say rollup for a cube if this doesn't finish by 2 hrs automatically.
    thank you.

    Bhanu;
    they are SEM cubes; and there are several situations sometimes the rollup gets stucks..; for example if some users do the planning in the same cube, or locks.,,etc..we tried to tell all the planner not to do any activities during the daily process chain time. we are doing rollup and compress every day for the sem cubes.

  • Process dependencies in process chain

    Hi All,
    in one of my process chain i have at the end :
    delte index -> Load DTP -> Create index -> Compress cube ->generate db stats
    but system is showing this warning
    A type "Construct Database Statistics" process cannot follow process "Compression of the InfoCube"
    and in table RSPCTYPESDEP i an entry stating that :
    generate db stats after a compression of cube "makes no sense but not completely wrong".
    can you explain why generate db stats after compress make no sense ?
    Regards.

    Hi Anns,
    It is always recommended to keep the create DB stats before compression.The advantage of the DB stats is to improve the query performance as well as loading performance.
    check the below links for more information
    /thread/2026714 [original link is broken]
    http://help.sap.com/saphelp_nw04/helpdata/en/d5/e80d3dbd82f72ce10000000a114084/content.htm
    Regards,
    Venkatesh

Maybe you are looking for

  • Using AJA IO with the FCP Voice Over Tool

    Has anyone used the io with the VO tool? I have gotten it to work in the past, however, there are a number of convoluted settings to go through to get it to work. It seems intermittent. Today it doesnt want to work at all. Does anyone have a list of

  • Is it possible to...

    First of all Good Morning/Afternoon/Night! How are you people doing? You see, I'm currently developing a web app that will allow multi-vendor processing. I thought everything was going to be very joyful but the more I started to investigate, the more

  • How can I hide the preview content of the text on ios 5?

    How can I hide the preview content of the text on ios 5?

  • Web Service and Complex Types

    Hi, I am creating a web service based on a java method that returns a Java bean. This is translated into a complex type in the wsdl containing the beans member variables. During this generation JDeveloper names the complex type as package structure +

  • Does the APS Java API v11.1.1 support the Essbase 9.x?

    Hi All, I'm currently using the APS java API v11.1.1 for the Essbase 11.x and works well. Now I'll change to Essbase 9.x, and I don't want to change the ess_japi.jar which version is 11.1.1. I want to ask if the api v11.1.1 support the Essbase 9.x? T