Aggrates after compression

hi,
     here is my scenario:i have a cube with data of 2010,2011,2012,2013,2014,every time we load the data into cube data is compressed after some time Client asked me  to create an aggregate on 2012 data ? how can we create aggregates in this scenario?
Thanks,
Ramesh.

Hi
Create a aggregate on calyear , and expand the dimension of that  node and select the characteristic calyear , then right click on that and select the fixed value, there select 2012. and activate the it.
For more clarity see the below attachement.
Hope you got it,

Similar Messages

  • "get all new data request by request" after compressing source Cube

    Hi
    I need to transfer data from one Infocube to another and use the Delta request by request.
    I have tried this when data on Source Infocube was not compressed and it worked.
    Afterwards some requests were compressed and after that the delta request by request is transfering all the information to target Infocube in only one request.
    Do you know if this a normal behavior?  
    Thanks in advance

    Hi
    The objective of compression is it will delete all the request in your F table and moves data to E table.after compression you don't have request by request by data.
    This is the reason you are getting all the data in single request.
    Get data request by request works, if you don't compress the data in your Cube.
    If you want to know about compression, check the below one
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c035d300-b477-2d10-0c92-f858f7f1b575?QuickLink=index&overridelayout=true
    Regards,
    Venkatesh.

  • Bad reporting performance after compressing infocubes

    Hi,
    as I learned, we should compress requests in our infocubes. And since we're using Oracle 9.2.0.7 as database, we can use partitioning on the E-facttable to still increase reporting performance. So far all theory...
    After getting complaints about worse reporting performance we tested this theory. I created four InfoCubes (same datamodel):
    A - no compression
    B - compression, but no partitioning
    C - compression, one partition for each year
    D - compression, one partition for each month
    After loading 135 requests and compressing the cubes, we get this amount of data:
    15.6 million records in each cube
    Cube A: 135 partitions (one per request)
    Cube B:   1 partition
    Cube C:   8 partitions
    Cube D:  62 partitions
    Now I copied one query on each cube and with this I tested the performance (transaction rsrt, without aggregates and cache, comparing the database time QTIMEDB and DMTDBBASIC). In the query I selected always one month, some hierarchy nodes and one branch.
    With this selection on each cube, I expected that cube D would be fastest, since we only have one (small) partition with relevant data. But reality shows some different picture:
    Cube A is fastest with an avg. time of 8.15, followed by cube B (8.75, +8%), cube C (10.14, +24%) and finally cube D (26.75, +228%).
    Does anyone have an idea what's going wrong? Are there same db-parameters to "activate" the partitioning for the optimizer? Or do we have to do some other customizing?
    Thanks for your replies,
    Knut

    Hi Björn,
    thanks for your hints.
    1. after compressing the cubes I refreshed the statistics in the infocube administration.
    2. cube C ist partitioned using 0CALMONTH, cube D ist partitioned using 0FISCPER.
    3. here we are: alle queries are filtered using 0FISCPER. Therefor I could increase the performance on cube C, but still not on D. I will change the query on cube C and do a retest at the end of this week.
    4. loaded data is joined from 10 months. The records are nearly equally distributed over this 10 months.
    5. partitioning was done for the period 01.2005 - 14.2009 (01.2005 - 12.2009 on cube C). So I have 5 years - 8 partitions on cube C are the result of a slight miscalculation on my side: 5 years + 1 partion before + 1 partition after => I set max. no. of partitions on 7, not thinking of BI, which always adds one partition for the data after the requested period... So each partition on cube C does not contain one full year but something about 8 months.
    6. since I tested the cubes one after another without much time between, the system load should be nearly the same (on top: it was a friday afternoon...). Our BI is clustered with several other SAP installations on a big unix server, so I cannot see the overall system load. But I did several runs with each query and the mentioned times are average times over all runs - and the average shows the same picture as the single runs (cube A is always fastest, cube D always the worst).
    Any further ideas?
    Greets,
    Knut

  • How to delete the duplicate requests in a cube after compression.

    Hi experts,
        1. How to delete the duplicate requests in a cube after compression.?
        2. How to show a charaterstics and a keyfigure side by side in a bex query output?
    Regards,
    Nishuv.

    Hi,
    You cannot delete the request as its compressed as all the data would have been moved to E table ..
    If you have the double records you may use the selective deletion .
    Check this thread ..
    How to delete duplicate data from compressed requests?
    Regards,
    shikha

  • Incorrect results after compressing a non-cumulative InfoCube

    Hi Gurus,
    In BI 7.0 After compressing the non cumulative InfoCube its showing the incorrect reference points .leis_03_bf (pintail stock moments)  showing as the reference points(opening Balance) after compressing  as no marker update. Due to this its showing in correct result in reporting.please suggest me .
    Thanks
    Naveen

    Hi Nirajan,
    First of all as I undestood 2LIS_03_BX is the initial upload of stocks, so there is no need of delta load for this datasource, it collects data from MARC, and MARD tables when running the stock setup in R3 and you ahve to load it just one time.
    If between delta loads of 2LIS_03_BF you're loading full updates you are dupplicating material movements data, the idea of compression with marker update is that this movements affects to the stock value in the query, it's because of that when you load the delta init you do it without marker update because these movements are contained in the opening stock loaded with 2LIS_03_BX so you dont want to affect the stock calculation.
    You can refer to the How to handle Inventory management scenarios in BW for more detail on the topic.
    I hope this helps,
    Regards,
    Carlos.

  • ASM space increased after compression of tables

    Hi all,
    I will have compressed some my huge tables in dataware house database , tables size are reduce  after compression, while on ASM space has been increased.
    datbasebase is 10.2.0.4 (64 bit) and OS is AIX 5.3 (64 bit)

    I have checked the tablespaces of compressed table now. And it shows huge free space:
    Tablespace size in GB
        Free space in GB
    658
    513
    682
    546
    958
    767
    686
    551

  • Delete a particular request from a InfoCube  after compression  in sap bi 7

    HI ,
    Issue : I wand to delete a particular request from a InfoCube  after compression  in sap bi 7.0 ; (not in sap bow 3.x).
                can any one suggests me how to do .give me possible solutions.
    Regards,
    EdK...

    Hi,
    You can delete the perticular request from Infocube by useing'RSICCONT'.
    Goto SE16, give the 'RSICCONT' then Select your Cube and Select Execuite.
    It will display the list of requests. you can select the perticular request and deleted from cube.
    Edited by: Suman Reddy.Vuyyuru on Mar 18, 2010 12:38 PM

  • Video has noise after compression

    Hi, does anyone know why my videos are showing up wavey after compression is complete? 
    The following is what's going on:
    Video files originally imported to FCPX at 1920x1080p.  I then realized I should have imported them at 720p per what has been suggested for exporting videos for web use.  I figured I could merely change the settings in compression so I selected Export>Send to Compressor option.  I have my video set for 640x360.  Other information I have is for size 1450 bytes, 100 milliseconds duration.  Video frame rate is currently at 29.97 fps and key frame interval is 30.  I've been teaching myself FCPX and digital editing on the job so a lot of this is a foreign language for me.
    My video's resolution looks fine before I send it to the Compressor but after the Compressor successfully completes the job, my video wiggles and waves.
    Suggestions?
    Thanks!

    By Web use, do you mean uploading to a site like You Tube? If so, there are Web sharing presets in Compressor that work well for sites  like Vimeo and You Tube. Export as a Quick Time movie and then bring that file into Compressor and use the presets.
    If not, describe how you're going to use the video.
    Russ

  • It is possible to delete the request after compression

    Hi,
    It is possible to delete the request after compression,if it yes,how can delete the  request.

    Hi,
    You cannot delete the request individually but you can one of the following:
    1. Do a selective deletion from the cube. RSA1 -> Cube -> Contents -> selective deletion.
    2. Delete all the data in the cube and then reconstruct only the required request ids. This would work only if you have the PSA available for all the requests.
    3. Reverse posting is another possibility.
    Bye
    Dinesh

  • Getting A PAL error in DVD studio after compression

    OK, so I have no idea what is going on here, as I have never ran into this problem.
    I have a client who brought in an older sony cybershot camera. Since the camera has no output, I pulled the memory stick, and brought the footage into my computer from that.
    Everything went fine up to and including compression. I have tried this multiple times on multiple machines and get the same result. After compression while completing the DVD process, when I import the video into assets in DVD studio I get an error.
    "PAL assets cannot be imported into NTSC projects."
    The client assures me that the camera and memory stick were purchased here in the US.
    Why am I being told that this is PAL video after compression?
    Also posting this in DVD studio pro forum, figured this one was more viewed.

    Unfortunately the video format of older Cybershot cameras is neither PAL or NTSC but rather something called mpeg VX. Its 640 x 480 and only runs at 15fps.
    DVD studio is probably getting confused by the frame rate and throwing an error because the software is expecting 30fps. (29.97)
    JC

  • Space left on DVD after compression

    Workflow:
    1.Edit project in FCP7
    2. Export to Quicktime (not self contained)
    3. Use Bit Budget to create compression settings, maxed out at average 6.8, maximum 7.8
    4. Drag to compressor, change settings to match above, 7.8 changes to 8
    5. After compression build in DVDSP, burn in Toast
    6. Toast says there is 1.2 Gb left on DVD.
    Question: Why wouldn't there be less compression? The video looks ok, but with 1.2 gb left on the DVD seems there could be less compression for a better looking DVD.
    Thanks in advance for all tips and help.

    As hanumang said there are limits as to rates. If your movie is one minute long, there will be a ton of room left
    http://dvdstepbystep.com/faqs_3.php
    The maximum rate that a DVD should output is 10.08 Mbps (Page 43 DVD SP 4.1.2 Pdf Manual) Of the 10.08, the maximum rate for the video stream is 9.8 Mbps. Video, audio and subtitle streams count to the max Mbps of 10.08 (Page 44 Pdf ) Note that some players do not handle the theoretical rates (meaning the rates they are suppossed to be able to handle) well, more so on discs burned from your computer. This rate is not related to how long or short a track is, it relates to how much data is being put out by the DVD.

  • Final video pixelated after compression

    Hello,
    I am sort of new to this but seemed to have found a system that works, but for some reason my latest video is pixelated after compression for vimeo.
    I am following the exact same steps as I do for all of my videos and I have never had a problem.
    I have shot it on a Canon 5D mark III / 1920X180 / 25FPS
    I convert it to apple pro res 422 for progressive material
    I drag it into final cut
    I export it as quicktime file
    I then drag into compressor and  use the vimeo recommended settings that I use on all of my videos
    final video is very pixelated
    I will attach the process in screen shots below and hopefully someone can point out where I am going wrong.
    Thank you!

    It is kind of hard to tell what is going on from your screen shots, but a couple of things:
    It's been a while since I used that version of Compressor, but you are resizing (correctly) the 1080 source to 720, and I believe there are settings for "Best" available to make the resize as high quality as possible.
    The 5 Mbits data rate seems low to me. The Vimeo guidelines specify a rate between 5 and 10 Mbits for 720p, I'd set it to average at 8 and peak at 10. The scene with the ocean is a really tough scene to compress, because there is so much motion, throughout the frame, that it might be lowing the quality there to keep to the 5Mbits target.
    Compression works by finding areas of the image where there is little or no difference from frame to frame - so a locked off shot of a person giving a speech at a lectern is much easier to compress than a scene with an ocean where there is  movement and change on a frame by frame basis.
    MtD

  • 0IC_C03 Issue, after compression, the data still in F table,what happening

    Dears,
      After I performed the compression on the InfoCube 0IC_C03, all the queries on this InfoCube don't exeucte or with horrible low performance.
      Any suggestions are appreciated.
    B.R
    Gerald

    Hi Gerald,
    I think there is no connection of compression of request and low performance of the queries. Infact Queries run faster after compression as of my knowledge. Do some other checks to get better performance.
    Regards,
    Krish

  • How can we delete the request after compression? is it possible ?if so how?

    how can we delete the request after compression? is it possible ?if so how?

    Hi,
    You basically have 3 options:
    1. Use selective deletion and delete the error records.
    2. Do reverse posting and negate the error records.
    3. This is my preference. Delete all the data from the cube. RSA1 -> cube -> right click -> delete data -> choose "fact and dim" from pop-up. Now reconstruct all the requests that you need i.e ignore the error request. But before all this make sure you have the PSAs for all the request.
    Bye
    Dinesh

  • Create indices before or after compression

    Hello
    I ussualy delete and recreate indices before big uploads. Do you reccomend to create indices before or after compression?

    After thinking about this a little more, I think Dinesh has got the sequence correct.  I haven't mentally visited the issue in a while, since generally, we don't drop the indices as we support a 24 hr query environment and like to keep the secondary indices in place.
    One thing you should "take away" from this thread is that for load perfromance you would drop the indices before a large load, load and compress the data, then build indices.  There is no point in dropping <b>and rebuilding</b> before the load, which is what I think your your original post suggests you are doing.
    For some good info on this issue, and when you might not want to drop inidces, or use techniques to drop just the E fact table indices rather than both the e and F fact table indices - check note 407260.

Maybe you are looking for

  • I want to connect my MacBook Pro (2009) with Apple TV

    I want to connect my MacBook Pro (2009) with Apple TV I updated my Mac to Mountain Lion OS X 10.8.2 (12C60) I had tried connecting my Mac for many time but I could not. iTunse is connected with Apple TV

  • Thunar doesn't show the partitions mounted in fstab[SOLVED]

    Hi everyone, I am running XFCE 4.10 in archlinux and today when i added couple of partitions to fstab and rebooted my system I noticed that thunar doesn't show the partitions that I added to my fstab. what should i do to make it appear in thunar ? he

  • Build Applicatio​n with all files inside .exe

    Hello, I have a labview project consisting of a few classes and subvi's. Nothing really special. Now when i try to build an .exe file, i get the .exe, an application.ini, an Application.aliases and an data folder (has the classes and dll's within). W

  • How do I attach file to the reply mail

    Hello! I'm using iPad 3rd generation (ver. 5.1.1) and it's native mail client. I've read how to attach file to the email. But it might work only if I create new email. My problem is attaching a file to a reply email. Is there any solution for this?

  • Color and Laptop Screen Angle

    While on vacation I'll download daily photos from a Rebel XTi to my MacBook to review them and delete those I don't want. However, when reviewing them on the MacBook the photo exposure and color saturation appear to change based on the angle of the M