Essbase cube/pag file size reduction-bitmap compression

We are seeing some huge pag files in our essbase cube, and Oracle had suggested changing to bitmap compression to see if that will help.(currently, we have 'no compression')
That said, what is the difference between using RLE versus bitmap encoding for BSO cubes?
Would like to hear other's experiences.

(currently, we have 'no compression')^^^You are going to be very happy. Very, very happy.
You can read the Database Administrator's Guide -- just search for "comrpession".
Here it is: http://download.oracle.com/docs/cd/E17236_01/epm.1112/esb_dbag/dstalloc.html
There are a bunch of caveats and rules re the "better" compression method -- having said that, I almost always use RLE as the method it seems the most efficient wrt disk space. This makes sense as this method examines each block and applies the most efficient method: RLE, bitmap, or Index Value pair. I can't see a calculation difference between bitmap and RLE.
How on earth did you get into the position of no compression in the first place?
Regards,
Cameron Lackpour
Edited by: CL on Jul 18, 2011 10:48 AM
Whoops, a couple of things:
1) After you apply the change in compression in EAS or via MaxL, you must stop and then restart the database for the setting to take affect.
2) You must export the database, clear it out, and reload it using the new compression method -- the change in #1 only affects new blocks so you could have a mix of compressed and uncompressed bliocks in your db which won't do anything for storage space.
3) The only way to really and truly know which method is more efficient is to benchmark the calculation (think writing to disk which could be a bit slower than it is now) and compression methods is to do it. Try it as bitmap and write down sample calc times and IND and PAG file size, then try it as RLE and do the same.

Similar Messages

  • Essbase cube data file size

    Hi,
    Why is it showing different numbers about my data file size in EAS>database>edit>properties>storage and a full database export.
    Thanks,
    KK

    Tim, in all seriousness, I am not a stalker. Honestly. You just post about things that interest me/we share some of the same skills. Alternatively, Glenn stalks me, I stalk you, it's time for you to become a sociopath too and stalk someone else.
    Okay, with that insanity out of the way, the other thing that could have a big impact on export size is if the OP did a level zero or input level export as opposed to a full export. In BSO databases in particular, that can lead to a very, very large set of .PAG (and to some extent .IND files) and a rather small export file size as the calculated values aren't gettting written out.
    If the export is done through EAS I think the default is level zero.
    Regards,
    Cameron Lackpour
    Edited by: CL on Sep 23, 2011 2:38 PM
    Bugger, the OP wrote "full database export". Okay, never mind, I am a terrible stalker. I agree with everything Tim wrote re compression.
    In an effort to add something useful, if you use the new-ish database archive feature in MaxL, you will essentially get .PAG files + .IND files combined into a single binary file. I used to be a big fan of full restructures, clears, and reloads to do defrags, but now go with the restructure command. Despite the fact that it's single threaded, in my limited testing (only done it at one client) it was faster as the export all, clear, reload. If you combine that with the archive mode you might have a better approach.
    Regards,
    Cameron Lackpour

  • Index file increase with no corresponding increase in block numbers or Pag file size

    Hi All,
    Just wondering if anyone else has experienced this issue and/or can help explain why it is happening....
    I have a BSO cube fronted by a Hyperion Planning app, in version 11.1.2.1.000
    The cube is in it's infancy, but already contains 24M blocks, with a PAG file size of 12GB.  We expect this to grow fairly rapidly over the next 12 months or so.
    After performing a simple Agg of aggregating sparse dimensions, the Index file sits at 1.6GB.
    When I then perform a dense restructure, the index file reduces to 0.6GB.  The PAG file remains around 12GB (a minor reduction of 0.4GB occurs).  The number of blocks remains exactly the same.
    If I then run the Agg script again, the number of blocks again remains exactly the same, the PAG file increases by about 0.4GB, but the index file size leaps back to 1.6GB.
    If I then immediately re-run the Agg script, the # blocks still remains the same, the PAG file increases marginally (less than 0.1GB) and the Index remains exactly the same at 1.6GB.
    Subsequent passes of the Agg script have the same effect - a slight increase in the PAG file only.
    Performing another dense restructure reverts the Index file to 0.6GB (exactly the same number of bytes as before).
    I have tried running the Aggs using parallel calcs, and also as in series (ie single thread) and get exactly the same results.
    I figured there must be some kind of fragmentation happening on the Index, but can't think of a way to prove it.  At all stages of the above test, the Average Clustering Ratio remains at 1.00, but I believe this just relates to the data, rather than the Index.
    After a bit of research, it seems older versions of Essbase used to suffer from this Index 'leakage', but that it was fixed way before 11.1.2.1. 
    I also found the following thread which indicates that the Index tags may be duplicated during a calc to allow a read of the data during the calc;
    http://www.network54.com/Forum/58296/thread/1038502076/1038565646/index+file+size+grows+with+same+data+-
    However, even if all the Index tags are duplicated, I would expect the maximum growth of the Index file to be 100%, right?  But I am getting more than 160% growth (1.6GB / 0.6GB).
    And what I haven't mentioned is that I am only aggregating a subset of the database, as my Agg script fixes on only certain members of my non-aggregating sparse dimensions (ie only 1 Scenario & Version)
    The Index file growth in itself is not a problem.  But the knock-on effect is that calc times increase - if I run back-to-back Aggs as above, the 2nd Agg calc takes 20% longer than the 1st.  And with the expected growth of the model, this will likely get much worse.
    Anyone have any explanation as to what is occurring, and how to prevent it...?
    Happy to add any other details that might help with troubleshooting, but thought I'd see if I get any bites first.
    The only other thing I think worth pointing out at this stage is that we have made the cube Direct I/O for performance reasons. I don't have much prior exposure to Direct I/O so don't know whether this could be contributing to the problem.
    Thanks for reading.

    alan.d wrote:
    The only other thing I think worth pointing out at this stage is that we have made the cube Direct I/O for performance reasons. I don't have much prior exposure to Direct I/O so don't know whether this could be contributing to the problem.
    Thanks for reading.
    I haven't tried Direct I/O for quite a while, but I never got it to work properly. Not exactly the same issue that you have, but it would spawn tons of .pag files in the past. You might try duplicating your cube, changing it to buffered I/O, and run the same processes and see if it does the same thing.
    Sabrina

  • RE: File Size Reduction by Hiding Layers

    I've come across the trick of hiding layers to reduce file size, however am wondering exactly what is happening.
    Here's my situation.
    20.6mb PSD
    Hiding the layers reduces the file to 13.1mb
    I've been told that the file reduction is because preview images are no longer being generated.
    Could preview images really be accounting for nearly 40% of the file size?
    To test this I saved this file as a JPG at 100% quality (not for web) and yielded a 1.34mb file.
    This leaves me with roughly 6.6mb of file reduction unaccounted for.
    Is something else being siltently compressed/compatibility settings not included/other info being excluded from the file when layers are hidden?
    NOTE:
    I need to make sure my files have maximum compatibility for opening them in Lightroom and other programs that open PSDs like Painter - so compatibility settings are important.
    I need to make sure that everything will still be setup properly for print and no compression is ocurring.
    Could some please tell me what exactly is excluded from a file to lower the file size 40% when hiding layers?
    Thanks!

    Similar, yes... but that still doesn't answer my question
    2400x1600px image
    only a background layer
    RGB 8bits/channel 72dpi
    19.2mb
    ...hide layer = 9.91mb
    .. white layer on top = 10.1mb (similar results as you said)
    DIFFERENCE: 9.29mb
    So, back to the question. Is it REALLY just the thumbnail/preview image taking up 50% of the file size in this case? Seriously? I would think that Adobe would be a bit more efficient with saving preview images. So...back to my earlier test to try and get a comparison of what a full size/full quality preview image would weigh, I've saved this image as:
    high res JPG (not for web) = 1.55mb
    compressed (layered) PNG = 3.64mb
    compressed (layered) TIFF = 3.99mb
    I also tried saving different combinations of layered/compressed versions of the PNG and TIFF files (although I think JPGs would be a more accurate representation of preview image file size). Uncompressed PNGs and TIFFs were far too large for the 9.29mb difference.
    These don't even come within a couple of mb of the file size reduction from hiding the layers....
    So, my question:
    If hiding a layer is reducing the file size by supposedely rendering less pixel data to the preview images, then
    - why don't any of these "mock" preview image situations come close to accounting for the size in reduction?
      Even a layered lossless compressed TIFF still leaves an unexplained 5mb+  (9.29mb total file size reduction - 3.99mb TIFF)
    I'm curious and need this for a specific (yet reocurring) situation at the workplace.
    It really seems like there's something else that is happening when you save a file with layers hidden.
    Is there any other possible scenarios (generating mutliple thumbnail sizes/sets, metadata based on pixel data, other information based on pixel data, silent compression, settings changing, etc) that could possibly explain this?

  • Reduce pages file size while maintain picture resolution

    How do you have a small pages file size - <10mb - and still have a high resolution of my pictures in the doc

    The two are rather mutually exclusive.
    Quality, Number , Size - pick 2.
    You have not told us how many images, how large in dimensions and whether you are going to commercial print or convert it to pdf and email/post it.
    So it is hard to know the target.
    You can trim away the unmasked portions of the images in Preview and bring them back into Pages.
    You can ensure there is exactly the resolution you need in final size in the document, 150dpi for HG pdf for screen, or 300 dpi for print. Preview can adjust resolution.
    If this is meant for screen and not print you can resave the images as jpegs as lower quality in Preview. Be aware that this is a compromise, smaller size is lower quality and ultimately noticeable. The process is lossy.
    Vector (drawn) objects remain sharp and are very small compared to bitmap images, so use pdf or font graphics where possible.
    Peter

  • Amazingly inefficient file size reduction

    I have some Texinfo documents that are regularly updated and from which HTML and PDF documentation is then produced. The PDF as it comes from Texinfo is about 850 pages, contains a lot of small graphics, and is about ~50Mb in size. When I use Acrobat (v9.5.1 running on Windows 7 x64) to reduce the file size, the process takes in excess of 12 hours to complete, and when I look at processes in Task Manager when the file size reduction has completed, I can see that the acrobat.exe process has read 61Gb and written 11Gb. At the end of the process, the PDf has been reduced from 50Mb to around 30Mb.
    The machine on which Acrobat is running is not at all underpowered - it has an Intel core i7 CPU and 12Gb of RAM. Any suggestions on possibilities for speeding up the file size reduction. The excessively long processing time is not a one-off aberration - I've run the file size reduction half a dozen times now over a few months, and it takes this long to reduce the file size on every occasion.

    Hi jamesfb,
    You can do so by resizing your image. It will be under the More (&) menu> image size

  • Pages file sizes in Lion

    Anyone have any idea how big files will get using AutoSave/Versions in Lion or if there are any safeguards to prevent the file size from spiraling out of control?
    For example, I write short stories and novels with Pages (I used to use Word 2004 pre-Lion, but that's a Rosetta app), and the files I deal with are just text and formatting. I have one .DOC file from before my Pages adoption that is only 120K, and I have a .pages file that is much shorter in word count than the 120K .DOC file, but as I've been working on it for the past two weeks, that .pages file has grown to 236K, nearly twice the size of its longer .DOC counterpart.
    What worries me is when I end up working on the same file for a couple of years (think 150,000 words or more), how big is it going to be then? Does AutoSave/Versions act like Time Machine and only keep the most recent X months' worth of edits to keep the file at a reasonable size? Or am I going to end up with a text file that has ballooned up to 20 or 30 MB in size because of the several years' worth of versions it has kept that whole time?

    Hi Matt,
    the problem is that a pages 'file' isn't a file at all: it's a folder full of smaller files. Your finder is probably set up NOT to display folder sizes (I believe this is the default since it uses up system resources/CPU calculating them all).
    Go to Finder>View>Show View Options then check the box at the bottom: 'calculate all sizes'. If your pages docs are all in one folder, then you can limit this preference to 'this window only' and save the system calculating folder sizes everywhere. The finder will now display your pages 'file' sizes (you may need to close the finder window and reopen it).
    Hope this makes your yinyang feel better.

  • 16x9, 300ppi, 75mb tiff file, LR converts to 1mb jpg.  Export in LR 5 is being done at 100%, no file size reduction.

    I've got a 16x9, 300ppi, 75mb tiff file that LR converts to 1mb jpg.  Export in LR 5 is being done at 100%, no file size reduction.  Can't figure out why it is downsizing so small?  Even upsized to 420ppi in PS and the export was still only 2mb. Stock agency wants 3mb .jpg minimum. Any help appreciated.  Thanks.

    Using PS CS6 with NO changes applied to the original TIFF the JPEG file size is 8.686 MB. The slightly larger file size is due to metadata differences between LR and PS.
    Both Adobe applications (PS CS6 and LR 5.71) are producing near identical and much larger highest quality JPEG files. PS 12 Quality is the same as LR 100.
    SUGGESTION:
    1) Close LR and rename your LR Preferences file by adding the extension .OLD to it:
    Mac OS X
    Preferences
    /Users/[user name]/Library/Preferences/com.adobe.Lightroom5.plist.OLD
    Windows 7 & 8
    Preferences
    C:\Users\[user name]\AppData\Roaming\Adobe\Lightroom\Preferences\Lightroom 5 Preferences.agprefs.OLD
    Reopen LR and it will create a new Preferences file. Try the JPEG Export again using the same settings as I have posted.
    2) If still no change I suggest uninstalling LR, delete the new LR Preferences file created in step #1 above, keep the .OLD Preferences file, and reinstall LR 5.71.
    3) If all is well now close LR and try restoring you original Preferences file by renaming the new Preferences file something like .OLD.OLD and removing .OLD from the original file.

  • Slow Snow Leopard - esp. Parallels or VMWare - 140-500gb VM Page File Size!

    I know that the VM page file size is only the hypothetical maximum VM size that could be used, but immediately after startup on my recently installed (not upgraded) Snow Leopard my VM Size is 140gb. It is not running anything at that point, other than Vuze which starts up at login.
    Before restart, when I was trying to install Windows XP under VMWare, it was over 500gb!
    My MBP 2.4 unibody has been feeling sluggish generally with plenty of beachballs, grinding to a halt under VMWare or Parallels in particular, and I wonder if the VM Size is an indicator of some problem? Examples of Parallels and VMWare problems were that it would take hours to install Windows XP or Windows 7, and eventually just ground to a halt with the install almost finished.
    I have used Disk Utility to check the disk and repair permissions, but maybe bigger guns are needed? I took a quick look at my Xbench results, and the hard drive results were about 40% of the baseline... I have a feeling the hard drive might be churning away on something, slowing everything else down.
    For example, surely it shouldn't take a few seconds to actually show all the applications in the applications folder when you open the folder (when no other programs are running)...
    B
    Message was edited by: Bingggo

    Sorry, 2gb RAM. When I restarted, the VM Page File Size was 140gb, the free memory was over 1gb, and there were no page in/outs from memory.

  • Essbase pag file size had a sudden increase

    We have a BSO cube who used to have 1.8g of exported data and about 2.5 gigs of pag files. However, they both suddenly increased to 5 gigs each (in just one day). I checked our logs and there were only a few data uploads/changes and the outline member additions that we had that day were all dynamic calcs members. We do export - clear - reload data everynight, but the exported files and page files are still on th 5 gigs.
    Does anyone know what could have caused this issue?
    Thank you.

    Same type of thing happened here. Overnight, 3 hour agg calc now taking upwards of 8 hours. They had implemented hierarchy additions, nothing drastic.
    By chance, were any of these Dynamic Calc members created in sparse dimensions? Do they roll up to stored members? It's bad ju-ju to have that situation.
    This was the biggest design offender I found, and am redesigning cube for them right now.
    I have taken the 8-10 hour agg (and 41 pag files) down to 36 minutes (and 15 pag files) simply by changing the dense and sparse settings of their cube. One of the dimensions they had dynamic members rolling up to stored members, I made this one dense and made all upper levels dynamic. (of course I had to change some of their other dense dimensions to sparse to accomodate this) But the end results on time and page files tells me that having dynamic calc members roll to stored members is really bad.
    Robert
    Edited by: RobertR3 on Apr 18, 2011 10:02 AM

  • Illustrator file size reduction

    Illustrator 11.0
    Win XP Pro SP2
    Adobe Acrobat 6.0
    Why are ai files so much larger than the pdf's they generate. For example I cut and pasted an excel spreadsheet into Illustrator. No bitmaps or rasterized data.
    ai file is 2.5Mb (acrobat compatible, compression on)
    ai file is 7 Mb (compression off) yikes!
    saved as pdf 2.2Mb
    printed to adobe acrobat 6.0 24kb (maintains vector output with great zoom)
    these are all vector files and there is a 100 to 1 size ratio between them.
    If I take the small pdf and use the edit file command in acrobat and then re save the illustrator to ai file it is still over 2Mb.
    I have a long framemaker document that will have 60 some line drawings and text and I need the ai files to be smaller due to memory considerations.
    What can I do to shrink these ai files.? I don't want any bitmaps or rasterized illustrations in my end product pdf file.

    Interesting results here. All tests conducted with AI 12 (don't have AI 11). All .ai files were saved with compression and no profile embedded. All PDFs were created via 'save as' or Distiller (print to PDF) using Acrobat 7 compatibility and Text and Line Art preset.
    Downloaded [Test1 copy from xcell.ai], opened it in AI 12, stripped unused swatches, styles, brushes, etc. then saved as .ai with no PDF compatibility. Resulting file: 456 KB
    Same as above but saved
    with PDF compatibility. Resulting .ai file: 417 KB (!)
    Same as above but saved as v11 .ai with, then without, PDF compatibility. Resulting .ai files: 1,011 KB
    regardless of PDF compatibility (!)
    As above but saved as v10 .ai without PDF compatibility. Resulting file: 412 KB
    Again v10 but with PDF compatibility: 374 KB (!)
    Saved as PDF with .ai editing preserved: 999 KB
    Saved as PDF without .ai editing preserved: 133 KB
    Distilled PDF (printed to Adobe PDF): 56 KB
    Downloaded Excel file and printed to PDF from Excel. PDF file: 53 KB
    That PDF file then opened in AI 12 and saved as v12 .ai with no PDF compatibility: 282 KB

  • Delete Pages, file size, PDF Optimizer Clean Up

    Using Acrobat 9 Pro (for Windows, if that makes any difference).
    I have a 148 page PDF of a magazine, about 58 MB in size.
    I Deleted 40 pages of ads, leaving just the useful content.
    The file size is still about 58 MB.
    When I run Advanced -> PDF Optimizer,
    the file size goes down to about 47 MB.
    The Optimizer settings don't seem to matter.
    Even with just "Clean Up" checked, all its options UNchecked,
    and Compression set to "Leave Compression Unchanged"
    the file size is still reduced.
    I have tried several other PDF editors and tools,
    and none of them reduce the file size after deleting the 40 pages,
    even using their "optimize" or "compress" functions.
    Can someone explain why deleting pages does not reduce the file size,
    and what the Optimizer is doing (that the other tools don't do)
    to reduce the file size ?
    Thanks,
    -- Glenn

    this may be helpful
    http://forums.adobe.com/message/4568414#4568414
    in Adobe 9 Pro Windows, I found the best was:
    Document -> Optimize Scanned PDF (defaults)... tho you could play w the slider bar for even smaller.

  • How do I reduce a Pages file size so I can create a pdf for e-mailing?

    I recently upgraded from Tiger to OS X Leopard. In Tiger, I was able to reduce file size and create a 700 to 800kb file by going to print/color sync/quartz filter/reduce file size/save as pdf. But in Leopard those options can't be accessed from my pages print menu. Any suggestions? Thanks TH

    Hi t.h.leeds,
    Welcome to Pages discussions.
    Another way to reduce the PDF files size is: Print > PDF, Compress PDF. As I create a PDF for e-mailing that's what I do, works well. 99.99% of folks don't print PDF's, they either file in a folder or delete. Viewing on their monitor that Compress PDF appears as it should.
    A side note; by using this discussions Search feature in the box that reads "Search Discussions more Options" you could have found the answer to your question. Doing that would be faster than awaiting for an enduser as yourself to respond.
    These Discussions are user helping user, not Apple employees answering the question. Questions will be answered when a user such as yourself finds time, desires to, knows the answer along with the time to respond.
    Again, welcome to Pages Discussions, have fun here.
    Sincerely,
    RicD

  • How to achieve small file size after editing compressed video?

    I recieved a video in .wmv file format with a file size of 6 MB. In  Premiere its info is
    320 x 240 (1.0)
    00;03;17;01, 30.00 fps
    44100 Hz - Stereo
    I want to remove a few parts from the video with simple cuts and make video file with an appropriate file size and format for playing on a web page.
    I made a simple test by importing and placing the video in a sequence and without any editing exported the sequence to FLV|F4V format with the preset: F4V - Same as source(Flash 9.0r115 and higher). This created a new video with the same quality as the original but the file size went to about 40 MB which I think will make it much slower than the original 6 MB file when played back on a web page.
    I tried things like reducing the frame rate to 10 fps and lowering the sound quality. This didn't affect noticably the quality but the file size was reduced only with about 5 MB to the total of 35MB which is still very big for a web video.
    I'm aware that editing a highly compressed videos can't produce the same quality with the same size but I will greatly appreciate any tips for lowering the files size while retaining some usable quality in a case like this.

    Since wmv is Microsoft's format, you might want to use their free editor
    Added Link
    http://www.microsoft.com/windows/windowsmedia/forpros/encoder/default.mspx

  • Reducing pages file size

    Hi,
    I've googled around on the question of pages having large file sizes, but the discussion centres around documents that have images. I'm wondering if anyone knows why an identical text only document saved in Pages is so much larger than say one saved in MS Word or Neoffice.
    e.g. on an identical 2 page document :-
    1. Just save as Pages document = 225kb
    2. Untick the "Include preview page" box in save menu & it is reduced to 131kb
    3. Save copy as Word document & it is the more familiar 41kb size.
    Are there any other settings to reduce the size of a simple text document down to something comparable to Word. I realise I'm talking about small size examples, but if I start saving 100's of files that are 20 or 30 pages in size, it all starts to add up.
    thanks,
    Terry

    An other explanation may be the way Pages store characters.
    Most of the characters available in the old ASCII set are stored as is using a single byte.
    Some of them "&" for instance are stored as a descriptive string requiring several bytes.
    & is stored as &amp;
    Every other characters require six to eight bytes.
    é is stored as &#xE9;
    œ is stored as &#x153;
    ᴂ is stored as &#x1D02;
    Yvan KOENIG (VALLAURIS, France) dimanche 21 août 2011 12:35:22
    iMac 21”5, i7, 2.8 GHz, 4 Gbytes, 1 Tbytes, mac OS X 10.6.8 and 10.7.0
    My iDisk is : <http://public.me.com/koenigyvan>
    Please : Search for questions similar to your own before submitting them to the community
    To be the AW6 successor, iWork MUST integrate a TRUE DB, not a list organizer !

Maybe you are looking for

  • PB won't start up please help

    Hi everyone, I cant seem to get my computer to start up and I'd really appreciate some help. When I try and start my g4 PB running 10.4, instead of the normal start up chime, I get a beeping sound and the screen stays black. The status light on the f

  • Can only import one song at a time from a CD, then get conversion error.

    I have been using iTunes for several years to import tracks from CDs.  Until now I have not really experienced any problems - though I have not had need to do so for some time.  I have just tried importing a few new CDs and am experiencing the follow

  • Using rowconcat in where cluse

    hi all i need to use rowconcat function in my where cluse ex: select id,name from emp where id in (select rowconcat('select emp_id from emp_test') from dual); select rowconcat('select emp_id from emp_test') from dual = returned (2,10,14,20) plz help

  • Automatic purchase order deletion after SD order is deleted

    Hi forum, I managed to get the automatic purchase order creation to work. Now I have a few issues about it. - why is the workflow not deleted after the sd order is deleted? - why is the purchase requisition not deleted after the sd order is deleted?

  • Transfer iWeb  content to new computer

    My G4 finally died, but before I could back up any iWeb files. I have bought a new MacPro now, and need to know if its possible to get it all back from the internet version somehow?