Can anybody share their data load methodologies?

Is it possible (without giving any proprietary information, of course) for anyone currently using OLAP in Oracle 11g to share how they maintain their data in a regular basis? Also, can you share your data volumes (i.e. millions of records in fact tables, number of dimension and rough total dimension members) if possible?
Do you use OLAP DML or do you use DBMS_CUBE?
Is there any documentation other that Oracle provided User Guides available anywhere?
Thanks for any help,
Thomas

Regd "selected portions of the cube", you can find some documentation relating to this in AWM help "Maintenance Wizard: Data processing options" for 10g cubes and "Maintenance Wizard: Data refresh methods" for 11g cubes.
In 10g, this was possible by choosing option "Aggregate the cube for only the incoming data values"
Select this option to speed processing when the new, incoming data does not invalidate the stored aggregates already present in the analytic workspace.
In 11g, this is possible using either of the following refresh methods options - Fast, Force, Partition Change Tracking and Fast Solve
Refresh Method: The method used to refresh and aggregate the data. Cubes that have been enabled as cube materialized views use different methods than other cubes. The list contains only those methods applicable to a particular cube. Analytic Workspace Manager uses the selected method, but will revert to a different method if necessary to complete the build. Choose from among these methods:
* Complete: Clears all data from the cube, then loads and aggregates all the data from the source tables. All cubes can use this method.
* Fast: Uses the materialized view log tables to identify, load, and aggregate only the new and changed data from the source tables. Cubes defined as a fast refresh or a rewrite materialized view can use this method.
* Force: Uses the Fast method if possible; otherwise, the Complete method.
* Partition Change Tracking: Clears, loads, and aggregates only the values from an altered partition in the source tables.
* Fast Solve: Loads all the detail data from the source tables, then aggregates only the new values. Compressed cubes and cube materialized views can use this method.

Similar Messages

  • How can i share downloaded data!!!

    hi everybody..
    i want to a program like download accelerator...it use 2 or more simultaneous download...how can i do this....if i use multi threading...how can i share byte among threads...i mean how can i share which part of file to which thread
    every replies will be helpful

    Read up the HTML specifications. It's possible to send a header where you request for a specific range of bytes in the file to download. You could get the filesize through a HTTP request, then assign byte ranges to specific threads and let each download their own part into seperate buffers. Let the threads write their data in sequence to a file (so some of them will have to wait until another finishes, this should not be a big deal when you assign byte ranges efficiently).
    http://www.w3.org/Protocols/rfc2616/rfc2616.html

  • Can I share the data on my phone?

    If I get an ipad and I have an iphone that has a data plan, can I just get the ipad with wifi and just turn on my personal hot spot in my phone to share the data or must I purchase the ipad with wifi and cellular?

    Eva_Yang wrote:
    If I get an ipad and I have an iphone that has a data plan, can I just get the ipad with wifi and just turn on my personal hot spot in my phone to share the data
    Yes, if you cell carrier offers the hotspot for your iPhone.

  • One person buys Premium but can both share their s...

    Hi everyone, 
    We are a group of 3 individual Skypers from aroun the world, working on a project and have recently thought about paying for a Premium account to get Screen Sharing and Video. If only one person pays for it, does it mean all three can share their screen as well, or the person with the premium account? 
    Thank you, 

    Each subscriber needs their own iTunes library on their own computer. They will not be able to share their music.
    jim

  • Can someone share their Cisco PIX config?

    Running a Cisco PIX 515 with a front-end/back-end Exchange server set up. Problem I am having is that the calendar will sync once and then never again. I delete the account, recreate it and sometimes it will show up but for the most part it will not.
    I have no issue with OWA, BlackBerrys or Windows Mobile devices. They all do everything perfect. Just the iPhone giving me grief. And that has only been really bad since the upgrade to 2.0.1!
    I have checked everything except comparing my PIX config with someone elses. Can someone post their "timeout" lines from a PIX 5XX running 6.X.X?
    Thanks so much!

    Oh trust me, I am in the same exact boat as most people. One man's configuration might work for him, but doesn't work for me. It's really, really bizarre that there does not seem to be any consistency.
    Anyway, here's our timeout - good thing I checked!
    timeout conn 4:00:00 half-closed 0:10:00 udp 0:02:00 rpc 0:10:00 h225 1:00:00
    -Sam

  • Can anybody share with me software products that block **** etc. that are compatible with Apple computers? My boys have a MacBook

    Can anyone share with me which 3rd party software that blocks **** is compatible with Apple computers?

    Each child should have his own account, and understand that others' accounts are not to be tmapered with.
    Parental Controls is already built into Your Mac. All you have to do is configure it. This Video explains the basics.
    Mac OS X: Parental Controls
    Pørn may give your child an odd view of human interactions and be repugnant to you, but that is trivial compred to the real threat on the Internet: Unsupervised chat. Pørn will not arrange to meet your child after school ...

  • DasyLab Timebases - Can anybody explain the various clocking methodologies ?

    Hi All.
    I,ve been using DasyLab for at least 10 years and as you'd expect become quite proficient.
    Mostly I use Measurement Computing USB pods or DataShuttles.
    But the exact homology of the Time Bases, their physical methods of generation, and particularly the optimum logical choices,
    still remains mysterious and obscure. I can always get a program working you understand, but selecting DasyLab clocking feels like playing Tetris.
    Any insight would be welcomed.  Does anybody see the 'Highest Governing Concept ' here ?
    Thanks and respect.
    DaveTheRave

    Generally speaking, you want to use the hardware based clocking, since the hardware usually has the highest resolution clock.
    When we introduced the Time Base concept, it was to address the new feature that allowed more than one DAQ device to be connected to the PC. Each device may have its own clocking, and some allow you to connect and synchronize multiple devices (Master/Slave clocking).
    One major driver, the IOtech driver, decided to stay within the original Driver API, which allows only one time base, but they made changes in the driver to handle master/slave and sample rate decisions. 
    Other drivers embraced the Time Base concept, and were developed using the "Extension DLL API", starting with the National Instruments driver, and picked up by most driver developers, including Measurement Computing, UEI, instruNet, and many others. 
    So, DASYLab now shows you timing sources from at least two devices -- 
    Driver Timebase is associated with the installed driver - Sound Card, Demo, or IOtech, typically. 
    DASYLab Timebase is a software timed source, usually used for slower (less than 100 samples/sec) timing
    As you add drivers, you will be exposed to that manufacturer's choice of timing:
    National Instruments NI-DAQmx - timing is set in NI Measurement & Automation (MAX)
    You have choices of 1 sample on demand, n samples, continuous
    Each task has its own timing, so one board may have a timebase for Analog Input, Analog Output, Digital Input, Digital Output, etc.
    Measurement Computing MCC-DRV 
    You have two timebases available for most devices, Hardware clocked or software clocked. 
    Some devices only allow hardware clocking for one subsystem - the other subsystems would then use the software clock
    Some devices only allow a single sample rate (USB-TC, USB-TEMP are 2 s/sec)
    Some devices use the settings in instaCal to determine allowed sample rates (USB-2416, USB-2408); individual channel choices may limit the overall  (aggregate) sample rate.
    Analog/Digital output timing can be configured "using the input timing" or with driver based timing - many devices only support software clocked ("slow") timing on outputs.
    UEI - time bases are not created until you create modules that require them
    InstruNet - time base is determined by a mix of the DASYLab time base and the individual channel settings
    If you open a worksheet and you see a time base that is in (parentheses) - the driver or hardware was not found, but the worksheet remembers it.
    Many modules are software data generators - the Generator, the Switch, the Slider, and they offer an option to select one of the available time bases. If you use a switch, for example, to control a relay, the switch timing should match the other inputs of the relay. The switch and the slider also offer a "with input" choice, which allows you to wire an input to them, to force the module timing to match the input timing. 
    Some Input/Output modules have to use software timing on the data - the RS232 Input, for example. It has to assume that the input samples are not equidistantly spaced, and tags the timing with "triggered". You can force it to use hardware timing in the Options, and fill the data block with old data until new data is received. 
    Confusing? Yep. 
    What's the right choice? 
    Hardware vs. software clocking - for sample rates above 100 samples/second, you will want hardware clocking. For slower rates, you may be forced to use software clocking. 
    DASYLab vs. Driver vs. Timebase A HW vs. Timebase B SW
    I would use the device's clock whenever feasible. It's more reliable and accurate than the PC clock.
    So, for Measurement Computing USB "pods" (I like that word!) - use the MCC-DRV timebases when ever possible. 
    Datashuttle... are you in the UK? The original Datashuttle is long gone, and only used the Driver timebase.
    The UK sold (sells?) a version of the IOtech PersonalDAQ 54/55/56 products as a Datashuttle.
    Funny that you should mention tetris... someone may have written a tetris module for DASYLab as a exercise.
    - cj
    Measurement Computing (MCC) has free technical support. Visit www.mccdaq.com and click on the "Support" tab for all support options, including DASYLab.

  • How can I increase the data loading speed

    Hi Expert,
    I need to populate a table cross dblink from another database each night. There are about 600,000 rows. It takes about two hours. This is the steps:
    delete from target_table;
    insert into target_table
    select * from source_table@dblink;
    commit;
    I can't use truncate in case the loading fails that I still have the old data.
    How can I increate the loading speed?
    Thanks!

    DELETE and INSERT /*+ APPEND */ aren't a good combination, as the high water mark will keep going up.
    With a trivial number of rows like this I would not expect the delete or insert to be the problem. (How long does the delete take anyway?) It's more likely to be to do with the query over the db link. Is it just one table or is that a simplified example? Can you check the equivalent insert over on the remote database? If it's much faster I would investigate the network connection.

  • Can anyone explain me - data load failure

    hi guys..
    im loading data using LO extraction(full update) into the target,and there are 10,000,00 lakh records in the r/3.during loading about 90% say 900000 lakh records are transferred and the loading fails at this stage.
    How will i push the remaining records into the target,is there any way to do tat or simply i hav to delete the request and reload(i think this is not the best option) the entire content once again???Pls explain me in detail...
    And also tell me for delta upload also for the same scenario.
    thanks in advance
    regards
    viru

    Hi,
    Here I would suggest you to break up the load in different sets. Though 10 Lakh records are not much. But then better if you are doing a full load through the LO Extractors.
    Selections can be done in the InfoPackages.
    Because even while loading data from LO Extractors in out client system we also faced the similar problem, and when we breaked the data to be loaded via selections, all the load happened fine.
    Hope this helps,
    Pradip Parmar.

  • I am thinking of using a Mac pro for share trading. Can anyone share their experience and suggest specs for 30 -40 monitors etc

    Hi
    I am thinking of using a Mac Pro for share trading.
    I am proposing to start small but want to be able to expand up to 30+ monitors.
    It must be super quick.
    Any help with hardware would be appreciated.
    Software is secondary at this point , but open to advice.
    I have an iphone and ipad so feel it will better intergrate usung Apple
    regards
    David

    There is no desktop machine that I know of that will support 30+ monitors as is.
    Fill the 3 or 4 PCIe slots of any logic board with three port graphics cards and you have the physical cap without additional hardware.
    Software is secondary at this point
    No, without software, how can you control more than the monitor limit of the OS?
    How can you set up the monitors to display the various programs that you will wish to use simultaneously?
    Software is a major consideration.
    Matrox (and others) have solutions for additional monitors beyond the physical capabilities of generally offered graphics cards.
    The caveat here, however, is support software is typically reserved for Windows and Linux systems.
    Matrox has a single 8 display card:
    http://www.matrox.com/graphics/en/products/graphics_cards/m_series/m9188pciex16/
    Matrox has solutions specifically targeting the finance market:
    http://www.matrox.com/graphics/en/solutions/trading_analyst/
    Though the Mac Pro has more than adequate processing power, I think that you will find (through research) that building a solid machine capable of using several multi monitor cards is going to be what you will need to support more than 8-10 monitors with conventionally offered hardware.
    Fun stuff:
    http://www.digitaltigers.com/zenview.asp

  • Can anybody share the steps involved in Rebuildable in oracle eAM

    Hi,
    Iam enable to do a flow on Rebuildable in oracle eam, Could anybody helps the steps involved in Rebuild able and Preventive Maitenance.
    It will be helps me alot.
    Advanced thanks
    [email protected]

    What are the steps involved in configuring Oracle EBS R12 after installation is complete?
    We have an empty oracle EBS R12 installed and up and running. We do not have anything configured.
    Please direct me to the documents that can be followed to do initial steps for the application functionality to work.All Oracle EBS Docs 11i/R12 can be found at:
    Oracle Applications Documentation
    http://www.oracle.com/technetwork/documentation/applications-167706.html
    Thanks,
    Hussein

  • Can anybody share printer connected to mac running 10.3.9?

    I used to print wireless using printer hsaring all the time from my ibook running 10.3.9 to my emac running 10.3.9 with a canon ink jet printer plugged into it. I just realized I can't print from my new macbook to my emac. The printer doesn't show up. Has anybody got this to work? I found a few threads with people have problems printer sharing with computers not running leopard.

    This article may help:
    http://docs.info.apple.com/article.html?artnum=306984

  • Can someone share their IBM X31 xorg.conf with radeon driver

    I have tried the one in the wiki and it does not start my X
    Also my sond it not working if any of you X31 users have any tips.
    Thanks

    KingYes wrote:Hey,
    I don't have OpenGL in this command.
    Paste your glxinfo output via pastebin.com, dmesg may be usefull too...
    dmesg
    glxinfo

  • How to use incremental data load in OWB? can CDC be used?

    hi,
    i am using oracle 10g relese 2 and OWB 10g relese 1
    i want know how can i implement incremental data load in OWB?
    is it having such implicit feature in OWB tool like informatica?
    can i use CDC concept for this/ is it viable and compatible with my envoirnment?
    what could be other possible ways?

    Hi ,
    As such the current version of OWB does not provide the functionality to directly use CDC feature available. You have to come up with your own strategy for incremental loading. Like, try to use the Update Dates if available on your source systems or use CDC packages to pick the changed data from your source systems.
    rgds
    mahesh

  • How to tune data loading time in BSO using 14 rules files ?

    Hello there,
    I'm using Hyperion-Essbase-Admin-Services v11.1.1.2 and the BSO Option.
    In a nightly process using MAXL i load new data into one Essbase-cube.
    In this nightly update process 14 account-members are updated by running 14 rules files one after another.
    These rules files connect 14 times by sql-connection to the same oracle database and the same table.
    I use this procedure because i cannot load 2 or more data fields using one rules file.
    It takes a long time to load up 14 accounts one after other.
    Now my Question: How can I minimise this data loading time ?
    This is what I found on Oracle Homepage:
    What's New
    Oracle Essbase V.11.1.1 Release Highlights
    Parallel SQL Data Loads- Supports up to 8 rules files via temporary load buffers.
    In an Older Thread John said:
    As it is version 11 why not use parallel sql loading, you can specify up to 8 load rules to load data in parallel.
    Example:
    import database AsoSamp.Sample data
    connect as TBC identified by 'password'
    using multiple rules_file 'rule1','rule2'
    to load_buffer_block starting with buffer_id 100
    on error write to "error.txt";
    But this is for ASO Option only.
    Can I use it in my MAXL also for BSO ?? Is there a sample ?
    What else is possible to tune up nightly update time ??
    Thanks in advance for every tip,
    Zeljko

    Thanks a lot for your support. I’m just a little confused.
    I will use an example to illustrate my problem a bit more clearly.
    This is the basic table, in my case a view, which is queried by all 14 rules files:
    column1 --- column2 --- column3 --- column4 --- ... ---column n
    dim 1 --- dim 2 --- dim 3 --- data1 --- data2 --- data3 --- ... --- data 14
    Region -- ID --- Product --- sales --- cogs ---- discounts --- ... --- amount
    West --- D1 --- Coffee --- 11001 --- 1,322 --- 10789 --- ... --- 548
    West --- D2 --- Tea10 --- 12011 --- 1,325 --- 10548 --- ... --- 589
    West --- S1 --- Tea10 --- 14115 --- 1,699 --- 10145 --- ... --- 852
    West --- C3 --- Tea10 --- 21053 --- 1,588 --- 10998 --- ... --- 981
    East ---- S2 --- Coffee --- 15563 --- 1,458 --- 10991 --- ... --- 876
    East ---- D1 --- Tea10 --- 15894 --- 1,664 --- 11615 --- ... --- 156
    East ---- D3 --- Coffee --- 19689 --- 1,989 --- 15615 --- ... --- 986
    East ---- C1 --- Coffee --- 18897 --- 1,988 --- 11898 --- ... --- 256
    East ---- C3 --- Tea10 --- 11699 --- 1,328 --- 12156 --- ... --- 9896
    Following 3 out of 14 (load-) rules files to load the data columns into the cube:
    Rules File1:
    dim 1 --- dim 2 --- dim 3 --- sales --- ignore --- ignore --- ... --- ignore
    Rules File2:
    dim 1 --- dim 2 --- dim 3 --- ignore --- cogs --- ignore --- ... --- ignore
    Rules File14:
    dim 1 --- dim 2 --- dim 3 --- ignore --- ignore --- ignore --- ... --- amount
    Is the upper table design what GlennS mentioned as a "Data" column concept which only allows a single numeric data value ?
    In this case I cant tag two or more columns as “Data fields”. I just can tag one column as “Data field”. Other data fields I have to tag as “ignore fields during data load”. Otherwise, when I validate the rules file, an Error occurs “only one field can contain the Data Field attribute”.
    Or may I skip this error massage and just try to tag all 14 fields as “Data fields” and “load data” ?
    Please advise.
    Am I right that the other way is to reconstruct the table/view (and the rules files) like follows to load all of the data in one pass:
    dim 0 --- dim 1 --- dim 2 --- dim 3 --- data
    Account --- Region -- ID --- Product --- data
    sales --- West --- D1 --- Coffee --- 11001
    sales --- West --- D2 --- Tea10 --- 12011
    sales --- West --- S1 --- Tea10 --- 14115
    sales --- West --- C3 --- Tea10 --- 21053
    sales --- East ---- S2 --- Coffee --- 15563
    sales --- East ---- D1 --- Tea10 --- 15894
    sales --- East ---- D3 --- Coffee --- 19689
    sales --- East ---- C1 --- Coffee --- 18897
    sales --- East ---- C3 --- Tea10 --- 11699
    cogs --- West --- D1 --- Coffee --- 1,322
    cogs --- West --- D2 --- Tea10 --- 1,325
    cogs --- West --- S1 --- Tea10 --- 1,699
    cogs --- West --- C3 --- Tea10 --- 1,588
    cogs --- East ---- S2 --- Coffee --- 1,458
    cogs --- East ---- D1 --- Tea10 --- 1,664
    cogs --- East ---- D3 --- Coffee --- 1,989
    cogs --- East ---- C1 --- Coffee --- 1,988
    cogs --- East ---- C3 --- Tea10 --- 1,328
    discounts --- West --- D1 --- Coffee --- 10789
    discounts --- West --- D2 --- Tea10 --- 10548
    discounts --- West --- S1 --- Tea10 --- 10145
    discounts --- West --- C3 --- Tea10 --- 10998
    discounts --- East ---- S2 --- Coffee --- 10991
    discounts --- East ---- D1 --- Tea10 --- 11615
    discounts --- East ---- D3 --- Coffee --- 15615
    discounts --- East ---- C1 --- Coffee --- 11898
    discounts --- East ---- C3 --- Tea10 --- 12156
    amount --- West --- D1 --- Coffee --- 548
    amount --- West --- D2 --- Tea10 --- 589
    amount --- West --- S1 --- Tea10 --- 852
    amount --- West --- C3 --- Tea10 --- 981
    amount --- East ---- S2 --- Coffee --- 876
    amount --- East ---- D1 --- Tea10 --- 156
    amount --- East ---- D3 --- Coffee --- 986
    amount --- East ---- C1 --- Coffee --- 256
    amount --- East ---- C3 --- Tea10 --- 9896
    And the third way is to adjust the essbase.cfg parameters DLTHREADSPREPARE and DLTHREADSWRITE (and DLSINGLETHREADPERSTAGE)
    I just want to be sure that I understand your suggestions.
    Many thanks for awesome help,
    Zeljko

Maybe you are looking for

  • Photoshop CS5 edits not showing in Lightroom

    Experiencing strange behavior and would appreciate any advice... Am editing .CR2 files, then doing Edit in > Photoshop CS5 > Edit with Lightroom Adjustments and "Stack with Original".  All goes fine until I save and the new image is now displayed in

  • [SOLVED] Arch64, Opera and plugins library problem

    I have Arch64 with Opera 64-bit as default browser. I have succesfully installed 32-bit flashplugin and it works nicely. The only problem is that fonts cannot be read from the flash player context menu and Open File dialog opened from flash player. T

  • IPad 2 screen no longer works for swipes or taps

    I tried to use the iPad this morning and it was frozen. My password screen was there but would not accept any taps. I tried the reboot steps but no change. Finally, I synced it to itunes and ran the restore. It is now restored to original seetings, s

  • Dual channel question

    Hi, Just want to know.....i have 2x512(matched pair) DDR 400 , how can i be sure that im working in dual channel ? And if i add more 512 in the third ram slot ill still work in dual channel ? Thx

  • Essbase 11.1.2 EPMVLD-01005: Not all recommended configuration tasks

    I get this error when I run my validation or EPM System Diagnostic scripts. How/Where do I check logs to get the details of what dint work? Everything works fine , essbase works fine, eas works fiine, studio works fine. Just want to be sure what this