Best Practices for loading large sets of Data

Just a general question regarding an initial load with a large set of data.
Does it make any sense to use a materialized view to aid with load times for an initial load? Or do I simply let the query run for as long as it takes.
Just looking for advice on what is the common approach here.
Thanks!

Hi GK,
What I have normally seen is:
1) Data would be extracted from APO Planning Area to APO Cube (FOR BACKUP purpose). Weekly or monthly, depending on how much data change you expect, or how critical it is for business. Backups are mostly monthly for DP.
2) Data extracted from APO planning area directly to DSO of staging layer in BW, and then to BW cubes, for reporting.
For DP monthly, SNP daily
You can also use the option 1 that you mentioned below. In this case, the APO cube is the backup cube, while the BW cube is the one that you could use for reporting, and this BW cube gets data from APO cube.
Benefit in this case is that we have to extract data from Planning Area only once. So, planning area is available for jobs/users for more time. However, backup and reporting extraction are getting mixed in this case, so issues in the flow could impact both the backup and the reporting. We have used this scenario recently, and yet to see the full impact.
Thanks - Pawan

Similar Messages

  • Best practices for loading apo planning book data to cube for reporting

    Hi,
    I would like to know whether there are any Best practices for loading apo planning book data to cube for reporting.
    I have seen 2 types of Design:
    1) The Planning Book Extractor data is Loaded first to the APO BW system within a Cube, and then get transferred to the Actual BW system. Reports are run from the Actual BW system cube.
    2) The Planning Book Extractor data is loaded directly to a cube within the Actual BW system.
    We do these data loads during evening hours once in a day.
    Rgds
    Gk

    Hi GK,
    What I have normally seen is:
    1) Data would be extracted from APO Planning Area to APO Cube (FOR BACKUP purpose). Weekly or monthly, depending on how much data change you expect, or how critical it is for business. Backups are mostly monthly for DP.
    2) Data extracted from APO planning area directly to DSO of staging layer in BW, and then to BW cubes, for reporting.
    For DP monthly, SNP daily
    You can also use the option 1 that you mentioned below. In this case, the APO cube is the backup cube, while the BW cube is the one that you could use for reporting, and this BW cube gets data from APO cube.
    Benefit in this case is that we have to extract data from Planning Area only once. So, planning area is available for jobs/users for more time. However, backup and reporting extraction are getting mixed in this case, so issues in the flow could impact both the backup and the reporting. We have used this scenario recently, and yet to see the full impact.
    Thanks - Pawan

  • Best practice for loading config params for web services in BEA

    Hello all.
    I have deployed a web service using a java class as back end.
    I want to read in config values (like init-params for servlets in web.xml). What
    is the best practice for doing this in BEA framework? I am not sure how to use
    the web.xml file in WAR file since I do not know how the name of the underlying
    servlet.
    Any useful pointers will be very much appreciated.
    Thank you.

    It doesnt matter whether the service is invoked as part of your larger process or not, if it is performing any business critical operation then it should be secured.
    The idea of SOA / designing services is to have the services available so that it can be orchestrated as part of any other business process.
    Today you may have secured your parent services and tomorrow you could come up with a new service which may use one of the existing lower level services.
    If all the services are in one Application server you can make the configuration/development environment lot easier by securing them using the Gateway.
    Typical probelm with any gateway architecture is that the service is available without any security enforcement when accessed directly.
    You can enforce rules at your network layer to allow access to the App server only from Gateway.
    When you have the liberty to use OWSM or any other WS-Security products, i would stay away from any extensions. Two things to consider
    The next BPEL developer in your project may not be aware of Security extensions
    Centralizing Security enforcement will make your development and security operations as loosely coupled and addresses scalability.
    Thanks
    Ram

  • Best practices for loading swf's

    Greetings,
    Using CS5 AS2
    I'm creating a website in flash (all the files will be in one directory/folder on SharePoint) and want to make sure that what seems to be working fine is best practice.
    I have an index.swf with many buttons which will take the user to landing pages/content/other swfs. On these different buttons I have the script...
    on (release) {loadMovieNum("name.swf", 0);}                I could also do just {loadMovie("name.swf", 0);} ??
    The movie transitions nicely to name.swf and on this page I have a button that returns the user to the index.swf...
    on (release) {loadMovieNum("index.swf", 0);}   Things move back to index.swf nicely and user can chose to go to another landing page.
    It looks like I'm on the right track, bc nothing is going awry? but want to check. Am I following best practices for moving from one swf to another within a website?
    Thanks for help or confirmation!!

    loading into _level0 (and you should use loadMovieNum, not loadMovie, when loading into a level) undermines some of the benefits of a flash application:  all the assets must load with each display change and the user sees a flash instead of appearing to transition seamlessly from one display to the next.

  • Best Practices for Loading Master Data via a Process Chain

    Currently, we load attributes, text, and hierarchies before loading the transactional data.  We have one meta chain.  To load the master data it is taking more than 2 hours.  Most of the master data is full loads.  We've noticed that a lot of the master data, especially text, has not changed or changed very little since we implemented 18 months ago.  Is there a precedence or best practice to follow such as do we remove these processes from the chain?  If so, how often should it be run?  We would really like to reduce the amount of the master data loading time.  Is there any documentation that I can refer to?  What are other organizations doing to reduce the amount of time to load master data?
    Thanks!
    Debby

    Hi Debby,
    I assume you're loading Master Data from a BI system? The forum here are related to SAP NetWeaver MDM, so maybe you should ask this question in a BI forum?
    Nevertheless, if your data isn't changed this much, maybe you could use a delta mechanism for extraction? This would send only the changed records and not all the unchanged all the time. But this depends on your master data and of course on your extractors.
    Cheers
    Michael

  • Best Practices for Loading Data in 0SD_C03

    Hi, Guru, I want to know which is the best practice to have information about Sales, billing, delivery. I know it has this Datasources.
    • Sales Order Item Data - 2LIS_11_VAITM
    • Billing Document Data: Items - 2LIS_13_VDITM
    • Billing Document Header Data - 2LIS_13_VDHDR
    • Sales-Shipping: Allocation Item Data - 2LIS_11_V_ITM
    • Delivery Header Data - 2LIS_12_VCHDR
    • Delivery Item Data - 2LIS_12_VCITM
    • Sales Order Header Data - 2LIS_11_VAHDR
    Do I have to load all this Datasource to Infocube 0SD_C03 or I have to create copy of 0SD_C03 to mach with each Datasources.

    Hi.
        If you just want to statistic the amount or quantity of the sales process,I suppose you to create 3 cubes and then use a multi provider to integrated those 3 cubes you created.for example:
        2LIS_11_VAITM  -> ZSD_C01
        2LIS_12_VCITM -> ZSD_C02
        2LIS_13_VDITM -> ZSD_C03
    In this scenario,you can enhance the 2lis_12_vcitm and 2lis_13_vditm with sales order data,such as request delivery date etc..and then create a Multiprovider such as ZSD_M01.
    Best Regards
    Martin Xie

  • Best practices for a large application

    I have an existing application that is written in character
    based Oracle Developer and has many forms and reports. A few years
    ago I converted an inquiry portion of the system to standard ASP,
    this portion has about 25 pages. Initially I want to rewrite the
    Inquiry portion in Flex (partly to teach myself flex), but
    eventually (soon) I will need to convert the remainder of the
    application, so I want a flexible and robust framework from the
    beginning.
    So far for fun I wrote a simple query and I like what I have
    done but I realized that trying to write the entire application in
    a single script with hundreds of states would be impossible. The
    application is a fairly traditional type app with a login script, a
    horizontal menu bar and panels that allow data entry, queries and
    reports. In Oracle and ASP each "panel" is a seperate program and
    is loaded as needed. I see advantages in this approach, and would
    like to continue it (creating as much reusable stuff as possible).
    So where can I find documentation, and or examples on the
    best practice in laying out a framework for what will eventually be
    a sizeable app?
    BTW, what about the ever present problem of writing reports?
    Paul

    As a matter of fact, modules are exactly what you're looking
    for! :)
    Flex's doc team has a good intro document here:
    http://blogs.adobe.com/flexdoc/2007/01/modules_documentation_update.html
    And this gentleman has a quick example in his blog:
    http://blog.flexexamples.com/2007/08/06/building-a-simple-flex-module/
    The app I'm working on now is primarily based on modules, and
    they're a good way to keep things organized and keep loading times
    low. Documentation on them has been so-so, but its getting better.
    There are little undocumented gotchas that you'll undoubtedly run
    into as you develop, but they've been covered here and in the
    flexcoders Yahoo group time and again, so you'll readily be able to
    find help.
    When creating a module, you can either place them in the
    project that they'll eventually be used with, or create a stand
    alone project just for that module (the preferred and more
    organized method.) Creating a stand-alone module project is easy:
    just change the "mx:Application" tag in your main mxml file to a
    "mx:Module" tag instead.
    If you're using Cairngorm as your app's framework, you'll
    have to tinker with things a bit to get it all working smoothly,
    but its not that hard, and I believe the doc team is working on a
    definitive method for using modules in Cairngorm based apps.
    Hope this helps!

  • Best practice for load balancing on SA540

    Are there some 'best practice' guide to configure out load balancing on SA540 .?
    I've got 2 ADSL lines and would like device to auto manage outgoing traffic .. Any idea ?
    Regards

    Hi,
    SA500 today implements flow based round robin load balancing scheme.
    In the case of two WAN link (over ADSL), by default, the traffic should be "roughly" equally distributed.
    So in general, users should have no need to configure anything further for load balancing.
    The SA500 also supports protocol binding (~PBR) over WAN links. This mechanism offers more control on how traffic can flow.
    For example, if you have 1 ADSL with higher throughput than the other ADSL link offers, you can consider to bind bandwidth-hungry app on the WAN link connecting to the higher ADSL link and the less bandwidth-hungary app on the other one. The other traffic can continue to do round robin.  This way you won't saturate the low bandwidth link and give users better application experiences.
    Regards,
    Richard

  • Best practice for BI-BO 4.0 data model

    Dear all,
    we are planning to upgrade BOXI 3.1 to BO 4.0 next year and would like to know if Best Practice exists for BI data model. We find out some general BO 4.0 presentations and it seems that enhancements and changes have been implemented: our goal would be to better understand which BI data model best fits the BO 4.0 solution.
    Have you find documentations or links to BI-BO 4.0 best practice to share ?
    thanks in avance

    Have a look in this document:
    http://www.sdn.sap.com/irj/sdn/index?rid=/library/uuid/f06ab3a6-05e6-2c10-7e91-e62d6505e4ef#rating
    Regards
    Aban

  • Best Practice for Buy in Set and Dismantle for Sales

    Hi All SAP Masters,
    We have a scenario that when purchasing an item as "set", in this set, it has a few components inside this set (something like a material BOM). Example, a machine which comes with several parts. However, when the user received this set from the supplier, the user would further dismantle certain part(s) from the set/"machine" and sell it separately to the customer as a component/"single item".
    What is the best practice in the SAP process to be adopted?
    Please help. Thank you.
    Warmest Regards,
    Edwin

    If your client  have PP module , then follow this steps
    Consider A is the purchased material and going to dismantle the A into B, and C
    1) create a BOM for B material
        and assign the header material  A as consumption material with + ve qty
       and C component as byproduct and maintain - ve qty in BOM
    2) maintain backflush indicator for A & C in material master MRP2 view
    3) create routing for B and maintain auto GR for final operation
    4) create a  production order for B
    5) confirm the order in Co11n, A  will be consumed in 261 movement, C will be receipt with 531 movement
    B will receipt in 101 movement .
    once the stock is posted into unrestricted you can sale B & C

  • Best Practice for Oil and Gas Field Data Capture

    Hi All
    What is the best practice to capture volume data of Oil and Gas commodities in SAP? There is a solution [FDC|http://help.sap.com/miicont-fdc] that address the requirements however what parameters it asks, whats the process and how it calculates different variables isn't provided over the resource center.
    Appreciate any response.
    Regards
    Nayab

    Hi Zack,
    It will be easier for you to maintain the data in a single application. Every application needs to have the category dimension, mandatorily. So, you can use this dimension to maintain the actual and plan data.
    Hope this helps.

  • Best Practice for carrying forward monthly forecast data??

    Dear all:
    I am in the process of building demo for monthly forecast. I have 12 forecast categories for each period of the fiscal year. However, I am wondering whether there is a best practice to carry forward data in current forecast (month) category into the next forecast (month) category, besides running a standard copy at every month end.
    For instance, let's say I am in June, and I would forecast in a category named "JunFcst" for the period 2010.06, 2010.07, 2010.08... 2010.12. For the example purpose let's say I enter a value of 100 for each time period.
    When next month comes, I would select "JulFcst", which is supposed to initially carry the numbers I have entered in JunFcst. In other words, 2010.07 ~ 2010.12 in JulFcst should show a value of 100. There is no need to carry forward data prior 2010.07 because these months have the actual already.
    I know I can easily carry forward data in JunFcst to JulFcst category by running a standard copy data package, or perhaps use some allocation logic. But if I choose this approach, I will run the copy or allocation package at month end manually every month, or create/schedule a 12 unique copy data packages for each month respectively. Another approach is to use default logic to copy the data on the fly from JunFcst to JulFcst. I am less inclined to this because of the performance impact.
    Is there anyway to create just one script logic that is intelligent enough to copy from one forecast category to another based on the current month? Is there a system function that can grab the current month info within the script logic?
    Thanks a bunch!
    Brian

    Which business rule (i.e. transformation, carry-forward) do you suggest? I looked into carry-forward but didn't think it will work due to the fact that it carries forward within the same category. Also, because we currently do not use carry-forward, we do not have a sub-dimension created...
    Thank you!

  • Best practice for loading from mysql into oracle?

    Hi!
    We're planning migrating our software from mysql to oracle. Therefore we need a migration path for moving the customer's data from mysql to oracle. The installation and the data migration/transfer have to run onto different customer's enviroments. So migration ways like installing the oracle gateway and connect for example via ODBC to mysql are no option because the installation process gets more complicated... Also the installation with preconfigured oracle database has to fit on a 4,6 GB dvd...
    I would prefer the following:
    - spool mysql table data into flat files
    - create oracle external tables on the flat files
    - load data with insert into from external tables
    Are there other "easy" ways of doing migrations or what do you think about the prefered way above?
    Thanks
    Markus

    Hi!
    Didn't anyone have this requirement for migrations? I have tested with the mysql select into file clause. Seems to work for simple data types - we're now testing with blobs...
    Markus

  • Best Practice for SCVMM2012 setup in 3 data centers

    We are currently setting up SCVMM for our 2012 Hyper-V clusters. We have 3 data center each with one 2012 Hyper-V two node cluster. The data center are separated by a 10MB up and 50MB down pipe. There are three campuses.
    We plan to use SQL 2012 to house the VMMManagerDB.
    Is there any advantage to replicate this database to the other sites?
    Should we point the SCVMM to a single VMMManagerDB?
    In the past, for redundancy, we have VMM (2008) installed at each of the three campuses. We were planning to do the same for SCVMM 2012.
    Should we replicate the DB to the other 2 sites and have SCVMM point to the local replica?
    Thanks in advance

    For Best Design of SCVMM, you can refer below link
    http://blogs.technet.com/b/scvmm/archive/2012/07/24/now-available-infrastructure-planning-and-design-guide-for-system-center-2012-virtual-machine-manager.aspx
    Also i recommended to ask this question in
    VMM forum
    Please remember, if you see a post that helped you please click "Vote As Helpful" and if it answered your question, please click "Mark As Answer"
    Mai Ali | My blog: Technical | Twitter:
    Mai Ali

  • Best Practice for very large itunes and photo library..using Os X Server

    Ok setup....
    one Imac, one new Macbook Pro, one Macbook, all on leopard. Wired and wireless, all airport extremes and express'
    have purchased a mac mini plus a firewire 800 2TB Raid drive.
    I have a 190GB ever increasing music library (I rip one to one no compression) and a 300gb photo library.
    So..question Will it be easier to set up OS X Server on the mini and access my itunes library via that?
    Is it easy to do so?
    I only rip via the Imac, so the library is connected to that and shared to the laptops...how does one go about making the imac automatically connect to the music if i transfer all music to the server ?
    The photo bit can wait depending on the answer to the music..
    many thanks
    Adrian

    I have a much larger itunes collection (500gb/ 300k songs, a lot more photos, and several terabytes of movies). I share them out via a linux server. We use apple TV for music/video and the bottleneck appears to be the mac running itunes in the middle. I have all of the laptops (macbook pros) set up with their own "instance" of itunes that just references the files on the server. You can enable sharing on itunes itself, but with a library this size performance on things like loading cover art and browsing the library is not great. Please note also I haven't tried 8.x so there may be some performance enhancements that have improved things.
    There is a lag on accessing music/video on the server of a second or so. I suspect that this is due to speed in the mac accessing the network shares, but it's not bad and you never know it once the music starts or the video starts. Some of this on the video front may be the codec settings I used to encode the video.
    I suspect that as long as you are doing just music, this isn't going to be an issue for you with a mini. I also suspect that you don't need OSX server at all. You can just do a file share in OSX and give each machine a local itunes instance pointing back at the files on the server and have a good setup.

Maybe you are looking for

  • Issue in evaluation of Role Membership Rule in gtc trusted recon.

    Hi All, I got a issue in evaluation of role membership in gtc trusted recon. i created a custom UDF in user profile.i am updating that field from gtc trusted recon. i created a rule based on that custom UDF.But that is not triggering while we run the

  • Original Lightning cable doesn't work

    Hi , first of all sorry for my bad english. I bought an original Lightning cable with the apple logo and all the stuff printed on the box, but when I tried on my phone it doesn't work. And a pop-up appear with the text: "the accessory may not be cert

  • No receivables and payables are updated in Profitcenter accounting

    Dear All, No Account Receivables and payables are getting updated In profit center accounting. In 3KEH we need to maintain only the balance sheet accounts? or P&L a/c's also needs to be maintained? Please advice, Rgds, Kumar.

  • IDOC AND ALE

    Hi experts, I am new to IDOC and I want to learn IDOC AND ALE .So pls send me some scenario and process with screen shots to my mail id ([email protected]) so that i can learn. Thanks, DJREDDY.  ([email protected])

  • Touch pad not working with HP G60 operating on Windows 7

    I have a HP G60 operating on Windows 7 (not sure bit) and have problems with touch pad not working. I was using touch pad and wireless mouse interchangeably for past year. No recent changes made to system. Suddenly not work one evening. Have tried sh