Info Cubes  & live Cache

Hi there ,
              please suggest me the significance of Info Cubes  & live Cache ?
My understanding after going through the material in help.sap is that we store planning data in both ......
Regards
Kaushik

Hi Kaushik,
INFOCUBE
please read the help for this:http://help.sap.com/saphelp_nw70/helpdata/en/de/113037530b7774e10000009b38f839/frameset.htm
info cube: is an info provider, which contains data in it. (techniacallyAn InfoCube describes (from an analysis point of view) a self-contained dataset, for example, for a business-orientated area. You analyze this dataset in a BEx query.
An InfoCube is a set of relational tables arranged according to the star schema: A large fact table in the middle surrounded by several dimension tables.) Infocube is to store data from R/3 r from other source, used for reporting.
Livecache:
http://help.sap.com/saphelp_nw04s/helpdata/en/42/db70ae3f382cede10000000a1553f7/frameset.htm
http://help.sap.com/saphelp_nw04/helpdata/en/f2/0271f49770f0498d32844fc0283645/frameset.htm
Hope this helps a bit
Reward if useful
Cheers
Kripa Rangachari.

Similar Messages

  • Multiple users uploading data to Live cache

    Sorry posted in wrong thread. Please find this thread in Master data and general.
    Closing this one
    Hello ,
    Does APO has the capability to handle flat file uploads simultaneously to live cache  by 5 different users or does it throw out a message asking other users to wait until one upload is finished.
    We have a situation where users upload the flat files and based on these files the CVC creation process kicks off (If there are any new combinations). But every time more than one user tries to upload the file the system would nt allow other users to upload.
    Any help or input is highly appreciated.
    Thank you,
    Sai
    Edited by: st.sapscm on Jan 11, 2012 12:01 PM

    Thanks a lot of the prompt reply.
    Let me be more specific. The users upload sales data to APO (APO Info cube) in Work week versions. So the planners usually don't upload same CVCs but sometime they do upload all at once, say once every week or month.
    So when you say "Object" say the object in my case is APO BW infocube. The second part of your response compliments the idea that it might not be possible to do a simultaneous upload. Correct me if i am wrong. I am a little confused with the very first statement  ("Yes it is possible..." and the second part of you response about the locking.
    Can you please explain what would happen in each case when the recipient is 1. Info cube 2. Planning area
    Thanks a lot.
    Sai

  • Steps involved in Selective Deletion of Data from Info Cube

    Please search...this is much discussed in different threads out here
    Hi Experts
    I shall be greatful if you can updated me with the detail steps involved in Selective Data Deletion from a Info Cube....as i need to delete data selectively from a Infocube in LIVE system...so there is no second chance for me..
    Please update in detail
    Thanks in advance
    Edited by: Moderator

    Not sure how you were searching, but did you come across any of these...
    https://forums.sdn.sap.com/search.jspa?threadID=&q=selective+deletion&objID=f131&dateRange=all&numResults=15

  • Is it possible to use same data source for two info cube

    Hi,
    My Problem is in BW we can not have value of material at storage location level.In R/3 also value is maintained at plant level.
    Then we searched and we found out one hot to doc for summarized display of stock values on storage location level.
    Problem is that we have gone live in last December and we are using " 0AFMM_C02 " and it contains around 1,81,26,000 records. and according to note we have to use
    "0IC_C03".
    Both the cube uses same data sources for the data.So, how to get the data for "0IC_C03".
    and how to delete the data of existing info cube.And is it possible to delete data selectively from the info cube.
    Pls. help.
    Regards,
    viren.

    Hi,
    You can't create update rule from PSA.You can create from the infosource or from ODS or from cube to cube or ODS to ODS.
    In your scenario, what you can do is create update rules from the ODS to the new cube and then transfer the data from there. Or from the Infosource create rules to the new data target and then upload the full data and then set up the delta.
    Third option is to create update rules from the existing cube to the new cube and then load all the data one time. Then you can deactivate the update rules as that was needed only for 1 time data transfer.
    Cheers,
    Kedar

  • Data movement from R/3 DB to APO DB and to Live cache

    Dear APO Experts,
    I have few questions on live cache and how it works, I understand that Live cache is memory resident database. below are my questions.
    1) What sought of data move from R/3 DB to APO DB and then to Live cache DB
    2) When data moves from APO DB to Live cache, till how much duration that data stays in live cache
    3) And how does data gets pulled in to live cache from APO DB
    4) Why do we require live cache logs while data never gets commited to hard disk
    5) Do we ever add a datafile to Live cache DB
    any info that you provide on this will be really helpfull to me
    Thanks,
    Chetan

    Hello Chetan,
    As you know, itu2019s MAXDB/liveCache forum.
    What is the version of your system?
    The is documentation available at:
       SAP liveCache technology
    < Please review document u201CWhat is SAP liveCache technology?u201D >
    Sap documents at service.sap.com/scm -> Technology:
    u201CliveCache overviewu201D and u201CIntegration overviewu201D
    For SAP liveCache documentation also see the SAP note 767598.
    Go to SAP link Best Practices for Solution Management: mySAP SCM at   SAP liveCache technology ...
    And Review the document u201CManage APO Core Interface in SAP APO (3.x) / mySAP SCM (4.x/ 5.0)u201D
    1. As you saw in the reference documents the data transferred from the connected R/3 system to APO.
    And the transactional data could be uploaded to the liveCache, if you downloaded them
    to the APO database cluster tables first. This procedure steps are running during the
    system upgrade < for example, from SCM 5.0 to SCM 7.0 ) u2013 report /SAPAPO/OM_LC_UPGRADE_70 steps. Or you want to migrate the          
    liveCache to another operating system or convert your system to Unicode,  
    and therefore you want to back up the liveCache data first so that you    
    can reload it into the liveCache afterwards - SAP Note No. 632357.                              
    2. Could you collaborate more on this sentence. Could you give examples?
    3.  u201CAnd how does data gets pulled in to live cache from APO DBu201D
        The data are not pulled from the APO DB to liveCache.
         When you changed the APO data on the system the LCA procedures have been called from ABAP. The objects stored in the class containers in liveCache can be accessed and manipulated only via LCA routines. The registration of the LCA routines is done automatically when the liveCache is started by the LC10. The LCA procedures in the LiveCache are written in C++                          
    and shipped to the customers as binary LCA libraries(LCA build) together with the LiveCache.
    < See more details in SAP Note No. 824489 or 1278897 as of SCM 7.0 >
    4. See the SAP notes:
                   869267     FAQ: SAP MaxDB LOG area
                 1377148     FAQ: SAP MaxDB backup/recovery
    5. You could run the quicksizer & estimate how much data you are planning to have in liveCache.
         Also the amount of data could be increased by the creation of the new data in APO by users
         Or another reason u2026
         In general you will add the datavolume to solve or prevent the DB_FULL issue,
         See u201C17. What do I do if the data area is full? u201C in SAP note 846890.
    Thank you and best regards, Natalia Khlopina

  • BW reporting from APO live cache

    It has been 4 years since I have worked on a BW/APO project.  Back then, I know that in APO you could backup live cache and then use the backup as the datasource for reporting in BW.  Is this still the same or can you go directly against live cache now. 
    Thanks,
    Dean

    Hi Somnath,
    Can you pl. give me the reference where running reports from internal APO-BW directly is recommended by SAP?
    We have a similar situation with our client. It is possible to do reporting from APO cubes directly, but we are not sure since I found this in once of the SAP presentations
    SAP recommends for customer to prepare separate systems for APO and BW.
    Reasons
    The APO system will be tuned for optimal APO functionality performance such as calculating forecasts in demand planning or deriving ATP values
    The BW system will be tuned for optimal query performance in order to return results in a timely fashion.

  • ( very urgent ) read data from bw to live cache

    hi,
         i am very mew to APO LIVE CACHE. i need to read the data from BW to LIVE CACHE for forecasting. i now the transaction lc10 and lc11. but the problem is i need to know where i have to give the BW cubes or BW information into LIVECACHE.
    WOULD APPRECIATE UR HELP.
    RGS,
    SUGUNA.S

    Try TCode/SAPAPO/TSCUBE or Program /SAPAPO/RTSINPUT_CUBE
    Regards
    Krishna Chaitanya.P

  • What is Live Cache in APO

    Hi Gurus,
    I am working on APO datasource in SNP,
    How do the live cache filled up? I have built a cube top of a custom datasource (9AZSNP), everything works fine and i want to know what is behind the live cache, is it directly a user entry planning version template or any underlying tables.
    Please advise.
    Thanks. Gowda

    Hi,
    The datasource will use only planning version template, and data will trabsfer from Planning area to BW. Regarding Liveche you talk to APO guys,,, it will store data and it is dynamic..
    Check
    Feeding Cube from Live Cache
    Re: Livecache for SCM/APO
    Re: what is the use of Livecache,why can't we go with relation databases...
    Thanks
    Reddy
    Edited by: Surendra Reddy on Mar 7, 2009 5:39 AM

  • Approach when the used Live cache data area crosses the threshold

    Hi,
    Could any of you please let me know the detailed approach when the used Live cache data area crosses the threshold in APO system?
    The approach I have as of now is :
    1) When it is identified that data cache usage is nearly 100%, check for hit rate for OMS data in data cache in LC10 .Because generally hit rate for OMS data in data cache should be atleaset 99.8% and Data Cache usage should be well below 100%.
    2) To monitor unsuccessful accesses to data cache choose refresh and compare value now and before unsuccessful accesses result in physical disk I/O and should generally be avoided.
    3) The number of OMS data pages (OMS Data) should be much higher than the number of OMS history pages (History/Undo).A ratio of 4:1 is desirable. If OMS history has nearly the same size as OMS data, use Problem AnalysisPerformanceOMS versions to find out if named consistent views (Versions) are open for a long time. Maximum age should be 8hrs.
    4)If consumption of OMS heap and data cache is large, one reason may be a long running transaction simulation that accumulates heap memory and prevents the garbage collector from releasing old object images.
    5) To display existing transactional simulations in LC10,use Problem AnalysisPerformanceOMS versions and SM04 to find out user of corresponding transaction and may be required to cancel the session after contacting user if the version open for long time..
    Please help me by providing additional information on the issue.
    Thanks,
    Varada Reddy.

    Hi Mayank, sorry, one basic question - are you using some selection criteria during extraction? If yes, then try extraction without the selection criteria.
    If you maintain selection based on, let's say, material, you need to use the right number of zeros as prefix (based on how you have defined the characteristic for material) otherwise no records would be selected.
    Is this relevant in your case?
    One more option is to try to repair teh datasource. In the planning area, go to extraction tools, select the datasource, and then choose the option of repair datasource.
    If you need more info, pls let me know.
    - Pawan

  • Transporting Info areas and info cubes with all the contents

    Hello Guru's,
    I want to transport Info areas and info cubes with all the contents from 1 system to another. How can this be achieved.
    Background... these belong to the system ($tmp), i already tried assigning them to a package and transported to the system, but only definitions have been copied over, there is no master data for any characteristics (i.e every characteristic is empty), there are no Hierarchies for any characteristics, there is no transaction data ===> effectively only the definitions is copied, but everything is emtpy.
    Regards, Jatin

    No way to transport contents along with structure from on system to another.
    Different system contains different data
    DEV system has only rough data. So developed objects will be tested with rough data for  consistancy of objets.
    Quality system  has some amount of original data. After transporting the delveloped objects  from DEV to Quality  we need to test  the oblects with some original data in Quality.
    Production or Live system has only live data. Afer testing complted  in DEV and Quality ( with rough and Some amount of original data ) if the objects  are giving desired values as per the requrements we need to transport them to Production system.
    considering the above criteria no use of transporting data from one sysstem to another.

  • Detailed approach when the used Live cache data area crosses the threshold

    Hi,
    Could any of you please let me know the detailed approach when the used Live cache data area crosses the threshold in APO system?
    The approach I have as of now is :
    1) When it is identified that data cache usage is nearly 100%, check for hit rate for OMS data in data cache in LC10 .Because generally hit rate for OMS data in data cache should be atleaset 99.8% and Data Cache usage should be well below 100%.
    2) To monitor unsuccessful accesses to data cache choose refresh and compare value now and before unsuccessful accesses result in physical disk I/O and should generally be avoided.
    3) The number of OMS data pages (OMS Data) should be much higher than the number of OMS history pages (History/Undo).A ratio of 4:1 is desirable. If OMS history has nearly the same size as OMS data, use Problem AnalysisPerformanceOMS versions to find out if named consistent views (Versions) are open for a long time. Maximum age should be 8hrs.
    4)If consumption of OMS heap and data cache is large, one reason may be a long running transaction simulation that accumulates heap memory and prevents the garbage collector from releasing old object images.
    5) To display existing transactional simulations in LC10,use Problem AnalysisPerformanceOMS versions and SM04 to find out user of corresponding transaction and may be required to cancel the session after contacting user if the version open for long time..
    Please help me by providing additional information on the issue.
    Thanks,
    Varada Reddy.

    Hi Mayank, sorry, one basic question - are you using some selection criteria during extraction? If yes, then try extraction without the selection criteria.
    If you maintain selection based on, let's say, material, you need to use the right number of zeros as prefix (based on how you have defined the characteristic for material) otherwise no records would be selected.
    Is this relevant in your case?
    One more option is to try to repair teh datasource. In the planning area, go to extraction tools, select the datasource, and then choose the option of repair datasource.
    If you need more info, pls let me know.
    - Pawan

  • How to make sure the Live cache information is getting reflect in the bex?

    Hi All,
    When the user enters some data in the planning book as soon as he saves it. It should be getting reflected in the cube isnt it? when i run the query it doesnt get reflected?
    What is the technical settings should i do ? to reflected those changes in apo 3.5 verison.?
    Thanks
    Pooja

    Hi Pooja, Rafael,
    You can actually have a bex report with life data.
    For that you need to use a remote cube, and not a basic one.
    On comment though: you access only the information stored in the live cache, so for example you will not get the projected stock easily.
    By default the projected stock is not stored; but even if you sotre it in a time series key figure, you still need to run the calculation... so any changes in order will not be reflected untill the next sotck calculation...
    So to get the projected stock real time is more complex and required some abap coding... to get the the order real time is quite easy to do with remote cube
    Good luck
    Julien

  • End routine to populate Info-cube.

    Hi ,
    Is it possible to load fileds of a Info-cube using End routines in the following scenairos.
    1.Loading fields of info-cube by referencing/using a master data table in End routine.
    2.Loading fields of info-cube by referencing/using a DSO fields  in End routine.
    3.Loading fields of info-cube by referencing/using a fields of another info-cube in End routine.
    Please advise.

    Hi Stalin,
    Before answering your question you need to understand something about "End routine" and "Expert routine".
    End Routine:
    - Result_fields and Result_package are available
    - End routine contains only those fields available in Data target.
    Start Routine:
    - Source_fields and Source_package are available
    - Start routine contains only those fields coming from source.
    Expert Routine:
    -  Source_fields, Source_package, Result_fields and Result_package are available
    So Now if you want write code to look up into some other cube, in look up you may need to test condition using source fields, in that case " Expert Routine" is only the option.
    For Ex
    my data target contains : x,y and z fields (it becomes result_field)
    source contains : a field ( it becomes source_field)
    now if i want to write look up code like this " select x,y and z fields from other cube where my a field value = other cube a field value. here u r accessing both S_F as well as R_F. So only the option is "EXPERT ROUTINE"
    or else u want to write code only with R_F then "End routine " is enough.
    Thanks,
    Gowd

  • Data in ODS, Info cube ans Multiprovider(List cube) are in Sync.

    Hi,
    My query is built on multiprovider. The data flow is data source u2013 ODS then ODS u2013 Info cube and multiprovider contains  Info cube only.
    Data in ODS, Info cube ans Multiprovider(List cube) are in Sync.
    The query results are not tie up ODS, Info cube ans Multiprovider(List cube).
    Any one let me know why this is happening and how do I resolve it.
    Regards,
    Sharma.

    HI;
    thanks for help.
    I resolved the issue in my own.
    Regards,
    Sharma.

  • Info cube data doesn't match with R3

    Hi,
    We are using "0AFMM_C02" Info cube for Inventory data.
    for some records its not fatching data for distribution channel.
    I checked the update routine for Distribution Channel.
    PROGRAM UPDATE_ROUTINE.
    $$ begin of global - insert your declaration only below this line  -
    TABLES: ...
    DATA:  BEGIN OF it_data OCCURS 0,
         material LIKE /bic/cs2af_mm_inv_1-material,
         plant LIKE /bic/cs2af_mm_inv_1-plant,
         val_type LIKE /bic/cs2af_mm_inv_1-val_type,
         END OF it_data.
    DATA: lt_mrpods TYPE TABLE OF /bic/ascm_d0500.
    INCLUDE rs_bct_retail_update_rules.
    $$ end of global - insert your declaration only before this line   -
    FORM compute_key_field
      TABLES   MONITOR STRUCTURE RSMONITOR "user defined monitoring
      USING    COMM_STRUCTURE LIKE /BIC/CS2AF_MM_INV_1
               RECORD_NO LIKE SY-TABIX
               RECORD_ALL LIKE SY-TABIX
               SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
      CHANGING RESULT LIKE /BI0/V0AFMM_C02T-DISTR_CHAN
               RETURNCODE LIKE SY-SUBRC
               ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
    $$ begin of routine - insert your code only below this line        -
    fill the internal table "MONITOR", to make monitor entries
    data:DISTR_CHAN LIKE /BI0/MCUST_SALES-DISTR_CHAN.
    SELECT SINGLE DISTR_CHAN INTO DISTR_CHAN FROM /BI0/MCUST_SALES
        WHERE CUST_SALES = COMM_STRUCTURE-CUST_SALES.
        IF SY-SUBRC EQ 0.
          RESULT = DISTR_CHAN.
        ELSE.
          RESULT = COMM_STRUCTURE-DISTR_CHAN.
        ENDIF.
    result value of the routine
    RESULT = .
    if the returncode is not equal zero, the result will not be updated
      RETURNCODE = 0.
    if abort is not equal zero, the update process will be canceled
      ABORT = 0.
    $$ end of routine - insert your code only before this line         -
    ENDFORM.
    Can any body tell for what distribution channel it will get data..
    Please help .. its very urgent..
    Regards,
    Viren.

    Hi,
    I have a better suggestion..Try to write this code in Start Routine of Update Rule.
    DATA:  S_DATA TYPE STANDARD TABLE OF DATA_PACKAGE_STRUCTURE
           WITH HEADER LINE
           WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
    DATA:  ITEM_TABLE TYPE STANDARD TABLE OF /BI0/MCUST_SALES
           WITH HEADER LINE
           WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
    *start of modification
    *Populating the item status data from the item status ODS table
    SELECT * FROM  /BI0/MCUST_SALES INTO TABLE ITEM_TABLE.
    LOOP AT DATA_PACKAGE INTO S_DATA.
       LOOP AT ITEM_TABLE WHERE CUST_SALES = S_DATA-CUST_SALES.
         MOVE item_table-DISTR_CHAN to s_data-DISTR_CHAN.
         APPEND S_DATA.
       ENDLOOP.
    ENDLOOP.
    DATA_PACKAGE[] = S_DATA[].
    Hope this works...
    Regards,
    San!

Maybe you are looking for

  • TC used as Storage for iMovie

    My TC is directly connected to my MacBook Pro I'm trying to "digitize" about 40 miniDV tapes through iMovie. I have done 5 so far. Originally created the .mov files on my Mac Pro, but due to space requirements, I moved the .mov files onto the TC. Now

  • How do I transfer logic pro from my existing mac to my new macbook pro with no cd drive?

    I have logic already installed on my mac desktop. I have just bought a macbook pro and I am trying to install my legitimate copy of logic onto my new mac, unfortunately Apple in their wisdom have removed the cd drive so the simplest option i.e. use t

  • Payment Block A in ERS document

    Experts, When we put an A blcok in the vendor master and post invoices using LIV, the invoice is posted. When you display the invoice document, you will not see the A block on document level. We know that when the invoice is due, the payment proposal

  • Web sevice data control deploy error.

    We developed an ADF application. We deploy this application to SOA 10.1.3.3 it works correctly. After that we add this project a web sevice data control pointing a BPEL process (initiate method). it works from Jdev 10.1.3.3. no problem. But when we d

  • How to get a cc copy of LR5 on my computer, not a trial?

    LR5 now appears in the 'your apps' section of my creative cloud panel, but I still need to use 'trial' every time I open lightroom...I never get a window asking me for my id or password like i did with photoshop... I signed up yesterday for the ps +