Load of data to ODS or Info cube

Hi Experts,
            Iam reading the forum from couple of days to work for real time.I have few questions which i dint get exact answer or I might not understood about that.Questions are:
1.What is exact difference between full update and intialize without data trasnfer.I can see that data is getting loaded in both.I know with data transfer process it loacks r/s system and we can do any furthur postings, and also I know that we do initalize without data transfer process is done to initialize delta.But when we are doing without data transfer then even then data is loading.
how exactly it is working?
2. If our querry is running for long time for millions of records then it is a performance issue.How i can stop the querry in middle of its running without any effects and what steps I need to follow to get best results without using aggregates.
3. I know that when data is loading from R/3 to BW we can check the load in different logs. like SM37, BD87, SM66, SM21, SM58, SM50, SM51, for dumps SM22 and for abap debugger we got to SM30.But I want to know starting from R/3 the sequnce of checking these transactions to check my data flow and status.
I know these are many questions and also I searched in forums about this but still confusing for me.
I will award points for the best answer.
Please reply me....thanx in advance.

Hi Manasa,
What is exact difference between full update and intialize without data trasnfer.I can see that data is getting loaded in both.I know with data transfer process it loacks r/s system and we can do any furthur postings, and also I know that we do initalize without data transfer process is done to initialize delta.But when we are doing without data transfer then even then data is loading.
how exactly it is working?
Difference b/n init load & full load is
both fetches full load but after full load deltas is not possible.
Full update
A full update requests all data that corresponds to the selection criteria you determined in the scheduler
Initializing the Delta Process
To request deltas, you need to have initialized the delta process
For test purposes, you can initialize the delta process without transferring data. In the Update Parameters tab page, select the corresponding indicator. When the initialization simulation runs, the system creates the help entries in all required tables, but does not load data. Afterwards, the monitor displays a loaded record. The init simulation request appears in all data targets into which data was updated, but only contains the help entries.
you can also run this after full load to  enable for deltas.
*Query Performance*
how can i increse query performance other than creating aggregates
How to improve query performance ?
Query performance - bench marking
/people/prakash.darji/blog/2006/01/27/query-creation-checklist
/people/prakash.darji/blog/2006/01/26/query-optimization
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3f66ba90-0201-0010-ac8d-b61d8fd9abe9
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/cccad390-0201-0010-5093-fd9ec8157802
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/ce7fb368-0601-0010-64ba-fadc985a1f94
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/c8c4d794-0501-0010-a693-918a17e663cc
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/064fed90-0201-0010-13ae-b16fa4dab695
How to improve Query Performance
Sequence of checks
/people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
Regards,
Vijay

Similar Messages

  • Init Data Upload from ODS to Info cube

    Hi,
    When i am trying to upload the date from ODS to info cube using initial upload option in info package, i am not able to see the selection option fields.
    Whereas using the same ODS & info cube, if i am trying to change the option to full upload in info package, i am getting all the selection fields.
    what could be the reason for the same.
    please help
    Rajiv

    Somnath,
    Let me explain u the complete scenario
    1) we have an existing ODS which populates an info cube on daily basis (delta upload)
    2) we have created a new info cube, which fetch the data from the same ODS. that means 2 info cubes which are getting populated from one ODS.
    3) now when we have tried to initialize the new info cube from the existing ODS, it asked us to delete the initialize upload options "scheduler - initialize option for the system". & we have done the same. but by doing this, ODS has removed the option of delta upload from old info cube upload info package &&  new info cube initial upload is not dislaying any of selection fields..
    4) now, i have a query
    - how can i activate delta on old info cube (without losing any data)
    - how to upload data in new info cube with selection fields and start delta
    Regards
    Rajiv

  • Is it possible to use same data source for two info cube

    Hi,
    My Problem is in BW we can not have value of material at storage location level.In R/3 also value is maintained at plant level.
    Then we searched and we found out one hot to doc for summarized display of stock values on storage location level.
    Problem is that we have gone live in last December and we are using " 0AFMM_C02 " and it contains around 1,81,26,000 records. and according to note we have to use
    "0IC_C03".
    Both the cube uses same data sources for the data.So, how to get the data for "0IC_C03".
    and how to delete the data of existing info cube.And is it possible to delete data selectively from the info cube.
    Pls. help.
    Regards,
    viren.

    Hi,
    You can't create update rule from PSA.You can create from the infosource or from ODS or from cube to cube or ODS to ODS.
    In your scenario, what you can do is create update rules from the ODS to the new cube and then transfer the data from there. Or from the Infosource create rules to the new data target and then upload the full data and then set up the delta.
    Third option is to create update rules from the existing cube to the new cube and then load all the data one time. Then you can deactivate the update rules as that was needed only for 1 time data transfer.
    Cheers,
    Kedar

  • Open Orders are negitive in ODS and Info cube

    Hi,
    Our ODS is getting data from sales order Item data and from ODS data is going to Info cube.
    The problem is that: for few sales orders some open orders are deleting in R/3 but same records are reversing
    the old records due to this in the ODS and info cube sales orders Quantity is negative but the orders are not negitive.
    Could any one let me know how do we can ressolve it.
    Regards,
    Sharma. IVN

    Hi Sarma,
    You should consider checking the attached links below:
    ROCANCEL field in R3 extraction program can't catch the LOEKZ (deletion ind
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/oss_notes/sdn_oss_bw_bct/~form/handler%7b5f4150503d3030323030363832353030303030303031393732265f4556454e543d444953504c4159265f4e4e554d3d383836373136%7d
    This may help in this case.
    Regards,
    Pietro

  • Start routine from ODS to Info cube in bw 3.5

    Hi Folks,
                     We have a code for START ROUTINE from ODS to Info Cube.
    we are doing FULL Load to ODS(0CRM_QUT1) and from ODS to INFO CUBE also we are doing FULL Load.
    Iam planning to change to DELTA Load from ODS to Info Cube to improve performance.
    Is there any situtation If  we write a Start routine from ODS to Info cube we can't do DELTA Load .
    Please clarify me.

    improving performance is a good thing. using delta load mechanism is also good for performance as loadings are faster and are less time cosuming. Nevertheless analyse carefully how the solution was built to understand clearly why a full load was required at the time of the design of the solution. maybe the full load was used for some reasons that are still valid or would require more than just changing from a full to a delta load . in conclusion it is always good to use delta but in your case look carefully at the existing coding it is possible that look up is coded for inserting records or other treatments.
    hope this could help you out.

  • Loading Master Data into ODS

    Hi,
         Can anyone tell me how to load the master data into the ODS?
    Thanks
    Yadav
    Edited by: yadav on Apr 12, 2008 8:04 PM

    Dear Yadav,,
    U can load the master data to ur ODS..
    The Process is same as loading Transaction Data into ODS but..
    1. U need to do  flexible Update instead of Direct Update...
    2. Maintain all the Attributes in ur DATA part...
    3. The record on which Updation depends should reside in the KEY FIELD section.
    But there is difference in loading master data to info object and ODS..
    If loaded in Info Objects..REusability is there.
    But in ODS no reusability...
    Hope this helps u...
    Assign points if helpful...
    Best Regards,
    VVenkat..

  • Error while loading the data from ODS to InfoCube

    hai
    Im trying to load the data from ODS to InfoCube for particular year .
    But it says that there is a source system problem .
    why it is like that .
    pls tell me
    i ll assing the points
    rizwan

    Hi Rizwan,
    you didn't mention the error message in details. there could be a few places to be checked:
    - check if BW itself source system is active and in tact and reactivate if necessary
    - check if update rule is active and reactivate if necessary
    - check if ODS is active and reactivate if necessary
    Regards,
    Lilly

  • How the data is stored in Info cube...in the back end what will happen???

    Hi Experts,
    How the data is stored in Info cube and DSO...in the back end what will happen???
    I mean  Cube contain Fact table and Dimension tables How the data will store and what will happen in the backend???
    Regards,
    Swetha.

    Hi,
    Please check :
    How is data stored in DSO and Infocube
    InfoCubes are made up of a number of InfoObjects. All InfoObjects (characteristics and key figures) are available independent of the InfoCube. Characteristics refer to master data with their attributes and text descriptions.
    An InfoCube consists of several InfoObjects and is structured according to the star schema. This means there is a (large) fact table that contains the key figures for the InfoCube, as well as several (smaller) dimension tables which surround it. The characteristics of the InfoCube are stored in these dimensions.
    An InfoCube fact table only contains key figures, in contrast to a DataStore object, whose data part can also contain characteristics. The characteristics of an InfoCube are stored in its dimensions.
    The dimensions and the fact table are linked to one another using abstract identification numbers (dimension IDs) which are contained in the key part of the particular database table. As a result, the key figures of the InfoCube relate to the characteristics of the dimension. The characteristics determine the granularity (the degree of detail) at which the key figures are stored in the InfoCube.
    Characteristics that logically belong together (for example, district and area belong to the regional dimension) are grouped together in a dimension. By adhering to this design criterion, dimensions are to a large extent independent of each other, and dimension tables remain small with regards to data volume. This is beneficial in terms of performance. This InfoCube structure is optimized for data analysis.
    The fact table and dimension tables are both relational database tables.
    Characteristics refer to the master data with their attributes and text descriptions. All InfoObjects (characteristics with their master data as well as key figures) are available for all InfoCubes, unlike dimensions, which represent the specific organizational form of characteristics in one InfoCube.
    http://help.sap.com/saphelp_nw04s/helpdata/en/4c/89dc37c7f2d67ae10000009b38f889/frameset.htm
    Check the threads below:
    Re: about Star Schema
    Differences between Star Schema and extended Star Schem
    What is the difference between Fact tables F & E?
    Invalid characters erros
    -Vikram

  • Error loading the data into ODS - Message no. BRAIN060?

    Hi,
    I am getting following error while load the data from flat file, data loaded successfully from flat file to PSA but I got following error while updating the data from PSA to data target:
    Value '010384 javablue' (hex. '30003100300033003800340020006A0061007600610062006C') of characteristic 0PO_NUMBER contains invalid characters
    Message no. BRAIN060
    Diagnosis
    The following standard characters are valid in characteristic values as default:
    !"%&''()*+,-./:;<=>?_0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ
    Characteristic values are not allowed if they only consist of the character "#" or begin with "!". If the characteristic is compounded, this also applies to each partial key.
    You are trying to load the invalid characteristic value 1. (hexidecimal representation 30003100300033003800340020006A0061007600610062006C).
    I am trying to load Value '010384 javablue' to "0PO_NUMBER" info object for ODS with in one row with some other data.
    Any idea or any input to resolve this issue?
    Thanks in advance for any input.
    Steve

    Thanks Soumya, I have maintained the upper case letters, but I am loading upper and lower case mixed to PO number? and it is not working. What is the solution to this? If I set lower case property to the PO number infoobject, it won't accept upper case. If I uncheck the lower case then it won't accept lower case letters. I cannot add upper and lower case letters in RSKC, because it accepts up to 72 characters, I have already have more than 60 characters (special char, numbers and 26 upper case letters).
    I have already tried transfer routine but you can either convert to lower or upper but it doesn't work. we need both uper and lower case for po number, and R/3 is accepting it. why BW doesn't accept both?
    Any idea what can be done?
    Thanks in advance for your help.
    Steve

  • The data back up to info cube

    hi.. sir
    i have created PA, also created a Time series for the PA.  and then created PB, DV.
    Now i want to add a KF to  the PA. How??
    if i deactivate the planning area i will loose the time series data.
    then how i can back up the time series data to infocube. What is the path and procedure to back up the time series data to INfo cube??

    HI,
    If you are on SCM5.1 or above i guess it is not requried to de-intiaze the planning area.
    IF it is SCM 4.1 or below, we need to de-initilise the planning area as you expected.
    Assuming you are in 4.1 or lower version,
    As i understand from your words, you have created a plannign area newly, so, there will be no Data source generated on it.
    To take the backup, we need to have a backup cube and the data source generated for the planning area.
    With out taking help of BW team you can do this yourself from Planning area,
    Goto MSDP_ADMIN and from the menu goto Extras and choose Generate Datasource to generate DS and use the other option Data extraction tools to create a cube. please refer the SAP HELP for more detail steps.
    Once the infocube is ready for backup,
    1. Take the backup and do the basic validation agianst planning area data.
    2. De-Inialise the planning area and add the keyfigure to planning area
    3. Create the timeseris agian and do the consistency check
    4. Load back the data to planning area using /SAPAPO/TSCUBE
    Hope the information will helps you
    Yarala

  • Errror while loading data from dso to info cube....

    Hi All,
    When i am running dtp from dso to cube i get a error i.e :
    1.  Data package processing terminated
         Message no. RSBK229
    2.  'Processed with Errors'
         Message no. RSBK257
    3. Error while updating to target 0FIGL_C10 (type INFOCUBE)
        Message no. RSBK241.
         So these are the errors i am getting , i am new to BI so how could i solve this errors....
         All the transformation are active with no errors.
    Thanks in Advance
    Regards,
    Mrigesh.
    Edited by: montz2006 on Dec 7, 2009 8:45 AM

    Hi All,
    Now i deleted data from dso and data source and again run the dtp,
    this time i got data from data-source to dso but , when i am running dtp between dso and cube , i am not finding an error.
    Now when i go to cube and see the data the request has came but data has not transferred .....
    The Request Status is green in cube but data is not cuming....
    Regards,
    Mrigesh.

  • Loading consolodated data from two excel files / cube into one infocube

    Hi Friends,
    I am receiving data from two sources:
    Source 1:
    Customer Product  Location                          Keyfig(Budget)
    C1            P1          L1                                     100
    C2            P1         L1                                      200
    C1           P2          L1                                      300
    Source 2:
    Product     Location                        KeyFig (Actual)
    P1               L1                                           320
    P2               L1                                           350
    I want to combinedata from two sources (or cubes) into one cube as follows:
    Customer      PRoduct       location          Budget            Actual
    C1                 P1                 L1                  100                 320
    C2                 P1                L1                   200                 320
    C1                P2                L1                   300                   350
    I tried by creating multiple data sources / infosources / transformations / updates rules and also tried with both DSO and cube. But the records are always getting updated as follows:
    Customer      PRoduct       location          Budget            Actual
    C1                 P1                 L1                  100                 320
    C2                 P1                L1                   200                 320
    C1                P2                L1                   300                   350
                        P1               L1                                           320
                       P2               L1                                           350
    Can you help me figure out if this is possible? If yes, how can I do it.
    Thanks a lot in advance.

    Hi,
    Please use the below approach.
    Load the budget data in ODS1.
    Load the actual data in ODS2.
    Create a ODS3 with same structure as ODS1 with additional key figure for Actuals,which will get data from ODS1. Here add a look up based product and location to populate actuals.
    Start Routine
    SELECT * FROM ODS2 into ITAB
    FOR ALL ENTRIES in SOURCE_PACKAGE.
    Transformation Routine:
    Read table ITAB  into WATAB
    with key location = <source_fields>-Location
    product =  <source_fields>-Product.
    If sy-subrc = 0.
    RESULT = WATAB-ACTUAL.
    ENDIF.
    -Vikram

  • Regarding loading the data into ODS

    Hi all,
    I am having a sitaution where I had filled an ODS with data. Now few fields have to be added to the ODS for reporting purpose. I have added the fields. But I am having doubt in how to fill those fields alone in that ODS so that the data can be represented in the Reports on that ODS. Is there any prerequits and precautions that have to be taken????
    Regards
    YJ

    Hi,
    Just write a prog and execute it to fill the added field.
    ex: if sy-subrc <> 0.
      message s000 with 'No records selected for the specified criteria.'.
    else.
      loop at int_tab.
        update /bic/aODS00 set
                  added_field= int_tab-added_field
              where condition.
        if sy-subrc = 0.
          counter = counter + 1.
        endif.
      endloop.

  • 0IC_C03 info cube loading issue

    Hi,
    While loading the 0IC_C03 (Inventory) info cube from 2LIS_03_BF
    extractor job is taking long time, even while loading upto PSA and also
    while loading the data from PSA to info cube through DTP.
    When i checked the same with other two extractors it was working fine
    i.e., 2LIS_03_BX and 2LIS_03_UM. I have checked in the process monitor
    in SM50 in BI it was showing ACTION: Direct Read TABLE: NRIV.
    Regards,
    Prabhakar.

    Ouuupsss, you must be on the interval definition.
    Execute the SNRO transaction
    Enter your number range ID
    Modify the definition of your number range (the pen )
    Menu Goto
    Buffer
    Memory
    Then there is a new field that that should be displayed. Something like "Buffer size". 500 is a good start (not too small, not too big).
    Regards,
    Fred

  • How to update DSO data in Data targets (ODS data into Data Targets)

    Hi,
    In 3.5 version we have a option <b>Update ODS data in Data Targets</b>
    to Transfer Data From ODS to Info cube.
    In 7.0 How to transfer DataStore Object data to Info Cube, I can't find
    Update DSO data in Data Target option.
    can any body Help me to solve this problem.
    Thanks
    prasanna

    Hi prassna,
    0RECORDMODE are auto generated infoobjects which are requried to be mapped if you are going for a delta update  from your DSO otherwise you can proceed further without mapping the 0RECORDMODE and neglect the system generated warning
    check this for the <b>Creation of data targets</b>
    •     In left panel select info provider
    •     Select created info area and right click to create ODS( Data store object ) or Cube.
    •     Specify name fro the ODS or cube and click create
    •     From the template window select the required characteristics and key figures and drag and drop it into the DATA FIELD and KEY FIELDS
    •     Click Activate.
    •     Right click on ODS or Cube and select create transformation.
    •     In source of transformation , select object type( data  source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
    •     Activate created transformation
    •     Create Data transfer process (DTP) by right clicking the master data attributes
    •     In extraction tab specify extraction mode ( full)
    •     In update tab specify error handling ( request green)
    •     Activate DTP and in execute tab click execute button to load data in data targets.
    cheers
    Sunil

Maybe you are looking for

  • Changing WT to GL mapping

    Dear All I am working in a group of companies and there is a request to change the GL mapping for few wage-types for one particular company.  I am new to this particular area of SAP HR, but I am learning.  By far what I have understood that WT are ma

  • Hide fields in a jsp

    hi i have a jsp in which i need to hide a field when it is first displayed. i have 2 radio buttons yes and no. the action class sets the no radio button as default. there is a text message that needs to be dispalyed only when the yes radio button is

  • Adding Chapters and Menus to  VTS Folder?

    Hello I have a DVD that was copied from a VHS tape. Its a concert video without menus or chapter breaks. I want to have each song be a chapter so how do I break it up? I don't even know what software to use...I have DVD Studio Pro 3 and Final Cut Pro

  • Problem with iText pageEvent.

    I am creating a PDF document using iText and writing some tables to it. On page end event of PDFWriter I am creating footer. method :onEndPage(PdfWriter pdfwriter, Document document) My problem is that this event is getting called twice. so some tabl

  • Spam scam friend requests

    This has been happening all day today! I have already blocked and reported them 5 times today! What's the deal? Make it stop Microsoft. Mod Edit: Edited post to comply with the Skype Community Guidelines and Skype Etiquette