Loading InfoCube 0PUR_C03

Hi,
when taking infocube 0PUR_C03 from Business Content it creates update rules for datasource 2LIS_02_S015.
Datasource gives no data when testing on RSA3.
Is there any customizing i should do to get data ?
thanks in advance
JP

Hi JP,
I do not think we have a new extractor replacing 2LIS_02_S015. I do not see it in the documentation.
http://help.sap.com/saphelp_bic735/helpdata/en/7e/09693724f38932e10000009b38f8cf/content.htm
For some of the other old datasources we have new datasources and are updated in the documentation.
http://help.sap.com/saphelp_bic735/helpdata/en/c5/a26f37eb997866e10000009b38f8cf/content.htm
I guess you could use 2LIS_02_S015 by following the steps below.
1. Transaction SBIW in ECC, Settings for Appl-Specefic datasources (PI) --> Logistics --> Managing Transfer information structures --> Appl specific setup of statistical data --> Activate/Deactivate update. Select info structure S015, click on the details button above. Choose Asynchronous update/Asynch collective update. The second one, I guess is like the V3 update we schedule between extraction queue and delta queue. Flag the activate delta update check box.
2. In the same path above, choose Perform setup - Purchasing --> PURCHIS Statistical Setup. Give infostructure S015 and run in background from the program menu.
3. Come back and choose LIS/Setup/Log to check if documents have been filled.
4. Use RSA3 to see if you get data.
Best Regards,
Murali.

Similar Messages

  • What is the use of parllalization in loading infocube by the help of dtp

    what is the use of parllalization in loading infocube by the help of dtp

    An Index can improve the reading performance when data is searched for values of fileds contained in the index.
    The fact whether the Index is used or not depends on the database optimizer, which decides it after taking the following under consideration:
    - Size of the table
    - Fields in the index compared to the fields in the statement
    - The quality of the index (its clustering factor)
    One backdraw of Index is that they decrease writing performance, as they are maintained during the write to the table.
    Hope it helps you,
    Gilad

  • "****Error from PSA****" error message when loading InfoCube

    Hi all,
    I am getting the following error message when loading transaction data to an InfoCube.
    ***Error from PSA***
    This is not MD or Hierarchies (like previous messages in the forum)
    Please advise.
    Thanks,
    Miki

    Hi Miki,
    Before just loading data into the infocube. Catchup PSA table and see the data into it.
    Then u may find the solution.
    But as ronald said this shouldnot be the complete error statement.
    bye.

  • Loading Infocube OSD_C03

    Hi,
    I am practicing BI7 by taking standard infocube and datasource 2LIS_12_VCITM.Upto now i am familiar with BW3.5 but i am new to BI7.When i log on to BI7 i saw datasources with small square before it.I came to know that they are BW3.X datasources in BI7.So, can i use these datasources to load data?And can anyone tell me step by step procedure of activating cube,datasources, creation of transformations and DTPs.Upto my understanding infopackage loads data from datasource to PSA and DTPs load data from PSA to datatarget.Is my understanding correct?
    Please can anyone help me.
    Thanks,
    Raghavendra Edara.

    Hi
    Check these links
    http://help.sap.com/saphelp_nw04s/helpdata/en/b2/e50138fede083de10000009b38f8cf/frameset.htm
    http://help.sap.com/saphelp_nw04s/helpdata/en/b2/e50138fede083de10000009b38f8cf/frameset.htm
    http://help.sap.com/saphelp_nw04s/helpdata/en/b2/e50138fede083de10000009b38f8cf/frameset.htm
    Regards,
    Sumit

  • Help with loading infocube from flat file

    Hi:
    I am using SCM 5.1 and the BI 7.0 component that comes with it. I am trying to create an infocube and load it with flat file data but getting errors.
    1. First I created the infoobjects and the infocube.
    2.a I first tried to create only the data source and activate the datasource. This was not successful.
    2.b I then tried to create the application component, infosource etc. (like one does in BW 3.0) but was still not successful.
    If you have any pointers, could you please let me know? Thanks.
    Satish

    You can go through the following Blog for flat file loads.
    https://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/60debae1-84dd-2b10-e7bf-bdedf1eabdf9
    If you are getting any specific error, then you can post it here.
    Regards,
    Gaurav

  • APO DP - loading InfoCube data into a planning area

    I am using APO DP V5.
    In my planning object structure I have planning characteristics, some of which have 'navigation attribute' characteristics. For example, my planning characteristic 'product' has navigation attribute 'product group'.
    In the standard transaction to load data from an InfoCube to a planning area, can the characteristics used in this transaction be 'navigation attribute' characteristics? (or do they have to be plannng characteristics?)
    So, for example, could I load data at 'product group' level?
    Thanks,
    Bob Austin, Atos Origin

    am not sure if you can load data from cube to a navigation attribute (product group)
    you might need to maintain it in the Admin workbench and also assign values there. Am not sure if you will get the attribute characterisitics in yout TSCUBE loading screen

  • Error During Load Infocube from ODS

    Hi guys i hae an infocube and an ODS i load data to the ODS directly from SAP and its OK but i have an error when i try to load data from ODS to infocube, i`ve created an update rule from ODS to infocube theres an error in the load request "Load Request is invalid" i try to load to infocube from ODS in many ways Full, Init Delta but i have the same error in all of them.....i need to do something before or i need to change in any way the data targets.....
    I hope somebody could help me with this.......

    HI,
    Activate the data in ur ODS. Then right click on ur ODS n choose "create export datasource". Now right click ur ODS and go to "Manage" and make sure that the QM status is green and the Datamart status is checked with a green mark. Now carry on the load from ODS to infocube.
    Hope this helps...Let me know
    Regards,
    R.Ravi

  • Error loading infocube

    Hi all,
    I am getting the following error when i am trying to load the infocube from a flat file.
    Error message when processing in the Business Warehouse
    Diagnosis
    An error occurred in the SAP BW when processing the data. The error is documented in an error message.
    System response
    A caller 01, 02 or equal to or greater than 20 contains an error meesage.
    Further analysis:
    The error message(s) was (were) sent by:
    Update
    Procedure
    Check the error message (pushbutton below the text).
    Select the message in the message dialog box, and look at the long text for further information.
    Follow the instructions in the message.
    This is the error message i am getting:
    Data records for package 1 selected in PSA - 1 error(s)
    Error 4 in the update     
    The update rules and transfer rules are correct and active and all infoobjects are active.
    Thanks a lot.

    hi Bhanu/all,
    the data looks as follows in the psa.
    RED sts     1     1     s1     300
    @5B@     1     2     s2     400
    @5B@     1     3     s3     500
    @5B@     1     4     s4     600
    @5B@     1     5     s5     700
    @5B@     1     6     s6     800
    @5B@     1     7     s7     900
    @5B@     1     8     s8     1000
    @5B@     1     9     s9     1100
    @5B@     1     10     s10     1200
    this is the psa maintenance screen and I cannot see any pencil icon to edit the first record. There is no special character space that i can see here but when i click on the error status(red status) it states that 'Value 's1 ' (hex. '7331 ') of characteristic ZSID contains invalid characters BRAIN 60.
    To edit psa data should we need to select only psa in the processing option? let me know.
    thanks.
    Message was edited by: Prathibha E

  • Process Chain taking long time in loading data in infocube

    Dear Expert,
      We are loading data thru PC in AR cube it takes data frm
    PSA-> DSO->Activation->Index Deletion->DTP(load infocube)->IndexCreation->Create Aggregates.
    In Index creation everyday its taking long time around 9 to 10 hrs to create it
    when we go in RSRV and repair the infocube thr loading of data happens fast. We are doing it(RSRV) everyday. In DB02 we have seen dat 96% tablespace is used.
    Please tell permanent solution.
    Please suggest its BI Issue or Basis.
    Regards,
    Ankit

    Hi ,
    We are loading data thru PC in AR cube it takes data frm
    PSA-> DSO->Activation->Index Deletion->DTP(load infocube)->IndexCreation->Create Aggregates.
    In the above steps insted of Create Aggregates it should be Roll up Process of aggregates.
    You can ask the basis team to check the Table space in the transaction db02old/db02.
    Check if there is long running job in SM66/SM50 kill that job.
    check there should be enough Batch process to perform the steps.
    Hope this helps.
    "Assigning points is the ways to say thanks on SDN".
    Br
    Alok

  • BI 7 creation and loading of ods ,infocube in BI 7

    hi all... i am a beginner in SAP BI 7 ... can any1 of u'll send me easy steps to create an load infocube and ODS in BI 7. Thanks in advance.
    Tom Jogy

    HI Tom,
    Check the below links:
    DSO Creation in BI 7:
    http://help.sap.com/saphelp_nw04s/helpdata/en/4a/e71f39488fee0ce10000000a114084/content.htm
    InfoCube Creation in BI 7:
    http://help.sap.com/saphelp_nw04s/helpdata/en/80/1a643fe07211d2acb80000e829fbfe/content.htm
    Hope these are useful to u.. Step by step..
    Thanks
    Rajesh

  • Loading 2 data Sources in One Infocube

    Hello,
    I need to load Infocube using 2 datasources like 2LIS_11_VAHDR and 2LIS_11_VAITM using BI 7.0 is this possible ?
    I need to take 10 fields from 2LIS_11_VAHDR and 8 fields from 2LIS_11_VAITM  and load into Infocube. 
    Do I need to create 2 Tranformations under infocube?
    I searched in SDN I couldn't find any blog on this
    If anybody have doc plze share I will assign full points
    Thanks

    How can this possible let me give one example
    Data in 2LIS_11_VAHDR as follows
    Doc no:    12345
    CCode:     cp123
    Data in my datasource 2LIS_11_VAITM as follows
    MatDsce: Finish goods
    In my first transformation Doc no and Ccode will be mapped to infocube
    2nd transformation Matdsc will be mapped to Same infocube.
    I'm having doubt like how do we know Matdsc is the correct material description for Doc no :1234
    Where is the relationship between two datasources?
    Thanks

  • InfoCube Load

    Hi,
    I loade Infocube with Historical data from year 2000 to 2006 this is one time load.
    Every month I have to load Current year and month data for example 2007, Jan....
    Initial load will store in a cube constant but every time I load current year and month data should be override current month data
    For Example:
    In Jan 2007 i will load Jan 2007 data.
    In  Feb 2007 I will jan ,Feb 2007 when i load this system should delete the Jan 2007 data and load Jan,feb 2007 data
    My Final output will be
    Initial load plus Jan,Feb 2007 data.....
    is it possible ?

    Hello Roy,
    You can achieve this using process chains.
    Create a PC with the process type "Delte Overlapping Request in Data Target".
    inside that process select Deletion conditions (Eg: if the info source is same and info pack is same delte the previous request)
    you have to load your initial loads manually for yr 2000 to 2006.
    Cheers
    Praveen

  • Problems with delete overlapping requests from InfoCube in PC

    Hi guys,
    Iu2019m using delete overlapping requests from InfoCube in Process Chains, but Iu2019m not being able to adjust it to my specific requirement.
    For example:
    I execute DTP to load InfoCube XPTO with Fiscal Year 2008 and 2009. After this I have to load again, but only for 2009.
    In this specific example I want my process chain to delete the 2009 data from my first load, because it is overlapped, and leave 2008 data.
    Is this possible? If yes how?
    Thanks in advance
    Jão Arvanas

    It will not work that way.
    It will look if the selections are same then it wil delete if not then it will not do that activity.
    Overlapping settings which you might chosen is based on the delete overlapping for the same selections..
    So in this case the selections are different and hence its not possible.
    Thanks
    Murali

  • Inventory Cube Loading - Questions....

    This is the process I intend to follow to load InfoCube 0IC_C03 and the questions therein. I have the "how to handle inventory scenarios" document, so please don't suggest I read that.
    1A) Delete Set-up tables for application 03
    1B) Lock Users
    1C) Flush LBWQ and delete entries in RSA7 related to application 03 (optional - only if needed)
    2A) Fill set up tables for 2LIS_03_BX. Do Full load into BW Inventory Cube with "generate initial status" in InfoPackage.
    2B) Compress request with a marker update
    3A) Fill set up table for 2LIS_03_BF with posting date 01/01/2006 to 12/31/2007 (Historical) and load to Inventory Cube with a full update
          QUESTION1: Does this need a marker update?
          QUESTION2: Is this a good strategy  - do movement documents that old get updated once posted?
          QUESTION3: Does the posting dates restriction depend on on how far back in history we want to go and look at stock values?
    3B) Fill set up table for 2LIS_03_BF with posting date 01/01/2008 to 9/9/999  (Current) and load to Inventory Cube with a delta init with data transfer.
    3C) Compress load in 3B without a marker update
    4A) Fill set up table for 2LIS_03_UM  and load to Inventory Cube with a delta init
    4B) Compress load in 4A without a marker update
          QUESTION4: How should we select the posting date criteria? Do I need to load from 01/01/2006 to 9/9/9999 since that's the range used for BF?
    5) Start V3 update jobs via Job Control
    6) Intiate subsequent delta loads from BF and UM and compress with marker update
    QUESTION 5: Is the sequence of loading BX first, then BF and UM fixed? or can I do BF and then BX, UM
    QUESTION 6: Any tips on minimizing downtime in this particular scenario? Please don't suggest generic steps. If you can suggest something specific to this situation, that'd be great.
    I hope you can help with the 6 questions I asked above.
    Regards,
    Anita S.

    Hi Anita,
    Please find my answers below. I have worked enough with this scenario and hence feel that these would be worth considering for your scenario.
    3A) Fill set up table for 2LIS_03_BF with posting date 01/01/2006 to 12/31/2007 (Historical) and load to Inventory Cube with a full update
    QUESTION1: Does this need a marker update?
    In this step we dont need marker update while compressing.
    QUESTION2: Is this a good strategy - do movement documents that old get updated once posted?
    I am able to get the question quite clearly.
    QUESTION3: Does the posting dates restriction depend on on how far back in history we want to go and look at stock values?
    Yes. We need to start from latest and then go back as backwards as we want to see the stock values. This holds true when we are using non cumulative key figures
    4B) Compress load in 4A without a marker update
    QUESTION4: How should we select the posting date criteria? Do I need to load from 01/01/2006 to 9/9/9999 since that's the range used for BF?
    No need to provide any selection criteria for UM while loading to BW, as this would fetch the same data filled in setup of revaluations.Unless you are looking for only some history other wise you can fill that for company code list and bring the whole data to BW  by a single full load, as the data wont be that huge as compared to BF
    6) Intiate subsequent delta loads from BF and UM and compress with marker update
    QUESTION 5: Is the sequence of loading BX first, then BF and UM fixed? or can I do BF and then BX, UM
    This is fixed in terms of compression with marker updates.
    QUESTION 6: Any tips on minimizing downtime in this particular scenario? Please don't suggest generic steps. If you can suggest something specific to this situation, that'd be great.
    *Yes. Most of time consuming activity is for filling BF history. Negligable for BX and UM comparitively.
    Either try having multiple BF selective setup filling based on posting date based on available background processes or filli open months setup during downtime and rest history once you unlock the system for postings. The reason for this being we dont expect any posting for history*
    Feel free to ask any further questions about this.
    Naveen.A

  • Inventory Management loading

    hi xperts,
      I am loading infocube,0IC_C032 from the infosource LIS_03_BX,with business content updaterules.
    But no data is coming into the cube,I set the option as GENERATE INTIAL STATUS,I checked RSA3 in source system data is there.
    please somebody help with this inventory management.

    Hi,
    you have to run MCNB in R/3 to perform the stock initialization.
    You do this vis SBIW - Setings for App Spec DS - Logistics - Settings Inventory Controlling - Stock initialization.
    Kind regards
    /martin

Maybe you are looking for