Error when loading from ODS to Cube

Hello Friends,
I am having trouble loading data from ODS to Infocube in my 2004S system.When loading data i get this message
07/03/2007     13:10:25     Data target 'ODSXYZ ' removed from list of loadable targets; not loadable.
07/03/2007     13:28:42     Data target 'ODSXYZ ' is not active or is incorrect; no loading allowed     
I checked for ODSXYZ in my data target but there is nothing by that name.Even the infopackage doesnt have it.What needs to be done.Please help.
Thanks.

Its a expected behavior. When you migrate ur DS, the infopacks associated with it, will grey out all the data targets that they were feeding before, that applies to any infopack you create even after the migration. You womt be able to delete it.
Having said this, this shouldnt impact ur loads from ODS to Cube. As this shoudl be taken care of by ur DTP's rather than ur infopacks.
A few questions:
How are you loading ur cube?
Did the data get thru fine to PSA with the infopack in question?
How did you load ur DSO(assuming the load was successful)?
Message was edited by:
        voodi

Similar Messages

  • Error while loading from ODS to CUBE

    Hi guys,
    I am loading data from source system to ODS and from ODS to CUBE. Ok. The data came successfully from Source System to ODS. But while coming to ODS to CUBE it is showing some error at Data Packet 13. Ok. In CUBE, I loaded the data by using “update ODS data to data target”, at that time it shows two options like FULL UPDATE and INIT UPDATE. Then I selected FULL UPDATE and then I checked in “processing” tab there is only one option enabled i.e. only data target. OK and then I loaded it. Now after getting the error where I can correct it. There is no PSA option in monitoring.
             Other wise how I can change option in Processing tab of infopackage for PSA. But I know one point that when we load the data from one target to another the only one option available in processing tab is “only data target”. How can I change that option and how can I correct the error.
    Thanks
    Rajesh

    Hi,
    i solved my question like the following.
    Go to monitoring of the CUBE and select the option "read every thing as manually" -> then it shows another screen for correcting the records -> correct all the records and load the data again. ok
    Thanks
    Rajesh

  • Loading from ODS to Cube in process chain

    Hi Experts,
    How can I do a full load from ODS to cube when using further processing in process chain? Your help is much appreciated.
    Thanks,
    Bill

    Hi,
    You can use a DTP for this.
    Create transformation between DSO and cube.
    Create DTP and run it.
    Loading data from one cube to another cube.
    Cube to cube data loading
    how to upload data from cube to cube
    Can we pull data from one cube to another cube
    Data Load Steps:
    Reading data from another cube
    Hope this helps.
    Thanks,
    JituK

  • Can I do Parallel Full loads from ODS to Cube.

    Hai,
       Usually I will do one a full update load from OD'S to Cube. to speed up the process can I do parallel Full update loads from ods to Cube?
    Please advise.
    Thanks, Vijay,

    Assuming that the only connection we are talking about is between a single ODS and a single Cube.
    I think the only time you could speed anything up is in full drop an reload scenario. You could create multiple InfoPackages based on selection and execute them simultaneously.
    If the update is a delta there is really no way to do it.
    How many records are we talking about? Is there logic in the update rule?

  • Index for loads from ODS To Cube

    The load from ODS to cube is taking a long time - In the start routine another ODS is being looked up - The keys for look up is say X and Y
    There is already an index existing on keys X , Y & Z -
    Will this index be used while doing the select on that ODS or I need to create a new index with only X and Y keys ?
    Thnx

    When you are running the start routine - run an SQL trace - ST05 - that will tell you if the index is being used.
    Arun

  • Problem when loading from ODS to the CUBE

    Hi Experts,
    I am facing an unusual problem when loading data from ODS to the cube.
    I have a first level ODS where the delta postings are updated. I have checked the active table of the ODS and the data is accurate.
    I have deleted the entire data from the Cube and trying to do a full load from ODS to the CUBE.
    I am sure when I run a full load the data goes from Active table of the ODS.
    After the full load the the keyfigure values are 0. I have tried doing the testing by loading couple of sales documents and still the keyfigure values are 0.
    I wonder when I load the full. The data should be picked exactly the way it is in active table of the ODS.
    I also dont have any fancy routines in the update rules.
    Please help me in this regard.
    Regards
    Raghu

    Hi,
    Check the procedure did u do that exactly or not. just follow the the laymen steps here:... and let me know ur issue
    o     First prepare a flat file in Microsoft excel sheet for master data and transaction data and save it in .CSV format and close it.
    o     First select info objects option and create info area then create info object catalog then char, & key figure. Then create ‘id’ in Char, Name as attribute activate it then in key figures create no and activate it.
    o     In info sources create application component then create info source with direct update for master data and flexible update for transaction data. The flat files would be created for master data create info package and execute it.
    o     For transaction data go to info provider then select the file right click on it then select option create ODS object. Then a new screen opens give the name of the ODS and create other screen opens then drag character to key fields.
    o     Activate it then update rules it will give error to it then go to communication structure and give 0Record mode then activate it then go to update rules it will activate it.
    o     Then go to info source then create info package then external data then processing update, Data targets then the scheduling then click start then the monitor.
    o     Then other screen opens there we can see if we doesn’t able to see the records then come back to the info provider right click on it then other screen opens then select the option activate data in ODS then a screen opens.
    o     so select the option QM where it would be in a yellow co lour then by double click then another screen opens then select green color option then click continue it and it comes to the previous screen then select the option and click save it would be underneath of it just click it up and close it.
    o     Then come back to the main screen to the info source to the info package go to the package and right click on it then a screen opens then click schedule without distributing any of the tab directly click the monitor tab. Then refresh it until the records come to the green.
    o     Once it is green the click the data target and see the file executed.
    o     Now go to the info provider then a right click on the file name then create info cube. Then a new screen opens and gives the name and activates it. The drag the key figures to the other side, time characteristics to the other side, and chacetristics to the other side.
    o     Then create the dimension table then assign it and activate it .Then go to file name right click on it then select the option create update rules select ODS option and give the file name and activate it .
    o     Then come back to the main screen and refresh it. Whenever you see 8 file names you will know that it is undergoing the data mart concept.
    o     In the main screen click ODS file right click then a pop up screen opens select the option update ODS data target. Then another screen opens in that we are able to see 2 options full update, initial update then select initial update.
    o     Then another screen opens info Package there the external data , data targets , scheduler select the option later in background option then a another screen opens just click immediate then click save automatically it closes then click start option.
    o     Then select monitor option then the contents then the field to be elected then the file to be executed.
    regards
    ashwin

  • Data loading from ODS to CUBE

    Hi All,
    I have loaded data from ODS to CUBE. now i have requirement to add some fields in the standard cube. so, for testing purpose i have created copy of the original and created transformation . now when i try to load data from ODS it shows me no more data available . while data is already there in ODS.
    now what should i do ? i don't want to delete data from original cube. is there any other way to load data through transformation ?
    Regards,
    Komik Shah

    Hi,
    Check the DTP of old cube n see whether its Delta. If yes then check whether any one of the foll is check:
    1) get delta only once
    2) get data by request.
    If 1 is checked then delta wont come for the second cube as it says to get delta once and delta is already in one of the cube.
    Generally both should be unchecked but can vary as per requirements.
    Now for your new DTP, i dont think it will aloow you to change to FULL.
    If its allowing you to select FULL, then select it and select from acive table.
    try to load and see.
    regds,
    Shashank

  • Automatic loading from ODS to Cube in 3.5

    Hi All
    I was under the impression that in version 3.5 in order to load delta from ODS to Cube you had to run the 8 series Ipak.
    However I have recently noticed that this ipak is running automatically after a delta load into the ODS even when the load is not via a process chain.
    Can somebody where and how this setting is maintained.
    Regards
    A

    Hi,
    Go to ODS display mode and check if "Update Data Automatically" is ticked in Settings.
    Regards,
    Kams

  • Loading from ODS to Cube

    I have a process chain for GL.
    Data was getting loaded first to ODS and then to CUBE.
    Now i there is no update to cube is scheduled. I removed  'Further Update'  process from process chain.
    During loading  I am getting a warning message as follow:
    There must be a type "Update ODS Object Data (Further Update)" process behind process "Activate ODS Object Data" var.ACTIVATE_ODSu201D
    So plz suggest me any way to remove this warning message.
    Thanx,
    Vishal

    Hi Vishal,
    since you removed 'Further Update' process from process chain, this message is comming, i hope instead of this peorcess you are loading dta from ODS to Cube using Infopackge at process chain.
    at ODS - setting uncheck setting Further update to data targets (i assume you are loading data from ODS to Cube using Infopackage at process chain).
    Best Regards.

  • Data loads from ODS TO CUBE

    Hi,
       i have delta loads comming into the ODS. i do full update from ODS to the cube by date range for material moments no. last time when i loaded the data, it loaded few for the date range. rest did not load and sitting at ODS. this is full load and tried to load again. any suggestions...
    sp

    Hi Srinivas,
            check your update rules between ODS and cube whether they are mapped properly(check your date range for the cube load).
             Do a Init load and then do the delta load.
    hope this will help.

  • ARFCSTATE = SYSFAIL ??????? loading from ODS to cube

    Hi,
    I am loading a delta initialization from ODS to a cube. The load goes fine but never ends. I saw through SM37 transaction and the job has finished but with these messages at the end:
    tRFC: Data Package = 6, TID = 0A11010C066C45438FC70221, Duration = 01:01:03, <b>ARFCSTATE = SYSFAIL</b>
    tRFC: Start = 28.10.2006 12:10:34, End = 28.10.2006 13:11:37                                    
    tRFC: Data Package = 7, TID = 0A11010C066C45438FE00222, Duration = 01:01:38, <b>ARFCSTATE = SYSFAIL</b>
    tRFC: Start = 28.10.2006 12:10:59, End = 28.10.2006 13:12:37                                    
    tRFC: Data Package = 8, TID = 0A11010C066C45438FE90223, Duration = 01:01:32, <b>ARFCSTATE = SYSFAIL</b>
    tRFC: Start = 28.10.2006 12:11:07, End = 28.10.2006 13:12:39                                    
    tRFC: Data Package = 10, TID = 0A11010C04BC454390020000, Duration = 01:01:12, <b>ARFCSTATE = SYSFAIL</b>
    tRFC: Start = 28.10.2006 12:11:32, End = 28.10.2006 13:12:44                                    
    Synchronized transmission of info IDoc 5 (0 parallel tasks)                                     
    tRFC: Data Package = 0, TID = , Duration = 00:00:03, ARFCSTATE =                                
    tRFC: Start = 28.10.2006 13:12:45, End = 28.10.2006 13:12:48                                    
    Job finished                                                                               
    What does this ARFCSTATE = SYSFAIL means and how to correct this?
    Thanks for your help!!!

    Hi Miguel Sanchez,
    I hope you have with PSA in IP.  If not run with PSA option and goto Detail tab of the Monitor , Expand the node Extraction Check for the message 'Data Selection is Ended'.
    If you find this message you can update from PSA.
    otherwise Go for Repeat the Delta.
    and also refer the notes 516251.
    also the error you are reporting could be for several reasons.  Could you therefore please check the following:
    1) check the RFC destination in SM59 for both your BW my self system
    2) in SM59, in BW source system, go to menu path "test"
       -> "connection" and test this for errors
    3) in SM59, in your BW  source system, go to menu path "test"
       -> "authorization" and check if the user and password are o.k.
    4) check the port definitions in WE20 and WE21
    5) finally check that you have sufficient DIA processes defined in
       your BW and R/3 source system (you should have at least one more
       DIA process than all other work processes combined; this is described
       in more detail in notes 561880 and 74141).
    Please check if this solves the problem for you.
    Hope it helps.
    Regards,
    Srikanth.

  • Error while loading from PSA to Cube. RSM2-704 & RSAR -119

    Dear Gurus,
    I have to extract about 1.13 million records from set up tables to the MM cube 0IC_C03. While doing, the following two errors occured while loading from the psa to the cube:
    Error ID - RSM2 & error no. 704: Data records in the PSA were marked for data package 39 . Here there were 2 errors. The system wrote information or warnings for the remaining data records.
    Error ID- RSAR & No. 119: The update delivered the error code 4 .
    (Data records for package 39 selected in PSA - 2 error(s). Record 5129 :No SID found for value 'EMN ' of characteristic 0BASE_UOM . Record 5132 :No SID found for value 'EMN ' of characteristic 0BASE_UOM )
    I tried to change the defective records in psa by deleting the erraneous field value EMN and tried to load but failed.
    Now, my questions are:
    How can I resolve the issue ?
    (ii) How to rectify the erraneous record ? should we only delete the erraneous field value or delete the entire record from the psa.
    (iii) How to delete a record from psa.
    Thanks & regards,
    Sheeja.

    Hi,
    Data records for package 39 selected in PSA - 2 error(s). Record 5129 :No SID found for value 'EMN ' of characteristic 0BASE_UOM . Record 5132 :No SID found for value 'EMN ' of characteristic 0BASE_UOM
    The issue with record no. 5129 and 5132.
    In PSA check errorneous records and it will display only the error records, you just edit as per the require and try reloading into cube.
    Deleting single record is not possible.
    Let us know if you still have any issues.
    Reg
    Pra

  • Error when loading from Flat File

    I get the message below when trying to load from a flat file.  The change that I made to the Transfer Structure was to move the items.  How can I resolve? Thanks
    Error 8 when compiling the upload program: row 227, message: Data type /BIC/CCABSMMIMH1 was found in a newer ve

    hi Niten,
    check if helps
    A newer version of the data type error when loading

  • Delta load from ODS to cube failed - Data mismatch

    Hi all
    We have a scenario where the data flow is like
    R/3 table - >dataSrc -- > pSA - >InfoSrc -> ODS ->Cube.
    The cube has an additional field called "monthly version"
    and since it is a history cube , it is supposed to hold data
    snapshots of the all data in current cube for each month .
    We are facing the problem that the Data for the current month
    is there in the history ODS but not in the cube . In the ODS ->Manage->requests
    tab i can see only 1 red request that too with 0 recs.
    However ,In the cube -> manage-> reconstruction tab , i can see 2 Red request
    with the current month date. Could these red requests be the reason for
    data mismatch in ODS and cube .
    Please guide as to how can i solve this problem .
    thanks all
    annie

    Hi
    Thanks for the reply.
    Load to Cube is Delta and goes daily .
    Load to ODS is a full on daily basis .
    Can you help me how to sort this issue . I have to work directly in production env . so has to be safe and full proof .
    Thanks
    annie

  • Buffer error when loading from EAS??

    All,
    We're on essbase 7.1.2 and having trouble loading an ASO cube thru eas. We keep getting error 1270040: Data load buffer [2] does not exist.
    I did some research on this and it should only happen when running thru MaxL, not in EAS. From what I read, EAS is supposed to handle this transparently.
    Does anyone have any ideas why this might be happening?
    Additional info: It only seems to happen when using an EAS client on a remote computer, not on the server itself.
    Thanks,

    This is one of those error messages that does not mean what you think it means. I have had this error come up when something was wrong in my load rule, misspelled dimension or member name, missing a dimension etc. check your load rule carefully, sometimes the errors are hard to spot

Maybe you are looking for