Loading from cube to same cube failing

Hi All,
we are facing a situation wherein we have to load some data from  a custom cube to the same cube with some selections.
But we are facing problem while loading it is showing error - no active update rules exists but I have already activated the update rules. the same thing in production system is working fine but in dev system it is throwing error .
Also in sm58 the trfc's are not getting processed.
in sm37 , the datapackets are having ARFCSTATE = SYSFAIL .
I have done the following checks:-
1)the extractor is working fine with the selections .
2) Loaded into PSA an then tried to load one by one data packet but failed.
3) again activated the update rules and tried to load , again failed and the same thing as mentioned above is happening.
4) in sm58 , executed f6 , the data packets were processed butb then when i run the load again it failed and the data packets were again in 'transaction executing " state .
Kindlu let me know if there is a way out .
Regards,
Dola

Hi Gareth,
Many thanks for the response.
Firstly , yes the same update rules etc are working fine in production expect for 1 thing , in production after every 10 days , we compress the cube . but in dev we are loading it freshly , so no compression or E table is there.
Secondly, the status of the update rules are "active , executable". the colour is green .
thirdly , we are using BW 3.5 , so we are loading thru infopackage .
forthly , It is actually a full load , so when i schedule the load , it actually hangs ( goes on and on and on ) . in sm58 , i can see trfc related to the load in "executable " state for more than 12 hrs ,. after that it goes to "time limit excceded" state . So now when i click on individual trfc , and press f6 , it disappears without any shourt dump.
And in the details tab in RSMO (data load) , under " Processing data packets "  -  under "update rules" -  
" start of processing update rules for data target " till this message it i sgreen but after this for update rules we are getting "missing messages :update rules finished for data target".
I have checked the update rules . it is same as the production ones .
Any idea as why this is behaving like this .
Regards,
Dola

Similar Messages

  • Can data be loaded from Cube to DSO

    Can data be loaded from Cube to DSO

    Hello,
    Yes,it can be loaded...all you need to do is the follow the same procedure of source and target as you do with the normal flow,...there is no different with cube as data source...you can even schedule delta from the cube to the DSO.
    Thanks
    Ajeet

  • DTP load to cube failed with no errors

    Hi,
    I executed a DTP load from cube to cube with huge data volume approx 25 million records. It is a full load. The DTP load failed without any errors in any packages. The load went for 1 day and 33 mins and the request just turned red without any error messages. I have check ST22 and SM37 Job logs. No clue.
    I restarted this load and again it failed approximately around the sametime without any error messages. Please throw some light on this.
    Thanks

    Hi,
    Thanks for the response. There are no abnormal messages in the SM37 Job logs. These process is running fine the production. We are making a small enhancement.
    08/22/2010 15:36:01 Overlapping check with archived data areas for InfoProvider Cube                             RSDA          230          S
    08/22/2010 15:36:42 Processing of data package 000569 started                                                        RSBK          244          S
    08/22/2010 15:36:47 All records forwarded                                                                            RSM2          730          S
    08/22/2010 15:48:58 Overlapping check with archived data areas for InfoProvider Cube                            RSDA          230          S
    08/22/2010 15:49:48 Processing of data package 000573 started                                                        RSBK          244          S
    08/22/2010 15:49:53 All records forwarded                                                                            RSM2          730          S
    08/22/2010 16:01:25 Overlapping check with archived data areas for InfoProvider Cube                            RSDA          230          S
    08/22/2010 16:02:13 Processing of data package 000577 started                                                        RSBK          244          S
    08/22/2010 16:02:18 All records forwarded                                                                            RSM2          730          S
    08/22/2010 16:10:46 Overlapping check with archived data areas for InfoProvider Cube                            RSDA          230          S
    08/22/2010 16:11:12 Job finished                                                                                00           517          S
    As shown here The process ended in 577 and 578 package did not start.  I can do be executing multiple DTPs but this process is running fine in production. There are no error messages or short dumps to start troubleshooting.
    Thanks,
    Mathi

  • Load from cube to planning area

    Hi,
    We are facing a problem in loading data from Cube to planning area, we need to distinguish between the Zero and the blank values in planning area when loading data from cube to planning area.
    Scenario is like this.
    For a CVC u2013A on day D I am having key figure value as blank and on D1 is Zero and on D2 is 2, I want the same to appear as my planning area data If I load this data to cube my day D value will appear as Zero  (but it was blank) and on D1 Zero and on D2 as 2 in cube.
    I am using Std T-code /SAPAPAO/TSCUBE to load data from cube to planning area, in std T-code we are having option of ignore Zero, if I use this it will also stops the actual zero value(which I need) on day D+1 to come in to planning area which I donu2019t want.
    I want my planning area data to be as blank on day D and Zero on D1 and 2 on D2 after loading data from cube to planning area..
    Can any one put some light and help me out from this issue.
    With regards,
    Sreerama

    Hi Seerama,
    I am not sure to understand completely the issue: do you have an issue in the cube or in the planning area?
    In order to differentiate a blank for a zero in the planning area, you need to set the flag "zero allowed" in the planning area settings. (in the tab keyfigure, click details, then for each key figure you can select or not the flag "zero allowed")
    When you load the data, you should indeed not flag "ignore zero value"
    If the issue you have is in your cube (between the file and the cube) then it is another matter...
    Kind Regards,
    Julien

  • DTP - Load from cube to cube - Setting "Use aggregates"

    Hi all,
    what exactly does the setting "Use aggregates" on the extraction tab in a DTP which loads data from Cube to Cube mean? The F1-Help didn't really bring me any further...
    Thanks to any answers in advance!
    Kind regards,
    Philipp

    Hi,
    U can update data from from one data target to other data target through data marting.
    right click ur ODS or Cube from where u r updating data to other target and select the Generate export data source. then it will create a data source with the name 8forllowed by ur data taget name. there after assign ur data source to ur infosource. and then create Update b/n the targets and then select the update data to other data target from where u r updating.
    select the type of update. and then load it from ur IP.
    regards-
    MM

  • Load from cube on one system to same cube on other system

    If i have the same cube on two different BW systems, how can i load data from one cube to the other ?
    thanks

    First thing is you have to create a export data source. Right click on the cube and choose "Generate Export Datasource" in additional functions of context menu. It will generate a datasource prefixed by 8. like: 8ZTESTCUBE
    Then come to another source system where you want to load the data. In rsa1 --> Source system --> choose your source system ---> Replicate --> You will get your data source you generated here.
    Now create a transformation, do mapping, activate and create a DTP. Load the data.
    Pravender

  • Load OLAPTRAIN cube failed

    Hi, I followed every steps at this link: http://www.oracle.com/technology/obe/olap_cube/buildicubes.htm
    First, when I tried to mantain the TIME dimension and load data, I got the ORA-01843 error 'not a valid month'.
    I'm italian, and I've found that the error was in the mapping window in the ALL_YEARS END_DATE field: instead of TO_DATE('2008-DEC-31', 'YYYY-MON-DD') I have to type DIC in the month. Afther that I can mantain TIME dimension and view all the data through AWM.
    But when I try to manage the SALES_CUBE I continue getting this error:
    NI: XOQ-01600: Error DML OLAP "ORA-01843: not a valid month
    " during DML Execution "SYS.AWXML!R11_LOAD_MEASURES('SALES_CUBE.CUBE' SYS.AWXML!___R11_LONG_ARG_VALUE(SYS.AWXML!___R11_LONG_ARG_DIM 1) SYS.AWXML!___R11_LONG_ARG_VALUE(SYS.AWXML!___R11_LONG_ARG_DIM 2) SYS.AWXML!___R11_LONG_ARG_VALUE(SYS.AWXML!___R11_LONG_ARG_DIM 3) 'NO')", Generico in TxsOqStdFormCommand::execute
    at oracle.olapi.data.source.DataProvider.callGeneric(Unknown Source)
    at oracle.olapi.data.source.DataProvider.callGeneric(Unknown Source)
    at oracle.olapi.data.source.DataProvider.executeBuild(Unknown Source)
    at oracle.olap.awm.wizard.awbuild.UBuildWizardHelper$1.construct(Unknown Source)
    at oracle.olap.awm.ui.SwingWorker$2.run(Unknown Source)
    at java.lang.Thread.run(Unknown Source)
    How can I solve this error?? anyone could help me?
    Edited by: battle84 on 27-ago-2009 11.16

    I have nls_language=italian, and my Windows XP is in italian language... i don't understand because as I said the TIME dimension with the DEC->DIC correction in dates works, but not the SALES_CUBE... Have I to check the data in the fact table?

  • HT1338 Number, Keynote and Pages app updates fail. Says try from Purchases. Same result - fails to download, Why??

    Got update notice for Pages, Numbers and Keynote on my macbook. When I try to install updates ... "failed" message appears.
    Have tried several times.
    Why?

    I have just purchased Keynote 6.2 from the app store and the download keeps failing. I have an iMac running Mav's 10.9.2 Any advice with what could be down to get to my purchase would be great. Thanks very much.

  • Delta from cube

    Hi,
    Is it possible to have delta load from cube.
    Erick.

    Hi,
    Yes its possible. Delta will be based on request.
    Refer
    what are the navigational steps  in loading data from cube to cube?
    3.X
    how to copy the one cube to onether cube
    7
    Create transformation btn cubes.
    Create DTP and run it.
    Loading data from one cube to another cube.
    Cube to cube data loading
    how to upload data from cube to cube
    Can we pull data from one cube to another cube
    Data Load Steps:
    Reading data from another cube
    Thanks,
    JituK

  • How is data loaded from Infocube to Planning area - need technical info

    I would like to find out how data is loaded from cube to planning area ,
    I know that CUBE has tables, but how does that get loaded to planning area , where is the mapping from which I can compare
    data in CUBE and the same data that reaches planning area.
    Say for example I have below values in the infocube
    Prod1 --> Loc1 -
    > available provisioning qty
    AAA        AB90             100
    Then where do i check in the planning area tables ( are there any tables mapped) after running the TSINPUT program, i know it can be checked in the planning book , but i want to check in planning area tables if they exi

    Hi ,
    The data is loaded from infocube to planning area using update rules. The issue you have mentioned seems to be two requests are having data for the same CVCs in the cube.
    Example: For the same CVC, two requests avilable in the infocube. When you copy the data using TSCUBE transaction, whatever data avilable in the cube for the CVC gets copied into Planning Area.
    CVC1 - cube - Old Request - 100
    CVC1 - cube - Actual Request - 200
    CVC1 - Planning Area = 300 ( The value is supposed to be 200, but since the old request also contains data in the cube for the cvc, it gets copied as 300)
    Issue : There might two request contains data for the same code
    Solution : 1. Check the data in the cube using transaction LISTCUBE  2.  Delete old request and check the data agian. 3. If it matches with your requirement, run the TSCUBE again.
    Please let me know if you need additional information.
    Thanks,
    Jeysraj

  • Data load from DSO to cube fails

    Hi Gurus,
    The data loads have failed last night and when I try to dig inside the process chains I find out that the cube which should be loaded from DSO is not getting loaded.
    The DSO has been loaded without errors from ECC.
    Error Message say u201DThe unit/currency u2018source currency 0currencyu2019 with the value u2018spaceu2019 is assigned to the keyfigure u2018source keyfigure.ZARAMTu2019 u201D
    I looked in the  PSA has about 50, 000 records .
    and all data packages have green light and all amounts have 0currency assigned
    I went in to the DTP and looked at the error stack it has nothing in it then I  changed the error handling option from u2018 no update,no reportingu2019 to u2018valid records update, no reporting (resquest red)u2019 and executed then the error stactk showed 101 records
    The ZARAMT filed has 0currency blank for all these records
    I tried to assign USD to them and  the changes were saved and I tried to execute again but then the message says that the request ID should be repaired or deleted before the execution. I tried to repair it says can not be repaired so I deleted it and executed . it fails and the error stack still show 101 records.when I look at the records the changes made do not exist anymore.
    If I delete the request ID before changing and try to save the changes then they donu2019t get saved.
    What should I do to resolve the issue.
    thanks
    Prasad

    Hi Prasad....
    Error Stack is request specific.....Once you delete the request from the target the data in error stach will also get deleted....
    Actually....in this case what you are suppose to do is :
    1) Change the error handling option to u2018valid records update, no reporting (resquest red)u2019 (As you have already done....) and execute the DTP......all the erroneous records will get sccumulated in the error stack...
    2) Then correct the erroneos records in the Error Stack..
    3) Then in the DTP----in the "Update tab"....you will get the option "Error DTP".......if it is already not created you will get the option as "Create Error DT".............click there and execute the Error DTP..........The Error DTP will fetch the records from the Error Stack and will create a new request in the target....
    4) Then manually change the status of the original request to Green....
    But did you check why the value of this field is blank....if these records again come as apart of delta or full then your load will again fail.....chcek in the source system and fix it...for permanent solution..
    Regards,
    Debjani......

  • Delta load from ODS to cube failed - Data mismatch

    Hi all
    We have a scenario where the data flow is like
    R/3 table - >dataSrc -- > pSA - >InfoSrc -> ODS ->Cube.
    The cube has an additional field called "monthly version"
    and since it is a history cube , it is supposed to hold data
    snapshots of the all data in current cube for each month .
    We are facing the problem that the Data for the current month
    is there in the history ODS but not in the cube . In the ODS ->Manage->requests
    tab i can see only 1 red request that too with 0 recs.
    However ,In the cube -> manage-> reconstruction tab , i can see 2 Red request
    with the current month date. Could these red requests be the reason for
    data mismatch in ODS and cube .
    Please guide as to how can i solve this problem .
    thanks all
    annie

    Hi
    Thanks for the reply.
    Load to Cube is Delta and goes daily .
    Load to ODS is a full on daily basis .
    Can you help me how to sort this issue . I have to work directly in production env . so has to be safe and full proof .
    Thanks
    annie

  • Data load from DSO to Cube in BI7?

    Hi All,
    We just migrated a dataflow from 3.5 to 7 in development and moved to production. So till now in production, the dataloads happend using the infopackages.
    a. Infopackage1 from datasource to ODS and
    b. Infopackage2 from ODS to the CUBE.
    Now after we transported the migrated dataflow to production, to load the same infoproviders I use
    1. Infopackage to load PSA.
    2. DTP1 to load from PSA to DSO.
    3. DTP2 to load from DSO to CUBE.
    step1 and step2 works fine but when I run the DTP2 it is getting terminated. But now when I tried the step b (above), it loads the CUBE fine using the infopackage. So I am unable to understand why the DTP failed and why the infopackage load is successful. In order to use the DTP do we need to do any cleanup when using it for first time? Please let me know if you have any suggestions.
    Please note that the DSO already has data loaded using infopackage.  (Is this causing the problem?)
    Thanks,
    Sirish.

    Hi Naveen,
    Thanks for the Reply. The creation of DTP is not possible without a transformation.
    The transformation has been moved to production successfully.

  • Full load from a DSO to a cube processes less records than available in DSO

    We have a scenario, where every Sunday I have to make a full load from a DSO with OnHand Stock information to a cube, where I register on material and stoer level a counter if there is stock available.
    The DTP has no filters at all and has a semantic group on 0MATERIAL and 0PLANT.
    The key in the DSO is:
    0MATERIAL
    0PLANT
    0STOCKTYPE
    0STOR_LOC
    0BOM
    of which only 0MATERIAL, 0PLANT and 0STORE_LOC are later used in the transformation.
    As we had a growing number of records, we decided to delete in the START routine all records, where the inventory is not GT zero, thus eliminating zero and negative inventory records.
    Now comes the funny part of the story:
    Prior to these changes I would [in a test system, just copied from PROD] read some 33 million of records and write out the same amount of records. Of course, after the change we expected to write out less. To my total surprise I was reading now however 45 million of records with the same unchanged DTP, and writing out the expected less records.
    When checking the number of records in the DSO I found the 45 million, but cannot explain why in the loads before we only retrieved some 33 millions from the same unchanged amount of records.
    When checking in PROD - same result: we have some 45 million records in the DSO, but when we do the full load from the DSO to the cube the DTP only processes some 33 millions.
    What am I missing - is there a compression going on? Why would the amount of records in a DSO differ from the amount of records processed in the DataPackgages when I am making a FULL load without any filter restrictions and only a semantic grouping in place for parts of the DSO key?
    ANY idea, thought is appreciated.

    Thanks Gaurav.
    I did check if there were more/any loads doen inbetween - there were none in the test system.  As I mentioned that it was a new copy from PROD to TEST, I compared the number of entries in the DSO and that seems to be a match between TEST and PROD, ok a some more in PROD but they can be accounted for. In test I loaded the day before the changes were imported to have a comparison, and between that load and the one ofter the changes were imported nothing in the DSO was changed.
    Both DTPs in TEST and PW2 load from actived DSO [without archive]. The DTPs were not changed in quite a while - so I ruled that one out. Same with activation of data in the DSO - this DSO get's loaded and activated in PROD daily via process chain and we load daily deltas into the cube in question. Only on Sundays, for the begin of the new week/fiscal period, we need to make a full load to capture all materials per site with inventory. The deltas loaded during the week are less than 1 million, but the difference between the number of records in the DSO and the amount processed in the DataPackages is more than 10 millions per full load even in PROD.
    I really appreciated the knowledgable answer, I just wished you would pointed out something that I missed out on.

  • Delta records are not loading from DSO to info cube

    My query is about delta loading from DSO to info cube. (Filter used in selection)
    Delta records are not loading from DSO to Info cube. I have tried all options available in DTP but no luck.
    Selected "Change log" and "Get one request only" and run the DTP, but 0 records got updated in info cube
    Selected "Change log" and "Get all new data request by request", but again 0 records got updated
    Selected "Change log" and "Only get the delta once", in that case all delta records loaded to info cube as it was in DSO and  gave error message "Lock Table Overflow" .
    When I run full load using same filter, data is loading from DSO to info cube.
    Can anyone please help me on this to get delta records from DSO to info cube?
    Thanks,
    Shamma

    Data is loading in case of full load with the same filter, so I don't think filter is an issue.
    When I follow below sequence, I get lock table overflow error;
    1. Full load with active table with or without archive
    2. Then with the same setting if I run init, the final status remains yellow and when I change the status to green manually, it gives lock table overflow error.
    When I chnage the settings of DTP to init run;
    1. Select change log and get only one request, and run the init, It is successfully completed with green status
    2. But when I run the same DTP for delta records, it does not load any data.
    Please help me to resolve this issue.

Maybe you are looking for

  • Multioprovider question

    Hi I have a requirement to bring the real estate measurement data and fixed asset and tax data from condition cube to generate the real estate Investment/market / tax value - portfolio overview report. all fields are available in different datasource

  • PDF serving

    Serving a file I'm having trouble serving a PDF file from a server to the user. I've used iText to generate a PDF file (by completing an AcroForm with info from a database) which now resides on an absolute path. I've used BalusC's tutorial for filese

  • BOM Validity Using ECM

    Hello All, We are currently reviewing the ECM functionality to assistn in BOM validity maintenance.  I have made the necessary changes in SPRO to activate this functionality.  My initial testing utilizing the Change Master and associating it with the

  • Error while Installating of JAVA Add-in

    Hi, I installed ABAP stack on the server and am planning to add the Java Add in for Oracle and I get the following error from the log. INFO 2005-08-16 14:18:00 Looking for SAP system instances installed on this host... ERROR 2005-08-16 14:18:06 CJS-2

  • Printing photo book one to page and legible

    I cannot figure out how to print one page at a time of my photo book so that I can proof it.  ' The closest I have gotten is to print it both sides of book on one page, but I can't read it then.  Must be a simple way to see a hard copy. Advice welcom