Data load for target after target enhancement

Dear all,
We are using BI7.00 and in one of our design we have the data first loaded to the ODS and then to the Cube. Now we wanted to get the value of one more field. This field is already avaialable in the data source and data is flowing upto PSA. I have added that field in the ODS. My problem starts here. I want the data to flow for all the previous requests i.e., earlier requests in the ODS and the Cube (Prior to enhancement) without disturbing the data load.
Kindly provide the step by step instructions. I do not want the earlier data to be deleted. The current field value to be updated to the previous loads to the ODS and the Cube.
Regards,
M.M

Hi,
Overwrite optrion is available in the update mode of key figures.
Its key figure which determines whether the DSO is in overwrite or addition mode.
Go to the mappings of each infoobject in the tranformation from the data source to DSO and check the mapping type of each key figure and then see if its overwrite or not??
To avoide reloading to the cube.....One of the option is to add the fields in both cubes and DSO and then activate the transformation and mappings,DTP ect.
After that load the data to the DSO and then schedule the delta from the DSO to the cube.Thi will bring all the chnages to the DSO in the cube.
But this delta may be a huge one and may fail depending upon the data in the DSO.
But you can try for this and if it is not working then you may have to delete the cube and reload it.
Thanks
Ajeet

Similar Messages

  • Master Data Load for New Attribute

    Hi Users,
    We had to implement a separate load flow for a new field coming from R3. This field was to be added to existing master data object.
    I added a new Display attribute for an existing 0GL_ACCOUNT master data object.
    This new attribute along with some other existing fields is getting data from another master data object with an infosource in between because two transformations cannot be created for same source and target.
    When i load the data i dont see data being populated for this new field. I did ACR, checked the keys e.t.c.
    Source object has data but after executing DTP no data comes to this attribute. No routines or anything.
    Please suggest
    Regards
    Zabi

    Hi,
    The situation is:
    Field x from source maps to
    1. field y ( which was existing field ) and also maps to
    2. field z which is the new attribute.
    field y has to get updated for company codes 10 for example.
    field z for company codes 30.
    now if i use same flow and map field x to both y and z then there is overwriting happening if 10 does not have value for x and 20 has then its not good.
    So if i use a separate flow with infosource then i will map only x to z so after loads which means for 10 code if no value went in first dtp to y then if code 30 has value for x then z will only be updated and y remains empty....
    Master Data Load for New Attribute 

  • Master data loads for Attributes and texts failing

    Hello
    Master data loads for Attributs and texts are failing. The error is
    1. Lock Not set for Loading master data attributes
    2.Table /BI0/YAccount does not exists (this error is for 0Account master data - attribute load)
    3.Error  1 in the update.
    We had faced this error few days ago and rebooting the server resolved the error but it has recurred after 4 days.
    RS12 and SM12 do not show any locks. Activating the info object has also not resolved the error.
    Any insight is appreciated.
    Thanks
    Inder

    Hello
    Master data loads for Attributs and texts are failing. The error is
    1. Lock Not set for Loading master data attributes
    2.Table /BI0/YAccount does not exists (this error is for 0Account master data - attribute load)
    3.Error  1 in the update.
    We had faced this error few days ago and rebooting the server resolved the error but it has recurred after 4 days.
    RS12 and SM12 do not show any locks. Activating the info object has also not resolved the error.
    Any insight is appreciated.
    Thanks
    Inder

  • Need to skip the data load for one of the sub chain.

    Hi All,
    In our project, we have one meta chain and have many sub chains inside that. Today we don't want to run one of the sub chain. As we are in BW7.0, Skip options are not available. Can you please help how to stop the data load for the particular sub-chain.Please suggest.
    Thanks.

    Hi Jalina,
    If this is a frequent request then you can create a custom ABAP Program and then use the below simple logic to skip/run the meta chain, but you will also have to change the link between the sub chains as "Always" instead of "Successful" so that no matter if the dependent chain is successful OR not the nex chain will proceed. If you are comfortable with event then you can also use event to achieve this
    Program Logic- Create a new table where you maintain meaningful values (indicator/description) in it and the program should read the data from the table, Based on the values in the table the program will be successful/fail
    Thanks
    Abhishek Shanbhogue

  • Master Data Loading for Prices and Conditions in CRM - "/SAPCND/GCM"

    Hi,
    Could anyone give me some inputs on Master Data Loading for Prices and Conditions in CRM.
    T. Code is:  /SAPCND/GCM
    I need to load data on a file (extracted from 4.6) for service contracts.
    I tried LSMW : for this transaction, recording does not work.
    I am trying loading thru Idocs (LSMW). But that too is note really working.
    Do we require some custom development for this , or is some SAP standard funcntionality available ??
    Can anyone provide some valuable inputs one this.
    Would appreciate your responses.

    Hi Tiest,
    Thanx for responding.
    U r right, our clint is upgrading from 4.6 to ECC.
    So as per the clients requirements, we are maintaining all the configs for Services in CRM.
    Services Data which was in 4.6 is being pulled put on flat files which needs to be loaded in CRM. So middleware would not be able to do this.
    What I am looking os some standard upload program.
    LSMW recording does not work.
    This I-Doc "CRMXIF_COND_REC_SLIM_SAVE_M", i am able to load a single record. But I am not able to find , how to make this function for multiple entries.
    IN standard we for loading master data thru I-docs, we map the values to the standard fields which are available in that I-Doc.
    But in this particular i-doc, there is a common field for which I need to define the field name and a field value..
    Till now, I am only able to define just one field name and a field value.
    I want this to word for mutliple entries.
    Hope u get my point.
    Thanx

  • Master data Load for Substance Creation :  Using BAPI in LSMW  ?

    Hi ,
    Had any one did a Master data load for creating a Substance in SAP RM. If so please let me know the best possible option. I am beeing trying to figure out the option to load Substances and its IDENTIFIER / MATERIAL ASSINGMNET from my legacy system. But I am not geeting a clue to use LSMW. ( Direct or BatchInput )
    I am not seeing any BAPI for uploading it through the LSMW except the one BUS1077 ( method SAVEREPMUL ) and I guess this is used along with IDOC.
    Appreciate your help and suggestion.
    David

    Thanks John.
    I will try this option what you have suggested. Even I told by business team to look for CG33. But they were been telling me some format issue. Probably they would meant the same what you have said. But they were not aware of work around for this. your information is intresting.
    The other option given to me was to create a recording for each and every Tab like Substance header / Identifier / Material assignment etc. seperately. Technically I don't feel good to create in that way. Do you think that is alernate approach.?
    With Respect & Regards
    David

  • Master data load for 0COSTCENTER (Cost Center) failing

    Hi Experts
    I have a master data load for 0COSTCENTER (Cost Center). The Load has started failing from a couple of days at DTP.
    (R) Filter Out New Records with the Same Key
          (G) Starting Processing...
          (R) Dump: ABAP/4 processor: DBIF_RSQL_SQL_ERROR
    I am unable to understand the reason for this failure. I tried loading the data 1 costcenter at a time it is still failing so i doubt if its a internal table storage issue as suggested by the dump.
    Could you help me on this one please.
    Regards
    Akshay Chonkar

    Thanks All
    I have got the issue resolved
    The Error Stack for the DTP had accumulated so many entries that it was unable to process further.
    So i deleted the entries in Error Stack usin SE14. Then again executed my DTP.
    Everything is fine now thanks for your help.
    Regards
    Akshay Chonkar

  • Data load for VK11

    Hi All,
    can anyone suggest me about data load for VK11, I mean which method will be easier?
    appreciate if you can send me the developed custom code
    thanks.

    Hi Kiran,
    You can also use BAPI BAPI_PRICES_CONDITIONS.
    Please check this link for sample codes.
    Re: Sample code for  BAPI_PRICES_CONDITIONS
    Hope this will help.
    Regards,
    Ferry Lianto

  • How to Run ebs adaptor data load for 1 year using DAC

    Hi,
    iam trying to Run the ebs adaptor data load for 1 year for procurement and spend ,Please let me know the parameter to set in DAC.
    last extract date is set as Custom Format(@DAC_SOURCE_PRUNED_REFRESH_TIMESTAMP
    Thanks

    You need to set $$INITIAL_EXTRACT_DATE to a year ago. The LAST EXTRACT DATE is something used for incremental loads and you do not manually set that.
    if this helps, mark as correct or helpful.

  • After recent patch upgrade, BW delta Data loads for 2LIS_02_ITM failing

    Hi All
    We have patched our BW 3.5 system from 16 to 25. After updating patches, I am geetting delta load issues.
    2lis_02_itm and 2lis_02_scl delta loads are taking long long time. Deltas are failing even for 3000  records . some time it is taking 14 hours .  Delta loads are failing  and I am getting below message. Please advise.
    I am still getting below dump and i could not find any errors in
    ST22. Please see below error.
    Processing in Warehouse timed out; processing steps mising
    Diagnosis
    Processing the request in the BW system is taking a long time and the
    processing step Update rules has still not been executed.
    System response
    <DS:DE.RSCALLER>Caller is still missing.
    Procedure
    Check in the process overview in the BW system whether processes are
    running in the BW system under the background user.
    If this is not the case, check the short dump overview in the BW system.
    Thanks
    Pramod

    Hello Pramod,
    Is this load is to load to initial level ODS? If Yes I understand that you have issue in loading but not at activation of the ODS?
    Normally this kind of issue comes during the activation of ODS.
    1. If you are getting this during the data loads, then try to check loading only to PSA first and then from PSA to target? And check how much time that it's taking as this will help to track the issue whether its present in Source or in BW?
    2. If the load is fine till PSA then try loading the same to ODS and then check whether this issue is present or not?
    3. At the same you need to check the source side the system load and the number of processors which are available?
    4. Did you perform the delta queue draining before the stack upgrade in R/3 or not? Normally as a pre-requisite this needs to be done if not then you will have issues in extracting the data!
    5. Finally this can be avoided by having Initwithout data transfer and then performing the Delta loads. BUt for this need to first extract all the data from Source to BW and then take a down time in R/3 for few min and perform this activity so that the Init Timstamp will be reset properly and the Delta should work fine for you.
    Do get back to us with more details.
    Thanks
    Murali M

  • Data loading for newly enhanced fields

    hi  friends
    we have a cube that updated with data daily for last one year.now we needed to add two fields to that data source .if we do so how can we  update data for those newly added fields in our data target for the previoue one year.
    Thanks  in  advance
      venkat.p

    HI,
    GOTO INfOPACKAGE screen ,select the two fields you want in the dataselection Tab you can give the values for those fields in the From Value and To Value, now goto update tab and choose initialize delta process  and schedule the dataload
    Hope this helps....

  • Outbound queue won't fetch data to inbound queue after DS enhancement

    I enhanced the 2LIS_02_ITM extractor with a field from EKKN table, in CMOD filled this field. Transported all this to TEST and then to PROD systems. In DEV system everything is OK, but in TEST and PROD system my outbound delta queues are filling and not fetching the data to the inbound queues!!! When i launch the job in LBWE, it cancels itself. And when i see short dumps, it says something about modified structure. Any idea how can i fetch the data from one queue to another?

    Hi,
    First you have to make sure that there are no enrteis in LBWQ,delta queue and det up table before you do the transports.
    So anyway the process has to be repeated again.
    So once the transports are done then you will have to do the whole set up again.
    Do the follwing after transports.
    1)Delete the set up tables
    2)Do init w/o data transfer from the infopackage.
    3)Fill the set up tables( You can schedule a job in the background to fill the whole 7000000 records at the same time)
    4) Full repair loads to BW( Either you can do it in parts...like scheduling different loads at the same time through different infopackages for different selections OR one infopackage for the whole data).
    The better is to do in parts as in case of whole data load fails then you will lose the whole data and you will have to repeat the process again.
    You can schedule the set up table fill job as well as full repair infopackage schedule at the same time for different selections.
    Suppose if set up table is filled with selection A and once job finished then schedule the infopackage with selection A meanwhile you can fill the set up table with selection B and so no.
    This way you can make sure the jobs are not failing because of huge amount of data load at the same time the time taken to complete the process is less.
    As such there is no limitation on amount of data..its the duration of process which is limitation...for very long running process chances are that you run out of background process or you miss some IDOC's.
    Thanks
    Ajeet
    Edited by: Ajeet Singh on Nov 18, 2008 9:26 PM

  • Data loads for the copied Infoobject?

    Dear Bwers,
    We have a need to make modifications to a standard content infoobject 0MATL_GROUP and chose to copy it (ZMATL_GRP) and make the changes to the copied infoobject.
    How do i load this new infoobject with data? Do i create a export datasource from 0MATL_GROUP and use it or with the new infoobject ZMATL_GRP or is there any other way to do the daily master data loads to this new infoobject? Please give your thoughts on this.
    Thanks
    Raj
    Message was edited by:
            Raj Singh

    You have few other options.
    If  you are in 7.0, you can use DTP.
    If you are in older versions, then:
    1. Make the -MATL_GRP as a data target and load from that ti your Z object.
    OR
    2. Bring the data of matl_grp to an ODS and include all your new attributes  and from the ODS you can load to both 0matl_grp and to Z object. While doing so, you can set no update for the attribute that you dont want in 0matl_grp.
    I will prefer the first approach.
    Ravi Thothadri

  • BPC data load for consolidation..

    Once gone live.. how frequently is the data will be loaded for each month.. Is it on daily basis or only at the end of month loading.
    I understand that if the data is loaded only at the end of the month.. all errors will come only during monthend and will delay the consolidation process.

    Hi,
    Consolidation happens monthly and quarterly only.
    Consolidation normally happens at YTD figures ie Year to date figures.
    If you take the transaction data mid of the month and start doing execution your profit will show wrong figures and all eliminations may go wrong.
    You can plan your consolidation process after Closer of existing source system books for the particular month.

  • IMP -Data load for 0BBP_BIGURE_HIER in SRM  5.0 to BI 7.0

    I have been trying to load hierarchy data in the datasource 0BBP_BIGURE_HIER from SRM to my BI system but the monitor shows the req remaining in yellow for hours together.The data comes fine till PSA but doesnt get loaded to the Infoobject 0BBP_BIGURE.The text data has been succesfully loaded for 0BBP_BIGURE.No attributes for the infoobejct to be loaded.
    Can someone provide some information abt how to load 0BBP_BIGURE_HIER from SRM and what can be the possible errors.
    Regards,
    Joy
    [email protected]

    Sony,
    You need to implement the following note for SRM 5.0
    Note 931998 - Release Enhancement for note 501836
    Praveen

Maybe you are looking for

  • Delivery address of PO is not geting disp in print o/p when it is changed

    Hello, When delivery address is changed at PO line item it is not getting displayed in print output. Before change it is getting displayed. Note - system is 4.6C. Thanks - aadil

  • Toshiba ph3100u-1exb External Hard Drive Will Not Stay On

    Hello all. I recently purchased a Toshiba ph3100u-1exb 1.0 TB external hard drive. It's working great, but the problem I am running into is that it will not stay powered on for more than a couple minutes. I'm sure that its some kind of power saving f

  • UTF-8 stored in VARCHAR2 on a non-Unicode DB

    Hi there, we have a company that implements storing Unicode data in Oracle in the following way: A plain VARCHAR2 on a non-Unicode DB (charset is actually WE8MSWIN1252) receives UTF-8 coded data. As client and server have the same setting for NLS_LAN

  • Connecting Samsung smart tv un55d7000lf to receiver to use AR

    I am trying to connect the AR from the TV to my receiver.  I have been told I need to configure the TV to reroute the sound from the TV speakers.  How do I do that?  I have looked at the emanuals pages 300-310 and I do not even know how to get to the

  • IPod Classic 160GB looping restore.

    After moving my iTunes library to another disk and confirming that files were fine, I plugged in my iPod to sync and charge it, and it said that it was in recovery mode and required a restore. No problem I thought. Had a little time, so clicked resto