Data loading for newly enhanced fields

hi  friends
we have a cube that updated with data daily for last one year.now we needed to add two fields to that data source .if we do so how can we  update data for those newly added fields in our data target for the previoue one year.
Thanks  in  advance
  venkat.p

HI,
GOTO INfOPACKAGE screen ,select the two fields you want in the dataselection Tab you can give the values for those fields in the From Value and To Value, now goto update tab and choose initialize delta process  and schedule the dataload
Hope this helps....

Similar Messages

  • Historical data load for newly added fields

    Hi,
    we are using cube 0UCSA_C01 with delta update which has gone live now the requirement is we have to  add 2 new fields to this cube  but the problem is how can we load previous data for dese fields dat has been updated already . Plz letme know hw this can b acheieved  plz letme know the proceedure to achieve dis..
    Thnx
    help will b apprciated

    Hi,
    populate historical data into a newly added field in an infoprovider
    How to populate historical data into a newly added field in an infoprovider
    /people/dinesh.lalchand/blog/2006/02/07/loopback-process--follow-up
    http://sap.ittoolbox.com/documents/popular-q-and-a/adding-characteristics-to-infocubes-2138
    Thanks,
    JituK

  • Reg: Loading historic data for the enhanced field

    Hello All,
    We need to add a new field 0VENDOR to our datasource 0FI_GL_4. This field is available in our BSEG table. Hence, we are planning to go ahead with datasource enhancement.
    Now, please advice on how to update the historical data to this newly added field.I have heard there is a BW functionality/program to do so without deleting the entire data. Kindly advice on the possible solutions.
    Thanks & Regards
    Sneha Santhanakrishnan

    HI Sneha,
    Using remodeling option you will be able to do that, ie.. loading historical data for new attributes without deleting existing data. But the problem is in remodeling either you can assign constant or some other attribute value or values determined using EXIT.
    Now when you are loading data from source system and if you need historical data as well the best practise is delete existing data and reload it from source sytsem.
    But if you don't want to do that then I can give you one trick but not sure whether it will work or not. The idea is to populate the historical values for 0Vendor using customer exit option of remodeling. Now to get the historical values in customer exit you will need all that data in some BW table, here you can think of creating some generic extractor which will store the values of all the documents and the respective vendor, as you will load data form source system you will get historical values as well.
    Now read that table in customer exit and populate vendor value. This will be one time process to populate the historical values.
    Regards,
    Durgesh.

  • No data coming in to enhanced fields of extractor

    I added fields twice to  the existing enhanced business content extractor 0fi_gl_4, I wrote the transactional data exit and did the extraction, the extraction went with green light. when i checked the data in ods, i dont see any data for the enhanced fields, including the earlier enhanced fields which used to be populated before i did this enhancement.
    when i look at load in monitor
    it is all green, but when i go to scheduler to do another test load it is giving me the following message
    Conv. exit PERI7 for field FISCPER not found in BW; Exit deleted
    Message no. RSM1149,
    > is this the reason for no data in my enhanced fields....if so how do i get rid of it?
    > do i need to delete and recreate the datasoure
    so that all the enhaced fields lie in same append structure, if so how do i recreate the datasource...?
    > plz also help me debug my enhancement, i tried rsa3 it doesnt show an error, i tried break point but no use..
    plz give me detailed instructions..
    thanks in advance
    points will be assigned for inputs

    do i need not add exit even if i did an append structure to the datasource? (bcoz in help it says only about include structure )
    do you know how to do an include structure to the datasource instead of append?
    i tried making an infinite loop in exit and do a data load..the load has taken sucessfully without an error and i checked the data , the enhanced fields are empty..so i concluded that it is not even calling the exit? any idea of how to fix it ? plz give me ur mail id ..i will send some screen shots?
    regarding the conversion exit error, this is all i am getting, when i click on the error there is nothing but the same decription
    Conv. exit PERI7 for field FISCPER not found in BW; Exit deleted     RSM1     149     \
    > any way to look up by error message and number?

  • Master Data Loading for Prices and Conditions in CRM - "/SAPCND/GCM"

    Hi,
    Could anyone give me some inputs on Master Data Loading for Prices and Conditions in CRM.
    T. Code is:  /SAPCND/GCM
    I need to load data on a file (extracted from 4.6) for service contracts.
    I tried LSMW : for this transaction, recording does not work.
    I am trying loading thru Idocs (LSMW). But that too is note really working.
    Do we require some custom development for this , or is some SAP standard funcntionality available ??
    Can anyone provide some valuable inputs one this.
    Would appreciate your responses.

    Hi Tiest,
    Thanx for responding.
    U r right, our clint is upgrading from 4.6 to ECC.
    So as per the clients requirements, we are maintaining all the configs for Services in CRM.
    Services Data which was in 4.6 is being pulled put on flat files which needs to be loaded in CRM. So middleware would not be able to do this.
    What I am looking os some standard upload program.
    LSMW recording does not work.
    This I-Doc "CRMXIF_COND_REC_SLIM_SAVE_M", i am able to load a single record. But I am not able to find , how to make this function for multiple entries.
    IN standard we for loading master data thru I-docs, we map the values to the standard fields which are available in that I-Doc.
    But in this particular i-doc, there is a common field for which I need to define the field name and a field value..
    Till now, I am only able to define just one field name and a field value.
    I want this to word for mutliple entries.
    Hope u get my point.
    Thanx

  • Master Data Load for New Attribute

    Hi Users,
    We had to implement a separate load flow for a new field coming from R3. This field was to be added to existing master data object.
    I added a new Display attribute for an existing 0GL_ACCOUNT master data object.
    This new attribute along with some other existing fields is getting data from another master data object with an infosource in between because two transformations cannot be created for same source and target.
    When i load the data i dont see data being populated for this new field. I did ACR, checked the keys e.t.c.
    Source object has data but after executing DTP no data comes to this attribute. No routines or anything.
    Please suggest
    Regards
    Zabi

    Hi,
    The situation is:
    Field x from source maps to
    1. field y ( which was existing field ) and also maps to
    2. field z which is the new attribute.
    field y has to get updated for company codes 10 for example.
    field z for company codes 30.
    now if i use same flow and map field x to both y and z then there is overwriting happening if 10 does not have value for x and 20 has then its not good.
    So if i use a separate flow with infosource then i will map only x to z so after loads which means for 10 code if no value went in first dtp to y then if code 30 has value for x then z will only be updated and y remains empty....
    Master Data Load for New Attribute 

  • PROBLEM  TRANSFERRING   MULTIPLE   DATA  ENTRIES    FOR  ONE KEY-FIELD.

    DEAR   EXPERTS ,
       I  HAVE  TRANSFERRED  DATA  FROM  THE  FINAL  INTERNAL  TABLE  OF  MY  ABAP REPORT (NOT ALV)  TO  CUSTOM  Z-TABLE  CREATED  IN  SE11.
    BUT  MY  PROBLEM  IS  :  I   COULD  NOT   TRANSFER  MULTIPLE   DATA  ENTRIES   FOR  A  PARTICULAR  FIELD.
    FOR  EXAMPLE :  IN  TABLE  EKKO  THERE  ARE   FOUR  EBELN-4900006375  AND  FOR  THAT  DIFFERENT  EBELP S  ARE
    PRESENT.  I  COULD  TRANSFER  ONLY  THE  FIRST  ENTRY ,  THAT  IS :  EBELN -  4900006375  AND   EBELP - 0010,
    AFTER  THAT  THE  ZTABLE  IS  NOT  GETTING  UPDATED  TO  EBELN-4900006375 FOR  EBELP - 0020  AND  SO ON.
    I  HAVE  TRIED  ALL  THE  '  MODIFY, INSERT,  UPDATE  '  STATEMENTS.  I  HAVE  USED  AT - USERCOMMAND - HIDE  AND  CHECKBOXES.
       PLEASE   SUGGEST   A   SAMPLE   CODE   FOR   THIS.
    Moderator message: please post again, but not in all upper case.
    [Rules of engagement|http://wiki.sdn.sap.com/wiki/display/HOME/RulesofEngagement]
    Edited by: Thomas Zloch on Jun 19, 2011 10:05 PM

    There are actually 5 queries in this report now. From what I understand about a union query, I don't think it will work here because the data being returned in each of the queries is so different. I basically need to know how to make all the criteria for each individual to be displayed before proceeding to the next data set, which will include the same data as the first, but for the next employee, and so on. I need to basically create a repeating frame with each individual's respective data I guess, but every time I do, it tells me that it's referencing an invalid group.

  • Master data loads for Attributes and texts failing

    Hello
    Master data loads for Attributs and texts are failing. The error is
    1. Lock Not set for Loading master data attributes
    2.Table /BI0/YAccount does not exists (this error is for 0Account master data - attribute load)
    3.Error  1 in the update.
    We had faced this error few days ago and rebooting the server resolved the error but it has recurred after 4 days.
    RS12 and SM12 do not show any locks. Activating the info object has also not resolved the error.
    Any insight is appreciated.
    Thanks
    Inder

    Hello
    Master data loads for Attributs and texts are failing. The error is
    1. Lock Not set for Loading master data attributes
    2.Table /BI0/YAccount does not exists (this error is for 0Account master data - attribute load)
    3.Error  1 in the update.
    We had faced this error few days ago and rebooting the server resolved the error but it has recurred after 4 days.
    RS12 and SM12 do not show any locks. Activating the info object has also not resolved the error.
    Any insight is appreciated.
    Thanks
    Inder

  • Need to skip the data load for one of the sub chain.

    Hi All,
    In our project, we have one meta chain and have many sub chains inside that. Today we don't want to run one of the sub chain. As we are in BW7.0, Skip options are not available. Can you please help how to stop the data load for the particular sub-chain.Please suggest.
    Thanks.

    Hi Jalina,
    If this is a frequent request then you can create a custom ABAP Program and then use the below simple logic to skip/run the meta chain, but you will also have to change the link between the sub chains as "Always" instead of "Successful" so that no matter if the dependent chain is successful OR not the nex chain will proceed. If you are comfortable with event then you can also use event to achieve this
    Program Logic- Create a new table where you maintain meaningful values (indicator/description) in it and the program should read the data from the table, Based on the values in the table the program will be successful/fail
    Thanks
    Abhishek Shanbhogue

  • Master data Load for Substance Creation :  Using BAPI in LSMW  ?

    Hi ,
    Had any one did a Master data load for creating a Substance in SAP RM. If so please let me know the best possible option. I am beeing trying to figure out the option to load Substances and its IDENTIFIER / MATERIAL ASSINGMNET from my legacy system. But I am not geeting a clue to use LSMW. ( Direct or BatchInput )
    I am not seeing any BAPI for uploading it through the LSMW except the one BUS1077 ( method SAVEREPMUL ) and I guess this is used along with IDOC.
    Appreciate your help and suggestion.
    David

    Thanks John.
    I will try this option what you have suggested. Even I told by business team to look for CG33. But they were been telling me some format issue. Probably they would meant the same what you have said. But they were not aware of work around for this. your information is intresting.
    The other option given to me was to create a recording for each and every Tab like Substance header / Identifier / Material assignment etc. seperately. Technically I don't feel good to create in that way. Do you think that is alernate approach.?
    With Respect & Regards
    David

  • Master data load for 0COSTCENTER (Cost Center) failing

    Hi Experts
    I have a master data load for 0COSTCENTER (Cost Center). The Load has started failing from a couple of days at DTP.
    (R) Filter Out New Records with the Same Key
          (G) Starting Processing...
          (R) Dump: ABAP/4 processor: DBIF_RSQL_SQL_ERROR
    I am unable to understand the reason for this failure. I tried loading the data 1 costcenter at a time it is still failing so i doubt if its a internal table storage issue as suggested by the dump.
    Could you help me on this one please.
    Regards
    Akshay Chonkar

    Thanks All
    I have got the issue resolved
    The Error Stack for the DTP had accumulated so many entries that it was unable to process further.
    So i deleted the entries in Error Stack usin SE14. Then again executed my DTP.
    Everything is fine now thanks for your help.
    Regards
    Akshay Chonkar

  • Data load for VK11

    Hi All,
    can anyone suggest me about data load for VK11, I mean which method will be easier?
    appreciate if you can send me the developed custom code
    thanks.

    Hi Kiran,
    You can also use BAPI BAPI_PRICES_CONDITIONS.
    Please check this link for sample codes.
    Re: Sample code for  BAPI_PRICES_CONDITIONS
    Hope this will help.
    Regards,
    Ferry Lianto

  • How to Run ebs adaptor data load for 1 year using DAC

    Hi,
    iam trying to Run the ebs adaptor data load for 1 year for procurement and spend ,Please let me know the parameter to set in DAC.
    last extract date is set as Custom Format(@DAC_SOURCE_PRUNED_REFRESH_TIMESTAMP
    Thanks

    You need to set $$INITIAL_EXTRACT_DATE to a year ago. The LAST EXTRACT DATE is something used for incremental loads and you do not manually set that.
    if this helps, mark as correct or helpful.

  • Data load for target after target enhancement

    Dear all,
    We are using BI7.00 and in one of our design we have the data first loaded to the ODS and then to the Cube. Now we wanted to get the value of one more field. This field is already avaialable in the data source and data is flowing upto PSA. I have added that field in the ODS. My problem starts here. I want the data to flow for all the previous requests i.e., earlier requests in the ODS and the Cube (Prior to enhancement) without disturbing the data load.
    Kindly provide the step by step instructions. I do not want the earlier data to be deleted. The current field value to be updated to the previous loads to the ODS and the Cube.
    Regards,
    M.M

    Hi,
    Overwrite optrion is available in the update mode of key figures.
    Its key figure which determines whether the DSO is in overwrite or addition mode.
    Go to the mappings of each infoobject in the tranformation from the data source to DSO and check the mapping type of each key figure and then see if its overwrite or not??
    To avoide reloading to the cube.....One of the option is to add the fields in both cubes and DSO and then activate the transformation and mappings,DTP ect.
    After that load the data to the DSO and then schedule the delta from the DSO to the cube.Thi will bring all the chnages to the DSO in the cube.
    But this delta may be a huge one and may fail depending upon the data in the DSO.
    But you can try for this and if it is not working then you may have to delete the cube and reload it.
    Thanks
    Ajeet

  • How to get the histoical data for newly added field in the cube?

    Hi Experts,
    I have small doubt on remodeling the infocube.
    After adding the characteristic or keyfigure  to a cube by using remodeling concept, how can I get the historical data for that particular field.
    I have searched in SDN also but I didn't get proper information.
    Please excuse me if I posted repeated question.
    helpful answer will be awarded with poitns.
    Thanks & regards,
    Venkat.

    hi
    depending on your customer need you could use the remodelling functionnality but sometimes you have no way to retrieve what you want so another option you should consider is the following:
    Advantages
    that will cost less effort and guarantee the result.
    Drawbacks
    data is redondant for a while
    space (depending on the volume of historical data)
    So here are the steps :
    step 1Adjust your extraction process according to the fields you need to add to populate the cube.
    step 2 Then create a dso next or even a cube, feed the dso with a full load with the enhanced extractor you adjusted with the new fields in step 1 only once in fact this should be one shot.
    step 3 Copy the query to the previous built  multi-provider on top of the new historical data from dso and the running live delta cube. Adjust the queries if necessary.
    optionnal Then if you want to get rid of the dso or new cube for historical data you could empty the actual one push the data from the new data provider and that's all.
    bye
    Boujema

Maybe you are looking for