Loading to DSO for Lookup

Hi Guys,
I have created a DSO with just MATNR and one of its bespoke attributes(Catalogue number) and extracted from 0Mat_Sales_Attr.(MVKE).
When I load the 0Mat_Sales_Attr in DEV & QA, the data loads OK. Now in Regression, the load only adds 1 record per package.
There is no code in the Transformation - just mapping. I will be using this DSO as a lookup reference in ABAP, once the data has loaded.
I have dropped the data all the way back to R3 and reloaded. The PSA only has the one request. I am expecting 1.5m records, which I get in the PSA and Master Data table, but when I load to the DSO, it only adds 79. Has anyone ever experienced this before? If so, what was the issue/resolution?
Thanks in advance,
Scott

Hi Scott,
this surely must depend a) on the data you load to bw or b) on the key of your ods/dso.
I think you need to have at least the matnr in the key of the ods, but as mvke is matnr and the sales organization you normally get multiple records per material and they will be aggregated. So in your case it might make sense to add the sales org to the key as well or in short: Define your ods the same way mvke is defined related to the key fields.
regards
Siggi

Similar Messages

  • No data in Active table of DSO for fields populated by End Routine

    Hi,
    I have a Standard DSO where we are populating few fields by using End Routine.
    Last week we added 5 more fields to DSO and wrote a logic in End ROutine to populate the DSO. These new fields dont have any mapping and these are just populated by end routine only.
    When I loaded the data from Data Source TO DSO, Data is loaded correctly into NEW DATA Table of DSO for all the fields. I could see correct data as per the logic in NEW Table including old and new fields.
    However, when I activate the DSO, I could not find the data for new fields which I added last week. Remaining fields are getting data as per the logic. Only these five fields are not having any data.
    Can you please let me know if any one had similar issue. I was under impression that all the data in the new table will go to Active table when we activate the DSO.
    Your inputs are highly appreciated.
    Thanks
    Krishna

    What version of BW are you using?  When editing your end-routine, a pop-up should display saying which fields you want populated/transferred from the end routine.  This pop-up will not display if you are using a lower version of BW 7.x.  To get around this, make sure that your newly added fields have a transformation rule type set to constant.  This will make sure that the fields get populated when transferring from new to active tables.

  • Delta records are not loading from DSO to info cube

    My query is about delta loading from DSO to info cube. (Filter used in selection)
    Delta records are not loading from DSO to Info cube. I have tried all options available in DTP but no luck.
    Selected "Change log" and "Get one request only" and run the DTP, but 0 records got updated in info cube
    Selected "Change log" and "Get all new data request by request", but again 0 records got updated
    Selected "Change log" and "Only get the delta once", in that case all delta records loaded to info cube as it was in DSO and  gave error message "Lock Table Overflow" .
    When I run full load using same filter, data is loading from DSO to info cube.
    Can anyone please help me on this to get delta records from DSO to info cube?
    Thanks,
    Shamma

    Data is loading in case of full load with the same filter, so I don't think filter is an issue.
    When I follow below sequence, I get lock table overflow error;
    1. Full load with active table with or without archive
    2. Then with the same setting if I run init, the final status remains yellow and when I change the status to green manually, it gives lock table overflow error.
    When I chnage the settings of DTP to init run;
    1. Select change log and get only one request, and run the init, It is successfully completed with green status
    2. But when I run the same DTP for delta records, it does not load any data.
    Please help me to resolve this issue.

  • Data load from DSO to cube

    Hi gurus
    We have a typical problem, we have to combined 2 records into one when they reach to DSO.
    One source is flat file and the other is R3. so few fields I am geting from flat file and few from R3 . They are creating one record when they loaded to DSO . Now I am getting one record in active data table but when I load the data to cube from that DSO ( Data is going from change log to cube ) to cube , I am getting 2 seperate records ( one which is loaded from flat file and one which is loaded from R3 ) in cube which I dont want. I want only one record in cube just like i have in my active data table in DSO.
    I cant get the data from Active data table because I need delta load.
    Would you please advise what can I do to get that one record in Cube?
    Pl help

    Ravi
    I am sending the data thro DTP only but is there any solution to get one record bcos in other scenario I am getting data from 2 different ERP sources and getting one record in DSO and cube as well.
    but that is not happening for this second scenario where i am getting data from FLAT file and ERP and traing to create one record.

  • Data load from DSO to cube fails

    Hi Gurus,
    The data loads have failed last night and when I try to dig inside the process chains I find out that the cube which should be loaded from DSO is not getting loaded.
    The DSO has been loaded without errors from ECC.
    Error Message say u201DThe unit/currency u2018source currency 0currencyu2019 with the value u2018spaceu2019 is assigned to the keyfigure u2018source keyfigure.ZARAMTu2019 u201D
    I looked in the  PSA has about 50, 000 records .
    and all data packages have green light and all amounts have 0currency assigned
    I went in to the DTP and looked at the error stack it has nothing in it then I  changed the error handling option from u2018 no update,no reportingu2019 to u2018valid records update, no reporting (resquest red)u2019 and executed then the error stactk showed 101 records
    The ZARAMT filed has 0currency blank for all these records
    I tried to assign USD to them and  the changes were saved and I tried to execute again but then the message says that the request ID should be repaired or deleted before the execution. I tried to repair it says can not be repaired so I deleted it and executed . it fails and the error stack still show 101 records.when I look at the records the changes made do not exist anymore.
    If I delete the request ID before changing and try to save the changes then they donu2019t get saved.
    What should I do to resolve the issue.
    thanks
    Prasad

    Hi Prasad....
    Error Stack is request specific.....Once you delete the request from the target the data in error stach will also get deleted....
    Actually....in this case what you are suppose to do is :
    1) Change the error handling option to u2018valid records update, no reporting (resquest red)u2019 (As you have already done....) and execute the DTP......all the erroneous records will get sccumulated in the error stack...
    2) Then correct the erroneos records in the Error Stack..
    3) Then in the DTP----in the "Update tab"....you will get the option "Error DTP".......if it is already not created you will get the option as "Create Error DT".............click there and execute the Error DTP..........The Error DTP will fetch the records from the Error Stack and will create a new request in the target....
    4) Then manually change the status of the original request to Green....
    But did you check why the value of this field is blank....if these records again come as apart of delta or full then your load will again fail.....chcek in the source system and fix it...for permanent solution..
    Regards,
    Debjani......

  • Data load from DSO to Cube in BI7?

    Hi All,
    We just migrated a dataflow from 3.5 to 7 in development and moved to production. So till now in production, the dataloads happend using the infopackages.
    a. Infopackage1 from datasource to ODS and
    b. Infopackage2 from ODS to the CUBE.
    Now after we transported the migrated dataflow to production, to load the same infoproviders I use
    1. Infopackage to load PSA.
    2. DTP1 to load from PSA to DSO.
    3. DTP2 to load from DSO to CUBE.
    step1 and step2 works fine but when I run the DTP2 it is getting terminated. But now when I tried the step b (above), it loads the CUBE fine using the infopackage. So I am unable to understand why the DTP failed and why the infopackage load is successful. In order to use the DTP do we need to do any cleanup when using it for first time? Please let me know if you have any suggestions.
    Please note that the DSO already has data loaded using infopackage.  (Is this causing the problem?)
    Thanks,
    Sirish.

    Hi Naveen,
    Thanks for the Reply. The creation of DTP is not possible without a transformation.
    The transformation has been moved to production successfully.

  • Date fields are showing as  /  /   in DSO for blank dates or no dates

    We are loading flat file data to DSO and the date fields are showing as  /  /   in DSO for blank dates or no dates in the flat file source system.  We don't want to see this / /  and instead of this, we would want to this field in DSO to be in blank. Please help how to achieve this. Is there any way to do this transformation. If yes, then can you please provide  the sample coding.
    Advance Thanks,
    Christy

    I have added the code and data loading is successful. while DSO activation, it is failing. The error message is,
    Value '#' of characteristic 0DATE is not a number with 000008 spaces.
    It seems that we need to change the code. Can you provide me the corrected code please. Thanks.
    Christy

  • Amount of records loaded to dso is not same as in psa

    i performed a loading from psa to dso. i have 2 datasource under this dso, the amount of records loaded from psa for this 2 datasources to dso is not consistent. the psa for the 1st datasource having 3k records and the 2nd datasource having 5k records, when i perform the loading for both of this datasource to dso, the records is less. do anyone here know why is this so?

    hi,
    DSO have overwrite option and hence you have lesser records.
    chk if you have enough key fields in DSO, so that you can reduce the number of records getting overwritten.
    Ramesh

  • Remote key for lookup tables

    Hi,
    I need some advice on remote keys for lookup tables.
    We have loaded lookup data from several client system into the MDM repository. Each of the client system can have diffferences in the lookup values. What we need to do is to enable the keymappings so that the syndicator would know which value belongs to which system.
    The tricky part is. We haven't managed to send out the values based on the remote keys. We do <b></b>not<b></b> want to send the lookup tables themselves but the actually main table records. All lookup data should be checked at the point of the syndication and only the used lookup values that orginally came from one system should be send to that particular system. Otherwise they should be tag should be blank.
    Is this the right approach to handle this demand or is there a different way to take care of this? What would be the right settings in the syndicator?
    Help will be rewarded.
    Thank you very much
    best regards
    Nicolas

    Hi Andreas,
    that is correct. Let's take two examples:
    1) regions
    2) Sales Area data (qualified lookup data)
    Both tables are filled and loaded directly from the R/3s. So you would already know which value belongs to which system.
    The problem that I have is that we will not map the remote key from the main table because it will be blank for new created master data (Centralization scenario). Therefore we cannot map the remote key from the attached lookup tables, can we?
    The remote key will only work for lookup tables if the remote key of the actual master data is mapped. Since we don't have the remote key (local customer ID form R/3) in MDM and since we do not create it at the point of the syndication... how would the SAP standard scenario would look like for that?
    This is nothing extraordinary it's just a standard centralization scneario.
    Please advice.
    Thanks alot
    best regards
    Nicolas

  • DUmp Erro ASSIGN_TYPE_CONFLICT while loading from DSO to INFOCUBE

    HI ALL,
    Here i had rquirement of modifying the Tranformation rules and i have done some routine coding for a particular infoobject
    and deleted the data from DOS  & Cube , then i started Loading to DSO, here i got the sucess for DSO, now i have got  the modified data of the particular Infoobject in the DSO,
    But i am facing a Dump Error while loading data from DSO to Cube thorugh the DTP .
    HEre is the Error "ASSIGN_TYPE_CONFLICT" which i see in ST22.
    Plz,
    Thanks in Advance,
    sravan

    HI,
    When i started the load for the first time i got he same error, so i have activated the DTP and laoded it again,
    then i faced the same problem,
    where the Modified data is already in DSO and  i have validated the Data also,  that is ok ...
    so i need to Delete or to need to create another DTP to load the data from DSA to CUBE.
    and even i have cheked all the tansformation rules they are all fine  the DSO structure and Infocube Structure is ok,
    Please  suggest,
    Thanks LAX,

  • Adding leading zeros before data loaded into DSO

    Hi
    In below PROD_ID... In some ID leading zeros are missing before data loaded into BI from SRM into PROD_ID. Data type is character. If leading zeros are missing then data activation of DSO is failed due to missing zeros and have to manually add them in PSA table. I want to add leading zeros if they're missing before data loaded into DSO.... total character length is 40.. so e.g. if character is 1502 then there should be 36 zeros before it and if character is 265721 then there should be 34 zeros. Only two type of character is coming either length is 4 or 6 so there will be always need to 34 or 36 zeros in front of them if zeros are missing.
    Can we use CONVERSION_EXIT_ALPHPA_INPUT functional module ? As this is char so I'm not sure how to use in that case.. Do need to convert it first integer?
    Can someone please give me sample code? We're using BW 3.5 data flow to load data into DSO.... please give sample code and where need to write code either in rule type or in start routine...

    Hi,
    Can you check at info object level, what kind of conversion routine it used by.
    Use T code - RSD1, enter your info object and display it.
    Even at data source level also you can see external/internal format what it maintained.
    if your info object was using ALPHA conversion then it will have leading 0s automatically.
    Can you check from source how its coming, check at RSA3.
    if your receiving this issue for records only then you need to check those records.
    Thanks

  • Reporting on DSO for Broker's Statement (Line Item Reporting)?

    Dear experts,
    I need to create a BW report based on data from FS-CD and ICM for a client in the insurance industry. The report is broker's statement which has in total 3 sections/pages (or in 3 levels). The requirements are as follows:
    1. Overview of the total commissions (grouped by different categories) earned by a broker for a specific month
    2. When the broker clicks on the "commission group", it will jump to another page showing each commission items of that group
    3. Details for each commission item (line item in DSO) can be drill-downed in another separate page
    Currently, I have a consolidated DSO for the commission line items. The data will be loaded to an InfoCube followed by a MultiProvider for reporting.
    I am new to BI and I assume that I should follow the following approach:
    1. Use RRI for the jumps between different details levels
    2. Use DSO for reporting the line item on third page
    3. Use Report Designer for the layout because the reports are formatted like Balance Sheet
    Is it possible that I report on the InfoCube via MultiProvider for the 1st and 2nd levels whereas DSO for the 3rd level?
    Could anyone please give me some suggestions on this?
    Thanks in advance.
    Regards,
    Joon

    Hi,
    any updates? I read that the RRI capability of the query is not supported in BEx report. Is it true?
    If so, is there any workaround to enable this?
    Regards,
    Joon

  • 2LIS_02_SCL loaded to DSO

    Hi,
    I have just started up a purchasing project.
    According to SAP standard solution 2LIS_02_SCL is loaded into DSO's with a key containing: Document, Item, Schedule line.
    However, as far as I can see, a lot of info is lost in that concept.
    Ex. for the following field in the extractor:
    BWGEOO u2013 Purchase Value in Order Currency
    The value contained in this field depends on the information stored in the field BWVORG (BW Transaction Key).
    The field determines if the value is derived from a purchase order, a goods receipt, or an invoice.
    From the top of my head, it therefore seems profitable to use a DSO with a key that ALSO contains the BW Transaction Key.
    In that way BWGEOO holds the Order value, GR Value ot IR Value, depending on the whether BWVORG is 1, 2 or 3 (3 records)
    In the SAP standard setup BWGEOO just holds the value for the last event (one record).
    However, are there any pitfals in that approach?
    Thanks in advance.
    Lars

    Hi Lars,
    As  Elangovan Subbiah  suggested, keyfigure model is better to reduce no of records.
    3 key figures for Order value, GR Value ot IR Value.
    One more: As datasource: 2LIS_02_SCL is ABR delta capable and should load to CUBE or ODS in Addition mode only. If you are loading in Overwrite mode, you need to be very careful.
    Please check: [Purchasing Data (Schedule Line Level)|http://help.sap.com/saphelp_nw04/helpdata/en/8d/bc383fe58d5900e10000000a114084/frameset.htm]
    Hope it Helps
    Srini
    [Dont forget to close the call by assigning poings.... - Food for Points: Make a Difference through Community Contribution!|https://www.sdn.sap.com/irj/sdn/index?rid=/webcontent/uuid/007928c5-c4ef-2a10-d9a3-8109ae621a82]

  • Why no DSO for SD transaction data

    Hi Friends,
    As you know that we dont have standard DSO for loaidng data for the 0SD_C03 (Sales: Overview)  and 0SD_C05 (Offers/Orders) from R/3 system. As the functianality of the cube is additive, How we can manage transaction data with these cube without DSO.
    Means when we lode transaction data we generally use DSO, if we have chances to he recprds can be chnaged in the future.
    1.Why we load direclty for Sales data without, is there any funcatinality which manages changes for the sales records.
    2.Why SAP has'nt given standard DSO for the above said cubes.
    3.Do we have to create DSO for the above siad cubes
    Thanks & Rgards,
    Revathi

    Hi Revathi
    It is definitely possible to manage the changes in the transaction data even if there is no DSO for the Cubes that you have mentioned.
    Even though the cubes have 'additive' mode of updating the key figures they have been providing correct results for all sales documents (including sales return / sales credit / order credit ) and for all key figures. isnt it ?
    However, to answer your queries :
    1.Why we load direclty for Sales data without, is there any funcatinality which manages changes for the sales records.
    The functionality that manages the changes for sales records is through the extractors for the above cubes which send related data to the Cube. IF the data sent was incorrect or if the functionality was not managing the changing correctly then it is obvious that we all would have got wrong results. But that's not the case. you could check this yourself for some test orders and invoices and see how the data gets transferred in these cubes. You could add - decrease - remove quantities or even line items and see the correct results.
    2.Why SAP has'nt given standard DSO for the above said cubes.
    It is not necessary to have a DSO for every cube. Most of the Sales reports are analytical and the need for DSO would be depending on the implementation. If we need detailed analysis then DSO is a good idea. and Does providing DSO help or reduce our workload in any way ? No !!! Imagine the time that would be elapsed in transferring data into the DSO, activating the DSO and then transferring data into the Cube. There are 6 datasources in 0SD_C03. Is it worthwhile to get them into a DSO (if there isn't any need) and spend time on overheads which could be avoided ?
    3.Do we have to create DSO for the above siad cubes
    Depends purely on your requirement. Yes. You can create a DSO in between the data source and the Cube as a first layer of data store. But please be informed that you need to be very careful and your DSO updates would also depend heavily on the update mode for each of the extractor. It is not necessary that you would be able to use 'overwrite' mode of update for the DSO's (I am still not able to understand when you say that 'additive' mode doesnt help in managing the changes !!!.)
    Cheers
    Umesh
    Edited by: UPEDNEKAR on Nov 19, 2010 9:23 AM

  • No match defined for lookup dimension CATEGORY

    Hi,
    I am trying to load data into cube and i get blelow error during import.
    "No match defined for lookup dimension CATEGORY"
    This is first time i setup the this cube and loading the data. I verified category dim has members.
    You help is appreciated.
    Thanks
    Ramesh

    If you are using an input template in excel or loading data via data manager and have allowed default logic to run, the issue is most likely tied to the execution of default logic.  I would verify that the System constants file in the adminApp/[APP] directory has the correct dimensions named in the cube you are sending data.  If you are using a "C" type dimension that is named Scenario or some other name, you may need to verify that the constants file has the categorydim is set correctly.
    Otherwise, you have some logic that may be using a business rule that depends on a category member that is not present in the application.
    Hope this helps.

Maybe you are looking for

  • Swf published from Flash Catalyst not playing in Flash Player

    Hi All, I have 2 issues which I'm not sure may be related or not. - I've created a project in FC but when I publish it to .swf it does not open in Flash Player. It plays in Chrome but Flash Player 10 gives a blank screen. - I have FC 5.5 and I want t

  • How does TC/TM prioritize backups and deletions?

    I've recently installed a 500 GB TC and have used it to set up a backup schedule for two Macs using TM. I also want to add some archival files (old and unchanging, currently stored on several external HDs) and at a point in the near future I will wan

  • How do enter my wifi password when IPad has the old saved password on it?

    How do I change my wireless password when the old password is stored for the wireless connection?

  • Trouble with passwords

    Hi all, every time I start an application like Mail or Safari which wants to fill in passwords I am asked to type my user password, but I am already logged on. This is getting on my nerves and I don´t know how to stop this. HELP!

  • Can't download "Lion-version"

    When I bought my new MacBook Pro, they told me that I would get mac OS X Lion for free but it wasn't installed on the computer. The only thing I would have to do was download it from App Store since I bought my computer after the 16th of July. But wh