Filter on attribute of 0MATERIAL filters charac. values without master data

Hi SAP-Gurus,
I've a problem while executing a query (this query is based on a MultiProvider).
I got the message "Filter on attribute of 0MATERIAL filters charac. values without master data" and I don't know why.
When I don't use the filter on the navigation attribute, the query can be executed without any errors, bute when I set the filter, the warning message comes again.
Thank you all,
Clemens

Dear all SAP Gurus,
we've solved this problem and this is the solution:
The warning is triggered by the method CL_RSDRV_ODS_QUERY-PROCESS_SFC_WITH_ATR when the BExFlag of the InfoProvider is not set. Using DSOs for direct update there is no change to change the BExFlag. As a result queries based on a DSO for direct update always throw the above mentioned warning when filter-characteristics are attributes.
Regards, Clemens

Similar Messages

  • Error Message in Query Filter on attribute of 0MATERIAL filters charac.

    Hi All,
    i have a query with few variables , i used navigational attributes of 0material......
    I am getting the below Warning Message...  Ineed to supress the below Error Message....
    Please let me know the solution...
    Error Message: Filter on attribute of 0MATERIAL filters charac. values without master data
    Diagnosis
    Currently, it cannot be guaranteed that SIDs and master data exists for all characteristic attributes for the DataStore object to be read.
    There is a restriction on a navigation attribute of the listed characteristic in the query. This filters all characteristic values of the master-data bearing characteristic for which there is not yet master data out of the result.
    For performance reasons, this filtering is unavoidable.
    System Response
    Procedure
    In case of doubt, find other restirctions directly on the characteristic values of the characteristics contained in the DataStore object.
    Procedure for System Administration
      Notification Number DBMAN 345 
    Thanks All

    Dear all SAP Gurus,
    we've solved this problem and this is the solution:
    The warning is triggered by the method CL_RSDRV_ODS_QUERY-PROCESS_SFC_WITH_ATR when the BExFlag of the InfoProvider is not set. Using DSOs for direct update there is no change to change the BExFlag. As a result queries based on a DSO for direct update always throw the above mentioned warning when filter-characteristics are attributes.
    Regards, Clemens

  • Issues In Reading Attribute Values From Master Data

    Hi All,
    I have a requirement where, i need to read an attribute value from master data. I have a characteristic YCSTATMCG (AT Cost Group Code) which is the master data from where i have to read the attribute, 0PROFIT_CTR (Profit Center). The attribute thus read, has to be populated into another characteristic, YPROFIT_C.  But YCSTATMCG referes to another characteristic, YCSTCG. Here is the FOX Code I wrote with YPROFIT_C as the changing characteristic and 0AMOUNT as keyfigure.
    DATA V_ATCP TYPE YCSTATMCG.
    DATA V_PROFIT TYPE YPROFIT_C.
    DATA V_PROFITC TYPE YPROFIT_C.
    DATA V_AMOUNT TYPE F.
    V_ATCP = OBJV().
    MESSAGE I020(YCO_CFT) WITH V_ATCP.
    V_AMOUNT = {0AMOUNT,  # }.
    V_PROFIT = ATRV('0PROFIT_CTR' , V_ATCP).
    MESSAGE I020(YCO_CFT) WITH V_PROFIT.
    {0AMOUNT, V_PROFIT} = V_AMOUNT.
    But this is not working. The ATRV() function is not reading the attribute values at all. Any solutions and suggestions is highly valued.
    Thanks in advance
    Swaroop

    Hi,
    even i have the same situation.
    i just want the attribute value of a char to be populated into another characteristic in the planning query.
    my question is whether i should populate the keyfigure field also in the FOX code.
    if so should i populate both 0amount and 0quantity fields as i have 2 keyfigure fields.
    Thanks for your help
    Nishanth

  • Count values in master data

    Hello,
    If i want to implement a report that gives me the customers that had no sells for a period, i have to read the values from master data.
    But if i have to implement counters on that how can be done ? I don't want to change the data structure, can it be done only in the report ?
    I know how to count the occurrences of a characteristic relative to one or more other characteristics but that doesn't work because the values must exist in the infoprovider.
    Regards,
    Jorge Diogo

    Hello,
    Maybe you can try one of the below to see if it works:
    1) Create a MPRO on customer master data object and the transactional dataprovider. Now restrict the query where CALDAY is EQ # or any date EQ # which is for the sales date in the transactional DP.
    2) Include customer object in the rows of query and from its properties select display data from master data object. Now create condition where the sales amount is EQ 0 and keep it active.
    Regards,
    Shashank

  • Incorrect Value In master DATA

    Hi Everyone,
    I am getting incorrect data for attribute "Standard Price" of 0MAT_PLANT Master Data. This issue is there only for single record of 0MAT_PLANT value.In PSA data is correct as it is matching with R3.
    everyday there is  full load in 0MAT_PLANT,it as per business requirement.
    Eg
    provider level
    Plant: ABCd
    Standard Price : 40
    PSA Level:
    Plant: ABCd
    Standard Price : 0.75
    Note: here Standard Price is correct figure which i am getting.
    Observations:
    1. when i run dtp only for this record, i am getting correct Values.
    2. In debugging of mode of DTP  i am too getting correct Value
    Regards
    Nadeem

    Hi Nadeem,
    From your reply, I understood that, when you load single value, that record updated correctly.
    But, when you load all the entries, You see, there's problem. Recently, I have resolved same kind of issue.
    Details are given below :
    Example :
    When it's single record :
    Material  -   Price 
    MAT001 -   10
    And the same is entered into Target correctly.
    When, it's full load,
    In PSA, We had entries as follows :
    Customer  -   Price 
    MAT001      -  10
    MAT001-1  -   20
    MAT001-2  -   30 ,
    After transformation,
    These values are converted as follows :  
    Due to some issue in Transfer Routine (Some logic)
    MAT001      -  10
    MAT001      -   20
    MAT001      -   30
    So, Since the Key's are same, in Duplicate key check,
    Last record was taken :
    MAT001      -   30
    We see, wrong price for MAT001.
    So the Suggestion is, in Info Package data selection,
    Don't give the exact value for single record execution.
    Give filter like,
    MAT001* .
    So that, Info Package will pull all the three entries and You can check the DTP for any duplicate message. Then, We can figure it out.
    Assign points if this reply helps you.
    Thanks,
    Jay.

  • How does attribute change run works for Aggregates and Master data?

    Hi
    Can anybody xplain how does the attribute change run works for Master data ?
    For e.g.
    There is 0spelling and it has master data
    On Day 1 there are 10 records
    day 2 it has 12 records
    so with attribute change run this 2 new records will get added....
    The values for this  12 records will added seperately in Data load
    Is this how it workss
    So how about Aggregates which has Master data.????

    for e.g.
    u have 0spelling whicha has attributes x,y and z on day 1 with 10 records
    so do ur aggregates on day1 with same values
    now on day2 u had new values of attributes y,z,s,d and new hierarchies and so u add new records
    with data load u will load the data with version M of modified and is not available for reporting
    If u do attribute change run then this modified version is activated to A i.e. active version .
    It will also do the change run alignment for Aggregate for new attribute values and new hierarchy values for aggregate.
    now in order for this data to be available for reporting u will need to do the roll up of aggregate.....
    if u roll up aggregate before attribute change run , new data is not avaialable for reporting
    if u roll up aggregate after attribute change run, then data is available for reporting
    if u dont roll up aggregate eventhough new data is in dataprovider, still new data will not be available for reporting.
    this is how it works

  • Showing all Characteristic values from Master Data in WebI report

    I want to show master data when there is no data in Cube, I have created constant variable with 1 value in Bex and I am using that in the crystal report but it's still not showing master data. Can anybody help me on this.
    Thanks,
    Ravi

    Hi Ingo,
    Thanks for your quick reply, I have created formula with constant value 1. Do I need to put that formula in rows or columns or as Free Char?
    Thanks,
    Ravi

  • Geographical attributes are stored in SAP equipment/functional master data?

    Hi gurus,
    I Need to Understand how the  typical geographical/spatial attributes are stored in SAP equipment/functional master, and how they are stored. Is it through enhancement with customer fields or through classification?
    Please through some lights on this
    regards
    Krish

    You can create new tab's using User exit : ITOB0001 or use of address(Ctrl+F5) search  term field 1/2 .. you can use to store the geographical/spatial attributes or via classification as PeteA mentioned.
    Thanks
    -S.N

  • List of Values use Master Data or Trasaction Data?

    Hello,
    Do anyone of you from where does data in the list of values is populated Transaction data or Master data or combination of both?
    I know bex query could be built on a DSO, Info Cube, Info set, Mutiple Provider and Info objects. Is dependent on the type of info provider the query is built on? Could we really control the source in the lov's
    Thanks
    Ram

    its based on the data in the InfoProvider.
    Ingo

  • Navigatiomnal attribute is not possible to delete from master data?

    WHen I tried to delete the navagational attribute I am getting an erro like this.
    Characteristic Z5xyz: Master data has to be activated before conversion
    Message no. R7067
    Diagnosis
    The master data for characteristic Z5xyz  has to be converted, because an attribute has been deleted, or the time-dependency of an attribute has been changed. This process is possible, only if the master data for characteristic Z5xyz  is active. The  master data table /BIC/PZ5xyz contains only new or modified records, that have not yet been activated.
    System Response
    Characteristics Z5xyz  cannot be activated.
    Procedure
    Activate the master data for characteristic Z5xyz , before you activate the characteristic itself.
    Can any one help me to resovlet his.
    points will be definitely assigned.
    Thanks,.
    Vasu

    Hi Vasu,
    You are tying to delete a table filed when the table contains data. For this you need to adjust the database table.
    But instead of doing all this you may follow a simpler method.
    Before  deleting that Characteristics you need to delete the Masterdata first.  Then delete the Characteristics from the attribute list and activate the Masterdata Object. Now reload the masterdata.
    Regards,
    Pratap Sone

  • Description is showing for blank key values in master data

    Hello experts,
    I have loaded master data text. In general the first record of master data will be blank. But my case its showing description (short, medium) for blank key value.
    Could any one help me in this matter?
    Thanks
    Prathap

    Hello,
    Did you check the data source for this in the source system and check in RSA3 or in the underlying tables if anything like that is mainatained...if its a full load then the same thing may get loaded again even if you have changed it in BW.
    May be you can try to write a condition in the update rules on not to load the records with blank keys.
    If there is no such records in the source system then you can change the master data in the BW and make the text as blank.
    Thanks
    Ajeet

  • Blank Values in Master Data

    Hi,
    We have observed that '#' and '-' as part of Netweaver 2007 i.e BI 7.0 but in BI 3.5 we can only see '#'.
    What is the difference between '#' and '-'. Can somebody illustrate this.
    Thanks
    Amit

    I have not seen <b>-</b> for blank values so far in 7.0....jst in case have you checked the MD tables how it is stored.

  • Values for attributes in 0MATERIAL missing

    Dear friend,
    I have infoobjects 0FAMILY, 0SEGMENT as attributes of 0MATERIAL and i have loaded master data in dev BW and QBW for all the infoobjects including 0MATERIAL. Now in dev BW  i see values for 0FAMILY and 0SEGMENT in master data for 0MATERIAL but i dont see these values in QBW for 0MATERIAL. What could be the reason? Please suggest?
    Thanks
    raj

    Bhanu,
    There is master data in 0FAMILY,0SEGMENT and 0MATERIAL but in QBW the values for 0FAMILY and 0SEGMENT are missing. I have checked the transfer structure for 0MATERIAL attributes and found that there is a routine for populating 0FAMILY and 0SEGMENT. The code is as below. I am still pondering what is missing?  Thanks for all the help.
    Raj
    DATA: l_s_errorlog TYPE rssm_s_errorlog_int.
      RESULT = TRAN_STRUCTURE-PRDHA(2).
    returncode <> 0 means skip this record
      RETURNCODE = 0.
    abort <> 0 means skip whole data package !!!
      ABORT = 0.
    $$ begin of inverse routine - insert your code only below this line-
      DATA:
        L_S_SELECTION LIKE LINE OF C_T_SELECTION.
    An empty selection means all values
      CLEAR C_T_SELECTION.
      L_S_SELECTION-FIELDNM = 'PRDHA'.
    Selection of all values may be not exact
      E_EXACT = RS_C_FALSE.
    $$ end of inverse routine - insert your code only before this line -
    ENDFORM.
    Message was edited by: Raj Singh

  • Read & Filter Multiple .wav files, Export Filtered SPL

    Hi
    I'm searching for tips on how to use LabVIEW to read multiple .wav files, then filter them, then export just the calculated SPL values to a "database friendly format".
    Basically I want a solution that allows me to point to several files, then press a "go button" that triggers LabVIEW to produce filtered SPL values as exported data.
    I'll skim over the cal approach since I'm confident most readers know how to handle it. Basically, though, for each 30 s .wav file of broadband noise, an associated cal tone file is used to determine the dBFS value (dB below Full Scale) to be associated with 94 dB SPL, or 114 dB SPL, as the case may be. I'll leave it at that.
    Detailed Requirements, (excluding cal steps, tho).
    1) Operates on up to a dozen 30 s time records (44,1 kSa/s, mono .wav files);
    2) Passes the signal from each desired .wav file through 18 of the 1/3 octave filters (IEC 1260 compliant, as in the S/V toolset) not necessarily in real time!;
    3) Calculates the SPL for each 1/3 octave filtered time record; This makes (12 files) x (18 filters) = 216 filtered SPL values if end-user points to a dozen broadband noise files before pressing the "Go" button;
    4) Exports SPL values in a suitable database format; an existing database does the rest of the mathematics and report generation;
    Client FileMaker database functions include:
    - import and store 1/3 octave filtered Leq(30) SPL values;
    - average filtered SPL across the declared qty of mic positions of the present test case;
    NOTES:
    1) In the field, we do NOT use simultaneous / MULTI-mics. Instead, it is absolutely an "open and shut case" that the optimum process is to move one (1) single mic quickly from spot to spot. While holding the airborne noise at a constant level we position the mic at each spot for 30 s to feed the .wav recorder, then move on. We emerge from a typical day of field activity with, say, 60 to 100 .wav files to post-process.
    2) (1), above, explains why we don't use any N.I. DAQ h/w.
    3) LabVIEW, it seems, exports best to a spreadsheet, so if needed, I'm prepared to handle manually any ugly conversion from LV's format to a proper database format ("1st Normal Form, 2nd Normal Form, etc.).
    4) Given a choice to calculate the averages in my database, versus in LV (say, to cut down on exported data qty), I'd likely opt to export the 216 SPL's and then do averaging in the database, (where the ASTM E336 reports are done).
    TIPS Gathered To Date:
    1) S/V toolset has IEC 1260 compliant filters, BUT...
    2) If you use the S/V toolset VI's, then you CANNOT ACCESS the filtered time record NOR CAN YOU WRITE the filtered data back to a .wav file! All you can do is measure the filters' outputs. Fortunately for us here, the available measures include SPL.
    Aside: Although not a show-stopper in this project, this constraint is surprising, nonetheless. To access & record the filtered sound, you'd have to either (a) ask N.I. to unprotect VI's, or (b) develop your own IEC compliant filters from N.I.'s published tap co-eff's etc.;
    3) LabVIEW does not support drag'n'drop operations on files. Instead, the recommended technique is to use multiple "Browse to/ Navigation/ File Open" panels to allow LV to open the user's desired qty of .wav files simultaneously;
    Thanks for any tips!

    The filter VI is password protected in the Sound & Vibration Toolkit (SVT) for two reasons. First, since the filter is compliant to a standard, by locking the code we guarantee that no one has the chance to accidentally change it so that they are no longer compliant. Second, the method of filtering the data is considered intellectual property that we are protecting.
    That said, from reading your post it does not appear that you need to have access to the filtered data since you are making an SPL measurement which is provided in the Sound & Vibration functionality. Is that correct?
    Also, I wanted to point you to another subVI which comes with SVT but is not in the pallette that might be of use. There is an example for SVT which is SVXMPL_Wav Power Spectrum which opens up a wav file and computes the power spectrum of it. Inside the VI, there is a subVI which will open a Wav file and convert it into a waveform which you can use with your SVT functions.
    Hope that helps,
    Jack

  • How to convert  unit of measures in bex on master data attribute values

    Hi All,
    i need to convert the unit of measure in Masterdata Attribute values... that means ) ' 0grooss_wt'. this is attribute as a keyfigure of ' 0material' .. this ' 0gross_wt' values has to convert in KGs. at Present ' 0gross_wt' values are in KG and Grams...
    I can change these values in update rules by writting Routine.. But I need to convert it in KGS at BEX Query Designer Level..
    I need calculate like this
    quantity sold * Gross Weight.
    here Gross Weight is the formula variable which replacing the values of gross weight...
    i tried by creating conversion types in RSUOM t-code. but it works on keyfigures of infocube.. not on attribute values of master data...
    is there procedure in formula variable it self to convert unit of measure before replacing the values.....
    or is there any other  solution to  this Problem...
    Thanks in advance..
    regards
    ravi.p

    Hi
    Have you tried to create a variable for ths kf with exit. I thnk it is possible here
    Assign points if useful
    Regards
    N Ganesh

Maybe you are looking for

  • Einen zweiten Adobe PDF Drucker einrichten

    Hallo, ich habe hier ein Problem, bei dem ich nicht mehr weiter weiß. Wir haben Adobe Acrobat 7.0.0 Prof. im Einsatz. Da wir aber mehrere Programme (z.B. Datenbanken) haben, die auf den PDF Drucker zugreifen, benötigen wir mehrere Adobe PDF Drucker m

  • System fan not working properly

    i have a hp 15 notebook 15-r004ne  laptop and when i turn it on, it states the fan is not working correctly, then shuts down after 15 seconds or continues to windows if you press enter. I dont know what to do now please i need help . Answers will be

  • How to Create a File Structure

    HI experts! I want to create a file structure for testing purpose. Basically what i need to create is a 150 MB file and for that file i want to create a structure i.e my DT ,MT for testing purpose. Can you guide how should i create a 150 MB file and

  • Cannot install adobe photoshop elements 10 on my Pro Mac

    cannot install adobe elements 10 using the disk provided on my Pro Mac ?

  • RAC Installation ---- Hardware Requirment Doubt

    Hi All, I have to install 10g RAC at home for learning purpose. I have two PC's with same version of linux (Oracle Enterprise Linux 32 bit each) installed, but having different hardware. One machine is laptop and other is desktop. So, will it be poss