"Master Data Read Class Parameters" field is disabled
Hi.
While creating infoobject I switch master data access from Default to Own implementation,
after this screen is redisplayed and only Name of Master Data Read Class field become enabled.
I need Master Data Read Class Parameters field, but it is disabled and button near this field has the same status.
Why? How to make them active?
Hi,
I think it is not possible to add own Master Data Class if the reference is 0DATE or if the type is DATS.
So may be you can create your infoobject as Type CHAR, Length 8.
May be you can add a Conversion Routine for the required format of date to be displayed.
Similar Messages
-
Using a single master data infoobject for multiple fields
Hello experts. Today I ran un with the following problem: I am modelling an Infocube and will be receiving a flat file with several fields, of which 3 of them give information about sales representatives.
For each transaction there are 1-3 sales representatives involved, and this is reflected in the file I am going to be loading. If only one representative is involved, it means that he was in charge of a complete negotiation, and his code will be found in all three fields, if two representatives are involved, it means that one was in charge of one part of the negotiation, while the other was in charge of the rest, and according to what they were in charge, their code will appear in one or two of the fields. If three representatives are involved, each one's code will appear on the field corresponding to the part of the negotiation they were in charge of handling.
I already have the sales representative's codes as master data, but I was wondering how I can have those three fields map into the same InfoObject, if that is possible at all. Maybe in the transformation I map them to the same object? Also, what happens when you create the Data Source? As far as I know, each field in the Data Source must be unique, which makes it impossible to use my ZSALESREP InfoObject for the three fields, at least at Data Source level. What can I do in this case?Hi Pedro Olvera,
You need 3 infoobjects in datasouce to load 3 different fields.
If you want load to a single infoobject, you can map accordingly in transformation, bult last field overwrites previous one.
Its better to maintain 3 fields in cube. Create 2 more sales Reps(ZSALESREP1 & ZSALESREP2) with Reference to ZSALESREP then you no need to maintain master data for ZSALESREP1 & ZSALESREP2.
Use these 3 fields for modeling.
Hope it Helps
Srini
[Dont forget to close the call by assigning poings.... - Food for Points: Make a Difference through Community Contribution!|https://www.sdn.sap.com/irj/sdn/index?rid=/webcontent/uuid/007928c5-c4ef-2a10-d9a3-8109ae621a82] -
Navigation attribute behaviour during time dependent master data read
Hi All,
My report is based on infocube. There is one time depenedent master data characteristic named 0EMPLOYEE in the cube.
My report is run for a specific time period. For example Monthly and quaterly level.
There are many infoobject form part of dimension in cube and few are part of navigation attribute folder.
Now in the transformation there are some fields which needs to be read from master data depending upon some time characteristic for ex:-0CALMONTH, 0CALDAY. This rule is mapped correctly and data is also being stored correctly as per the specific time period in the cube.
Now there are some navigation attirbute which reads the data from 0EMPLOYEE master data itself.
My doubt is will navigation attribute read the latest record for the employee or is it intelligent enough to read the right records depending upon specific time char?
With navigation attribute we dont have option to specify any rule as compared to normal objects in transformation.
What will navigation attribute read ? Latest record or speicific records as per time period?
Thanks & Regards,
AnupHi Anup,
Let me give you one small example about time dependent attribute work,
let us say we have 0COSTCENTER as time dependent attribute of 0CUSTOMER. Now in your transaction data you have loaded values of 0CUSTOMER and in the query you have used 0CUSTOMER_COSTCENTER attribute.
Transaction Data,
Tran. no. Customer Number Amount
1 123 500
2 125 450
3 126 900
Master Data:
Customer Cost Center Valid From Valid To
123 COST1 1st Jan 15th Jan
123 COST2 16th Jan 30 March
123 COST3 31st March 30 June
In the above example the data loaded for valid to and valid from has came from source system and for this data you will have direct mapping done in transformation. Now data will reside as above in your system.
When you use Key date as 20th Jan then the cost center for customer 123 in the query will be shown as COST2. So this assignment of time dependent data is done at runtime only and will not have any impact on underlying data.
Regards,
Durgesh. -
Vendor Master Data, (FK02), Bank Details Fields Need to be Shown
Dear Gurus,
Now, in the payment transaction tab in FK02 (Modify Vendors), some bank fields need to be shown, which possibly would be suppressed.
How to activate or surface these fields?
Some fields would be like IBAN number, Bank Code, Bank Country, etc.
Points Awardable.Goto IMG>> Fin A/c >> AR & AP >>> Vendor A/c >>> Master Data >>> Prepration of Creating vendor master data >> Define screen layout per activity (vendor)
click : Change Vendor (Accounting)
Goto General data : >> Payment transactions
Make Bank details as optional filed from supressed.
Hope this helps
Please assign points as way to say thanks -
Master data reading does not work
Hello,
Having as a source for a transformation an infosource, masterdata reading (i.e. enhance master data with attributes) produces less data records than a simple transformation without reading master data attributes.
Question: Is the reason for this behavior, that an infosource is no info-object based data provider as is written in the RKT docu?
Thank you + regards,
IngoHi Ingo,
please open an customer problem message, since this seems to be a bug. It might be that it happens only in certain scenarios. Nevertheless it should be investigated by development.
Cheers
SAP NetWeaver 2004s Ramp-Up BI Back Office Team -
Time dependent employee master data reading !in sap-bw
Hi,
Here iam using one routine to read employee master data.
as employee is a time dependent characteristic,we want employee monthly details from the period of joining to till now with all his status.(these status characteristics are attributes of 0employee master table).
how to code for this requirement.
regards,
swami.Hi,
Why do you want to delete ?
No need to delete any Master Data
If you want to really delete ,you can do one thing
Right click on Master data Info Object 0EMPLOYEE
select Delete master data
And reload Master data for employee with full load
if delta Initilize delta with data transfer
Regards
Hari -
Time dependent employee master data reading !
Hi,
Here iam using one routine to read employee master data.
as employee is a time dependent characteristic,we want employee monthly details from the period of joining to till now with all his status.(these status characteristics are attributes of 0employee master table).
how to code for this requirement.
regards,
swami.Found it...
http://help.sap.com/SAPHELP_ERP2004/helpdata/EN/49/7e960481916448b20134d471d36a6b/content.htm
Regards
Juan -
Adding/Enhancing master data with a standard field?
Dear Friends,
I have to enhance master data 0VENDOR to add a field LFM1-LFABC (vendor ABC indicator) which is a standard SAP field in LFM1. How do i do that? How do you see which table/s the datasource 0VENDOR_ATTR is pulling data from in R/3.
Thanks for your time and help
RajHi,
The procedure is like below.
Take the datasource in which you enhance.
1. If you enhancing in the LO datasaource, take the aprapriate communication strucute append the structure in that structure add the fields you wnat enhance.
If you are enhancing other than LO you take extract structure append the structure and add the fields in that structure.
2. Goto LBWE Tcode take the option called "Maintain Extract structure" take the fields which you enhanced from pool take push in to extract strucure.
If you enhancing other than LO no need to do this.
3. Goto RSA6, take datasource, goto change mode. The enhanced fields Hide option checked, reomove that tick and save it.
4. Goto "CMODE" tcode, If project is already created for that enhancement goto disply mode. If the project is not created give the project name created and give the enhancemet name as "RSAP0001" and save it.
in the component you can find 4 Function modules
EXIT_SAPLRSAP_001 -> Transaction Data
EXIT_SAPLRSAP_002 -> masterdata attributes
EXIT_SAPLRSAP_003 -> Master data Texts
EXIT_SAPLRSAP_004 -> Masterdata Hirarchies.
Take the apprapriate FM and double click on it
there you can find Include name. If you double click on it you can find ABAP editor.
there we have to write the code
example code will be like below.
data: l_s_icctrcst like icctrcst,
l_s_icctract like icctract,
l_s_icctrsta like icctrsta,
l_tabix like sy-tabix.
case i_datasource. */ in BW 1.2B case i_isource.
when '0CO_OM_CCA_1'.
loop at c_t_data into l_s_icctrcst.
l_tabix = sy-tabix.
select single * from z0001 where kokrs = l_s_icctrcst-kok
and kostl = l_s_icctrcst-kostl.
if sy-subrc = 0.
l_s_icctrcst-zfield1 = z0001-zfield1.
l_s_icctrcst-zfield2 = z0001-zfield2.
modify c_t_data from l_s_icctrcst index l_tabix.
endif.
endloop.
when '0CO_OM_CCA_2'.
when '0CO_OM_CCA_3'.
when others.
exit.
endcase.
This is example code.
Thnaks
DST -
How to make a page READ ONLY / all fields are DISABLED ?
Hi all,
I am using ONE page to do ADD/EDIT/DISPLAY data. When users are in DISPLAY mode, I want all fields on the page to be read only / disabled.
How is the best approach to do that ?
(other than going to every fields and set the ReadOnly/Disabled property)
Thank you for your help,
xtantoHi Xtanto,
My business logic is somewhat different - I don't do this for new/editted rows, but based upon other security. You could certainly set something in your backing bean based upon a user clicking "new" or "edit"
John -
Info-object Maintenance -- Master Data/Texts Tab options
Hello Bi Gurus,
In BI 7.o:
Path:
Info-object Maintenance -->Master Data/Texts Tab -->Master Data infoSource/Data Taget/Infoprovider/Master Data Read Access.
Scenario 1:
1. Master Data Access - Own Implementation
Name of Master Data Access -
CL_RSMD_RS_SPEC class for accessing generic Bw special info objects
CL_RSR_XLS_TABF4 Table Master Data Read Class
CL_RSROA_LOCAL_MASTERDATA Master Data for Virtual Characteristics
CL_RSMD_RS_BW_SPEC Base class for accessing Bw special info objects
CL_RSMD_RS_SPEC_TXT Class for Accessing Generic BW-Specific InfoObjects
Under what scenairo we go for this option?
Any body used this option?
Scenario 2:
1. Master Data Access - Direct Access
Name of Master Data Access - CL_RSR_REMOTE_MASTERDATA
Under what scenairo we go for this option?
Any body used this option?Hi,
With Master Data:
If you set this indicator, the characteristic may have attributes. In this case the system generates a P table for this characteristic. This table contains the key of the characteristic and any attributes that might exist. It is used as a check table for the SID table. When you load transaction data, there is check whether there is a characteristic value in the P table if the referential integrity is used.
With Maintain Master Data you can go from the main menu to the maintenance dialog for processing attributes.
The master data table can have a time-dependent and a time-independent part.
In attribute maintenance, determine whether an attribute is time-dependent or time independent.
With Texts:
Here, you determine whether the characteristic has texts.
If you want to use texts with a characteristic, you have to select at least one text. The short text (20 characters) option is set by default but you can also choose medium-length texts (40 characters) or long texts (60 characters).
Helpfull link:
http://help.sap.com/saphelp_nw2004s/helpdata/en/71/f470375fbf307ee10000009b38f8cf/frameset.htm
Regards,
Suman -
Performance: reading huge amount of master data in end routine
In our 7.0 system, each day a full load runs from DSO X to DSO Y in which from six characteristics from DSO X master data is read to about 15 fields in DSO Y contains about 2mln. records, which are all transferred each day. The master data tables all contain between 2mln. and 4mln. records. Before this load starts, DSO Y is emptied. DSO Y is write optimized.
At first, we designed this with the standard "master data reads", but this resulted in load times of 4 hours, because all master data is read with single lookups. We redesigned and fill all master data attributes in the end routine, after fillilng internal tables with the master data values corresponding to the data package:
* Read 0UCPREMISE into temp table
SELECT ucpremise ucpremisty ucdele_ind
FROM /BI0/PUCPREMISE
INTO CORRESPONDING FIELDS OF TABLE lt_0ucpremise
FOR ALL ENTRIES IN RESULT_PACKAGE
WHERE ucpremise EQ RESULT_PACKAGE-ucpremise.
And when we loop over the data package, we write someting like:
LOOP AT RESULT_PACKAGE ASSIGNING <fs_rp>.
READ TABLE lt_0ucpremise INTO ls_0ucpremise
WITH KEY ucpremise = <fs_rp>-ucpremise
BINARY SEARCH.
IF sy-subrc EQ 0.
<fs_rp>-ucpremisty = ls_0ucpremise-ucpremisty.
<fs_rp>-ucdele_ind = ls_0ucpremise-ucdele_ind.
ENDIF.
*all other MD reads
ENDLOOP.
So the above statement is repeated for all master data we need to read from. Now this method is quite faster (1,5 hr). But we want to make it faster. We noticed that reading in the master data in the internal tables still takes a long time, and this has to be repeated for each data package. We want to change this. We have now tried a similar method, but now load all master data in internal tables, without filtering on the data package, and we do this only once.
* Read 0UCPREMISE into temp table
SELECT ucpremise ucpremisty ucdele_ind
FROM /BI0/PUCPREMISE
INTO CORRESPONDING FIELDS OF TABLE lt_0ucpremise.
So when the first data package starts, it fills all master data values, which 95% of them we would need anyway. To accomplish that the following data packages can use the same table and don't need to fill them again, we placed the definition of the internal tables in the global part of the end routine. In the global we also write:
DATA: lv_data_loaded TYPE C LENGTH 1.
And in the method we write:
IF lv_data_loaded IS INITIAL.
lv_0bpartner_loaded = 'X'.
* load all internal tables
lv_data_loaded = 'Y'.
WHILE lv_0bpartner_loaded NE 'Y'.
Call FUNCTION 'ENQUEUE_SLEEP'
EXPORTING
seconds = 1.
ENDWHILE.
LOOP AT RESULT_PACKAGE
* assign all data
ENDLOOP.
This makes sure that another data package that already started, "sleeps" until the first data package is done with filling the internal tables.
Well this all seems to work: it takes now 10 minutes to load everything to DSO Y. But I'm wondering if I'm missing anything. The system seems to work fine loading all these records in internal tables. But any improvements or critic remarks are very welcome.This is a great question, and you've clearly done a good job of investigating this, but there are some additional things you should look at and perhaps a few things you have missed.
Zephania Wilder wrote:
At first, we designed this with the standard "master data reads", but this resulted in load times of 4 hours, because all master data is read with single lookups.
This is not accurate. After SP14, BW does a prefetch and buffers the master data values used in the lookup. Note [1092539|https://service.sap.com/sap/support/notes/1092539] discusses this in detail. The important thing, and most likely the reason you are probably seeing individual master data lookups on the DB, is that you must manually maintain the MD_LOOKUP_MAX_BUFFER_SIZE parameter to be larger than the number of lines of master data (from all characteristics used in lookups) that will be read. If you are seeing one select statement per line, then something is going wrong.
You might want to go back and test with master data lookups using this setting and see how fast it goes. If memory serves, the BW master data lookup uses an approach very similar to your second example (1,5 hrs), though I think that it first loops through the source package and extracts the lists of required master data keys, which is probably faster than your statement "FOR ALL ENTRIES IN RESULT_PACKAGE" if RESULT_PACKAGE contains very many duplicate keys.
I'm guessing you'll get down to at least the 1,5 hrs that you saw in your second example, but it is possible that it will get down quite a bit further.
Zephania Wilder wrote:
This makes sure that another data package that already started, "sleeps" until the first data package is done with filling the internal tables.
This sleeping approach is not necessary as only one data package will be running at a time in any given process. I believe that the "global" internal table is not be shared between parallel processes, so if your DTP is running with three parallel processes, then this table will just get filled three times. Within a process, all data packages are processed serially, so all you need to do is check whether or not it has already been filled. Or are you are doing something additional to export the filled lookup table into a shared memory location?
Actually, you have your global data defined with the statement "DATA: lv_data_loaded TYPE C LENGTH 1.". I'm not completely sure, but I don't think that this data will persist from one data package to the next. Data defined in the global section using "DATA" is global to the package start, end, and field routines, but I believe it is discarded between packages. I think you need to use "CLASS-DATA: lv_data_loaded TYPE C LENGTH 1." to get the variables to persist between packages. Have you checked in the debugger that you are really only filling the table once per request and not once per package in your current setup? << This is incorrect - see next posting for correction.
Otherwise the third approach is fine as long as you are comfortable managing your process memory allocations and you know the maximum size that your master data tables can have. On the other hand, if your master data tables grow regularly, then you are eventually going to run out of memory and start seeing dumps.
Hopefully that helps out a little bit. This was a great question. If I'm off-base with my assumptions above and you can provide more information, I would be really interested in looking at it further.
Edited by: Ethan Jewett on Feb 13, 2011 1:47 PM -
Error - InfoObject cannot be used to read master data
Hi All,
We am working on mapping objects in a transformation, on some of the transformation I am using the master data attribute to fill the infoobject. In some cases the mapping happens without incident but in others we are getting the following error
InfoObject cannot be used to read master data
Inside the long text of the message it says:
InfoObject does not have the correct attributes to be used to read master data for the target field
The object is one of the master data attributes, so we are not sure why we are recieving this error?
We have recieved it with 0COMP_CODE trying to load Fiscal Year Variant and with a non standard enhanced object ZMATERIAL which is a copy of 0MATERIAL with added attributes for Product Class, Product Group and Product Line wich come from the product hierarchy and are populated on the load of the material from product hierarchy.
Has anyone else had an issue with this error and can you tell us how you fixed it???
Thank you!
CarolineYou received this error for loading 0FISCVARNT (Fiscal Year Variant) because that InfoObject isn't an attribute of 0COMP_CODE (Company Code) and it cannot therefore determine what data to populate based on your input. Instead of using 0COMP_CODE, you may have to enter a constant value for the Fiscal Year Variant if you don't have it in your source InfoProvider.
When you created ZMATERIAL, did you use 0MATERIAL as a reference so that it would inherit the attributes and texts? If so, Product Class, Product Group and Product Line would have to be attributes of 0MATERIAL. If you didn't reference 0MATERIAL and are loading attributes directly to the custom InfoObject attributes, then Product Class, Product Group and Product Line would have to be attributes in ZMATERIAL in order to use the Use Master Data function for transformations. You may have to derive the values for these different, such as add ABAP for lookup routines. -
Start Routine to Populate Account Group Field from Master data of 0Customer
Hello Friends. Please help me edit this ABAP code to make it work. I am putting this code in start routine in between two DSO. where I am using the
Start Routine to Populate Account Group Field from Master data of 0Customer. I do not want to use read from master data functionality since that field 0customer is not there in dso but similar field 0debitor is there. so i want to put this code
during the load from source DSO to Target DSO.
Error Explicit length specifications are necessary with types C, P, X, N und
DATA: L_S_DP_LINE TYPE DATA_PACKAGE_sTRUCTURE.
types: begin of comp,
CUSTOMER type /BI0/OICUSTOMER,
ACCNT_GRP type /BI0/OIACCNT_GRP,
end of comp.
DATA: l_S_comp type comp.
DATA: L_th_COMP TYPE HASHED TABLE OF COMP WITH UNIQUE KEY customer INITIAL SIZE 0.
IF L_th_COMP[] IS INITIAL.
SELECT CUSTOMER ACCNT_GRP FROM /BI0/PCUSTOMER APPENDING CORRESPONDING FIELDS OF TABLE L_th_COMP.
ENDIF.
LOOP AT SOURCE_PACKAGE INTO L_S_DP_LINE.
READ TABLE L_TH_COMP INTO L_S_COMP WITH TABLE KEY CUSTOMER = L_s_DP_LINE-CUSTOMER
IF SY-SUBRC = 0.
L_S_DP_LINE-/BIC/ACCNT_GRP = L_S_COMP-/BIC/ACCNT_GRP.
MODIFY SOURCE_PACKAGE FROM L_S_DP_LINE.
ENDIF.
ENDLOOP.
soniya kapoor
Message was edited by:
soniya kapoorHello Wond Thanks for Good Answer and good option, But Client does not like this option and does not like Nav Attribute so he does not want to turn on any Nav Attribute, In general also We hav requirement to read a third table while uploading 1 dso table to 2 dso table,
so Please help me edit this ABAP code to make it work. I am putting this code in start routine in between two DSO. where I am using the
Start Routine to Populate Account Group Field from Master data of 0Customer.
No syntax Error But during the load it is updating the source table and not the target table. how to define now target table.
***SOURCE DSO Table
types: begin of typ_tgl1.
include type /BIC/AZDAFIAR000.
types: end of typ_tgl1.
types: begin of comp,
CUSTOMER type /BI0/OICUSTOMER,
ACCNT_GRP type /BI0/OIACCNT_GRP,
end of comp.
DATA: L_th_COMP TYPE HASHED TABLE OF COMP WITH UNIQUE KEY customer
INITIAL SIZE 0.
data: wa_itab type COMP.
data: wa_zdtg type typ_tgl1.
IF L_th_COMP[] IS INITIAL.
***Master Data Table
SELECT CUSTOMER ACCNT_GRP FROM /BI0/PCUSTOMER APPENDING CORRESPONDING
FIELDS OF TABLE L_th_COMP.
sort L_th_COMP by CUSTOMER.
ENDIF.
LOOP AT L_th_COMP into wa_itab.
select * from /BIC/AZDAFIAR000 into wa_zdtg
where DEBITOR eq wa_itab-CUSTOMER. *** SOURCE DSO Table
IF SY-SUBRC = 0.
wa_zdtg-ACCNT_GRP = wa_itab-ACCNT_GRP.
MODIFY /BIC/AZDAFIAR000 from wa_zdtg. *** modify SOURCE DSO Table
ENDIF.
endselect.
endloop.
soniya kapoor -
Depreciation area set in Asset Class does not appear in asset master data
Depreciation area does not appear in the asset master data however it is set in the Asset class. If other asset is created in the same Asset class that depreciation area is automatically set in the asset master (it proofs that it is not a problem with Asset class).
I tried to set that deprecition area directly in the asset master data but all the fields are blocked for changing (is it possible to change that?).
What should I do as I need to post on that area ?
Thank you,
JeanOk Zaid! thanks,
But what treatment should I give to that asset? Because now it is not being evaluated in that area but we need to do it.
The asset was created in the prior exercise (year).
Regards,
Jean -
How to trace master data source (table-field)?
Hello All,
Please help, its Urgent!!
I have the following 4 characteristic requirement in BW from PM/PS module.
Master data source (format: table-field)
AFVV-FSAVD - early start date
AFVV-FSEDD - early finish date
AFVV-SSAVD - late start date
AFVV-SSEDD - late finish date
can anyone tell me the procedure, how to find them in R/3 n if there r any standard business content for the same?
Thanks in advance,
JPGo to RSA5 Tcode in R/3, under SAP R/3 Application component, you can get PO/PS Components, and under that you can get PO-IO/PS-IO Components, here you can check all the vaialble master data datasources.
KJ!!!
Maybe you are looking for
-
How to view (or print) all the links (URL's) of a webpage?
Years ago, when I was a Windows user, I recall my browser (IE or Firefox, can't recall which) having an option where I could print *all* the weblinks (URL's) at the end of the article. I don't see this option anywhere in Safari. I'd not only like to
-
Changing the file name dynamicallly in file Adapter - NullPointerException
Hi, I want to change the file name directly from Java Mapping in the file adapter for this i have written this code: [code] DynamicConfiguration conf = (DynamicConfiguration) container.getTransformationParameters().get(StreamTransformationConstants.D
-
Why does my screen keep refreshing?
Yesterday I had to delete all of my caches and preference files. Not a big problem. However today my desktop and file windows keep refreshing about every 30 seconds. Annoying when your screen (not apps) keeps blinking. Cannot find anything in Setting
-
How to create a watermark for digital photographs
HI, I'm wanting to c reate a watermark for my digital photographs. Can this be done in iPhoto? and If so, How? Thanks
-
Check the date of last modified procedure
Hi, could you please tell me how to check last modified function or procedure. for example: INVO_75.PCK.PrintInvoices_fct Thanks in advance.