Master Data/transactional Data Loading Sequence
I am having trouble understanding the need to load master data prior to transactional data. If you load transactional data and there is no supporting master data, when you subsequently load the master data, are the SIDs established at that time, or will then not sync up?
I feel in order to do a complete reload of new master data, I need to delete the data from the cubes, reload master data, then reload transactional data. However, I can't explain why I think this.
Thanks, Keith
Different approach is required for different scenario of data target. Below are just two scenarios out of many possibilities.
Scenario A:
Data target is a DataStore Object, with the indicator 'SIDs Generation upon Activation' is set in the DSO maintenance
Using DTP for data loading.
The following applies depending on the indicator 'No Update without Master Data' in DTP:
- If the indicator is set, the system terminates activation if master data is missing and produces an error message.
- If the indicator is not set, the system generates any missing SID values during activation.
Scenario B:
Data target has characteristic that is determined using transformation rules/update rules by reading master data attributes.
If the attribute is not available during the data load to data target, the system writes initial value to the characteristic.
When you reload the master data with attributes later, you need to delete the previous transaction data load and reload it, so that the transformation can re-determine the attributes values that writes to the characteristics in data target.
Hope this help you understand.
Similar Messages
-
We are preparing to load our master data, then our cubes.
My question is this: Is there a way to identify the master data objects that are dependent on the infoobject to be loaded.
Example. oprojects should be loaded before owbs_elemt, etc?
make sense. We want to make sure all higher level master data objects are loaded first.
Is there a way to see these dependencies ?i think the order might be important in case you are loading from any other source system(other than R/3) where it the checking is not as good as R/3.
If its R/3, i think it should be fine.
Robert correct me if i am wrong. what kind of dependency and order do u think is important, kindly let us know.
thank you
Gokul -
How to rectify the errors in master data loads & transactional data loads?
hy,
please any one tell me
How to rectify the errors in master data loads & transactional data loads?
thnQ
RaviHi,
Please post specific questions in the forum.
Please explain the error you are getting.
-Vikram -
Dear Experts,
If somebody can help me by the following case, please give me some solution. Iu2019m working in a project BI 7.0 were needed to delete master data for an InfoObject material. The way that I took for this was through tcode u201CS14u201D. After that, I have tried to load again the master data, but the process was broken and the load done to half data.
This it is the error:
Second attempt to write record 'YY99993' to /BIC/PYYYY00006 failed
Message no. RSDMD218
Diagnosis
During the master data update, the master data tables are read to determine which records of the data package that was passed have to be inserted, updated, or modified. Some records are inserted in the master data table by a concurrently running request between reading the tables at the start of package processing and the actual record insertion at the end of package processing.
The master data update tries to overwrite the records inserted by the concurrently running process, but the database record modification returns an unexpected error.
Procedure
u2022 Check if the values of the master data record with the key specified in this message are updated correctly.
u2022 Run the RSRV master data test "Time Overlaps of Load Requests" and enter the current request to analyze which requests are running concurrently and may have affected the master data update process.
u2022 Re-schedule the master data load process to avoid such situations in future.
u2022 Read SAP note 668466 to get more information about master data update scheduling.
Other hand, the SID table in the master data product is empty.
Thanks for you well!
LuisDear Daya,
Thank for your help, but I was applied your suggesting. I sent to OSS with the following details:
We are on BI 7.0 (system ID DXX)
While loading Master Data for infoobject XXXX00001 (main characteristic in our system u2013 like material) we are facing the following error:
Yellow warning u201CSecond attempt to write record u20182.347.263u2019 to BIC/ XXXX00001 was successfulu201D
We are loading the Master data from data source ZD_BW_XXXXXXX (from APO system) through the DTP ZD_BW_XXXXX / XXX130 -> XXXX00001
The Master Data tables (S, P, X) are not updated properly.
The following reparing actions have been taken so far:
1. Delete all related transactional and master data, by checking all relation (tcode SLG1 à RSDMD, MD_DEL)
2. Follow instructions from OSS 632931 (tcode RSRV)
3. Run report RSDMD_CHECKPRG_ALL from tcode SE38 (using both check and repair options).
After deleting all data, the previous tests were ok, but once we load new master data, the same problem appears again, and the report RSDMD_CHECKPRG_ALL gives the following error.
u201CCharacteristic XXXX00001: error fund during this test.u201D
The RSRV check for u201CCompare sizes of P and X and/or Q and Y tables for characteristic XXXX00001u201D is shown below:
Characteristic XXXX00001: Table /BIC/ PXXXX00001, /BIC/ XXXXX00001 are not consistent 351.196 derivation.
It seems that our problem is described in OSS 1143433 (SP13), even if we already are in SP16.
Could somebody please help us, and let us know how to solve the problem?
Thank for all,
Luis -
Error in Setting lock for Master Data Load
Hi Team,
I encounter the following error while uploading the Masterdata
"Lock NOT set for: Loading master data attributes
Attributes of characteristic 0BPARTNER are locked by terminated change run 465451241"
I Performed the following steps
1) Checked if any active jobs (change run jobs) running in the system. But no jobs running.
2) RSATTR->infoobject list-> no infoobject found
3)Tried after sometime ,but load continue to fail
Could anyone help to solve the problem
thanks
BalaHi Bala,
Check in SM12 if there are any locks exist with the change run number?
First check if there are any Master data loads running related to it?
Check in the Attribute change run screen if there are any change run steps are running?
Once after checking the above if no locks exist and no master data is running then check the below steps
1. Run RSA1 transaction
2. Tools -> Apply Hierarchy/Attibutes Change(or direct transaction RSATTR)... menu option
3. Click at "Monitor and Start terminated Change Runs" button (at the bottom of the screen)
4. Click "Reset Status" button at Run ID 465451241
NOTE: this is not recommended, to unlock the locks better to wait till the locks get relased.
If still the error message persists then we can use a FM in se37 in serious cases which was not recommended by SAP...
Use RSDDS_AGGR_MOD_CLOSE in se37 and execute--->there give ur failed change run no(465451241) beside I_CNSID and execute...
It releases all the locks by that change run....and continue with your further work/jobs...
Regards
KP -
R/3 Master Data Load Error
Hi Experts
Our master data load from R/3 to BW was failed today.
This is the error message on status Tab of failed request
Diagnosis
No IDocs could be sent to the SAP BW using RFC.
System response
There are IDocs in the source system ALE outbox that did not arrive in the ALE inbox of the SAP BW.
Further analysis:
Check the TRFC log.
You can get to this log using the wizard or the menu path "Environment -> Transact. RFC -> In source system".
Removing errors:
If the TRFC is incorrect, check whether the source system is completely connected to the SAP BW. Check especially the authorizations of the background user in the source system.
I navigated through Environment -> Transact. RFC -> In source system"
In R/3 i got the following error
timeout during allocate / CPIC-CALL: 'ThS
#Timeout during connect
How shall i proceed now.
ThanksHi,
Check the RFC connection between R3 and BW. Use transaction SM59 for the same or take help from the BASIS team.
If that is correct check the EDI partner profiles, if they are active and functioning properly. You can take BASIS help here.
Cheers,
Kedar -
Master Data Loaded but not able to see.
Loaded master data successfully. Able to see it in P and M table, but not able to see when going thru Manage - Contents at InfoProvider level.
Many requests are available showing number of records Transferred and Updated.
What could be the possible reason?Hi Sarabjit,
Welcome to SDN!!
You need to load the transaction data into the Info-provider to see the contents and not just the master data.
In case you have done the transaction load then you need to do the "apply hier/attr change" for the master data loaded. Goto RSA1 -> Tools(menu) -> apply hier/attr change -> Seelect your info-object and execute.
Bye
Dinesh -
0distr_chan initial master data load, text missing
Hey: I am trying to do an initial master data load for 0distr_chan on text. The load is not successful. In the monitor detailed error message, it has red alert for transfer rules missing message, update missing message.
0distr_chan attibute is OK.
I am also having problems when loading 0salesorg on text. It fails also.
Can anyone give me a clue?
Thanks!Hi, thank you for your answer. I am new to this system. Please give me more guidance.
To check on datasource on R3, I use transaction RSA6, underneath the SD-IO master data, I found 0DISTR_CHAN_TEXT with green text icon on the right. Is this a valid way to verify that datasource for 0DISTR_CHAN_TEXT has been activated OK on R3? I am not sure if there is any way for me to check if any transfer rules for the datasource has been activated OK on R3.
thanks! -
Hi All,
I would like to know the better master data strategy.
When using 3.x master data load strategy I can load directly into the info object without going thru additional DTP process. Is there any specific advantages of doing this, besides that it is 7.X.
I have been using 2004s from 2005 but most of the implementations we used 3.x methodology for master data load and DTP for transaction load. i would like to know whether SAP recommands new methodology using DTP for master data loads? If I load my master data using 3.x I can avoid one extra step, but will it be discontinued in the future? and have to use DTP even for this?
Please advice if you know what is the best way forward strategically an dtechnically?
Thanks,
Alex.Alex,
Please read my answer...
The new data flow designed by SAP is using the DTP, even for Master Data... Right now you can use the "3.x" style which they maintain for backward-compatibility but down the road, eventually, it will be dropped, so looking ahead, technically and strategically, the right way to go is by using DTP...
You can go here http://help.sap.com/saphelp_nw70/helpdata/en/e3/e60138fede083de10000009b38f8cf/frameset.htm and check under Data Warehousing, Data Distribution, Data Transfer Process...
You could also open an OSS note to SAP and ask them directly.
Thanks,
Luis -
Bi admin process chains- content master data load has failed
Hi Experts,
i worked with Bi Admin cockpit set-up, all the things are going fine
but Content Master data load (Attribute)has failed and i got the errors like
1)data records for package 1 selected in PSA-2 errors
2)Error 4 in the update
can anyone suggest me how can i sove the issue
regards
MrudulaHi
yes, i got 2 errors in Psa
when i click the 1st error i got like
Data record 10023 & with the key
'EBTCLNT100ACGRAZ_BWRPRT_USR_FBW_Ptest0001 &' is invalid in value
'EBTCLNT100ACGRAZ_BWRPRT_USR_FBW_attribute/characteristic 0TCTBWOBJECT &.
And i it gave the procedure as
if this message appears during a data load, maintain the attribute in the PSA maintenance screens. if this message appears in the master data maintenance screens, leave the transaction and call it again. this allows you to maintain your master data.
And when i click the 2nd error
it has showed the data record as 10024 and remaing explanation is same like 1st one
in this i am not getting, how to proceed the solution.
plz help me anyone
Thanks and regards
Mrudula
Edited by: mrudularajesh on Jul 1, 2009 7:39 AM -
Master Data Loading for Prices and Conditions in CRM - "/SAPCND/GCM"
Hi,
Could anyone give me some inputs on Master Data Loading for Prices and Conditions in CRM.
T. Code is: /SAPCND/GCM
I need to load data on a file (extracted from 4.6) for service contracts.
I tried LSMW : for this transaction, recording does not work.
I am trying loading thru Idocs (LSMW). But that too is note really working.
Do we require some custom development for this , or is some SAP standard funcntionality available ??
Can anyone provide some valuable inputs one this.
Would appreciate your responses.Hi Tiest,
Thanx for responding.
U r right, our clint is upgrading from 4.6 to ECC.
So as per the clients requirements, we are maintaining all the configs for Services in CRM.
Services Data which was in 4.6 is being pulled put on flat files which needs to be loaded in CRM. So middleware would not be able to do this.
What I am looking os some standard upload program.
LSMW recording does not work.
This I-Doc "CRMXIF_COND_REC_SLIM_SAVE_M", i am able to load a single record. But I am not able to find , how to make this function for multiple entries.
IN standard we for loading master data thru I-docs, we map the values to the standard fields which are available in that I-Doc.
But in this particular i-doc, there is a common field for which I need to define the field name and a field value..
Till now, I am only able to define just one field name and a field value.
I want this to word for mutliple entries.
Hope u get my point.
Thanx -
Regarding Master Data Loading by using Process Chain
hai
can anyone tell me 'step-by-step process and what are processes , process types' are used for loading the Master Data , Transaction into ODS and into Infocube.
i ll assing maximum points
bye
rizwanHI Mohammand,
1. Master Data loading:
http://help.sap.com/saphelp_nw04/helpdata/en/3d/320e3d89195c59e10000000a114084/content.htm
2. Transactional data to ODS:
http://help.sap.com/saphelp_nw04/helpdata/en/ce/dc87c1d711f846b34e0e42ede5ebb7/content.htm
3. Transactional data to CUBE:
http://help.sap.com/saphelp_nw04/helpdata/en/d5/e80d3dbd82f72ce10000000a114084/content.htm
Hope it Helps
Srini -
How do we improve master data load performance
Hi Experts,
Could you please tell me how do we identify the master data load performance problem and what can be done to improve the master data load performance .
Thanks in Advance.
NityaHi,
-Alpha conversion is defined at infoobject level for objects with data type CHAR.
A characteristic in SAP NetWeaver BI can use a conversion routine like the conversion routine called ALPHA. A conversion routine converts data that a user enters (in so called external format) to an internal format before it is stored on the data base.
The most important conversion routine - due to its common use - is the ALPHA routine that converts purely numeric user input like '4711' into '004711' (assuming that the characteristic value is 6 characters long). If a value is not purely numeric like '4711A' it is left unchanged.
We have found out that in customers systems there are quite often characteristics using a conversion routine like ALPHA that have values on the data base which are not in internal format, e.g. one might find '4711' instead of '004711' on the data base. It could even happen that there is also a value '04711', or ' 4711' (leading space).
This possibly results in data inconsistencies, also for query selection; i.e. if you select '4711', this is converted into '004711', so '04711' won't be selected.
-The check for referential integrity occurs for transaction data and master data if they are flexibly updated. You determine the valid InfoObject values.
- SID genaration is must for loading transaction data with respect to master data, to cal master data at bex level.
Regards,
rvc -
Master Data loading got failed: error "Update mode R is not supported by th
Hello Experts,
I use to load master data for 0Customer_Attr though daily process chain, it was running successfully.
For last 2 days master data loading for 0Customer_Attr got failed and it gives following error message:
"Update mode R is not supported by the extraction API"
Can anyone tell me what is that error for? how to resolve this issue?
Regards,
NiravHi
Update mode R error will come in the below case
You are running a delta (for master data) which afils due to some error. to resolve that error, you make the load red and try to repeat the load.
This time the load will fail with update mode R.
As repeat delta is not supported.
So, now, the only thing you can do is to reinit the delta(as told in above posts) and then you can proceed. The earlier problem has nothing to do with update mode R.
example your fiorst delta failed with replication issue.
only replicating and repeaing will not solve the update mode R.
you will have to do both replication of the data source and re-int for the update mode R.
One more thing I would like to add is.
If the the delat which failed with error the first time(not update mode R), then
you have to do init with data transfer
if it failed without picking any records,
then do init without data transfer.
Hope this helps
Regards
Shilpa
Edited by: Shilpa Vinayak on Oct 14, 2008 12:48 PM -
Hi All,
Could any body clearly explain the following options in the Info Package of master data Info Object with full update
Update data
1. PSA and then into Info Obejcts
2 . Only PSA ( Update Subsequently in data targets )
Data source trasfers no duplicate records
Igonore double data records
While I am doing master data load with the 1 and 3 options I am getting same no of records updated in PSA but loadng is getting failured with the 1st option (Caller 01 02 .. 20 Messages ) .and request with 3rd option is success..
Thanks in Advance...
Regards,
NagamaniHi
U r master data might have the Dublicate records
select the PSA
and select the option
Ignore dublicate data records
IF you have the Deblicates
EX
CNO
1010
1020
1010
ITs not alowed the 1010 record
Regards -
Master Data loading in BPC 7.5 with prefix
Hi Gurus,
I need a help in master loading in BPC 7.5
Currently we are loading GL account with prefix of A and controlling area(COA1)
BI:0000600000
BPC: ACOA10000600000
I have maintained logic in the transformation file as below. Z001 represents the hierarchy Node.
ID=IF(ID(1:4)=STR(Z001) then STR(A)+ID;STR(A)0CO_AREAID)
Now client has came with new requirement as below
BI BPC
GL Acc 0000600000 A_600000
GL Node Z00113 Z001_13
GL Node Z00213 Z002_13
For the above requirement i have maintained the logic in the transformation file as below. when running master data load package i got the error as
Line 755 :Command failed ; end position is out of record index
ID=IF(ID(1:4)=STR(Z001) then STR(Z001_)+ID(5:12);STR(A_)+ID(5:12))
I need your help in modifying the logic in the transformation file for handling the above requirement.
And also for handling two hierarchy Z001 and Z002.
Thanks
MaheshHi,
Your command should like:
ID=*IF(ID(1:4)=*STR(Z001) then *STR(Z001_)+ID(5:6);*STR(A_)+ID(5:10))
Hope this helps.
Maybe you are looking for
-
How can I copy all the comments from my picture/s in fotostream to my picture/s in my library?
-
How do I change the default Outlook mailto: account
I just upgraded to a new computer with Windows 7 and Outlook 2010. I'm running Firefox 3.6.9. When I right click on a website and choose "Send link", Firefox correctly uses Outlook as the email client. However, I have several accounts configured in O
-
Is there a way to automatically ID &/or delete copies/doubles of songs?
I have several copied versions of my iTunes library on external HD's, and when I travel with my music, I use my laptop and then add music, and so I have several copies of my library, but none of them are complete. I am concerned re consolidating them
-
Illustrating a stack of an unknown size in a JScrollPane
I designed a maze generation/traversal program and I want to show what happens to the stack in the depth first algorithms that I used. I have a JFrame with a JPanel containing the maze and to the right of the JPanel is a JScrollPane to hold the stack
-
[SOLVED] v4l emulated capture image formats missing
I'm trying to get my Webcam working with OpenCV on Archlinux. The Webcam is working through uvcvideo and OpenCV trys to access it through v4l requiring it to provide BGR24. The webcam itself only supports YUYV and MJPG and v4l should usually emulate