ASCP Test for Load Legacy Data
Through to the legacy Web Adi ASCP Data in the Test of the load.
Plan to check more than one, but the result is not correct that there is cause msc_operation_components not exists.
Questions 1: msc_operation_components Table also through the Upload Web Adi BOM_Component and Routing_operation Data should be used or whether the Pre-Process Monitor Program or other Concurrent Program please tell us what is produced?
Questions 2: If you create a Data Collect flat file data self service if you need through the OA Template Upload for msc_operation_components Dat there is no there, what should I do??
I think you'll find the white paper published on metalink, our main avenue for publishing documentation and providing support information to customers.
<BLOCKQUOTE><font size="1" face="Verdana, Arial">quote:</font><HR>Originally posted by JohnWorthington:
Hello,
The 11.0.2 implementation doc mentions a technical "White Paper about the Data Pump" for loading legacy data into HRMS.
I cannot find this White Paper anywhere.
Does it exist?
What is the easiest way to load our legacy data into 11.0.3 HRMS??
Thanks,
John Worthington
[email protected]<HR></BLOCKQUOTE>
null
Similar Messages
-
Creating a report of all the errors occured while loading legacy data
hi guys,
i am using Bapi to load legacy data .
how can i list all the errors that occur during the transfer .
i want to see all the errors that occured and create a report .
thanks .Hi look at this code... you will get an idea
CALL FUNCTION 'BAPI_BUPA_FS_CREATE_FROM_DATA2'
EXPORTING
* BUSINESSPARTNEREXTERN =
partnercategory = c_2
partnergroup = c_rp
centraldata = wa_centraldata
IMPORTING
businesspartner = w_partner
TABLES
return = it_return.
* Check for errors
CLEAR wa_return.
READ TABLE it_return INTO wa_return WITH KEY type = c_e.
IF sy-subrc EQ 0.
CALL FUNCTION 'FORMAT_MESSAGE'
EXPORTING
id = wa_return-id
lang = sy-langu
no = wa_return-number
v1 = wa_return-message_v1
v2 = wa_return-message_v2
v3 = wa_return-message_v3
v4 = wa_return-message_v4
IMPORTING
msg = wa_return-message
EXCEPTIONS
not_found = 1
OTHERS = 2.
ENDIF. " IF sy-subrc EQ 0 -
What are the steps for loading master data
Hello
what are the steps for loading master data? i want to learn about loading all master data and the steps to choose the best way to load the data.
if anyone has documents please send me the documents i will be really greatful
[email protected] thanks everyone
EvionHi Heng,
Download the data into a CSV file.
Write a program using GUI_UPLOAD to upload the CSV file and insert records.Chk the below link for example
http://www.sap-img.com/abap/vendor-master-upload-program.htm
Reward Points for the useful solutions.
Regards,
Harini.S -
Please send detail steps for uploading legacy data
Hi friends,
please send detail steps for uploading legacy data
Thanking u in advance,
Diwa.HI U CAN USE LSMW TO UPLOAD LEGACY DATA
LSMW is used for migrating data from a legacy system to SAP system, or from one SAP system to another.
Apart from standard batch/direct input and recordings, BAPI and IDocs are available as additional import methods for processing the legacy data.
The LSMW comprises the following main steps:
Read data (legacy data in spreadsheet tables and/or sequential files).
Convert data (from the source into the target format).
Import data (to the database used by the R/3 application.
But, before these steps, you need to perform following steps :
Define source structure : structure of data in the source file.
Define target structure : structure of SAP that receives data.
Field mapping: Mapping between the source and target structure with conversions, if any.
Specify file: location of the source file
Of all the methods used for data migration like BDC, LSMW , Call Transaction which one is used most of the time?
How is the decision made which method should be followed? What is the procedure followed for this analysis?
All the 3 methods are used to migrate data. Selection of these methods depends on the scenario, amount of data need to transfer. LSMW is a ready tool provided by SAP and you have to follow some 17 steps to migrate master data. While in BDCs Session method is the better choice because of some advantages over call transaction. But call transaction is also very useful to do immediate updation of small amout of data. (In call transaction developer has to handle errors).
SO Bottom line is make choice of these methods based of real time requirements.
These methods are chosen completely based on situation you are in. Direct input method is not available for all scenario, else, they are the simplest ones. In batch input method ,you need to do recording for the transaction concerned. Similarly, IDoc, and BAPI are there, and use of these need to be decided based on the requirement.
Try to go through the some material on these four methods, and implement them. You will then have a fair idea about when to use which.
LSMW Steps For Data Migration
How to develop a lsmw for data migration for va01 or xk01 transaction?
You can create lsmw for data migration as follows (using session method):
Example for xk01 (create vendor)
Initially there will be 20 steps but after processing 1 step it will reduced to 14 for session method.
1. TCode : LSMW.
2. Enter Project name, sub project name and object name.
Execute.
3. Maintain object attributes.
Execute
select Batch Input recording
goto->Recording overview
create
recording name.
enter transaction code.
start recording
do recording as per ur choice.
save + back.
enter recording name in lsmw screen.
save + back
Now there will be 14 steps.
2. MAINTAIN SOURCE STRUCTURES.
Here you have to enter the name of internal table.
display change
create
save + back
3. MAINTAIN SOURCE FIELDS.
display change
select structure
source_fields->copy fields.
a dialogue window will come .
select -> from data file
apply source fields
enter No. of fields
length of fields
attach file
save + back
4. MAINTAIN STRUCTURE RELATIONS
display change
save + back
5. MAINTAN FIELD MAPPING & CONVERSION RULE
display change
click on source field, select exact field from structue and enter
repeat these steps for all fields.
save+back
6. MAINTAIN FIXED VALUES, TRANSACTION, USER DEFINED
execute
save + back
7. SPECIFY FILES.
display change
click on legacy data
attah flat file
give description
select tabulatore
enter
save + back
8. ASSIGN FILE
execute
display change
save + back
9. IMPORT DATA.
execute
display change
save + back
10. DISPLAY IMPORTED DATA
enter ok, it willl show records only.
back
11. CONVERT DATA
execute
display change
save + back
12. DISPLAY CONVERTED DATA
execute
display change
save + back
13. CREATE BATCH INPUT SESSION
tick keep batch input folder
F8
back
14. RUN BATCH INPUT SESSION.
sm35 will come
Object name will be shown here
select object & process -
DTP error: Lock NOT set for: Loading master data attributes
Hi,
I have a custom datasource from ECC which loads into 0CUST_SALES. I'm using DTP & transformation which has worked for loading data in the past for this infoobject. The infopackage loads with a green status but when i try to load data, the DTP fails with an error message at "Updating attributes for InfoObject 0CUST_SALES Processing Terminated" & the job log says: "Lock NOT set for: Loading master data attributes". I've tried reactivating everything but it didn't help. Does anyone know what this error means? We're on SP7. Thanks in advance!Hello Catherine
I have had this problem in the past (3.0B) --> the reason is that our system was too slow and could not crunch the data fast enough, therefore packets where loacking each other.
The fix: load the data into the PSA only, and then send it in background from the PSA to the info object. By doing this, only a background process will run, therefore locks cannot happen.
Fix#2: by a faster server (by faster, I mean more CPU power)
Now, maybe you have another issue with NW2004s, this was only my 2 cents quick thought
Good luck!
Ioan -
Transaction type for the legacy data for Auc
Dear Experts,
When i use the AS91 to load the legacy for Auc, the system issues the error ' Trans. type 900 can only be used for legacy assets from prev. years'
The asset value data is '2011/04/05', the transaction type is 900. The asset transfer data is '2011/04/25'
Should I have to use the transaction for the Auc's legacy data input? Can i choose other transaction tpye like 100?
Br
SophieHi
100 is used for uploading Non AUC assets
900 is used for AUC.. AS per SAP recommendation, 01/01/20xx to be used when uploading AUC... I misplaced that SAP note which says so
Normally changing the asset val date impacts the dep calculation.. Since, AUC has 0000 dep key, it does not impact Dep calculation
br, Ajay M -
Best Practices for Loading Master Data via a Process Chain
Currently, we load attributes, text, and hierarchies before loading the transactional data. We have one meta chain. To load the master data it is taking more than 2 hours. Most of the master data is full loads. We've noticed that a lot of the master data, especially text, has not changed or changed very little since we implemented 18 months ago. Is there a precedence or best practice to follow such as do we remove these processes from the chain? If so, how often should it be run? We would really like to reduce the amount of the master data loading time. Is there any documentation that I can refer to? What are other organizations doing to reduce the amount of time to load master data?
Thanks!
DebbyHi Debby,
I assume you're loading Master Data from a BI system? The forum here are related to SAP NetWeaver MDM, so maybe you should ask this question in a BI forum?
Nevertheless, if your data isn't changed this much, maybe you could use a delta mechanism for extraction? This would send only the changed records and not all the unchanged all the time. But this depends on your master data and of course on your extractors.
Cheers
Michael -
Steps to reducing time for loading of data
Hi
Could any one tell me how to reduce the time for loading of records into a particular cube or Ods. For ex: iam loading some 1 lac records.It was taking for me some 5 hrs. I want to reduce the time to 3hrs or 4hrs.What are the very first steps to be considered to make fast.
Regards
AjayHi Ajay,
Check the following.
1> Any routine you have in transfer rule and update rule should not fire database select more then one time in the same code.
2> Load Master data before transaction data.
3> Reduce the data pack size in infopackage.
4> Delete old PSA because you may space issue while data loading .
5> If you are loading in ODS then remove Bex check in ODS maintenance screen if you are not doing report on that ODS.
hope this will help you.
Suneel -
Help: Algorithm for Load File Data Block Hash?
Hi guys
I would like to understand the algorithm of a field in INSTALL for LOAD command. It is "Load File Data Block Hash". I'm concerning about it in JCOP tool + Eclipse
(All I want to to is designing the new INSTALL for LOAD command, and I want to imitate JCOP tool)
Can I use DES algorithm for this field?
Thanks in advance.The load file data block hash is described in GP Card Spec v2.1.1 sections 6.7.6.1, 7.7.7 and C.2 all titled Load File Data Block Hash.
The hash is a SHA-1 hash of the load file data block.
Cheers,
Shane -
Status for loading end date when GR
Hi! We are using shipment for imports. I have set the indicator 'Cumulate 'finish loading' status from inbound deliv. sts 'gds rcpt completed' for the shipment type. But on receipting the inbound delivery, the status is still not set in the shipment created (loading end date field).
Does anyone know what else need to be done for the date to be populated?
Regards
SFNo answer
-
Lock NOT set for: Loading master data attributes error
Hi experts,
We were encountering this error before when trying to load master data. When we checked the system we could not find any locks at the time, and activation or kicking off the attribute change run failed again. We finally solved the problem running FM RSDDS_AGGR_MOD_CLOSE which sets the close flag to 'X' in table RSDDAGGRMODSTATE. I have read that it is possible this lock error happens when two change runs happen at the same time.
My question are:
1. is it possible to find out what process exactly "caused" the lock? the table RSDDAGGRMODSTATE does not have a reference to any object or job. I am curious as we are trying to find ways to avoid this in the future...
2. in our case, when we could not find any locks, is running this fm the only work around? is this a best practice?
mark
Message was edited by:
Mark Siongco
Message was edited by:
Mark SiongcoHello Catherine
I have had this problem in the past (3.0B) --> the reason is that our system was too slow and could not crunch the data fast enough, therefore packets where loacking each other.
The fix: load the data into the PSA only, and then send it in background from the PSA to the info object. By doing this, only a background process will run, therefore locks cannot happen.
Fix#2: by a faster server (by faster, I mean more CPU power)
Now, maybe you have another issue with NW2004s, this was only my 2 cents quick thought
Good luck!
Ioan -
Process chain for loading master data attributes
Dear Experts,
Can any one explain the steps needed to load master data into info object through process chain.
Thanks in advance
Santhosh
Please search the forum
Edited by: Pravender on Jun 8, 2010 3:06 PMin bi7 this would be:
start
infopackage
dtp
attribute change run
M. -
SAP R3 and BI system requirement for loading Inventory data (0IC_C03)
Hi,
I have installed the Business content for 0IC_C03,activated the required datasources in R3 and replicated the same in SAP BI.
However, while filling up setup tables the load is getting failed and it is giving error "No internal table space....".
This is related to memory issues in SAP R3.Please let me know the minimum system requirement to do the above activity,
for example RAM,parameters to be set from basis point of view,etc.The backend is MSSQL 2003 having RAM of 20GB.There is around one crore of data.
Please let me as soon as possible.
Regards
Deepak.Thank you Murali.
But approximately can you tell me what is the memory requirement for SAP R3 and SAP BI for such type of applications.
Regards
Deepak. -
Error when scheduling the infopackage for loading Master data attributes
Hi,
Iam getting the following error message when scheduling this Master data Attributes ZIP_0PLANT_ATTR_FULL..( Flexible update of Master data info objects)..
In Data load monitor error i got this following error message.
Error message when processing in the Business Warehouse
Diagnosis
An error occurred in the SAP BW when processing the data. The error is documented in an error message.
System response
A caller 01, 02 or equal to or greater than 20 contains an error meesage.
Further analysis:
The error message(s) was (were) sent by:
Update rules
ThanksHI,
A caller 01, 02 or equal to or greater than 20 contains an error meesage This is an Idoc error. Please check the Idocs :
1) SM37 job log (In source system if load is from R/3 or in BW if its a datamart load) (give request name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
Also see if there is any 'sysfail' for any datapacket in SM37.
2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. (In source system if load is from R/3 or in BW if its a datamart load). See if its accessing/updating some tables or is not doing anything at all.
3) RSMO see what is available in details tab. It may be in update rules.
4) ST22 check if any short dump has occured.(In source system if load is from R/3 or in BW if its a datamart load)
5) Check in SM58 and BD87 for pending tRFCs and IDOCS.
Once you identify you can rectify the error.
If all the records are in PSA you can pull it from the PSA to target. Else you may have to pull it again from source infoprovider.
If its running and if you are able to see it active in SM66 you can wait for some time to let it finish. You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
If you feel its active and running you can verify by checking if the number of records has increased in the data tables.
SM21 - System log can also be helpful.
Thanks,....
Shambhu -
Dispatch Event for loaded Netconnection datas
Hi All!
I made a manager component to retrieve data from database
through amfphp.
When the datas are loaded at the end of the "onResult"
function, I dispatch an event "contactLoader"
Another parent mxml component has an instance of my manager
component with a call to a function contactLoader="getData()",
and then this function calls in turn the getContacts()
function of the instantiated manager component to retrieve the
datas. Thus I am sure that the datas are correctly loaded by my
manager before to retrieve the Array.
But I would like to refresh this array in case any change has
been made in the database. The problem is that the listener is
created automaticly by the instance of my manager component and the
"contactLoader" already exists once the data are loaded once. I
need a way to reload the data in the manager , create a new event
once the data are downloaded and then my parent component can call
one more time the getContacts function.
Is it the right way to do that? I'm not skilled enouth to
solve that problem. Thanks for your help."badoumba" <[email protected]> wrote in
message
news:gk57nf$oak$[email protected]..
> Hi All!
> I made a manager component to retrieve data from
database through amfphp.
> When the datas are loaded at the end of the "onResult"
function, I
> dispatch an
> event "contactLoader"
> Another parent mxml component has an instance of my
manager component with
> a
> call to a function contactLoader="getData()",
> and then this function calls in turn the getContacts()
function of the
> instantiated manager component to retrieve the datas.
Thus I am sure that
> the
> datas are correctly loaded by my manager before to
retrieve the Array.
>
> But I would like to refresh this array in case any
change has been made in
> the
> database. The problem is that the listener is created
automaticly by the
> instance of my manager component and the "contactLoader"
already exists
> once
> the data are loaded once. I need a way to reload the
data in the manager ,
> create a new event once the data are downloaded and then
my parent
> component
> can call one more time the getContacts function.
I'm not sure what the issue is...you can fire a new
contactLoader event
whenever you need to.
Maybe you are looking for
-
ITunes Music not showing up in Library
I have read a number of similar questions on how to restore the music library. Of those I tried none worked. So, can anyone tell me: How do I recreate the library so that it will pick up all music in iTunes Music folders? Thank you, Trouvera52 Wind
-
LINK PRODUCTION ORDER WITH CROSS_COMPANY PURCHASE ORDER
Hello, we have 2 plants (differents conpanies). Plant A do purchase orders to plant B. So, plant B have to produce this material to delivery plant A. We need to link this purchase order (intercompany, NB purchase order, two cmpanies) to the produce o
-
Hi, How to split the pdf file as chapter?. (or) splitting the pdf file as pages? Is any packages in java?. can u please tell me. Thanks, nithi
-
Why does i-tunes library change when i change song data, and graphics? For example, if I choose follow you, follow me from genesis greatest hits, i modify it to the original album, and art work. it reamins for some time, and I play it some time down
-
How to open a new web browser after posting form data to web server ?
Hi Experts, I am using NetBeans 6.5 and I am able to post data to web server using httpconnect and i am also able to open default browser on my mobile using platformRequest(url). But what I am trying to do is that: i want to post data (not get method