Transfer Master Data Rate Routing
Hello Gurus,
We need to transfer Master Data (Rate Routing) with ALE. We already configure the ALE connection. We need to know which is the Process Code in the Inbound Table.
Anyone can help us, please.
Thanks in advance.
Best Regards.
Check this link
http://help.sap.com/saphelp_47x200/helpdata/en/92/58d455417011d189ec0000e81ddfac/frameset.htm
For BOM and materials there are transaction codes...but i could not get for WC and Routing...
Similar Messages
-
Please specify Transaction code to load SCM ROUTE MASTER DATA (EWM Routes)
Please specify Transaction code to load SCM ROUTE MASTER DATA (EWM Routes data).
Regards
Dhirendra
Moderator message - Moved to correct forum
Edited by: Rob Burbank on Apr 21, 2009 8:58 AMRoute definition is done at 0VTC. You have define the Route with its description, Transit time, factory calender, carrier info etc. It is stored in TVRO table
Then yoiu will maintain the route determination in 0VRF.
Route Determination:
Dep zone of the shipping point + Destination zone (T-zone) of the ship-to + Transportation group + shipping conditions.
Route determination details are stored in TROLZ table. -
Transfer master data from a project from one client to another client
Hi all,
I have a training client and a production client. The project master data has been created (Tcode CJ01) in training client with multiple levels and WBS. Is there a way to transfer the master data from the training client to another client so there will be no need to create the same project in another client?
Thanks.Refer note for 37899 batch input option.
Though this pertains to older versions with few changes to code it might be suitble to the version you are working on.
Regards
Sreenivas -
Transfer master data such as contacts and products from SRM to MDM catalog
Hi! We have a client who is not using PI as their middleware. They are using SRM 7 with SRM-MDM catalogue. My question is: do we definitely need PI in order to distribute product master and contract data to the catalogue? When programs such as BBP_CCM_TRANSFER_CATALOG and MECCM are run, is the data passed on to PI automatically? Or do they just generate an XML file in a designated location which we then use to upload to SRM-MDM Catalog? In this case, I suspect we can use other middleware to handle the upload into the catalog?
Appreciate clarification on the above.
Thanks!
SFHi,
PI is required. Mapping is executed in PI.
If you do not want to have PI, you can develop custom extractor and mapping.
Regards,
Masa -
Cannot transfer master data to EWM server
Hi:
I am posting this in this forum because SNC will have similar configuration where SNC can run on a different server and other SCM applications can run on the server on which live cache is there. So, if this is the case, please share your experience.
The configuration that we have in the development box is to run GATP, SPP and Livecache on one server and EWM on the other server (both are SCM servers). For the purpose of this discussion, let us call the server that is meant for SPP/GATP/Livecache the SCM Server and the one that is meant for EWM the EWM server (although both are SCM servers). The various deployment options are described in SAP Note 1606493 and this cofiguration that we have chosen is considered as the standard deployment option.
We created our first integration model and tried to CIF plants to EWM from the one ECC server and got an error "Version 000 not found". The same error was reported in the following thread (see link below). In this thread, one person suggested running the report "/SAPAPO/VERSION_CREATE_NO_APO" on the EWM server. This seems to be nothing but a backdoor to creating a version and not somehow removing the need for it.
Version 000 not found error message - /SAPAPO/LOM009
I have the following questions:
1. Do we need to install live cache instance on the EWM server also?
2. Suppose that we install the live cache instance on the EWM server. Now, one has to think about the information flow between the three system landscape. That is, EWM server does the goods receipt, etc. and this information wil go to the ECC server and from there to the SCM server. Any thoughts that you may have on this will be appreciated.
Thanks,
SatishYou'll need to push your data from your LabVIEW application to a server:
- HTTP Client VIs
- TCP/IP
- WebSockets
Server could have a LV application on it, or you could use a web server with a scripting language (e.g. PHP/ASP/Node.js).
You'll need to write something on the server to listen to your LabVIEW application and hold the data:
- Database
- File
- Memory
You'll need to write some interface/API for pushing/retrieving the data - e.g. JSON/POST/XML - take a look at 'RESTful APIs'.
The choice of which of these to go for depends on how 'real-time' you are talking - how often do you want to update the data, the latency etc.
If you don't want to do all of this yourself, there are 3rd party providers that can store this sort of data for you - there's a free (but limited) one run by SparkFun - https://data.sparkfun.com/ but I'm sure there are other services.
Certified LabVIEW Architect, Certified TestStand Developer
NI Days (and A&DF): 2010, 2011, 2013, 2014
NI Week: 2012, 2014
Knowledgeable in all things Giant Tetris and WebSockets -
Master data transfer using CIF
Hi all,
I am just clueless. I am trying to transfer master data from ERP to EWM but it ain't happening. It is throwing out this error. System: P6QCLNT800 User: RFC_USER_1 12/11/2010
Function/Q/SAPAPO/CIF_LOC_INBOUND
Text: Internal error in IGS2, US. RFC: Connection Error.
Here P6QCLNT800 is the target system and RFC destination and also the logical system for SCM EWM. User is the RFC user.
All my settings and configs are right atleast as per SAP documentation for integrating EWM as an add on. I tried after disabling geocoders for US using the GEOCODERS report. Still I am getting, I have checked my RFC destination and tested my connection it is working fine in SM59. What should I do to proceed further?
I posted this in SCM LE forum but I guess this post is more suitable here.85,
This document tells me to create a RFC user with user type System. While I've created with the type communications.
This shouldn't make much difference. When I am debugging a new implementation I will make the USERID dialogue with SAP_ALL until all connectivity problems are resolved, then change the USERID to System (I have never used communications, although I do not doubt that it works) , and take away the excess authorization. From the sound of it, the type of USERID may not be your problem.
A common problem is the case sensitivity of passwords. You usually get some kind of logon error message, though (with standalone SCM anyway). When your userid is 'dialogue', and you execute SM59, and perform a remote logon, this will tell whether logon issues are still a problem. I always set my passwords in the destination to all uppercase, since the CIF seems to capitalize PWs before it sends them across.
does your document fit my case?
Not exactly. Unfortunately, I have never set up EWM as an addon, so I can't speak from experience. The instructions in your doc seem to mostly parallel the standard SCM connectivity docs though, so most of the steps are probably the same.
Best Regards,
DB49 -
Creation of Vendor Master Data
Hi,
In my company they are currently using this process to create Vendor Master Data: The Vendor Master Data is created in the Production Server and refreshed to Development Server and to Quality Server, using ALE.
Is this the right process? If not can you let me know the steps to create Vendor Master Data.
regards,
rajHi Raj,
ALE and IDoc is the best way to transfer master data between SAP system. They have excellent error handling and reprocessing capabilities (Recommended by SAP).
In order to automate the vendor master data transfer, you need to do the following steps.
1. Create Logical System (if not available) for PROD, QA and DEV system.
2. Create RFC Destination (SM59) from PROD to QA and DEV system.
3. Create Distribution Model from PROD to QA/DEV (BD64) for message type CREMAS (Vendor Master).
4. Create Partner Profile (WE20) for message type CREMAS in PROD, QA and DEV system.
5. Execute transaction code BD14 (Send Vendor) from PROD to QA and DEV system.
6. To monitor the IDoc, you can use transaction code WE02.
If you are not familiar with ALE/IDoc setup, please work with technical folk.
Hope this will help and give an idea.
Regards,
Ferry Lianto -
TDMS HCM data transfer - no data transferred to receiver and no errors
Hi,
I am experiencing problems with a "Package for HCM Personnel Dev. PD & PA Receiver " for my client who has approximately 800 employees.
I have no errors in any of the phases, and I can see the TDMS SEND and RECEIVER jobs executing during the data transfer. However, the data transfer phase executes for approximately 1-2 minutes but when I check the tables in the RECEIVER the tables (such as HRP1001) are empty.
There are 42 tables selected, and CNVHCM_TR_TAB contains 1533 entries.
Below is a sample from the 'Transfer Selection Criteria' phase:
Transfer Target - System ECR / Client 210
The Transfer Program ran in PA Delete Mode
The Transfer Program ran in PD Delete Mode
Object Type: C // Total number: 5
Object Type: O // Total number: 5
Object Type: P // Total number: 28
Object Type: S // Total number: 31
Table HRITABNR - Time(ms) 10 - Entries 3
Table HRP1000 - Time(ms) 57 - Entries 65
Table HRP1001 - Time(ms) 39 - Entries 158
Table HRP1007 - Time(ms) 38 - Entries 5
Table HRP1018 - Time(ms) 39 - Entries 3
Table HRT1018 - Time(ms) 17 - Entries 3
Table PA0000 - Time(ms) 53 - Entries 47
Table PA0001 - Time(ms) 83 - Entries 76
So I would reasonably assume that if there are no errors anywhere in the Sender, Receiver or the Central system - that table HRP1001 would contain an extra 158 entries or PA0001 would contain 76 entries. This is not the case PA0001 is empty and HRP1001 contains the same number of rows as before the Data Transfer executed.
Has anyone come across this situation before, or got suggestions on troubleshooting the cause of no data transfer when all the phases are green.
Any assistance is appreciated. Regards,, Sheryl.Hi Poonam,
The reason why there is no extra PA* tables above is that this was just a short snapshot of the tables, those tables are included.
I have selected the following for the Data Transfer:
Plan Version = 01
Object Type = O
Object ID = 500383 (Contracts)
Objects Status = All Existing
Evaluation Path = BSSORG
Status Vector = 1
PD Selection tab:
Use Current PD Selections - ticked
Transfer PD Infotypes - 0000 to 9402 - ticked
All other PD values unticked
PD Delete and Target options:
Delete target area - ticked
Target plan version = 01, and transfer 1:1 without change chosen
Root options:
Without new root - chosen
PA Selections
Use current PA selections - ticked
Transfer Master Data - Infotypes 0000 to 9402 - ticked
Transfer Cluster Data - ticked
Transfer Central Person - ticked
Partial cut-off date - 01.07.2011
PA Delete and Target Options
Delete Target Area - ticked
Target PERNR Options - Target range - 700000 to 99999999, maximum number range= 99999, transfer 1:1 without change (chosen)
When I choose CONFIRM ONLINE the only warning messages I receive in the log file is (there are no error messages):
No CP Object Type records will be transferred from the sender system
Table PA3xxxx / PERNI is not registered as being released
Table HRPxxxx / PERNI is not registered as being released
Table PA09xx / PERNI is not registered as being released
Hi Toribio,
When I run the test with PD and PA Authorisation with granularity 04 I get no messages on the next screen. All users have SAP_ALL. SAP_NEW, SAP_TDMS_MASTER, SAP_TDMS_HCM_MASTER plus extra specific HR authorisations for PA and PY.
Thankyou both for your assistance, hopefully we can get to the bottom of this problem.
Regards,
Sheryl. -
Master Data from Dev Server to Qty Server
Hi Friends
How we are sending Master Data from Dev Server to Qty Server?
I created a Business Partner in Dev Server now I want move to Qty Server so how can send that Business Partner?
Plz give me the solution with Example plz
Regards
Mahesh Kumar
My mail id is [email protected]Hello Mahesh,
you CAN NOT transfer master date from DEV to QTY. You have to create the Business Partner again.
Regards
Gregor -
Hi Gurus,
Is it possible t transfer master data from SRM to MM? If it is possible, can you please explain me how can we do it??
Regards,
EswarHi
You can transfer the master data
Vendor Data - By first transfering the payment terms and quality terms by using
Report: UPLOAD_PAYMENT_TERMS
Report: BBP_UPLOAD_QM_SYSTYEMS - for quality sytems
then
by transaction BBPGETVD for all the vendors data from the back end
Material Master Data by CRM Middlewear for the material master data of The process of material master data tranfer is simple and effective but you set the Adpater setting for the EBP side and Plug-In settings at R/3 side.
If you still have any questions let me know -
Transferring Product Master Data Changes From APO to ECC/R/3
Is it possible to transfer product master data changes in APO to ECC via the CIF? If so, does it require custom coding?
Hi James,
In all the different clients where APO is being used with ECC systems. ECC system is always the main system (system of record) which contains all the master data and data present in ECC supposed to be the reference for APO system for planning purpose.
Generally it is not recommended to transfer master data from APO to ECC but as mentioned earlier by some experts you may have ways to do that.
I would be interested in the business scenario where one needs to transfer master data from APO to ECC.
Thanks,
Anupam
Edited by: Anupam Sengar on Aug 11, 2011 12:31 AM -
Assign master data attribute value to Time Infoobject in Transfer Rule Rout
Hi Experts,
I want to assign value for Time Infoobject ZTIME from a Master data Infoobject's attr. The value for this ZTIME exists as an attr of Infooject ZMMM. I want to do at Transfer Structure level so that the assigned value can be used for another calculation in the update rule..
Any help on the routine is highly appreciated.
Thanks,
DVHi Edwin,
After making the changes as u suggested, i tried to activate the transfer rule, but Iam getting the following eerror msg
Error generating program
Message no. RSAR245
Diagnosis
An error occurred in the program generation:
Program / Template: RSTMPL80
Error code / Action: 6
Row: 6,083
Message: Statement is not accessible.
Procedure
If the problem occurred during the data load or transport, activate the transfer rule.
If the problem persists, search in the SAP note system under 'RSAR 245'.
How can Iresolve this?/ Why is this happening??
DV
Any Help???
null -
OK, so I have resolved an older problem only to have dropped frames all the time during playback. How do I find out the maximum transfer data rate my processor can handle? ANy suggestions on how to get around an issue like that? If I want clean crisp graphics, how can I avoid the DV codec - which I understand is awful for graphics- yet still be able to play my sequence back without dropped frames?
Hi Kristin,
If I read correctly, you moved everything off your internal drive to an external. This external drive - is it a SATA drive, or FireWire? You mentioned SATA before, so I'm inclined to think that's what it is. If it IS an external Serial ATA drive, when you got it, did you zero all data on the drive before using it?
Also, when you started using this new external, did you change your capture/render drive settings in your prefs in FCP? If not, they may still be set to your system drive, the default, instead of your new external hard drive. If this is the case, change it in FCP, save, and quit. TRASH the render files and move all the capture files from the system disk. re-open FCP, reconnect the moved capture files (if any), and of course, you'll have to re-render most of your timeline. when you do this, it'll render everything to the new hard drive (SATA, yes?), and you SHOULD be good to go.
Now, as for interfacing with a monitor or an external device - are you going throgh firewire to a camera or deck and then to your video monitor? If so, no matter what, you won't be able to view anything but a single frame at a time with uncompressed. The only other thing that I can think of is that your camera/deck/external video device is powered on while you're trying to do this. In order to monitor Uncompressed externally, you need a pretty spiffy video capture card like the Kona. But internally, you should have no issues. So, make sure any external device except a hard drive is turned OFF, and disconnected.
If none of this works, what you may want to do is change your sequence settings back to NTSC/DV for compression, and when you're totally done editing your project, you change them BACK to uncompressed, render, and export an uncompressed file to an external hard drive, take that drive to a video production company, have them slap it on a DigitalBeta tape as a master, and you're good to go.
Hope this helps...
-Kris -
Periodic Transfer of Master Data
Hello,
I need assistance understanding the underlying config related to master data updates in R/3 and the resulting entries in GTS.
Our setup:
GTS (7.2) is a plug-in to CRM
CRM & R/3 in same logical system group (GTS)
EWM in unique logical system group (FEEDSYS) - based on SAP config guide and direction, we split feeder logical systems from GTS logical systems. all logical systems had resided in a single logical system group (GTS)
EWM > system for warehouse functions
R/3 > feeder system for material master
R/3 > GTS Plug-In activated (per SAP direction)
R/3 > Configure Control Settings for Document Transfer > MMOA mapped to our purchase order doc type at logical system and logical system group level
R/3 > Configure Control Settings for Document Transfer > MMOB mapped to our inbound delivery doc type (EL) at logical system and logical system group level for DTAVI
R/3 > Configure Control Settings for Document Transfer > MMOC mapped to our movement types 101 and 601, in addition to others
I mention the PO and Inbound delivery setup because we do see changes to material qty in R/3 after creating PO's and the subsequent IB deliveries.
We are trying to implement GTS as a bonded warehouse - storing material as duty-unpaid only. We are not concerned with duty-paid status for this implementation.
There has been an initial transfer of material to GTS from R/3 via /SAPSLL/MENU_LEGALR3. We have GTS master data marked as BondedWH. Although T-Code MMBE shows qty's for the material, it does not match the qty being displayed in GTS stock overview.
The SAP GTS Config guide states, page 40, to create a job for RBDMIDOC, with a variant and have it scheduled. We still need to do this part and it is my understanding this is to synce data. With this, I am also trying to understand the use of change pointers in R/3, in addition to enhancement project SLLLEG04 for the User Exits for Material Masters. I don't know if we need to worry about the SLLLEG04 enhancement or not.
I know there is more information I can provide on our current setup, but what major things are there to consider to get the stock overview qty's of GTS to actually reflect what's in MMBE, AND, be displayed as duty-unpaid, rather than duty-paid via the initial transfer function?
Any advise or experience getting R/3 to communicate correctly with GTS, especially in light of a bonded warehouse is greatly appreciated.
Regards,
BrianHello Uwe,
Our intent for GTS is the use of bonded warehouse functionality.
As a part of preparation for go-live activities, we will stock the warehouse with material, this is a distribution center of service parts (replacement parts). The stocking activitiy will be through STO's. As a result, we expect all the material showing up in GTS to be duty-unpaid. We will continue to receive material into the warehouse via STO's as no base-receipts are required at this time.
When we process orders through the warehouse system (EWM) we expect the qty's in GTS to reflect what has been shipped (exports). As such, we will be able to reconcile any stock discrepancies with the authoriities.
My questions about intial transfer is my attempt to get some inventory showing in GTS, so that after I process the inbound deliveries based on PO's (STO) I see the inventory building in GTS.
The use of initial stock transfer may not be correct as I described my activities above, but with very little documentation specifying the actual process of establishing a bonded warehouse, especially in our landscape, I am working through this on a trial-and-error basis.
I hope this provides clarity on our goals. I will continue to research the change pointers as I am still not sure of the relevance.
Regards,
Brian -
Master Data transfer from SAP R/3 to SAP GTS
HI,
Master data like customer master is transferred fro R/3 to GTS. Any customer master in R/3 has many fields like company code, sales area, shipping conditions etc.
Shipping conditions and other fields are not defined in GTS. So how come when the customer master which is copied into SAP GTS is not giving any error since certain fields like shipping conditions are not there in GTS. I mean there are certain validation in R/3 for a customer . So how come these validations are being done in SAP GTS ??
regardsCustomer/vendor the fields which is passed to GTS from ECC can be found in - (/SAPSLL/MENU_LEGAL - SAP Compliance Management (SPL Screening) - Logistics (Simulation of SPL - Check general address). Here you can see all the details captured for Business Partner in GTS. You can also see the entire field range at (/nbp)
When you transfer a Customer master record to GTS, in GTS an internal BP number is created to map the stated fields based on number range logic etc as per requirement. The BP's are created in GTS as per the partner role in GTS and also can be retrived using partner role. At document level the validations happen w.r.t to partner functions mapped and created as per the feeder system.
Maybe you are looking for
-
Is there an iPad promotion when purchasing a new iPhone 6?
I just upgraded from two iPhone 4's to 2- 6's. Seems like I saw mentioned on these forums somewhere the other day, of a promotional offer of like 200 bucks off an iPad when getting the new iPhone 6? Is there a promotion like this, I can't seem to fin
-
Albums are in entirely incorrect order
Using most recent itunes. Have imported many albums. Itunes has decided to not keep the original play order of my cd's. All tracks have the correct track numbers when "get info" is selected, but nothing plays in order. what is going on? shuffle is of
-
Java FTP client do not work in Solaris 10
We just upgrade our Solaris 8.0 system to Solaris10. A java FTP client don't work anymore when I try to download a remote file. It worked nicely in Solaris8. Specifically, this java program is using JDK FtpClient, the command that downloading a file
-
Hi, I took training in Oracle BPM and looking for job now. I am doing a pilot project to get more understanding. I need some guidence from industry experinced BPM people. I may have some basic stupid questions. I do not want to post the questions on
-
How to hide standard fileds ?
Hi All, How can we hide the standard fields like Company Code , Location , Goods Recipents etc under the tab Basic Data in EBP? Thanks, Anubhav