Initial load of PO - Migrate PO from R3 to SRM
Hi all,
we are implementing a SRM 5.0 (ECS) and we have to make an initial migration of po from r3 to srm.
Is there a standard functionality (IDOC, BAPI, or transaction) to use for this aim?
rds
enzo
Hi,
BBP_PD_PO_CREATE is called the very first time when the PO creation starts.
BBP_PD_PO_SAVE is to save the PO to database.
BBP_PD_PO_UPDATE is to update the PO after BBP_PD_PO_CREATE. It is not necessary that UPDATE is called after save. It can be called before.
You can check the function group BBP_PD_PO for more methods.
Regards
Saravanan.
Please award if useful
Similar Messages
-
Initial load of inventory level from csv - double datarows in query
Hello everybody,
a query result shown in a web browser seems strange to me and I would be very glad, if anyone can give me some advice how to solve the problem. As I do not think that it is related to the query, I posted it into this forum.
The query refers to an InfoCube for inventory management with a single non-cumulative key figure and two other cumulative key figures for increase and decrease of inventory. The time reference characteristic is 0CALDAY. The initial load has been processed reading from a flat file (CSV), the structure looks like this:
Product group XXX
Day 20040101
Quantity 1000
Increase 0
Decrease 0
Unit ST
The initial load runs fine, the system fills all the record sets into the InfoCube. Unfortunately I do not know how to look at the records written into the cube, because only the cumulative key figures are shown in InfoCube-> Manage-> Contents.
Well, when executing the query, a really simple one, the result is just strange, because somehow there are now two rows for each product group with different dates, one with the 1st of January, 2004 and the other for the 31st of December, 2003 containing both 1000 units. The sum is 2000.
It became more confusing, when I loaded the data for increase and decrease: now the quantities and sums are correct, but the date of the initial load is a few days later than before, the data table in the query does not contain the 1st of January.
Does anybody know, what I did wrong or where there is information about how to perform an initial load of inventory from csv in a better way?
Kind regards
PeterPeter,
Inventory is not that straight forward to evaluate as it is non-cumulative. Basically it means that one KF is derived from one/two other KFs. You cannot see non-cumulative KFs in manage infocube.
Have you uploaded opening balances separately? If so, your data for 31st of december is explained.
In non-cumulative cubes, there need not be a posting for a particular day for a record to exist. For e.g. if you have stock as 10 units on 1st and then no posting for 2nd and 3rd and then increase 10 units on 4th, even for 2nd and 3rd, the non-cumulative KF will report as 10 units (stock on 1st rolled forward).
There is a how to...inventory management document on service market place that explains this quite nicely.
Cheers
Aneesh -
Golden Gate Initial load from 3 tb schema
Hi
My source database is 9i rdbms on solaris 5.10. I would like to build 11gR2 database on oracel Enterprise linux .
How can i do the initial load of 3tb size schema , from my source to target ( which is cross platform and different version of rdbms)
ThanksCouple of options.
Use old export/import to do the initial load. While that is taking place, turn on change capture on the source so any transactions that take place during exp/imp timeframe are captured in the trails. Once the init load is done, you start replicat with the trails that have accumulated since exp started. Once source and target are fully synchronized, do your cutover to the target system.
Do an in-place upgrade of your 9i source, to at least 10g. Reason: use transportable tablespaces (or, you can go with expdp/impdp). If you go the TTS route, you will also have to take into account endian/byte ordering of the datafiles (Solaris = big, Linux = little), and that will involve time to run RMAN convert. You can test this out ahead of time both ways. Plus, you can get to 10g on your source via TTS since you are on the same platform. When you do all of this for real, you'll also be starting change capture so trails can be applied to the target (not so much the case with TTS, but for sure with Data Pump). -
Initial Load for Master Data from R/3 to CRM
Hi Middleware Experts,
I have an ambiguity on initial load of enhanced master data from R/3 to CRM. The current situation is
1)I have already enhanced master data on R/3 side with some additional fields. Now shall i go ahead with standard intial load of these Master data to CRM without creating middleware interfaces to map those enhanced fields on CRM side?
2)Also after doing initial load from R/3 to CRM, if I develop middleware interfaces for mapping these new fields of master data and then again do the initial load, so that new fields also get mapped, will the initial load be done properly as i have already done it once? will it overwrite the previous initial load data on CRM side properly?
Please suggest whether it can be done or not and which is the better option.
Regards,
RahulHi Rahul,
Since you have not done the mapping yet the middleware will not bring those fields to CRM it will work just like a standard initial load without considering the enhanced master data field.
When you develop the middleware interface and enhance the CRM data structure also and then you can see the exchange of those fields through MW. You can start the intial load at that time also its not that you can do the initial load just once. But the better option would be that you avoid doing initial load several times because its time consuming activity. You do all the enhancement and then finally perform the initaial load.
<b>Allot points if my post helps!!</b>
Best regards,
Vikash. -
Initial Load ISU / CRM Issues
Hi all
I have a few questions.
1. When you start an initial load for BUPA_MAIN in R3AS (from ISU to CRM) it seems only 1 DIAG WP is used (SM50). Is this normal?
2. In R3AM1 you can monitor the objects. Everything seems to be loaded but yet the status is still 'Running', is this because of BDoc validation errors?
Kind regards
LucasHello Lucas,
There's a param(CRM_MAX_QUEUE_NUMBER_INITIAL) you can maintain in the table CRMPAROLTP of the source system(maintain this in the ERP system if you're doing an initial load from ERP to CRM)
Reference Notes:
350176
765236
The parval1 for Parameter CRM_MAX_QUEUE_NUMBER_INITIAL should be an integer(lets say 5) instead of a character 'X'. In such a case 5 queues would be formed for initial load.
Here's an example :
Parameter name CRM_MAX_QUEUE_NUMBER_INITIAL
Param. Name 2 <Object name> " for example CUSTOMER_MAIN
Param. Name 3
User <User> " for example CRM, if CRM is connected
Param. Value <no. of queues> " for example 5
Param. Value 2
So using the example above 5 R3AI_CUSTOMER_MAIN queues would be formed on the ERP and sent to CRM for processing.
The no. of available dialog work processes in SM51 available to the qRFC scheduler in CRM system should be able to handle the parallel processing of queues. So consult your basis administrators on this front.
Regards,
Rohit -
Best Practice for Initial Load Data
Dear Experts,
I would like to know the best practices or factors to be concerned when performing initial load
For example,
1) requirement from business stakeholders for data analysis
2) age of data to meet tactical reproting
3) data dependency crossing sap modules
4) Is there any best practice for loading master data?HI ,
check this links
Master Data loading
http://searchsap.techtarget.com/guide/allInOne/category/0,296296,sid21_tax305408,00.html
http://datasolutions.searchdatamanagement.com/document;102048/datamgmt-abstract.htm
Regards,
Shikha -
Privileges getting removed during initial load
Hi All,
I was performing initial load,There is no error as such in initial load however privileges are removed from backend but available on IDM , any changes perform to identity all roles are provisioning to back end system.
When i have googled it , I have found few good links(http://scn.sap.com/thread/3400455) which says issue is fixed in IDM 7.2 SP8 ,however i am using SP8 but still facing same issue.
I have attached current version of Provisioning framework which is been used .... Please let me know it is right version.
Secondly, I have also attached script & version of IDM , Please advice on it.
Regards,
AliHi Ali,
I have seen and experienced this before. If you are doing an initial load when the users are already in the system, you may well experience this since the load job is overwriting what your users have.
Take a look at this blog post I wrote: Setting Write Permissions on ABAP Initial Loads.
I think it might help point you in the right direction.
Thanks,
Matt -
Hi
We are wanting to use the BOMBOS functionality. The standard BOMBOS interface transfer for new and changed BOMs works perfectly. However we need to carry out an initial load of BOMs.
We used to use BOMBOS several years ago, and carried out the initial load using a bespoke program from TechniData (ISI-STEP package), however it is no longer working.
The selection screen was as follows:
The BOMs would then be available for adding to the worklist (CG37).
We have upgraded our SAP version since then to ECC 6.0, hence this maybe the issue. However before we look into it any further, I wondered how do other companies carry out this initial load. Is there now a standard report I can use?
Thanks and regards
KarenDear Karen
i found by accident only: :consolut - R_EHPRC_BOMBOS_START_PV - Bill of Material Transfer
consolut - /TDAG/CPR_BOMBOS_START - Bill of Material Transfer
The last one is related I believe to "SAP Product and REACH Compliance" Suite and or former TechniData SolutioN "COmpliance for products". I am not sure about the first one
Potential solution may be depends on your system architecture: Do you like to generate REAL_SUB from BOM or do you have existing REAL_SUB with linked material in the system?
Check may be
BOM BOS Interface
Bill of Substance –Bill of material (BOSBOM)
If you have somebody in your house which can analyse the bespoke report he/or she might be able to detect why it does not work any more.
Sorry: As we do not use BOM BOS I can not provide additional input
C.B.
PS: check this thread may be on top: run BOMBOS en masse -
Java SP's - Initial Load Time!
FYI, We are running Oracle 8.1.7 on HP/UX 11.0 using Java Stored Procedures.
The Java stored procedures work fine (sub-second performance) after being loaded into memory, but the initial load time can be anywhere from 2 to 3 seconds. When you have 30+ Java stored procedures, the initial startup causes table locking and contention that will cause our app to crash.
Rewriting these Java SP's in PL/SQL shows that PL/SQL is immune to this initial load time (to a much lesser degree). Besides rewriting my application to read all of the Java SP's into memory before app startup, is there anything I can do on the Oracle server side to improve (or eliminate) load times for Java Stored Procedures?
Thanks!Hi Sonja,
maybe you can help me:
I want to map a portal role ( pcd:portal_content/every_user/general/eu_role) to a user. So I use the SAP Provisioning Framework with the Job "SetJavaRole&GroupForUser". But I get following exception:
putNextEntry failed storingSPML.SAPUSER.3010
Exception from Add operation:com.sap.idm.ic.ToPassException: No such objectclass defined
Exception from Modify operation:com.sap.idm.ic.ToPassException: SPML exception: Could not update
user The given id "$FUNCTION.sap_getSPMLRoles(2371!!NEW_UME!!pcd:portal_content/every_user
/general/eu_role)"is not a unique id of a role!
In the "destination" tab of the job I entered the following values:
SPMLID SPML.SAPUSER.%MSKEYVALUE%
assignedroles $FUNCTION.sap_getSPMLRoles(%MSKEY%!!%$rep.$NAME%!!pcd:portal_content/every_user/general/eu_role)
I have no idea what's wrong in here...
Maybe you or someone else can help me...?! -
Initial load of sales orders from R3 to CRM without statuses
1) Some sales orders were uploaded into CRM without statuses in the headers or line items.
2) Some sales orders were uploaded without status, ship-to, sold-to, payer.....If I deleted them and use R3AR2, R3AR4 to upload each individual then no problem.
Any ideas or suggestions?
Thanks.Hi,
Request load of adapter objects uses different extractor modules for extracting the data from external system to CRM. While your initial load of sales docs. will use a different extraction logic bases on the filter conditions specfied on trx.
R3AC1
There may be a problem in the extraction of data from the source system (don't know if you are using a R/3). Can you please de-register the R/3 (i suppose) outbound queue using trx.
SMQS
, and then debug the extraction (R/3 outbound) before the data is sent to CRM using FM
CRS_SEND_TO_SERVER
If this goes well, you may try debugging the mapper in CRM inbound and the validation module in CRM as a last resort. Also, please refer to trx.
SMW01
to see if the Bdocs are fully processed.
Hope this helps...Reward if helpful.
Regards,
Sudipta. -
Initial Load of Cubes from BW 3.1 to BW 3.5 system
Hello,
I am looking for some help with regards to the following challenge. We have an existing BW 3.1 (about 3 TB in size) with a cube defined which covers about half the size of the total amount of data (approx 1.5TB). We need to do an inital load of the cube data onto a new BW 3.5 system (which is in a different Data Center) and than establish regular deltas between the 2 systems. My question is what is the best way to move that initial amount of data from BW 3.1 to the new BW 3.5 system? Do we have to send every record over a wire or can we export the cube to tape (which will take a long time and is not easily repeatable), than import it in BW 3.5 and re-establish the connection between the 2 systems to allow deltas to flow. Any advice or explanation is highly appreciated
Thanks
Hans deVriesHi Hans,
isn't it possible for you to do full loads for some time slices from your source cube? So you would be able to load the history up to a specific date via full loads and if you are up to date initialize your delta.
Siggi -
Initial Load of contract from ISU to CRM
Hi All,
We are working on the replication of contract from ISU to CRM.
We have done all the necessary setting,like assigning default product,running ecrm_generate_everh report etc
When we are running initial load on SI_CONTRACT,only single BDOC is getting generated which is error free but still it contains no data.
Since its without an error ,we are not able to figure out whats the error is.
Regards
NikhilHello Nikhill,
Could you resolve the problem?? I've a similar error, the BDoc is empty. The table everh is filed but the fields contractpos and contarcthead have value '0000000000000000' i think that is the problem. ANd the recport ECRM_CHECK_EVERH say that are misiing contracts.
Could you help me please!!!
Thnks!! -
Hi.
Initial load from ABAP described in following PDF document failed.
Identity Management for SAP System Landscapes: Configuration Guide
http://service.sap.com/~sapidb/011000358700001449662008E
System log Error
lang.ExceptionInInitializerError: JCO.classInitialize(): Could not load middleware layer 'com.sap.mw.jco.rfc.MiddlewareRFC'
JCO.nativeInit(): Could not initialize dynamic link library sapjcorfc [C:\WINDOWS\system32\sapjcorfc.dll: Can't find dependent libraries]. java.library.path [C:\Program Files (x86)\Java\j2re1.4.2_16\bin;.;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\system32;C:\WINDOWS;......
Environment is
IdM 7.1
NW AS JAVA 7.0 SP 17
Windows 2003 Server 64bit
SQL Server
Any help will be most appreciated !Hi You Nakamura,
We are facing the same problem during AS ABAP - initial load with the same error message.
I downloaded the msvcp71.dll file and placed it in System32 folder. But the problem remains the same (even after server restart).
The system log shows the same error message and no entries in job log.
Please let me know if you had followed any different way to install the dll file.
Regards,
Vijay.K -
Does re-running a MDM Initial Load restart from the first record?
Is there a way to restart an Initial load? I was running a new initial materials load in my ECC system for MDM (MDM_CLNT_EXTR) which was running a very long time (5 days). The background job hit a bad piece of data and crashed.
My question is: when I re-run my initial load job, will it start over from the first record, or will it pick up from the last material/IDOC and resume processing? I would prefer not to re-process all 1.7M materials since the job was very near the end before it crashed.
Thanks,
DenisHello,
There are two modes which are available in Client Extractor, Initial and Delta.
If you select Initial it will start from the first record, that you mention in the selection Criteria.
In Delta mode, any new records created will be sent to MDM.
For your issue, in the initial mode, select the Material Number from and to, which have not been updated in MDM.
This is will ensure, that only the records that have not been sent.
Regards,
Abhishek -
Error in Initial Load of Products from ECC to CRM
Hi ,
Im getting below errors In initial load of MATERIAL from ECC to CRM.
<b>Data cannot be maintained for set type COMM_PR_MAT
Data cannot be maintained for set type COMM_PR_UNIT
Data cannot be maintained for set type COMM_PR_SHTEXT</b>
<b>My Analysis is</b>
I found these set types are already assigned in other base category and around 100+ materials are created using this base category.
After initial load of DNL_CUST_PROD1, i found these set types are not assigned to any of the categories of hierarchy R3PRODSTYP. Since R3PRODSTYP is not having above set types, my initial load is failing.
<b>Required Solution</b>
Is there any other way, I can download materials from ECC to CRM, with out deleting the existing materials and Z hierarchy created earlier from CRM.
Regards,
Prasad
Message was edited by:
SAP CRMHi there,
Try to attach the Set Types:
"COMM_PR_MAT", "COMM_PR_UNIT", "COMM_PR_SHTEXT", COMM_PR_BATCH, COMM_PR_LGTEXT, COMM_PR_LGTEXT1,COMM_PR_LGTEXT2,
to the Category "MAT_" and then run the Initial load again.
It should work now unless you have some other issues.
Best regards,
Pankaj S.
Maybe you are looking for
-
Can not "see" Adobe Acrobat Reader DC in Google Chrome Plugins? Find in IE.
Downloaded new version of Reader twice and still can not see it in Google Chrome plugsin. Windows 7 with latest version of Google Chrome? Suggestions?
-
In-Ear Headphones Warranty Question
My non-Apple headphones are dead - a wire is probably loose at the connector (most common problem I have with headphones). I'm probably going to purchase the Apple In-Ear headphones. Does it come with a one-year warranty? If the cord or connector goe
-
Why Background job cancelled ?
Hi Experts, I developed a report for execution that is taking too much time so i kept it in background job using SM37, but it is getting canceled with in an half an hour am not getting why it is canceled. so plz assist me to know the reason. thanks a
-
Can anyone help me, please?
-
Column having minimum value of two calculated columns
Hi All Please help me out in the following scenario. I have to get the data from two tables so whatever columns I need put it in the select statement but for two columns I have to do calculations and did assigned an alias name to those two calculated