Initial Load of DD* tables
Hi everyone,
Just a hopefully quick and simple question. We set up a SLT Replication Server recently and created our first SLT Configuration into HANA. We have no need for real time so we choose a scheduled time out of office hours. The configuration was successful however the DD* tables appeared in a scheduled state. From what I thought these tables should populate (load) initially regardless of the schedule. What appeared to happen was that they were waiting for the first schedule to run. Is this expected? Without these populating initially we could not choose any real ERP tables to replicate.
We also tried with a non-SAP source (Oracle) and the DD* tables for that configuration were populated instantly even though that config was scheduled to run "off-peak" as well.
Thanks,
Marcus.
Hi Marcus,
As far as I understand your question, please find below my comments-
(sap source system scenario)
If the configuration is created with "schedule by time" option which I think is done in your case, then the replication server would replicate database changes to target system at the time set by you. Here the metadata of tables would be copied from source system tables[DD002L and DD02T]
yes, you are correct that ideally we should start with ERP tables replication after DD* tables are replicated successfully. This generally is faster but then depends upon the system.
(non sap source system scenario)
here the DD* tables are just initially loaded and not automatically replicated.
So you would find a difference in how the replication takes place in both scenarios.
Hope it answers to some extent for your query.
Regards,
Saritha K
Similar Messages
-
Initial load of custom table from r3
Hello all,
I am pretty much new to crm, with the documentation i had i was able to create a site for my r3 system in crm.
Initially i created a z table in my r3 system and crm system. i loaded some data into r3 ztable. i just want to replicate the data into my crm system.
If some one tell me if i need to create any custom function module to create adapter object in crm system. and also necessary steps from here on to perform the required task.
Any doucment regardig could be sent to: [email protected]
Thanks,
Krishnam Raju.Hi,
You need to perform a number of steps to result in the intial load of a z-R/3 table to a Z-CRM table. Here are some hints although I do not have a detailed document for this. May be you should try looking the CR 550 SAP course for <i>'Enhancing CRM middleware'</i> -
1. Create a Bdoc structure and model a mBdoc using the Bdoc Modeller in CRM.
2. Create a Adapter object in trx.
R3AC1
giving all details of the Bdoc type and tables used in source and destination systems.
3. You need to create an extract function module for extracting the data from Z-r/3 table. This needs to be part of crmsubtab mw table.
4. You also need to create a mapping function module in CRM for mapping the data received from the R/3 plug-in to the mBdoc structure. Insert this in the appropriate tab of trx.
R3AC1
5. Create a validation module which validates the data before persisting into the CRM z-table. This should be part of MW table smw3bdocif.
6. Once everything is in place, you can start the load using trx.
R3AS
. Monitor using
SMW01
7. Use the trx.
SMQR
to deregister the inbound queues and debug the CRM inbound. You can debug the R/3 outbound using
SMQS
Hope this helps. Reward if helpful.
Regards,
Sudipta. -
Golden Gate - Initial Load using parallel process group
Dear all,
I am new to GG and I was wondering if GG can support initial load with parallel process groups? I have manage to do an initial load using "Direct BULK Load" and "File to Replicat", but I have several big tables and replicat is not catching up. I am aware that GG is not ideal for making initial load, but it is complicated to explain why I am using it.
Is it possible to user @RANGE function while performing Initial Load regardless of which method is used (file to replicat, direct bulk, ...) ?
Thanks in advanceyou may use datapump for initial load for large tables.
-
Initial load failing between identical tables. DEFGEN skewed and fixable?
Initial load failing between identical tables. DEFGEN skewed and fixable?
Error seen:
2013-01-28 15:23:46 WARNING OGG-00869 [SQL error 0 (0x0)][HP][ODBC/MX Driver] DATETIME FIELD OVERFLOW. Incorrect Format or Data. Row: 1 Column: 11.
Then compared the discard record against a select * on the key column.
Mapping problem with insert record (target format)...
**** Comparing Discard contents to Select * display
ABCHID = 3431100001357760616974974003012 = 3431100001357760616974974003012
*!!! ABCHSTEPCD = 909129785 <> 9 ???*
ABCHCREATEDDATE = 2013-01-09 13:43:36 = 2013-01-09 13:43:36
ABCHMODIFIEDDATE = 2013-01-09 13:43:36 =2013-01-09 13:43:36
ABCHNRTPUSHED = 0 = 0
ABCHPRISMRESULTISEVALUATED = 0 = 0
SABCHPSEUDOTERM = 005340 = 005340
ABCHTERMID = TERM05 = TERM05
ABCHTXNSEQNUM = 300911112224 = 300911112224
ABCHTIMERQSTRECVFROMACQR = 1357799914310 = 1357799914310
*!!! ABCTHDATE = 1357-61-24 00:43:34 <> 2013-01-09 13:43:34*
ABCHABCDATETIME = 2013-01-09 13:43:34.310000 = 2013-01-09 13:43:34.310000
ABCHACCOUNTABCBER =123ABC = 123ABC
ABCHMESSAGETYPECODE = 1210 = 1210
ABCHPROCCDETRANTYPE = 00 = 00
ABCHPROCCDEFROMACCT = 00 = 00
ABCHPROCCDETOACCT = 00 = 00
ABCHRESPONSECODE = 00 = 00
…. <snipped>
Defgen comes out same when run against either table.
Also have copied over and tried both outputs from DEFGEN.
+- Defgen version 2.0, Encoding ISO-8859-1
* Definitions created/modified 2013-01-28 15:00
* Field descriptions for each column entry:
* 1 Name
* 2 Data Type
* 3 External Length
* 4 Fetch Offset
* 5 Scale
* 6 Level
* 7 Null
* 8 Bump if Odd
* 9 Internal Length
* 10 Binary Length
* 11 Table Length
* 12 Most Significant DT
* 13 Least Significant DT
* 14 High Precision
* 15 Low Precision
* 16 Elementary Item
* 17 Occurs
* 18 Key Column
* 19 Sub Data Type
Database type: SQLMX
Character set ID: ISO-8859-1
National character set ID: UTF-16
Locale: en_EN_US
Case sensitivity: 14 14 14 14 14 14 14 14 14 14 14 14 11 14 14 14
Definition for table RT.ABC
Record length: 1311
Syskey: 0
Columns: 106
ABCHID 64 34 0 0 0 0 0 34 34 34 0 0 32 32 1 0 1 3
ABCHSTEPCD 132 4 39 0 0 0 0 4 4 4 0 0 0 0 1 0 0 0
ABCHCREATEDDATE 192 19 46 0 0 0 0 19 19 19 0 5 0 0 1 0 0 0
ABCHMODIFIEDDATE 192 19 68 0 0 0 0 19 19 19 0 5 0 0 1 0 0 0
ABCHNRTPUSHED 130 2 90 0 0 0 0 2 2 2 0 0 0 0 1 0 0 0
ABCHPRISMRESULTISEVALUATED 130 2 95 0 0 0 0 2 2 2 0 0 0 0 1 0 0 0
ABCHPSEUDOTERM 0 8 100 0 0 0 0 8 8 8 0 0 0 0 1 0 0 0
ABCTERMID 0 16 111 0 0 0 0 16 16 16 0 0 0 0 1 0 0 0
ABCHTXNSEQNUM 0 12 130 0 0 0 0 12 12 12 0 0 0 0 1 0 0 0
ABCHTIMERQSTRECVFROMACQR 64 24 145 0 0 0 0 24 24 24 0 0 22 22 1 0 0 3
ABCTHDATE 192 19 174 0 0 0 0 19 19 19 0 5 0 0 1 0 0 0
ABCHABCDATETIME 192 26 196 0 0 1 0 26 26 26 0 6 0 0 1 0 0 0
ABCHACCOUNTABCER 0 19 225 0 0 1 0 19 19 19 0 0 0 0 1 0 0 0
ABCHMESSAGETYPECODE 0 4 247 0 0 1 0 4 4 4 0 0 0 0 1 0 0 0
ABCHPROCCDETRANTYPE 0 2 254 0 0 1 0 2 2 2 0 0 0 0 1 0 0 0
ABCHPROCCDEFROMACCT 0 2 259 0 0 1 0 2 2 2 0 0 0 0 1 0 0 0
ABCHPROCCDETOACCT 0 2 264 0 0 1 0 2 2 2 0 0 0 0 1 0 0 0
ABCHRESPONSECODE 0 5 269 0 0 1 0 5 5 5 0 0 0 0 1 0 0 0
… <snipped>
The physical table shows a PACKED REC 1078
And table invoke is:
-- Definition of table ABC3.RT.ABC
-- Definition current Mon Jan 28 18:20:02 2013
ABCHID NUMERIC(32, 0) NO DEFAULT HEADING '' NOT
NULL NOT DROPPABLE
, ABCHSTEPCD INT NO DEFAULT HEADING '' NOT NULL NOT
DROPPABLE
, ABCHCREATEDDATE TIMESTAMP(0) NO DEFAULT HEADING '' NOT
NULL NOT DROPPABLE
, ABCHMODIFIEDDATE TIMESTAMP(0) NO DEFAULT HEADING '' NOT
NULL NOT DROPPABLE
, ABCHNRTPUSHED SMALLINT DEFAULT 0 HEADING '' NOT NULL NOT
DROPPABLE
, ABCHPRISMRESULTISEVALUATED SMALLINT DEFAULT 0 HEADING '' NOT NULL NOT
DROPPABLE
, ABCHPSEUDOTERM CHAR(8) CHARACTER SET ISO88591 COLLATE
DEFAULT NO DEFAULT HEADING '' NOT NULL NOT DROPPABLE
, ABCHTERMID CHAR(16) CHARACTER SET ISO88591 COLLATE
DEFAULT NO DEFAULT HEADING '' NOT NULL NOT DROPPABLE
, ABCHTXNSEQNUM CHAR(12) CHARACTER SET ISO88591 COLLATE
DEFAULT NO DEFAULT HEADING '' NOT NULL NOT DROPPABLE
, ABCHTIMERQSTRECVFROMACQR NUMERIC(22, 0) NO DEFAULT HEADING '' NOT
NULL NOT DROPPABLE
, ABCTHDATE TIMESTAMP(0) NO DEFAULT HEADING '' NOT
NULL NOT DROPPABLE
, ABCHABCDATETIME TIMESTAMP(6) DEFAULT NULL HEADING ''
, ABCHACCOUNTNABCBER CHAR(19) CHARACTER SET ISO88591 COLLATE
DEFAULT DEFAULT NULL HEADING ''
, ABCHMESSAGETYPECODE CHAR(4) CHARACTER SET ISO88591 COLLATE
DEFAULT DEFAULT NULL HEADING ''
, ABCHPROCCDETRANTYPE CHAR(2) CHARACTER SET ISO88591 COLLATE
DEFAULT DEFAULT NULL HEADING ''
, ABCHPROCCDEFROMACCT CHAR(2) CHARACTER SET ISO88591 COLLATE
DEFAULT DEFAULT NULL HEADING ''
, ABCHPROCCDETOACCT CHAR(2) CHARACTER SET ISO88591 COLLATE
DEFAULT DEFAULT NULL HEADING ''
, ABCHRESPONSECODE CHAR(5) CHARACTER SET ISO88591 COLLATE
DEFAULT DEFAULT NULL HEADING ''
…. Snipped
I suspect that the fields having subtype 3 just before the garbled columns is a clue, but not sure what to replace with or adjust.
Any and all help mighty appreciated.Worthwhile suggestion, just having difficulty applying.
I will tinker with it more. But still open to more suggestions.
=-=-=-=-
Oracle GoldenGate Delivery for SQL/MX
Version 11.2.1.0.1 14305084
NonStop H06 on Jul 11 2012 14:11:30
Copyright (C) 1995, 2012, Oracle and/or its affiliates. All rights reserved.
Starting at 2013-01-31 15:19:35
Operating System Version:
NONSTOP_KERNEL
Version 12, Release J06
Node: abc3
Machine: NSE-AB
Process id: 67895711
Description:
** Running with the following parameters **
2013-01-31 15:19:40 INFO OGG-03035 Operating system character set identified as ISO-8859-1. Locale: en_US_POSIX, LC_ALL:.
Comment
Comment
REPLICAT lodrepx
ASSUMETARGETDEFS
Source Context :
SourceModule : [er.init]
SourceID : [home/ecloud/sqlmx_mlr14305084/src/app/er/init.cpp]
SourceFunction : [get_infile_params]
SourceLine : [2418]
2013-01-31 15:19:40 ERROR OGG-00184 ASSUMETARGETDEFS is not supported for SQL/MX ODBC replicat.
2013-01-31 15:19:45 ERROR OGG-01668 PROCESS ABENDING. -
Initial load of mview on a prebuilt table
We are using 9i Advanced Replication, materialized views. The situation is, we have a number of tables utilizing FAST or FORCE on PREBUILT TABLE. The master site database is already loaded and the mview logs have been created.
The initial creation of these materialized views does not result in the population of the prebuilt tables as expected. Only an update of the master site table will trigger a replication.
Is there a way to trigger a replication event that would provide the initial load on a prebuilt table?You will need to set you mv refresh to COMPLETE, refresh your group then set them back to FAST.
Depending on you data volumes, this might use a lot of rollback since the group is done in a single txn. Also make sure the refresh interval isn't so small that it tries another FULL refresh straight away - before you change them back to FAST. -
Initial load of small target tables via flashback query?
A simple question.
Scenario: I’m currently building a near real time warehouse, streaming some basic facts and dimension tables between two databases. I’m considering building a simple package to "reset" or reinitialize the dimensions an all-round fix for variety of problem scenarios (since they are really small, like 15 000 rows each). The first time I loaded the target tables I utilized data pump with good success, however since streams transforms data on the way a complete reload is somewhat more complex.
Considered solution: Ill just write a nice flashback query via db-link fetching data from a specific (recent) SCN and then I reinitialize the table on that SCN in streams...
Is this a good idea? Or is there something obvious like a green and yellow elephant in the gift shop that I overlooked? Why I’m at all worried is because in the manuals this solution is not mention among the supported ways to do the initial load of a target table and I’m thinking there is a reason for this?I have a series of streams with some tranformations feeding rather small dimensional tables, I want to make this solution easy to manage even when operations encounter difficult replication issues, so Im developing a PL/SQL package that will:
1) Stop all streams
2) Clear all errors
3) Truncate target tables
4) Reload them including transformation (using a SELECT AS OF "ANY RECENT SCN" from target to source over dblink)
5) Using this random recent SCN I will re-instantiate the tables
6) Start all streams
As you see datapump even if it works is rather difficult to utilize when you tranform data from A to B, using AS OF I not only get a constant snapshot from the source, I also get the exact SCN for it.
What do you think? Can I safely use SELECT AS OF SCN instead of datapump with SCN and still get a consisten sollution?
For the bigger FACT tables im thinking about using the same SELECT AS OF SCN but there with particular recent paritions as targets only and thus not having to reload the whole table.
Anyways this package would ensure operations that they can recover from any kind of disaster or incomplete recovery on both source and target databases, and just re-instantiate the warehouse within minutes. -
No initial load of Customers, Material and delta load of Sales Orders.
Hi Experts,
I am facing a very troublesome issue. I am not able to setup the Middleware portion for initial and delta loads. I read a lot of documents and corrected a lot of things. finally, the connectivity is done with R/3 and CRM. Initial load of all objects is successful (as per Best PRactices guide). Customizing load is successful.
But after now I have these open issues for which I am unable to find any answers (am really exhausted!!):
- Customer_main load, it was succesful, but no BP's of R/3 are available.
- Material, it failed in SMW01, SMQ2, the errors are:
Mat. for Initial Download: Function table not supported
EAN xxxxxxxxxxxxxxxxxx does not correspond to the GTIN format and cannot be transferred
EAN yyyyyyyyyyyyyyyyyy does not correspond to the GTIN format and cannot be transferred
Plant xx is not assigned to a business partner
- Sales order, it shows green bdoc, but error segments says "No upload to R/3" and the order does not flow to R/3.
We had our system setup 3 years back for data transfer and Middleware. But few things changed and connectivity stopped. I did all that again now, but am not yet successful. Any inputs will be greatly appreciated.
Thanks,
-PatHi Ashvin,
The error messages in SMW01 for MAterial initial load is :
Mat. for Initial Download: Function table not supported
EAN 123456789000562 does not correspond to the GTIN format and cannot be transferred
EAN 900033056531434 does not correspond to the GTIN format and cannot be transferred
Plant 21 is not assigned to a business partner
I have done the DNL_PLANT load successfully. Why then the plant error?
Some of the messages for BP:
Messages for business partner 1331:
No classification is assigned to business partner 1331
For another,
Partner 00001872(469206A60E5F61C6E10000009F70045E): the following errors occurred
City Atlanta does not exist in country US
Time zone EST_NA does not exist
You are not allowed to enter a tax jurisdiction code for country US
Validation error occurred: Module CRM_BUPA_MAIN_VAL, BDoc type BUPA_MAIN.
Now, the time zone EST is assigned by default in R/3. Where do I change that? I do not want to change time zones as this may have other impacts. Maybe CRM I cna change this, not for sure in R/3. City check has been deactivated in R/3 and CRM, still the error.
Till these 2 are not solved, I cannot go into the Sales order loads.
Any ideas will be greatly appreciated.
Thanks,
-Pat -
Replicating data once again to CRM after initial load fails for few records
My question (to put it simply):
We performed an initial load for customers and some records error out in CRM due to invalid data in R/3. How do we get the data into CRM after fixing the errors in R/3?
Detailed information:
This is a follow up question to the one posted here.
Can we turn off email validation during BP replication ?
We are doing an initial load of customers from R/3 to CRM, and those customers with invalid email address in R/3 error out and show up in SMW01 as having an invalid email address.
If we decide to fix the email address errors on R/3, these customers should then be replicated to CRM automatically, right? (since the deltas for customers are already active) The delta replication takes place, but, then we get this error message "Business Partner with GUID 'XXXX...' does not exist".
We ran the program ZREPAIR_CRMKUNNR provided by SAP to clear out any inconsistent data in the intermediate tables CRMKUNNR and CRM_BUT_CUSTNO, and then tried the delta load again. It still didn't seem to go through.
Any ideas how to resolve this issue?
Thanks in advance.
MaxSubramaniyan/Frederic,
We already performed an initial load of customers from R/3 to CRM. We had 30,330 records in R/3 and 30,300 of them have come over to CRM in the initial load. The remaining 30 show BDOC errors due to invalid email address.
I checked the delta load (R3AC4) and it is active for customers. Any changes I make for customers already in CRM come through successfully. When I make changes to customers with an invalid email address, the delta gets triggered and data come through to CRM, and I get the BDOC error "BP with GUID XXX... does not exist"
When I do a request load for that specific customer, it stays in "Wait" state forever in "Monitor Requests"
No, the DIMA did not help Frederic. I did follow the same steps you had mentioned in the other thread, but it just doesn't seem to run. I am going to open an OSS message with SAP for it. I'll update the other thread.
Thanks,
Max -
Initial Load of contract from ISU to CRM
Hi All,
We are working on the replication of contract from ISU to CRM.
We have done all the necessary setting,like assigning default product,running ecrm_generate_everh report etc
When we are running initial load on SI_CONTRACT,only single BDOC is getting generated which is error free but still it contains no data.
Since its without an error ,we are not able to figure out whats the error is.
Regards
NikhilHello Nikhill,
Could you resolve the problem?? I've a similar error, the BDoc is empty. The table everh is filed but the fields contractpos and contarcthead have value '0000000000000000' i think that is the problem. ANd the recport ECRM_CHECK_EVERH say that are misiing contracts.
Could you help me please!!!
Thnks!! -
Initial load of data to UCM for Customer Hub PIP
Hi,
What is the recommended approach to have XREF tables populated during an initial load to UCM (via EIM), when the Accounts already exist in Siebel CRM?
Our approach as of now, using EIM, is as follows:
1) Set required customer information in EIM_UCM_ORG
2) Look up the customer's existing Row_ID in Siebel, and populate EIM_UCM_ORG.EXT_SYSTEM_NUM and EIM_UCM_ORG.UCM_EXT_ID) accordingly
3) Run the EIM job and UCM Batch Process to import the customer into UCM
The account then appears in UCM with the correct reference to siebel/row_id under the "external account IDs" tab. HOWEVER, it also contains a reference to a newly created duplicate record for that account in Siebel. Looking at the xref tables, there is no reference to the existing Siebel/row_id specified in the EIM batch load, and our hypothesis is that this is the reason the account cannot be found (and a duplicated is created). What we want to achieve here is to tell UCM that the accounts we are inserting do infact already exist in Siebel CRM, and can be identified by the row_id that we pass along.
The relevant system versions are Customer Hub PIP 11g with AIA 3. Siebel CRM and Siebel UCM are on patch 8.1.1.7 (and pertinent ACRs have been incorporated in the two Siebel instances).
Any hints or suggestions on how to approach this would be appreciated
-M
Edited by: 968713 on Nov 1, 2012 5:05 AM
Edited by: 968713 on Nov 1, 2012 5:05 AM
Edited by: 968713 on Nov 1, 2012 5:06 AMDo you really need to populate the XREF table/transaction History table for initial load?
-
I am running an initial load from LDAP using the template job.
The users have been successfully loaded into the Id store table but the group read pass does not do anything.
What should the source and destination tabs look like for the Read groups pass.
Thanks
S.Hi
In my case the InitialLoad-Jobs for ADS/LDAP had some information missing in the pass "ReadGroupOfUniqueNamesFromLdap".
In the Source-Tab the LDAP URL should look like this:
LDAP://%$rep.LDAP_HOST%:%$rep.LDAP_PORT%/%$rep.LDAP_STARTING_POINT_GROUPS%?*?SUB?%$rep.LDAP_FILTER_GROUPS%
For that you should create additional repository-constants "LDAP_FILTER_GROUPS" and "LDAP_STARTING_POINT_GROUPS" which look like this in my case:
LDAP_FILTER_GROUPS=(objectclass=group)
LDAP_STARTING_POINT_GROUPS=ou=groups,ou=idm,dc=example,dc=com
I didn't change anything at the Destination-tab.
Hope this helps... -
Initial load of inventory level from csv - double datarows in query
Hello everybody,
a query result shown in a web browser seems strange to me and I would be very glad, if anyone can give me some advice how to solve the problem. As I do not think that it is related to the query, I posted it into this forum.
The query refers to an InfoCube for inventory management with a single non-cumulative key figure and two other cumulative key figures for increase and decrease of inventory. The time reference characteristic is 0CALDAY. The initial load has been processed reading from a flat file (CSV), the structure looks like this:
Product group XXX
Day 20040101
Quantity 1000
Increase 0
Decrease 0
Unit ST
The initial load runs fine, the system fills all the record sets into the InfoCube. Unfortunately I do not know how to look at the records written into the cube, because only the cumulative key figures are shown in InfoCube-> Manage-> Contents.
Well, when executing the query, a really simple one, the result is just strange, because somehow there are now two rows for each product group with different dates, one with the 1st of January, 2004 and the other for the 31st of December, 2003 containing both 1000 units. The sum is 2000.
It became more confusing, when I loaded the data for increase and decrease: now the quantities and sums are correct, but the date of the initial load is a few days later than before, the data table in the query does not contain the 1st of January.
Does anybody know, what I did wrong or where there is information about how to perform an initial load of inventory from csv in a better way?
Kind regards
PeterPeter,
Inventory is not that straight forward to evaluate as it is non-cumulative. Basically it means that one KF is derived from one/two other KFs. You cannot see non-cumulative KFs in manage infocube.
Have you uploaded opening balances separately? If so, your data for 31st of december is explained.
In non-cumulative cubes, there need not be a posting for a particular day for a record to exist. For e.g. if you have stock as 10 units on 1st and then no posting for 2nd and 3rd and then increase 10 units on 4th, even for 2nd and 3rd, the non-cumulative KF will report as 10 units (stock on 1st rolled forward).
There is a how to...inventory management document on service market place that explains this quite nicely.
Cheers
Aneesh -
Initial Load of BUPA_MAIN to GWA_01 does not work
Hi all
I have a question concerning the serverside groupware Integration and the initial Load of BUPA_MAIN.
According to the groupware integration guide, I have to "Transfer Business Partner to Public Contact Directory".
This is done with transaction R3AS (Object BUPA_Main, Sender=CRM, Receiver=GWA_01 (the mBDOC site)).
When I want to chose the above options in R3AS I cannot select GWA_01 as receiver site (only CRM or OLTP is offered when I click F4). That means when I chose BUPA_MAIN as object I cannot chose GWA_01 as receiver site (fbut or other objects f.ex. GWA_* I can chose GWA_01 as receiver site).
I created all subsriptions needed (I had to recreate the publication "ALL Business Partners (MESG)" because it was missing in my system).
Does anyone have an idea what could be wrong?
Thanks in advance and Regards
MarcoHi Marco,
Thank you for the reply.
The below steps i followed to customize Groupware Adapter. Please correct me if i went Wrong
1) Generated Functions and Services for BDoc Types
2) Generated Replication Services
3) Generated R/3 Adapter Services
4) Created Sites for Groupware Adapter
Name
Groupware adapter 01 (mBDoc)
Groupware Adapter 02 (sBDoc)
5) Created Subscriptions for the Groupware Adapter
6) Enabled the Transfer of Business Partners to the Private Contact Folders.
7) Enabling the Transfer of Attachments from CRM to Groupware
TBD
8) Prepared for Customizing Download
9) Loading the Customizing Objects to Groupware
Note : TBD
10) Loading Business Partners to Groupware
Note : TBD
11) Customizing Activities for Groupware Integration in IMG
Defined RFC Destination for the MapBox
Defined Internal SyncPoint Settings
Delete Timed Out Sessions from Status Table
Maintained Application Scenarios
Maintained vCard Properties
Maintained vCard Sub-Properties
Maintained iCalendar Priorities
Maintained Groupware Adapter Properties
Maintained General Groupware Settings
Maintained Groupware Object Selection Fields
Maintained Groupware Object Selection and Retrieval
Maintained Groupware Object Master Category List
12) Application-Specific IMG Views for Groupware Integration
Maintained Business Partner Properties
Maintained Business Partner Sub-Properties
Mapped CRM Business Partner Properties to vCard Properties
Mapped iCalendar Priorities and SAP Application Priorities
Filter Condition for Business Partners Exchange with Groupware
13) Activity Management -Specific IMG Views for Groupware Integration
Maped Activity Status to Groupwar
Maped Task Status to Groupware
Maped Activity Text Types and Business Partners to Groupware
All the above steps from 11 to 13,all the default entries were reentered starting with Z,is this ok.
Please let me now whether the above steps on customizing the Groupware Adapter which i followed are enough or i need to do any additional customization.
As we are using two sites, let me know how can i define RFC Destination for the MapBox.
How I can check whether ABAP MAPBOX is having the RFC Connection and it is activated.
Thanks & Regards
Pranav -
Initial load of articles through ARTMAS05
Hi Retail experts, I need to built IDOCs to perform the initial load of articles through ARTMAS05.
Although we tried to use an IDOC from a DEMO system as template, we couldn't get a successful IDOC so far. The Function module we are using is BAPI_IDOC_INPUT1, with IDOC type ARTMAS05.
Does anybody has a guideline to set this up?
Thanks in advance.I would welcome Bjorn's input on this, but, generally I accomplish this using LSMWs. I use SWO1 to explore the business object, but use LSMW (Legacy System Migration Workbench) to mass load. In the case of LSMW, you simply call the transaction LSMW.
- From here, define a project, subproject and object. (Eg: Project = INITLOAD {initial data load}, Subproject = LOMD {logistics master data}, object = ARTMAS {article master}).
- Maintain the object attributes. Here, you can chose from four options: standard/batchinput, batch input recording, Business Object Method (BAPI) or IDoc. Choose the Business Object method, use object BUS1001001 and method CLONE.
- Define your source structure. Here, you will lay out what the input file's STRUCTURE is going to look like (not the fields). Since it's ARTMAS, it's not realistic to put all data into a single row in a text file, so you will likely use a structured input file - especially for variants, site-specific and sales-org-specific data.
- Define your source fields. Here you will define the fields that are in your input file and assign them to your input structure. A lot of work goes into this step. Note - I would try to use names very close to the SAP names, since there is an automapping tool. Also, you can copy SAP table structures into your field structures which is very helpful if you plan to use say 75 - 80 percent of the fields of a particluar structure.
- Maintain structure relations. You will assign your input structures to the corresponding ARTMAS structures in this step.
- Map the fields and maintain conversion rules. Here you assign your input fields to the ARTMAS fields. Also, you can code ABAP in this step for conversion/translation purposes. It depends on your chosen ETL or ETCL or ECTL or CETL methodology (E = Extract, C = Cleanse, T = Transform, L = Load) on whether you will write ABAP conversion rules in this step.
- Specify input files. This is where the data resides in it's text file input format. Typically, you will use a small data set that sits on your PC to test it, and then for a mass load, create a server-side directory on the SAP server, place the input file there, and you can use it. This speeds processing for large files considerably.
- Assign files. Here you assign the previously specified input file to an input structure
- Read data. This actually reads the data so that you can see how it's going to come in.
- Convert data. This creates a psuedo IDoc. It is not an IDoc yet, but in IDoc format.
- Start IDoc generation. This converts the converted file into a true IDoc.
- Start IDoc processing. Here, your IDoc moves from 64 status to (hopefully) 53 status.
Well, I hope this helps, and I would be interested in Bjorn's input. Also, Bjorn, what did you mean by the WRKKEY comment? I've never heard or seen a reference to this. -
Initial load of DNL_CUST_CNDALL in running status
Hi ALL,
I am doing initial load of DNL_CUST_CNDALL, it is in running status in R3AM1.
Prior to this I have downloaded several other customizing objects and those have been downloaded successfully. My system has highest support package implemented according to the SAP NOTE I found.
Also there is no inbound queue in CRM and outbound queue in R/3 for this object.
All the queues are registered also in both systems(smqr and smqs).
After aborting the object in R3AM1, I have tried downloading the object again several times, still no entries are coming in any queues.
PLEASE help me on this.Hi Mahaadhevan,
please check the table CRMRFCPAR as well where you store the entry for the logical system. Check as well notes 588701 76501and SAP Best Practice for Connectivity and Replication.
Please reward with point if it helps.
Regards,
AndreA
Maybe you are looking for
-
How do you access data stored on the iCloud ?
im trying to restore my iPod. How do i get my data back on it from the Icloud ?
-
Electronic Bank Statement (EBS)
Hi All, My Client is implemented Electronic Bank Statement using BAI format. GL Accounts set up is wrong (all bank accounts should be clearing type accounts (Open Item managed, except Operating Account (Main bank account))),but these accounts are not
-
Why did all my scroll bars disapear? I don't have any option in my System Preferences to always turn scroll bars on (as suggested)
-
Why won't my desktop background save?
I have the wierdest problem. Everytime I try to change my desktop background it goes back to an old one when I restart my computer. Has anyone had this happen to them? I have Lion version 10.7.3
-
Lightroom 5.4 beta???
Somebody just sent me a Lightroom log file which said Lightroom version was 5.4. Has it (e.g. as beta) been released somewhere??? - I thought latest @2014-01-19 was 5.3