Initial load failed to transfer the clob data
Hi Experts
I am trying to move my huge database from Window 10g to 11g on linux though Goldengate Initial Laod. It have clob, long, blob datatypes. When i tring to move thought below parameters its giving an error
Error:
The Trail file can not be used with specialrun parameter and when i create a normal replicate process to replicate the data its dissplaying an error for log_csn, log_xid and log_cmplt_csnl columns under ggs_checkpointable (unable to populate this columns)
--Loading data from file to Replicat (Transfer Method)
Source Database Server:
1. EDIT PARAMS load1
2. Add below parameter into parameter files with name load1
SOURCEISTABLE
USERID gguser@orcl, PASSWORD test
RMTHOST 10.8.18.189, MGRPORT 7810
RMTFILE /ora01/initialload/pt, MAXFILES 10000, MEGABYTES 10
TABLE test.*;
3. EDIT PARAMS load2
4. Add below parameter into parameter files with name load2
SPECIALRUN
USERID gguser@orcl1, PASSWORD test
EXTTRAIL/ora01/initialload/pt
ASSUMETARGETDEFS
MAP test.*, TARGET test.*;
END RUNTIME
5. Start the extract porcess on the source database:
cmd> ogg_directory > extract paramfile dirprm\load1.prm reportfile c:\load1.rpt
6. Start the replicat process on the target Database Server:
$ ogg_directory> replicat paramfile dirprm/load2.prm reportfile /ora01/load1.rptt
Checkpointtable is not needed for a initial load replicat. You could do the following,
load2.prm
REPLICAT LOAD2
USERID gguser@orcl1, PASSWORD test
-- EXTTRAIL /ora01/initialload/pt
ASSUMETARGETDEFS
MAP test., TARGET test.;
-- END RUNTIME
ggsci> add rep load2, exttrail /ora01/initialload/pt, nodbcheckpoint
ggsci> start rep load2
Thanks,
Rajesh
Similar Messages
-
Initial load failing between identical tables. DEFGEN skewed and fixable?
Initial load failing between identical tables. DEFGEN skewed and fixable?
Error seen:
2013-01-28 15:23:46 WARNING OGG-00869 [SQL error 0 (0x0)][HP][ODBC/MX Driver] DATETIME FIELD OVERFLOW. Incorrect Format or Data. Row: 1 Column: 11.
Then compared the discard record against a select * on the key column.
Mapping problem with insert record (target format)...
**** Comparing Discard contents to Select * display
ABCHID = 3431100001357760616974974003012 = 3431100001357760616974974003012
*!!! ABCHSTEPCD = 909129785 <> 9 ???*
ABCHCREATEDDATE = 2013-01-09 13:43:36 = 2013-01-09 13:43:36
ABCHMODIFIEDDATE = 2013-01-09 13:43:36 =2013-01-09 13:43:36
ABCHNRTPUSHED = 0 = 0
ABCHPRISMRESULTISEVALUATED = 0 = 0
SABCHPSEUDOTERM = 005340 = 005340
ABCHTERMID = TERM05 = TERM05
ABCHTXNSEQNUM = 300911112224 = 300911112224
ABCHTIMERQSTRECVFROMACQR = 1357799914310 = 1357799914310
*!!! ABCTHDATE = 1357-61-24 00:43:34 <> 2013-01-09 13:43:34*
ABCHABCDATETIME = 2013-01-09 13:43:34.310000 = 2013-01-09 13:43:34.310000
ABCHACCOUNTABCBER =123ABC = 123ABC
ABCHMESSAGETYPECODE = 1210 = 1210
ABCHPROCCDETRANTYPE = 00 = 00
ABCHPROCCDEFROMACCT = 00 = 00
ABCHPROCCDETOACCT = 00 = 00
ABCHRESPONSECODE = 00 = 00
…. <snipped>
Defgen comes out same when run against either table.
Also have copied over and tried both outputs from DEFGEN.
+- Defgen version 2.0, Encoding ISO-8859-1
* Definitions created/modified 2013-01-28 15:00
* Field descriptions for each column entry:
* 1 Name
* 2 Data Type
* 3 External Length
* 4 Fetch Offset
* 5 Scale
* 6 Level
* 7 Null
* 8 Bump if Odd
* 9 Internal Length
* 10 Binary Length
* 11 Table Length
* 12 Most Significant DT
* 13 Least Significant DT
* 14 High Precision
* 15 Low Precision
* 16 Elementary Item
* 17 Occurs
* 18 Key Column
* 19 Sub Data Type
Database type: SQLMX
Character set ID: ISO-8859-1
National character set ID: UTF-16
Locale: en_EN_US
Case sensitivity: 14 14 14 14 14 14 14 14 14 14 14 14 11 14 14 14
Definition for table RT.ABC
Record length: 1311
Syskey: 0
Columns: 106
ABCHID 64 34 0 0 0 0 0 34 34 34 0 0 32 32 1 0 1 3
ABCHSTEPCD 132 4 39 0 0 0 0 4 4 4 0 0 0 0 1 0 0 0
ABCHCREATEDDATE 192 19 46 0 0 0 0 19 19 19 0 5 0 0 1 0 0 0
ABCHMODIFIEDDATE 192 19 68 0 0 0 0 19 19 19 0 5 0 0 1 0 0 0
ABCHNRTPUSHED 130 2 90 0 0 0 0 2 2 2 0 0 0 0 1 0 0 0
ABCHPRISMRESULTISEVALUATED 130 2 95 0 0 0 0 2 2 2 0 0 0 0 1 0 0 0
ABCHPSEUDOTERM 0 8 100 0 0 0 0 8 8 8 0 0 0 0 1 0 0 0
ABCTERMID 0 16 111 0 0 0 0 16 16 16 0 0 0 0 1 0 0 0
ABCHTXNSEQNUM 0 12 130 0 0 0 0 12 12 12 0 0 0 0 1 0 0 0
ABCHTIMERQSTRECVFROMACQR 64 24 145 0 0 0 0 24 24 24 0 0 22 22 1 0 0 3
ABCTHDATE 192 19 174 0 0 0 0 19 19 19 0 5 0 0 1 0 0 0
ABCHABCDATETIME 192 26 196 0 0 1 0 26 26 26 0 6 0 0 1 0 0 0
ABCHACCOUNTABCER 0 19 225 0 0 1 0 19 19 19 0 0 0 0 1 0 0 0
ABCHMESSAGETYPECODE 0 4 247 0 0 1 0 4 4 4 0 0 0 0 1 0 0 0
ABCHPROCCDETRANTYPE 0 2 254 0 0 1 0 2 2 2 0 0 0 0 1 0 0 0
ABCHPROCCDEFROMACCT 0 2 259 0 0 1 0 2 2 2 0 0 0 0 1 0 0 0
ABCHPROCCDETOACCT 0 2 264 0 0 1 0 2 2 2 0 0 0 0 1 0 0 0
ABCHRESPONSECODE 0 5 269 0 0 1 0 5 5 5 0 0 0 0 1 0 0 0
… <snipped>
The physical table shows a PACKED REC 1078
And table invoke is:
-- Definition of table ABC3.RT.ABC
-- Definition current Mon Jan 28 18:20:02 2013
ABCHID NUMERIC(32, 0) NO DEFAULT HEADING '' NOT
NULL NOT DROPPABLE
, ABCHSTEPCD INT NO DEFAULT HEADING '' NOT NULL NOT
DROPPABLE
, ABCHCREATEDDATE TIMESTAMP(0) NO DEFAULT HEADING '' NOT
NULL NOT DROPPABLE
, ABCHMODIFIEDDATE TIMESTAMP(0) NO DEFAULT HEADING '' NOT
NULL NOT DROPPABLE
, ABCHNRTPUSHED SMALLINT DEFAULT 0 HEADING '' NOT NULL NOT
DROPPABLE
, ABCHPRISMRESULTISEVALUATED SMALLINT DEFAULT 0 HEADING '' NOT NULL NOT
DROPPABLE
, ABCHPSEUDOTERM CHAR(8) CHARACTER SET ISO88591 COLLATE
DEFAULT NO DEFAULT HEADING '' NOT NULL NOT DROPPABLE
, ABCHTERMID CHAR(16) CHARACTER SET ISO88591 COLLATE
DEFAULT NO DEFAULT HEADING '' NOT NULL NOT DROPPABLE
, ABCHTXNSEQNUM CHAR(12) CHARACTER SET ISO88591 COLLATE
DEFAULT NO DEFAULT HEADING '' NOT NULL NOT DROPPABLE
, ABCHTIMERQSTRECVFROMACQR NUMERIC(22, 0) NO DEFAULT HEADING '' NOT
NULL NOT DROPPABLE
, ABCTHDATE TIMESTAMP(0) NO DEFAULT HEADING '' NOT
NULL NOT DROPPABLE
, ABCHABCDATETIME TIMESTAMP(6) DEFAULT NULL HEADING ''
, ABCHACCOUNTNABCBER CHAR(19) CHARACTER SET ISO88591 COLLATE
DEFAULT DEFAULT NULL HEADING ''
, ABCHMESSAGETYPECODE CHAR(4) CHARACTER SET ISO88591 COLLATE
DEFAULT DEFAULT NULL HEADING ''
, ABCHPROCCDETRANTYPE CHAR(2) CHARACTER SET ISO88591 COLLATE
DEFAULT DEFAULT NULL HEADING ''
, ABCHPROCCDEFROMACCT CHAR(2) CHARACTER SET ISO88591 COLLATE
DEFAULT DEFAULT NULL HEADING ''
, ABCHPROCCDETOACCT CHAR(2) CHARACTER SET ISO88591 COLLATE
DEFAULT DEFAULT NULL HEADING ''
, ABCHRESPONSECODE CHAR(5) CHARACTER SET ISO88591 COLLATE
DEFAULT DEFAULT NULL HEADING ''
…. Snipped
I suspect that the fields having subtype 3 just before the garbled columns is a clue, but not sure what to replace with or adjust.
Any and all help mighty appreciated.Worthwhile suggestion, just having difficulty applying.
I will tinker with it more. But still open to more suggestions.
=-=-=-=-
Oracle GoldenGate Delivery for SQL/MX
Version 11.2.1.0.1 14305084
NonStop H06 on Jul 11 2012 14:11:30
Copyright (C) 1995, 2012, Oracle and/or its affiliates. All rights reserved.
Starting at 2013-01-31 15:19:35
Operating System Version:
NONSTOP_KERNEL
Version 12, Release J06
Node: abc3
Machine: NSE-AB
Process id: 67895711
Description:
** Running with the following parameters **
2013-01-31 15:19:40 INFO OGG-03035 Operating system character set identified as ISO-8859-1. Locale: en_US_POSIX, LC_ALL:.
Comment
Comment
REPLICAT lodrepx
ASSUMETARGETDEFS
Source Context :
SourceModule : [er.init]
SourceID : [home/ecloud/sqlmx_mlr14305084/src/app/er/init.cpp]
SourceFunction : [get_infile_params]
SourceLine : [2418]
2013-01-31 15:19:40 ERROR OGG-00184 ASSUMETARGETDEFS is not supported for SQL/MX ODBC replicat.
2013-01-31 15:19:45 ERROR OGG-01668 PROCESS ABENDING. -
I'm moving an internal hard drive from an old computer to a new computer. (My new compute runs Windows 8.1) How do I transfer the catalog data to my new computer so the Organizer of Photoshop Elements 6 will find the data on the hard that has been moved to the new computer? Is this possible or do I have to use the procedure of copying the catalog along with the photos to a back up hard drive and then restore them on the new computer?
The 'normal' procedure is to do a PSE organizer backup (not a copy or a backup from external tools) to an external drive, then a restore to the new location.
The advantage is that you have a backup of both catalog and media files and you don't have all disconnected files, which would happen with external backup tools. Also, if you restore with a more recent elements version, the catalog will be automatically updated to the new format.
The drawback in your situation is that you'll have either to overwrite all your files or restore to a custom location, creating a duplicate photo library (which supposes you have enough free space).
It would be possible to simply use the organizer in the old PC to move the catalog to a new 'custom' folder just under your C: root drive. The catalog would be accessible once the drive will be installed in the new PC. Then you would have all files 'disconnected' due to the fact that all your media files are now on a drive with a different letter. Reconnecting a whole library is a hard job with PSE6, less so with PSE11/PSE12.
If the idea of fiddling with the sqlite database with external sqlite manager tools is ok for you, I can describe the process more in detail. You only copy your catalog folder (as suggested above) to another location. Instead of trying the 'reconnection' way, you install the sqlite utility on the new computer. When the old drive is installed in the new computer, you simply edit a given record in the catalog database, catalog.psedb, and start the organizer with your copied catalog by simply double clicking the 'catalog.psedb' file.
Even if the last solution is much, much quicker, I would still create a new PSE backup : you have never too much safety . -
Problem in spooling the clob data to Excel sheet
Hi dev's
currently i am using below version :
Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
PL/SQL Release 11.2.0.1.0 - Production
CORE 11.2.0.1.0 Production
TNS for 32-bit Windows: Version 11.2.0.1.0 - Production
NLSRTL Version 11.2.0.1.0 - Production here is my DDL ststements :
CREATE TABLE tt1 (n1 number,c1 varchar2(1),b1 clob);I inserted one record in tt1 table like :
INSERT INTO tt1 values (1,'a',aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa);length of clob data in b1 column is
SELECT DBMS_LOB.GETLENGTH(b1) FROM tt1 WHERE n1 =1;
1432
1 rows selected What's my problem is :
i am correctly spooling the data into .xls file But
The Clob data is placed into sperate lines(cells). I need the CLOB (b1 column data) data INTO single ROW (cell)
i am able to get the n1,c1 data into seperate cells.
below is the excel sheet data (the space between rows are seperate lines):
number_value character_value clob_value
1 a gaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa Please help me on this issue.
Thanks,You may need to put double quotes around your CSV string data, so that Excel can recognise it as single values rather than trying to split it across cells/rows.
For example a nicely formed CSV may look like:
empno,ename,job,mgr,hiredate,sal,comm,deptno
7369,"SMITH","CLERK",7902,17/12/1980 00:00:00,800,,20
7499,"ALLEN","SALESMAN",7698,20/02/1981 00:00:00,1600,300,30
7521,"WARD","SALESMAN",7698,22/02/1981 00:00:00,1250,500,30
7566,"JONES","MANAGER",7839,02/04/1981 00:00:00,2975,,20
7654,"MARTIN","SALESMAN",7698,28/09/1981 00:00:00,1250,1400,30
7698,"BLAKE","MANAGER",7839,01/05/1981 00:00:00,2850,,30
7782,"CLARK","MANAGER",7839,09/06/1981 00:00:00,2450,,10
7788,"SCOTT","ANALYST",7566,19/04/1987 00:00:00,3000,,20
7839,"KING","PRESIDENT",,17/11/1981 00:00:00,5000,,10
7844,"TURNER","SALESMAN",7698,08/09/1981 00:00:00,1500,0,30
7876,"ADAMS","CLERK",7788,23/05/1987 00:00:00,1100,,20
7900,"JAMES","CLERK",7698,03/12/1981 00:00:00,950,,30
7902,"FORD","ANALYST",7566,03/12/1981 00:00:00,3000,,20
7934,"MILLER","CLERK",7782,23/01/1982 00:00:00,1300,,10where the string data has double quotes around it. This also means that comma values within the quotes will not be treated as a new value, but rather as part of the text itself. -
Hi
I'm seeing this critical error on my primary.
SMS_DISCOVERY_DATA_MANAGER Message ID 2636 and 620.
Discovery data manager failed to process the discovery data record (DDR)"D:\Prog.....\inboxes\auth\ddm.box\userddrsonly\adu650dh.DDR", because it cannot update the data source.
Where these ddr's are actually under ddm.box\userddrsonly\BAD_DDRS folder
I see a ton of DDR files in that folder. Not sure if I can delete them, so I moved them to a temp folder. AD User discovery keeps generating them.
Any help ?
Thanks
UK
Check the ddm.log file for more information.
My Blog: http://www.petervanderwoude.nl/
Follow me on twitter: pvanderwoude -
Pl/sql Procedure is Not Creating With the CLOB data Type
Hi,
I am Using Oracle 10g Express Edition Release2.... My Doubt is While creating a table With CLOB Data Type the table is created successfully,but while Creating a Procedure With the CLOB Data type i am getting an Error Message
2667/5 PL/SQL: Statement ignored
2667/24 PLS-00382: expression is of wrong type
then i tried With the Varchar2(30000) the Procedure is Created Successfully note i have not changed any thing in my code except the data type.
I am Just Confused ......Why the Procedure is not Created with CLOB Data type?
Please advice ...
Thank U
SHANhi,
Thanks for reply....Another Example
CREATE TABLE USER_MAS (USER_ID VARCHAR2 (20 Byte),MAIL_ID VARCHAR2 (255 Byte));
set serveroutput on
declare
atable varchar2(64) := 'USER_MAS';
acolumn varchar2(64) := 'MAIL_ID';
avalue varchar2(64) := 'NEWYORK' ;
dyn_sql clob;
begin
dyn_sql := 'update '||atable||' set '||acolumn||' = '''||avalue|| '''' ;
dbms_output.put_line(dyn_sql);
execute immediate dyn_sql;
end;
commit ;
Error at line 2
ORA-06550: line 9, column 23:
PLS-00382: expression is of wrong type
ORA-06550: line 9, column 5:
PL/SQL: Statement ignored
When i Changed the Data type to varchar2(64)
update USER_MAS set MAIL_ID = 'NEWYORK'
PL/SQL procedure successfully completed.
Commit complete.
I like to Know the Reason Why the Procedure is Not Created in Oracle 10g XE DB
Note :the Same Script i used in 11g DB the Procedure is Created Successfully....
Why you need use CLOB or VARCHAR2 in your temp_num variable as you sending parameters as number?
In the Procedure we are create some run time queries while executing the procedure. There are around 10 run time queries created.
The size of each query is more than 4000 characters . We then add all the queries using union all after each query to the clob variable as the normal varchar will not support.
Please Advice
Thank U
SHAN -
How to transfer the transactional dat a using ale
hi to all abap gurus
i heard that we can transfer the transactional data ( like so data , po data ) using message control technique by ale technology . . can u please give all steps in message control technique with one exapmle . i searched in the forum but i did not get answer . pls points will be rewrared for good answers. if u want to give links pls give exact links .Hi ,
here is the configuration.
MESSAGE CONTROL (USING EDI / ALE)
For Purchase order (Message control using EDI and message type ORDERS) STEPS
Settings to be done in the Sending system
From NACE
Choose Application EF ( Purchase order), press on condition record
Output type NEU Purchase order / double click ;choose Purchasing output determination:Doc type/purc org/vendor . Key in the data (Purchasing doc type NB, Purc organisation 0001, Venor vendor11) execute. Key in the data (vendor vendor11, name name1, partner function - VN, Partner Vendor11,Medium 6 (EDI), time 4(send immediately), language EN).
Press on output types , choose NEU Purchase order, double click, Access sequence 0001(doctype/purcorg/vendor) tick mark the access to condition, go to default values (dispatch time send immediately, Transmission medium EDI, Partner function VN)
Create RFC destination from SM59 (ZNARA_ALE_EDI).
Create a port from ZNARA_ALE.
Create partner profile vendor11 under LI and not LS. PARTNER NUMBER SHOULD BE SAME AS THAT OF VENDOR SET IN THE CONDITION TYPE.
In the outbond parameters of the partner profile Vendor11 key in the data (Reciever port -> ZNARA_ALE,
Basic type ORDERS05, Message type ORDERS ) Go to the message control and key in the data (Application EF, Message type NEU, Process code ME10, Transfer immediately )
Create a purchase order from ME21 with data (Purchasing doc type NB, Purc organisation 0001, Venor vendor11). System will generate an outbond Idoc automatically which can be seen from WE02.
For Purchase order (Message control using ALE and Message type ORDERS) - STEPS
To transfer the idoc from Client 555 to 500
Settings do be done in 555.
From NACE
Choose Application EF Purchase order, press on condition record
Output type NEU Purchase order / double click ;choose Purchasing output determination:Doc type/purc org/vendor . Key in the data (Purchasing doc type NB, Purc organisation 0001, Venor vendor11) execute. Key in the data (vendor vendor11, name name1, partner function - VN, Partner Vendor11,Medium A (ALE), time 4(send immediately), language EN).
Press on output types , choose NEU Purchase order, double click, Access sequence 0001(doctype/purcorg/vendor) tick mark the access to condition, go to default values (dispatch time send immediately, Transmission medium ALE, Partner function VN)
From SALE create two logical systems and assign them to the respective clients (ZALERECV_N 500, ZALESEND_N 555).
From SM59 create RFC destination with the same name as that of the receiving logical system (ZALERECV_N).
From BD 64(distribution model), create a model view with Technical name ZNARA_PO (message type ORDERS, Sender ZALESEND_N, Receiver ZALERECV_N). Generate partner profiles. Distribute.
Note that partner profile will be generated under LS.
Settings do be done in 500.
From NACE create two logical systems and assign them to the respective clients (ZALERECV_N 500, ZALESEND_N 555).
From BD 64(distribution model), keep the cursor on Technical name ZNARA_PO. Generate partner profiles. Note that partner profile will be generated under LS. Change the process code to ORDE in the inbond parameters .
Now in client 555, Create a purchase order from ME21 with data (Purchasing doc type NB, Purc organisation 0001, Venor vendor11). System will generate an outbond Idoc automatically which can be seen from WE02. The purchase order details will be updated in 555 thru the inbond idoc.
IMP data for message control in Sales order creation.
Partner profile to be created under KU.
Outbond parameters in the partner profile (Message type ORDRSP, Basic type ORDERS02, partner profile - SP) In Message control (Application V1, Message type BA00, Process code SD10)
Notes:
Check the entries in the NAST table to see whether outbond idoc has been created successfully or not
If you have choosen collect idocs in the partner profile, use trans code BD87 to process them
Table EDP13 - Partner Profile: Outbound (technical parameters)
Table EDP21 - Partner Profile: Inbond (technical parameters)
Please reward if useful. -
My computer that my iphone 4 is backed up on crashed... I bought a new computer as well as the iphone 5. How can transfer the my data/pics/notes/songs etc to my new phone?
Use the backup you have of the old computer and restore that data to the new computer. Also and backups that you have made of your iTunes library (it has a method to back it up as well) can be copied back over to the new computer. If you attempt to sync the phone to the new computer without the data, it will delete that data from the phone. The iPhone is not designed as a storage/backup device and as such will not sync music from itself to a new computer.
If by chance you had not been backing up the computer and/or iTunes, you can google for some third party software, such as Touchcopy that will do some of what you are asking. -
Replicating data once again to CRM after initial load fails for few records
My question (to put it simply):
We performed an initial load for customers and some records error out in CRM due to invalid data in R/3. How do we get the data into CRM after fixing the errors in R/3?
Detailed information:
This is a follow up question to the one posted here.
Can we turn off email validation during BP replication ?
We are doing an initial load of customers from R/3 to CRM, and those customers with invalid email address in R/3 error out and show up in SMW01 as having an invalid email address.
If we decide to fix the email address errors on R/3, these customers should then be replicated to CRM automatically, right? (since the deltas for customers are already active) The delta replication takes place, but, then we get this error message "Business Partner with GUID 'XXXX...' does not exist".
We ran the program ZREPAIR_CRMKUNNR provided by SAP to clear out any inconsistent data in the intermediate tables CRMKUNNR and CRM_BUT_CUSTNO, and then tried the delta load again. It still didn't seem to go through.
Any ideas how to resolve this issue?
Thanks in advance.
MaxSubramaniyan/Frederic,
We already performed an initial load of customers from R/3 to CRM. We had 30,330 records in R/3 and 30,300 of them have come over to CRM in the initial load. The remaining 30 show BDOC errors due to invalid email address.
I checked the delta load (R3AC4) and it is active for customers. Any changes I make for customers already in CRM come through successfully. When I make changes to customers with an invalid email address, the delta gets triggered and data come through to CRM, and I get the BDOC error "BP with GUID XXX... does not exist"
When I do a request load for that specific customer, it stays in "Wait" state forever in "Monitor Requests"
No, the DIMA did not help Frederic. I did follow the same steps you had mentioned in the other thread, but it just doesn't seem to run. I am going to open an OSS message with SAP for it. I'll update the other thread.
Thanks,
Max -
How can I transfer the meta data to the Oracle OLAP Object?
When I finished in building a simple DataWare house demo in the warehouse Builder, I got meta data. But it seems that those doesn't present in the "<Database>" - "OLAP" - "Dimension" where the Sample "SH" resides.I just want to export those metadata to the SQL script, but failing to import the script to the Database. How can I transfer my demo's meta data to the OLAP Object? It is the last step to finish the whole process and I can't solve the problem.
I'll appreciate any one who can help me.Hello,
To enter/move the metadata for dimensions and facts to the OLAP catalog, you need to perform the following actions:
1) Create a business area in the OWB business tree holding the dimensions and facts you want in OLAP
2) Perform a metadata export, using the bridge (not to file!)
3) This will launch a wizard that will guide you through this process, follow the instructions carefully
4) Upon completion you will see metadata added into the OLAP catalog
We will be releasing a new version for this, which will be running on 9.2.0.2 database that allows you to use all the new 9i Release 2 OLAP features. This will create Analytic Workspaces from Warehouse Builder and should be released late this year (around end of November).
Hope this helps,
Jean-Pierre -
Can not transfer the partner data from ROS to EBP
Hi, All
We are using SRM 5.0, and setup the ROS ans SUS in one client, EBP in another client. All the configuration was done, we maintained the table: BBP_MARKETP_INFO by SM30 with the following record:
COMPANY_ROOT VENDOR_ROOT DUMMY_EKORG USER_ROOT DEF_CURRENCY
O 50000001 50000025 O 50000001 O 50000001 USD
When we create the business partner in BBPMAININT in EBP client by Business Partner from List. there is the error message: " Partner data not transferred from catalog, Inform system admin." Anyone encountered this case, please share your experience. thanks a lot.
GYhi,
please check oss note with number Note
1089734 - Error when transferring supplier from ROS to EBP.
1) apply the note if it is relevant for your release and support pack....
2) and then also go through all the parameters that are mentioned in the note , wether you have maintained in your system or not...
>
note mentions the following sympton
Symptom
User trying to transfer Supplier Data from ROS (Registration of Supplier system)to EBP (Enterprise Buyer Professional system).In the process of transferring error message appears saying "Currency of purchasing organization could not be determined" ( even when a default currency attribute is maintained in PPOMA_BBP for the user or for the purchasing organisation of the user.
In case, the default currency attribute is not maintained in the PPOMA_BBP for the user or for the purchasing organisation, then at the time of transfer of suplier data from ROS to EBP system, error message pops out saying "Partner data not transferred from catalog. Inform system admin" .
Edited by: khan voyalpad usman on Sep 13, 2008 10:30 AM -
Load multiple files using the same data load location
has anybody tried loading multiple files using the same load locations. I need to do this as the data in these multiple files will need to be exported from FDM as a single export file. the problem i am facing is more user related. since these files will be received at different points of time, users will need a way to tell them what has been loaded and what is yet to be loaded.
is it possible to throw a window on the web broser with OK and Cancel buttons from an event script?
any pointers to possible solutions will be helpfulwas able to resolve this. the implementation method is as follows
take a back up of previously imported data in the befcleardata event script. then in the beffileimport event append the data to the import file. there are many other intricacies but this is the broad implementation logic. it allowed my users to load multiple files without worrying about append or replace import type choices -
How to load this value into the master data display attribute
Hi ,
Please share me the knowldege how to load this kind of data into master data display attribtes ..
Raj + Ravi Ltd (PCG: 13592)
While loading this data ,i got the error message stating that '+' should not be part of the attributes and () should not be part of the attribute ..but i need all the information as it is available in the example data .
Do i need to maintain RSKC Settings else some other things required ..
Please guide me ..
Regards,
RajHI,
Maintain these symbols in RSKC and try to reload the data.... -
How To Transfer the master data from R/3 to APO System
hi this is mohamed .i m new to APO..
can any one guide me the configuration for transferring the master datas to APO system.i m using scm 4.0 version..
Thanks in advance
Regards
Mohamed.Welcome to SDN.
refer this link
creating integration model
Give ur mail id i will send the document which i have.
With regards.
Maya. -
Initial load of G/L account master data
Does anyone have any familiarity with any built-in tools SAP provides for bulk loading of master data? I have always been aware of LSMW but if there is any documentation out there around LSMW or other tools SAP may provide it would be much appreciated.
Moderator: Please, search before postingHi,
Many clients also use BDC Reports to upload master data. You can refer the below link for step-by-step process:
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/309cb157-738d-2910-7691-b74c4ddba3c7?quicklink=index&overridelayout=true
Regards,
Kiron Kumar T.
Maybe you are looking for
-
I have setup ACS 4.2 and when I run router# test aaa group tacacs+ myuser mypasswd [ legacy | new-code] Both options work fine But when I try and login, over telnet, the request reaches the aaa server, but returns fail ! My commands ar
-
Can't get past export eligibility with mozilla or netscape 7
So that we may better diagnose DOWNLOAD problems, please provide the following information. - Server name - Filename - Date/Time - Browser + Version - O/S + Version - Error Msg I've tried using both Netscape 7 and Mozilla 1.1 on NT 4.0 SP6a. With bot
-
Restrict GR prior to Order Acknowledgement of PO
Hi, We need to restrict GR for a PO prior to acceptance of the Order Acknowledgement. Secondly, for a multi Item PO if there is a discrepancy between the PO and Order Acknowledgement in one of the Line items, we need to complete the procurement proce
-
i I cant convert into a PDF
-
Removing "Notification Detail Link" in notification mails
Hello. Is it possible removing the attachment "Notification Detail Link.html" attached in the outgoing notification EMails? In the case, I have to do the set directly in the Workflow design, or is it a configuration in the Mailer Agent? Any help is a