Data Pumb Process Abended
Hi , I'm trying to Replicat data from source (OLTP) to target (OLAP) . I have succeeded in initial load. But there is an error when i configure Extract process , Data Pumb process, replicat process . Guide me how to fix
#Configure Extrace Process
GGSCI (dbserver1) 3> EDIT PARAMS EOLTP01
EXTRACT EOLTP01
USERID gg_admin, PASSWORD demo123
EXTTRAIL ./dirdat/sa
TRANLOGOPTIONS ASMUSER SYS@ASM, ASMPASSWORD demo123
TABLE SRC.DEPT;
TABLE SRC.EMP;
GGSCI (dbserver1) 1> add extract EOLTP01, tranlog, begin now, threads 2
GGSCI (dbserver1) 2> add exttrail ./dirdat/sa, extract EOLTP01,megabytes 50
#Configure Data Pumb Process
GGSCI (dbserver1) 4> EDIT PARAMS EPMP01
EXTRACT EPMP01
PASSTHRU
RMTHOST dbserver2, MGRPORT 7809
RMTTRAIL ./dirdat/ta
TABLE SRC.DEPT;
TABLE SRC.EMP;
GGSCI (dbserver1) 3> add extract EPMP01, exttrailsource ./dirdat/sa, begin now, threads 2
GGSCI (dbserver1) 3> ADD RMTTRAIL ./dirdat/ta, EXTRACT EPMP01, MEGABYTES 50
#Start Extrace Process & Data Pumb Process
GGSCI (dbserver1) 4> start extract EOLTP01
Sending START request to MANAGER ...
EXTRACT EOLTP01 starting
GGSCI (dbserver1) 5> start EXTRACT EPMP01
Sending START request to MANAGER ...
EXTRACT EPMP01 starting
#Configure Replicat Process on Target
GGSCI (dbserver2) 1> EDIT PARAMS ROLAP01
REPLICAT ROLAP01
SETENV (ORACLE_SID=OLAP)
USERID ggs_admin, PASSWORD ggs_admin
DISCARDFILE ./dirrpt/rolap01.dsc, PURGE
MAP SRC.DEPT, TARGET TGT.DEPT;
MAP SRC.EMP, TARGET TGT.EMP;
GGSCI (dbserver2) 1> add replicat ROLAP01, exttrail ./dirdat/ta
GGSCI (dbserver2) 2> start replicat ROLAP01
Sending START request to MANAGER ...
REPLICAT ROLAP01 starting
After that I insert some rows to source table
But in the source :
GGSCI (rx2600-2) 8> info all
Program Status Group Lag at Chkpt Time Since Chkpt
MANAGER RUNNING
EXTRACT RUNNING EOLTP01 00:00:00 00:00:07
EXTRACT ABENDED EPMP01 00:35:31 00:02:16
when i view report EPMP01
Source Context :
SourceModule : [er.api]
SourceID : [scratch/aime1/adestore/views/aime1_staoi06/oggcore/OpenSys/src/app/er/api.c]
SourceFunction : [XT_read]
SourceLine : [4054]
2012-09-05 15:39:12 ERROR OGG-01091 Unable to open file "M-qM-o#DM-WM-!@065536" (error 2, No such file or directory)
Thanks alot
ThoNguyen wrote:
when i view report EPMP01
2012-09-05 15:39:12 ERROR OGG-01091 Unable to open file "M-qM-o#DM-WM-!@065536" (error 2, No such file or directory)That's an interesting trail file name. It could be that there are special char's somehow in your prm file, try:
$ cat -vEt dirprm/EPMP01.prm
.......and look for special char's in the trail filename.
What is the source OS where the pump is running? (Perhaps you should display the entire pump report file -- it has all this info in it, including original parameters.) What is the target OS where the remote trails are created? Are the character sets different?
Similar Messages
-
To retrieve data after process abended
Hi everyone,
Is there any method of replicating the DDL,DML 's fired after the processes like replicat abended.
Currently i am using : alter replicat rep1 begin now
It seems to work to make processes running again but the i lost the data that was replicated after the process abended.
plz help......
thanks
SauravHi,
1.
My Suggestion is NOT to use BEGIN NOW option with Replicat. This will make your target tables go out of sync and miss some of the DML/DDL as you are altering replciat to process from current time stamp instead of recovery.
GGSCI>alter replicat rep1 begin now
Instead you should always start at seqno 0 and RBA 0 if you are starting fresh or starting Relicat for first time
GGSCI>alter replicat <replicat_name>, Extseqno 0, Extrba 0
GGSCI>start replicat <replicat_name>
OR
if the Replicat has abended earlier and you would like to restart it, you should simply restart it as it will have all the checkpoint information with it and it will automatically start processing data from the recovery checkpoint(trail file).
GGSCI>start replicat <replicat_name>
OR
If you intend to start processing fom a specific trail file then you should use below command
Example:
$cd trail
$ls -ltr
lt000123
lt000124
lt000125
If you are sure that the Replciat processed trail file 123 and you would like to start from trail file seq 124 then try
GGSCI>alter replicat <replicat_name>, Extseqno 124, Extrba 0
GGSCI>start replicat <replicat_name>
OR
If you want to start processing from specific record or start of transaction with in a trail file then you could use LOGDUMP utility to find the RBA and use the same to alter the Replicat.
GGSCI>alter replicat <replicat_name>, Extseqno 124, Extrba 8999934
GGSCI>start replicat <replicat_name>
2.
As you have already lost some of the data and your target DB is already out of sync, I would suggest below method to resync it.
full database export using expdp(Oracle Source to Oracle Target)
1) Enable Minimal Supplemental Logging in Oracle on source
SQLPLUS > alter database add supplemental log data ;
2) Enable Supplemental Logging at Table Level on source
GGSCI> dblogin userid xxxxx password xxxxxx
GGSCI> add trandata <schema>.<tablename>
3) Add Extract, Add Exttrail, Add Pump, Add Rmttrail on source
4) Start Extract, Start Pump on source
5) Create a database directory:
SQLPLUS> create directory dumpdir as '<some directory>' ;
6) Get the current SCN on the source database:
SQLPLUS> select current_scn from v$database ;
28318029
7) Run the export using the flashback SCN you obtained in the previous step. The following example shows running the expdp utility at a Degree Of Parallelism (DOP) of 4. If you have sufficient system resources (CPU,memory and IO) then running at a higher DOP will decrease the amount of time it takes to take the export (up to 4x for a DOP of 4). Note that expdp uses Oracle Database parallel execution settings (e.g.parallel_max_servers) which have to be set appropriately in order to take advantage of parallelism. Other processes running in parallel may be competing for those resources. See the Oracle Documentation for more details.
a.expdp directory=dumpdir full=y parallel=4 dumpfile=ora102_%u.dmp flashback_scn=28318029
Username: sys as sysdba
Password:
Note: The export log needs to be checked for errors.
8) Start an import using impdp to the target database when step 7 is complete.
9) Add and Start Replicat:
GGSCI> add replicat <rep_name>, exttrail ./dirdat/<xx>
GGSCI> start replicat <rep_name>, aftercsn <value returned from step 6>
Hope this information helps.
Thanks & Regards
SK -
Hello,
Before replicat process abended, I received warnings and inserted in exceptions table.
"Error - OCI Error ORA-14400: inserted partition key does not map to any partition " . One of the table doesn't have the partitions on the target database. 51 rows inserted in exceptions table.
Replicat process abended after the database restart.
Steps followed to start the replicat process :-
a) We have added the missing partitions to that table.
b) Get the RBA info from - info replicat repa, detail
c) alter replicat repa, extrba 80711767
Replicat is running successfully. But the problem is Source database row count is not in sync with the target database.
Source record count - 2000. But the target count is 500. Nearly 1500 records was not replicated.
Is there any way to replicat the missing records on to target machine through goldengate?
Do we need to prefer datapump utility to load the data?
Please help me!!!
Thanks,
951419Do the logs (or report) show anything?
-
ERROR OGG-00685 begin time : extract process abending
Hi Gurus,
I have installed Oracle Golden Gate in the below environment.But unable to start extract process I am newer to OGG and its production setup.
Primary site (source): This is production DB and up and running.
RAC 2 nodes
Oracle RAC 11.2.0.1.0
ASM
Oracle GoldenGate 11g Release 1 (11.1.1.0.0)
Enterprise Linux Server release 5.5 (Carthage)
Processor Type x64
OS 64 bit
target Site (destination):
Standalone single server - Non RAC
Oracle version 11.2.0.1.0
ASM
Oracle GoldenGate 11g Release 1 (11.1.1.0.0)
Enterprise Linux Server release 5.5 (Carthage)
Processor Type x64
OS 64 bit
Error is as below:
2012-08-12 22:57:09 ERROR OGG-00685 begin time May 1, 2011 11:06:53 AM prior to oldest log in log
2012-08-12 22:57:09 ERROR OGG-01668 PROCESS ABENDING. Please suggest , I am stuck
Thanks
Edited by: user13403707 on 16 Aug, 2012 9:07 AMggsci> alter urban begin now
Or explicitly specify a date time:
ggsci> alter urban begin 2012-08-13 12:00:00
I also noticed that this line:
table urbanlive.*
Is missing the closing semi colon. It should be:
table urbanlive.*;
Make sure you read on open transactions. This is important when you BEGIN capturing change data and performing your initial synchronization. If you don't understand this you will not initialize properly and you'll always be out of sync.
Good luck,
-joe -
Hello All,
Following is the scenario
Source - HP-UX Oracle 10.2
add extract ext1
edit params ext1
extract ext1
setenv (NLS_LANG=AMERICAN_AMERICA.AL32UTF8)
userid ggs_owner, password ggs_owner
rmthost 115.124.104.167, mgrport 7809
rmttask replicat, group ext2
table sapsr3.*;
On Target - Red Hat 5.8 Oralce 10.2
add replicat ext2
edit params ext2
replicat ext2
setenv (NLS_LANG=AMERICAN_AMERICA.AL32UTF8)
userid ggs_owner, password ggs_owner
ASSUMETARGETDEFS
map sapsr3.*, target sapsr3.*;
The result of "view ggsevt" on target gives the following error at the last -
ERROR OGG-00446 Oracle GoldenGate Delivery for Oracle, ext2.prm: Invalid data source -1 in checkpoin
t file /opt/oracleGG/dirchk/EXT2.cpr.
ERROR OGG-01668 Oracle GoldenGate Delivery for Oracle, ext2.prm: PROCESS ABENDING.
Please guide me, I am new to Golden Gate, tried lot to fetch info from net but failed :(
thanks in advanced.
Krishna.Hi,
U r following one time intial load process then there is no need of checkpoint file.
Performing intial load for one time: Please follow the link you will get clear steps.
http://gavinsoorma.com/2010/02/oracle-goldengate-tutorial-4-performing-initial-data-load/
Thanks
paddu kandimalla -
Is there any documentation for filter routine in Data Transfer Process?
I am trying to create a filter routine in the Data Transfer Process to select different billing types depending on what date the Data Transfer Process is running....
I have searched through SDN and found some examples, but some formal documentation would help.
Is there any documention on filtering in a Data Transfer Process using a routine?
I am in 7.0data: l_dow TYPE I,
L_S_RANGE TYPE rssdlrange.
l_s_range-iobjnm = '0BILL_TYPE'.
l_S_range-fieldname = 'BILL_TYPE'.
l_S_range-sign = 'I'.
l_S_range-option = 'EQ'.
l_S_range-low = 'F2'.
APPEND L_S_RANGE TO l_t_range.
l_s_range-iobjnm = '0BILL_TYPE'.
l_S_range-fieldname = 'BILL_TYPE'.
l_S_range-sign = 'I'.
l_S_range-option = 'EQ'.
l_S_range-low = 'G2'.
APPEND L_S_RANGE TO l_t_range.
l_s_range-iobjnm = '0BILL_TYPE'.
l_S_range-fieldname = 'BILL_TYPE'.
l_S_range-sign = 'I'.
l_S_range-option = 'EQ'.
l_S_range-low = 'L2'.
APPEND L_S_RANGE TO l_t_range.
l_s_range-iobjnm = '0BILL_TYPE'.
l_S_range-fieldname = 'BILL_TYPE'.
l_S_range-sign = 'I'.
l_S_range-option = 'EQ'.
l_S_range-low = 'ZCDD'.
APPEND L_S_RANGE TO l_t_range.
l_s_range-iobjnm = '0BILL_TYPE'.
l_S_range-fieldname = 'BILL_TYPE'.
l_S_range-sign = 'I'.
l_S_range-option = 'EQ'.
l_S_range-low = 'ZCDI'.
APPEND L_S_RANGE TO l_t_range.
l_s_range-iobjnm = '0BILL_TYPE'.
l_S_range-fieldname = 'BILL_TYPE'.
l_S_range-sign = 'I'.
l_S_range-option = 'EQ'.
l_S_range-low = 'ZCR1'.
APPEND L_S_RANGE TO l_t_range.
l_s_range-iobjnm = '0BILL_TYPE'.
l_S_range-fieldname = 'BILL_TYPE'.
l_S_range-sign = 'I'.
l_S_range-option = 'EQ'.
l_S_range-low = 'ZCR2'.
APPEND L_S_RANGE TO l_t_range.
l_s_range-iobjnm = '0BILL_TYPE'.
l_S_range-fieldname = 'BILL_TYPE'.
l_S_range-sign = 'I'.
l_S_range-option = 'EQ'.
l_S_range-low = 'ZDR1'.
APPEND L_S_RANGE TO l_t_range.
l_s_range-iobjnm = '0BILL_TYPE'.
l_S_range-fieldname = 'BILL_TYPE'.
l_S_range-sign = 'I'.
l_S_range-option = 'EQ'.
l_S_range-low = 'ZEDI'.
APPEND L_S_RANGE TO l_t_range.
l_s_range-iobjnm = '0BILL_TYPE'.
l_S_range-fieldname = 'BILL_TYPE'.
l_S_range-sign = 'I'.
l_S_range-option = 'EQ'.
l_S_range-low = 'ZMD'.
APPEND L_S_RANGE TO l_t_range.
l_s_range-iobjnm = '0BILL_TYPE'.
l_S_range-fieldname = 'BILL_TYPE'.
l_S_range-sign = 'I'.
l_S_range-option = 'EQ'.
l_S_range-low = 'ZRE'.
APPEND L_S_RANGE TO l_t_range.
l_s_range-iobjnm = '0BILL_TYPE'.
l_S_range-fieldname = 'BILL_TYPE'.
l_S_range-sign = 'I'.
l_S_range-option = 'EQ'.
l_S_range-low = 'ZRE1'.
APPEND L_S_RANGE TO l_t_range.
l_s_range-iobjnm = '0BILL_TYPE'.
l_S_range-fieldname = 'BILL_TYPE'.
l_S_range-sign = 'I'.
l_S_range-option = 'EQ'.
l_S_range-low = 'ZRED'.
APPEND L_S_RANGE TO l_t_range.
l_s_range-iobjnm = '0BILL_TYPE'.
l_S_range-fieldname = 'BILL_TYPE'.
l_S_range-sign = 'I'.
l_S_range-option = 'EQ'.
l_S_range-low = 'ZSMP'.
APPEND L_S_RANGE TO l_t_range.
l_s_range-iobjnm = '0BILL_TYPE'.
l_S_range-fieldname = 'BILL_TYPE'.
l_S_range-sign = 'I'.
l_S_range-option = 'EQ'.
l_S_range-low = 'ZUSD'.
APPEND L_S_RANGE TO l_t_range.
l_s_range-iobjnm = '0BILL_TYPE'.
l_S_range-fieldname = 'BILL_TYPE'.
l_S_range-sign = 'I'.
l_S_range-option = 'EQ'.
l_S_range-low = 'ZUSI'.
APPEND L_S_RANGE TO l_t_range.
CALL FUNCTION 'DATE_COMPUTE_DAY'
EXPORTING
date = sy-datum
IMPORTING
day = L_DOW.
IF l_Dow EQ 5.
l_s_range-iobjnm = '0BILL_TYPE'.
l_S_range-fieldname = 'BILL_TYPE'.
l_S_range-sign = 'I'.
l_S_range-option = 'EQ'.
l_S_range-low = 'S1'.
APPEND L_S_RANGE TO l_t_range.
l_s_range-iobjnm = '0BILL_TYPE'.
l_S_range-fieldname = 'BILL_TYPE'.
l_S_range-sign = 'I'.
l_S_range-option = 'EQ'.
l_S_range-low = 'S2'.
APPEND L_S_RANGE TO l_t_range.
ENDIF. -
Error in Data Transfer Process (DTP) Urgent!
Hi,
ive encountered an error in uploading data from R3 using DTP in a specific cube and it says "Exceptions in Subset: Load and Generation" and "Dump: ABAP/4 processor: MESSAGE_TYPE_X", how can i fix this problem? is there any problem with the upgrade process of our system or in KERNEL? and what is KERNEL? Please help me guys.....
Thanks,
nipsPlease do not post the same question twice.
Error in Data Transfer Process (DTP) Urgent! -
Hi,
I am having problems with a data transfer process, i am getting the message:
Extraction datasource Z_WMS_VRM
Prepare for extraction
Exceptions in Substep: Extraction Completed
Processing terminated
When I click the button behind "Exceptions in Substep" it jumps to a line with
CALL METHOD cl_rsbm_log_step=>raise_step_failed_callstack
in method IF_RSBK_CMD_X~GET_DATAPACKAGE.
When I look in the PSA(13 records) it all looks ok, no weird values. De infopackage runs ok and fills the PSA. Screenshot of PSA contents: http://i44.tinypic.com/sesqw2.jpg
The datasource is an external DS filled by powerexchange 4.
The system's current support pack is 18.
Edited by: Thijs de Jong on Jun 23, 2009 12:00 PM
Edited by: Thijs de Jong on Jun 23, 2009 12:11 PMHello Thijs,
The same issue happened to me this week. This can happen for several reasons:
- Some parts of the loading line is not active, resp. structures were changed and need to be activated again.
- There is no more tablespace in the DB.
- There is a deadlock when reading a table (maybe the table is too big...)
Hope this helps.
Cheers,
Stephan -
Issues in Data Transfer Process
Hello All,
After creating transformation from Infosource to InfoCube, now i am trying to data transfer process.
In DTP Type it displays "DTP for direct process", but i need DTP type as Standard(Can be scheduled).
Its giving me an error as "Source does not support direct access"
Could any one help me to solve this error.
Thanks in advance
Regards,
NithinHi,
You can only use the type DTP for Direct Access as the target of the data transfer process for a VirtualProvider.
Check the below links
http://help.sap.com/saphelp_nw04s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
http://help.sap.com/saphelp_nw04s/helpdata/en/42/fb8ed8481e1a61e10000000a422035/frameset.htm
Regards
KP
Edited by: prashanthk on Jan 19, 2011 5:15 PM -
Errors in data transfer process
Hi,
i'm getting "Errors in data transfer process" message when i try to run conntrans. when i see the detials error log in trouble shooting ( in client console), i get this following message in detials ( see the bold letters). can some one help me. i have check all the connections between mobile and CRM server. will reward points!!!
2324 96c ! Entering 'PullMessages' for queue 'CRM_SITE_000000000000274' / limit 20 messages
2324 96c Fri Aug 24 20:28:22 2007 E
NewTransferService - _TransferPullMessages : <b>DotNet Stub.TransferPullMessages call failed with fffffffa:</b>(null)
2324 96c E
_TransferPullMessages failed with fffffffa
2324 96c Fri Aug 24 20:28:23 2007 E
regards
Kamalesh K.VHi,
We have installed MSA 4.0 (SP12 ). we have also checked QMT config in Mobile sales bin folder. The test connection is sucessfull.( see the below message). we stil have problems in Data transfer when we run connstrans. Please help .
<b>" NewQmtCnfg Version 2.0 for Windows 2000 - revision 4003
Assume running on Client
1. Try to access Communication Station 'MTV01sdCR02', please wait...
... OK
Try QmtServer component initialize check, please wait...
...OK: QmtServer initialize call returns successfully
2. Try to access CRM Server (Destination '<DEFAULT>'), please wait...
...OK: CRM Server call returns successfully</b>
regards
Kamalesh KV -
LIKP-KODAT from transaction VL02N in header data in processing tab non edi
Hi gurus i want to make the field LIKP-KODAT from transaction VL02N in header data in processing tab.
is any way to make this by the spro transaction?
Thanks.Hi,,
what do you want to do with the meitoned field?? you did not mention that..
SB -
Data Load process for 0FI_AR_4 failed
Hi!
I am aobut to implement SAP Best practices scenario "Accounts Receivable Analysis".
When I schedule data load process in Dialog immediately for Transaction Data 0FI_AR_4 and check them in Monitor the the status is yellow:
On the top I can see the following information:
12:33:35 (194 from 0 records)
Request still running
Diagnosis
No errors found. The current process has probably not finished yet.
System Response
The ALE inbox of BI is identical to the ALE outbox of the source system
or
the maximum wait time for this request has not yet been exceeded
or
the background job has not yet finished in the source system.
Current status
No Idocs arrived from the source system.
Question:
which acitons can I do to run the loading process succesfully?Hi,
The job is still in progress it seems.
You could monitor the job that was created in R/3 (by copying the technical name in the monitor, appending "BI" to is as prefix, and searching for this in SM37 in R/3).
Keep on eye on ST22 as well if this job is taking too long, as you may have gotten a short dump for it already, and this may not have been reported to the monitor yet.
Regards,
De Villiers -
Data Transfer Process and Delete Overlapping Requests
Hi All,
We are on BW 7.0 (Netweaver 2004s). We are using the new data transfer processing and transformation. We want to use the ability to delete overlapping requests from a cube in a process chain. So lets say we have a full load from an R/3 system with fiscal year 2007 in the selection using an infopackage. It gets loaded to the PSA. From there we execute the data transfer process and load it to the cube. We then execute the delete overlapping requests functionality. My question is, will the DTP know that the infopackage selection was 2007 so it will only delete requests with selections of 2007 and not 2006 from the cube? Basically, is the DTP aware of the selections that were made in the infopackage?
Thanks,
ScottHi Everyone,
Figure it out...on a data transfer process you can filter the selection criteria - go to the extraction tab of a DTP and click on the filter icon. Enter your seleciton conditions to pull from the PSA....these seleciton conditions will be used to delete the overlapping requests from the cube.
Thanks -
How to design data load process chain?
Hello,
I am designing data load process chains for the first time and would like to get some general information on best practicies in that area.
My situation is as follows:
I have 3 source systems (R3 and two for which I use flat files).
How do you suggest, should I define one big chain for all my loading process (I have about 20 InfoSources) or define a few shorter e.g.
1. Master data R3
2. Master data flat file system 1
3. Master data flat file system 2
4. Transaction data R3
5. Transaction data file sys 1
... and execute one after another succesful end?
Could you also suggest me any links or manuals on that topic?
Thank you
AndrzejAndrzej,
My advise is to make separate chains for master & transaction data (always load in this order!) and afterwards make a 'master chain' where you insert these 2 chains one after the other (so: Start process -> Master data chain -> Transaction data chain).
Regarding the separate chains; paralellize as much as possible (if functionally allowed). Normally, the number of parallel ('vertical') chains equals the nr of CPU's available (check with basis-person).
Hope this provides you with enough info to start off with!
Regards,
Marco -
Hi:
i have active-active bidirectional replication in Oracle GoldeGate.
when i change in A database it replicates in B database. like insert (1,'A') values in A database it replicates in B database.
when i am trying to insert (2,'B') in B database, then it abend the process and display unique contraint error. in discard file it dislpay error for 1, A record.
Discard File:
Current time: 2011-05-11 09:56:32
Discarded record from action ABEND on error 1
OCI Error ORA-00001: unique constraint (PRS.SYS_C0011963) violated (status = 1), SQL <INSERT INTO "PRS"."TEST_DEMO" ("SNO","ENAME") VALUES (:a0,:a1)>
Aborting transaction on D:\REMOTE_TRAIL_PD1EPCS_EPDBMAINB2_EPDBMAINB1\dirdat\pt beginning at seqno 2 rba 4409
error at seqno 2 rba 4409
Problem replicating PRT.TEST_DEMO to PRS.TEST_DEMO
Mapping problem with insert record (target format)...
SNO = 8
ENAME = H
Process Abending : 2011-05-11 09:56:32
Please help
Regards,
AbhishekWell, it looks like your databases are not/were not synchronized. The record already lives in A, it did not in B, so the insert in B worked, and the replicat on A abended. If you expect to have more records like that, you can add exception handling on A.
Maybe you are looking for
-
I Just converted my iPhoto library to Photos. Now I can't find my RAW files. I need to have all the images in my old library show up in their RAW format.
-
I am in the process of upgrade from 10gR2 to 11gR2. After Installing 11gR2 in a new ORACLE_HOME, i have decided to use DBUA to upgrade the database from 10g to 11g. All my user Table Spaces are on different Mount points. for eg. /u60/app/ORACLE/orada
-
CRM material master valuation category
hi, I replicated material master data from ERP to CRM. 10000016 Material has 3 valuation category in ERP. But valuation category in the financial 1 tab doesn't replicate to CRM. While I create service order with 10000016 material , how can I know
-
Why Do The Latest Versions Of Soundtrack and Soundtrack Pro Use .caf Files?
About 3 years ago Soundtrack and Soundtrack Pro started using .caf files instead of .aif ones. The .caf files appear to take up more space than .aif and of course need rendering before they can be used in FCE/FCP. So what is the logic behind their us
-
Different Master of Height to match the corresponding page?
In my website I have multiple pages with different height. Should I create alot of Masters with different height to match the corresponding page?