Tolerance limit exceeded
Dear Friends,
In the cockpit, the doc has the following error 'tolerance limit exceeded'.
The PO is Rs.5000 and invoice is Rs.5000. How to analyze the error?
If there is document block, how can i find out what is the reason?
What are the Transaction codes that are associated with it.
Please explain in detail
Regards
Sridhar
Edited by: Sridhar M on Sep 14, 2009 5:29 PM
Edited by: Sridhar M on Sep 14, 2009 7:05 PM
Dear,
Can you check whats the tolerance limits maintained :
IMG -> MM -> Logistics Invoice Verification -> Invoice Block -> Set Tolerance limits, (Tcode OMR6.).
Regards,
Syed Hussain.
Similar Messages
-
Tolerance Key LD: Blanket purchase order time limit exceeded
Hi All,
Can anybody tell me how the Invoice Block Tolerance Key <b>LD: Blanket purchase order time limit exceeded</b> works.
As per my understanding, we need to maintain no days in the Tolreance Key LD , so that it checks for the Contract Validity & the Invoice Date with the tolerance limit days & block the invoice if it exceeds.
But, it's asking for the value in the LD tolerance limit setting !!! Hope it should be no of days !!! Kindly let me know.
Also let me know if you can explain me the settings & steps to run this scenario.
Thanks!
NandaSteve,
Can you answer Nanda's question? I haven't really used this LD tolerance previously. I tried this on my sandbox system and can see the cause for his confusion although I am unable to explain it.
Thanks,
H Narayan -
Quantity Tolerance limit for goods receipt
Hello Friends,
We are working on the stock transport order scenario, and the steps are as below
1. Stock Transport Order
2. Outbound Delivery
3. Post Goods Issue
4. Inbound Delivery
5. Goods Receipt
In our scenario, we were not maintaining any price in stock transport order and the invoice receipt at item level of STO is unflag. When we set the tolerance limit at stock transport order level and during goods receipt, if we post the excess stock then system is not allowing to post the stock and showing the below Error -
Message no. M7022:PU Withdrawn qty exceeded by...
Please suggest. Thanks in advance for your support.
Best Regards,
GouthamDear Friends,
Thanks a lot for your time. The problem was resolved, we maintained the tolerance limits at purchase order level and the same will be reflected at goods receipt from inound delivery. We cannot use either purchase value key function or GR/IR Tolerance since our stock transport orders doesnt have any price and not relevant for invoice receipt. Hence, the tolerance limit in sto is feasilble option for us.
Best Regards,
Goutham -
Hi all,
I have configured the the lower limit and higher limit both percentge & absolute value as 0% or 0 Value for tolerance limit PP.
However, when i post the invoice $90 for a PO of $100. The invoice is not blocked for payment. But i saw the warning message "Price too low (below tolerance limit of 0.00)"
On the other hand, when i post an invoice higher than the PO value, the invoice is blocked for payment.
Do anyone of you knows the possible reasons for this?
Thanks.Hi,
Yes, but many people will set a tolerance for these invoices (undercharges) so that the message appears when the undercharge is large because this may indicate that there is a mistake in the input of the invoice (Vendor very rarely undercharge) The message can be set as an error instead of a warning when the tolerance has been exceeded.
So I would set the lower tolerance as a reasonable tolerance and inform your entry users that is they get a warniong about the price being too low, to check there entry for mistakes and continue if everything looks right, or inform the purchasing department if something looks very wrong.
Steve B -
Invoice blocking due to tolerance limit at header Invoice Value.
Hi,
Our client needs to have tolerance limit set for total Invoice value.The Invoice contains multiple line items.The tolerance may be % or $ value.The standard setting which I know is for each Invoice line item but not for cummulative.The issue is if we post exceed but within tolerance amount for each line item from the Invoice containing multiple line items,the total Invoice value exceeds considerably.
Please let me know the how to deal with this issue?Hi Mandar,
Set Tolerance Limits:
In this step, you specify the tolerance limits for each tolerance key for each company code.
When processing an invoice, the R/3 System checks each item for variances between the invoice and the purchase order or goods receipt. The different types of variances are defined in tolerance keys.
As per your req ,it is not possible to compare PO Net header value with GR Total value and Toatl Invoice value because PO , GR and Invoice may contain the different line items.
we are always doing Invoice verification at item level not on header level so SAP provides all the keys at item level only.
Rewards if helpful...
Regards
Sanjay L -
The Tolerance limit for type "1" not running in Av. Control
In my project I am unable to see the effect of Warning on budget even though Activation type 1 with tolerance limit "1" is activated for 50% of budget.
The actual cost are posted by MB1A against reservation to the activity and material reservations are created.
Although when the costs exceed more than 50% the system status shows BUDG ISBD AVAC for the said WBS.
System does give the error on reaching 100% level of cost i.e. no "3" tolerance limit is successfully running.
A reply /suggestions shall be appreciated.
Thanks
RajeevHi,
We are also getting the same error message when posting from an external system.Could you please let me know if you were able to resolve it? -
hi,
we specified a limit in tolerance group in t-code oba3 for controlling the custoemr incoming payments differences.but system is accepting the diffrences which are exceeding the limit of tolerancelimit.
waiting for reply,
Thanku
Edited by: manjeera chitturi on Dec 23, 2008 7:27 AMhi,
OBA0 is tolerance for G/L account not for employees, my requirement is i need to control the payment differences of customer account.ie while receving incoming payments from customer we have to control the payment payment differences through Tolerance limit
Thanku, -
Tolerance limit for physical inventory
hai
In physical inventory there a option for tolerance limit setting for
A.. document level
B. item level
1. my question is this tolerance limit setting is for "QUANTIY BASED" (OR) "VALUE BASED "
2.in my problem the count is exceed the tolerance limit what they set for document level and item level , and how to post the document
Any Release is for posting the differnce while exceed the tolerance limit .
regards
christohai
my question is if the item value is excced the tolerance limit means the system will not allow to post the inventory difference document.
so how to post this document.
regard
christo -
Hi Experts,
Need some urgent input. I am using L_TO_CREATE_DN to create TO for Delivery. Certain deliveries have Over-Tolerance Limit such as 10%, 20% etc which means that Picking should be allowed even if the Pick qty is greater than delivery qty but less than over-tolerance limit.
Example
If a Delivery has Delivery Qty (in VL03N) = 1000. and Tolerance Limit = 10%. so Maximum Possible pick is (1000*10/100) + 1000 = 1100.
So in case I have already picked quantity of 900 from the delivery and the next HU I am picking has quantity 150 inside it then total pick would be 1050.
1050 > 1000 (Delivery Qty) but 1050 < 1100 (Qty within Tolerance) so it should go through.
But L_TO_CREATE_DN is giving error "Total for assigned quantities exceeds quantity to be removed".
Any thoughts on whether there is a particular setting in this FM to let it go through? Or is there a different FM available to perform this task?
Please HELP.
Thanks,
Rohit.Hi!
I had the same problem yesterday. How do you solve it?
I had the message Total for assigned quantities exceeds quantity to be removed
Regards,
LB -
Dear Gurus,
I want to restrict a GL to a certain amount so that amount more than that should not be entered or give error message.For Ex. Say My GL a/c is 40004000 and I want amount more than Rs 100 should not be entered or give error message while making data entry .Should I create a validation for this or do some configuration ( Tolerance Limit in control tab of GL master) .Please guide.Dear Murali,
Thanks for reply.I created a tolerance group say (TOLT) in OBA0 . I put amount in both debit and credit as 100.00 with % field blank . Then in GL master I put this tolerance group OBA0 in one single GL .so that GL will give error for amount exceed Rs 100.00 . But in FB50 , against that GL it is allowing amount more than 100 . So please guide me where I am wrong and how should I restrict one single GL for exceeding amount Rs 100. -
PI 7.11 - SXI_CACHE Issue - SQL 0904 - Resource Limit Exceeded
Hi PI & IBM I Gurus,
We are having the SXI_CACHE issue in our PI 7.11 SPS04 system. When we try to do the delta cache refresh through SXI_CACHE, it is returning the error SQL 0904 - Resource limit exceeded. When we try to do the full cache refresh, it is getting the issue 'Application issue during request processing'.
We have cleaned up the SQL Packages with DLTR3PKG command, which did not resolve the issue. We recently performed a system copy to build the QA instance and I observed that the adapter engine cache for the development is presented in the QA instance and removed that cache from there.
I am not seeing the adapter engine connection data cache in our PI system. The adapter engine cache is working fine.
All the caches are working fine from the PI Administration page. The cache connectivity test is failing with the same error as I mentioned for the SXI_CACHE.
Please let me know if you have encountered any issue like this on IBM I 6.1 Platform.
Your help is highly appreciated.
Thanks
KalyanHi Kalyan,
SQL0904 has different reason codes ...
Which one are you seeing ?
Is the SQL pack really at its boundary of 1GB ?
... otherwise, it is perhaps a totally different issue ... then DLTR3PKG cannot help at all ...
If you should see this big SQL Package, you should use PRTSQLINF in order to see if there is more or less over and over the same SQL in, just with different host variables or so ...
If the last point should be the case, I would open a message with BC-DB-DB4 so that they can check how to help here or to talk to the application people to behave a bit different ...
Regards
Volker Gueldenpfennig, consolut international ag
http://www.consolut.com http://www.4soi.de http://www.easymarketplace.de -
PI 7.11 - SXI_CACHE Issue - SQL0904 - Resource limit exceeded
Hi IBM I Gurus,
We are having the SXI_CACHE issue in our PI 7.11 SPS04 system. When we try to do the delta cache refresh through SXI_CACHE, it is returning the error SQL 0904 - Resource limit exceeded. When we try to do the full cache refresh, it is getting the issue 'Application issue during request processing'.
We have cleaned up the SQL Packages with DLTR3PKG command, which did not resolve the issue. We recently performed a system copy to build the QA instance and I observed that the adapter engine cache for the development is presented in the QA instance and removed that cache from there.
I am not seeing the adapter engine connection data cache in our PI system. The adapter engine cache is working fine.
All the caches are working fine from the PI Administration page. The cache connectivity test is failing with the same error as I mentioned for the SXI_CACHE.
Please let me know if you have encountered any issue like this on IBM I 6.1 Platform.
Your help is highly appreciated.
Thanks
KalyanHi Kalyan,
SQL0904 has different reason codes ...
Which one are you seeing ?
Is the SQL pack really at its boundary of 1GB ?
... otherwise, it is perhaps a totally different issue ... then DLTR3PKG cannot help at all ...
If you should see this big SQL Package, you should use PRTSQLINF in order to see if there is more or less over and over the same SQL in, just with different host variables or so ...
If the last point should be the case, I would open a message with BC-DB-DB4 so that they can check how to help here or to talk to the application people to behave a bit different ...
Regards
Volker Gueldenpfennig, consolut international ag
http://www.consolut.com http://www.4soi.de http://www.easymarketplace.de -
Hi Everyone
My Connection Pool parameters JCO api.
client=300
user=SISGERAL_RFC
passwd=******
ashost=14.29.3.120
sysnr=00
size=10
I have these parameters on my Connection Pool and sometimes appear these wrongs in my application:
1.
2006-01-07 13:20:37,414 ERROR com.tel.webapp.framework.SAPDataSource - ##### Time limit exceeded. LOCALIZED MESSAGE = Time limit exceeded. KEY = RFC_ERROR_SYSTEM_FAILURE GROUP = 104 TOSTRING = com.sap.mw.jco.JCO$Exception: (104) RFC_ERROR_SYSTEM_FAILURE: Time limit exceeded.
2.
2006-01-07 14:01:31,007 ERROR com.tel.webapp.framework.SapPoolConnectionManager - Timeout
Id like to know if is happening.
Are there something wrong with my connection pool?
What can be happening?
ThanksRaghu,
Thanks for your response.
Yes, the pool connections are in place according to the sAP note mentioned above.
Regards,
Faisal -
Memory Leak, Receiver Got Null Message & Consumer limit exceeded on destina
When running program that adds an Object message to a JMS queue and then recieves it. I get the following.
1) interminitent NULL messages recieved.
2) jms.JMSException: [C4073]: Consumer limit exceeded on destination interactionQueueDest. Even though only one receiver can be receiving via the supplied program.
3) After many message are added to the queue 1000's the Message Queue goes to Out Of Memory exception. It should swap to disk!!
STEPS TO FOLLOW TO REPRODUCE THE PROBLEM :
RUN this program via a JSP call in the application server.
JSP
<%@ page language="java" import="jms.*"%>
<html>
<head>
<title>Leak Memory</title>
</head>
<body>
<hr/>
<h1>Leak Memory</h1>
<%
LeakMemory leakMemory = new LeakMemory();
leakMemory.runTest(10000,1000);
// NOTE will brake but slower with setting leakMemory.runTest(10000,100);
%>JMS resources must be created:
jms/queueConnectionFactory
jms/interactionQueue
must be created first.
Class:
package jms;
import javax.naming.*;
import javax.jms.*;
public class LeakMemory implements Runnable {
private QueueConnectionFactory queueConnectionFactory = null;
private Queue interactionQueue = null;
private boolean receiverRun = true;
private QueueConnection queueConnection;
private int totalMessageInQueue = 0;
public LeakMemory() {
init();
* initialize queues
public void init(){
try {
InitialContext context = new InitialContext();
this.queueConnectionFactory = (QueueConnectionFactory)context.lookup("jms/queueConnectionFactory");
this.interactionQueue = (Queue) context.lookup("jms/interactionQueue");
catch (NamingException ex) {
printerError(ex);
public void runTest(int messageCount, int messageSize){
this.receiverRun = true;
Thread receiverThread = new Thread(this);
receiverThread.start();
for (int i = 0; i < messageCount; i++) {
StringBuffer messageToSend = new StringBuffer();
for (int ii = 0; ii < messageSize; ii++) {
messageToSend.append("0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\n");
QueueConnection queueConnectionAdder = null;
QueueSession queueInteractionSession = null;
QueueSender interactionQueueSender = null;
try {
//Get a queue connection
queueConnectionAdder = this.getQueueConnection();
queueInteractionSession = queueConnectionAdder.createQueueSession(false, Session.AUTO_ACKNOWLEDGE);
interactionQueueSender = queueInteractionSession.createSender(this.interactionQueue);
ObjectMessage objectMessage = queueInteractionSession.createObjectMessage(messageToSend);
objectMessage.setStringProperty("PROPERTY", "" + System.currentTimeMillis());
//Send object
interactionQueueSender.send(objectMessage,DeliveryMode.PERSISTENT,5,0);
totalMessageInQueue++;
//Close Resources
interactionQueueSender.close();
queueInteractionSession.close();
interactionQueueSender = null;
queueInteractionSession = null;
} catch (JMSException ex) {
printerError(ex);
* run
public void run() {
while(this.receiverRun){
try {
QueueSession interactionQueueSession = this.getQueueConnection().createQueueSession(false, Session.CLIENT_ACKNOWLEDGE);
QueueReceiver queueReceiver = interactionQueueSession.createReceiver(this.interactionQueue);
ObjectMessage message = (ObjectMessage)queueReceiver.receive(100);
if(message != null){
StringBuffer messageRecived = (StringBuffer)message.getObject();
//Simulate Doing Something
synchronized (this) {
try {
Thread.sleep(400);
catch (InterruptedException ex1) {
//Can Safely be ignored
message.acknowledge();
totalMessageInQueue--;
} else {
printerError(new Exception("Receiver Got Null Message"));
queueReceiver.close();
interactionQueueSession.close();
queueReceiver = null;
interactionQueueSession = null;
catch (JMSException ex) {
printerError(ex);
* Get's the queue Connection and starts it
* @return QueueConnection The queueConnection
public synchronized QueueConnection getQueueConnection(){
if (this.queueConnection == null) {
try {
this.queueConnection = this.queueConnectionFactory.createQueueConnection();
this.queueConnection.start();
catch (JMSException ex) {
printerError(ex);
return this.queueConnection;
private void printerError(Exception ex){
System.err.print("ERROR Exception totalMessageInQueue = " + this.totalMessageInQueue + "\n");
ex.printStackTrace();
}Is there something wrong with the way I'm working with JMS or is it just this unreliable in Sun App Server 7 Update 3 on windows?1) interminitent NULL messages recieved.Thanks that explains the behavior. It was wierd getting null messages when I know there is messages in the queue.
2) jms.JMSException: [C4073]: Consumer limit exceeded
on destination interactionQueueDest. Even though only
one receiver can be receiving via the supplied
program. No other instances, Only this program. Try it yourself!! It works everytime on Sun Application Server 7 update 2 & 3
heres the broker dump at that error point
[14/Apr/2004:12:51:47 BST] [B1065]: Accepting: [email protected]:3211->admin:3205. Count=1
[14/Apr/2004:12:51:47 BST] [B1066]: Closing: [email protected]:3211->admin:3205. Count=0
[14/Apr/2004:12:52:20 BST] [B1065]: Accepting: [email protected]:3231->jms:3204. Count=1 [14/Apr/2004:12:53:31 BST] WARNING [B2009]: Creation of consumer from connection [email protected]:3231 on destination interactionQueueDest failed:
B4006: com.sun.messaging.jmq.jmsserver.util.BrokerException: [B4006]: Unable to attach to queue queue:single:interactionQueueDest: a primary queue is already active
3) After many message are added to the queue 1000's
the Message Queue goes to Out Of Memory exception. It
should swap to disk!!The broker runs out of memory. Version in use
Sun ONE Message Queue Copyright 2002
Version: 3.0.1 SP2 (Build 4-a) Sun Microsystems, Inc.
Compile: Fri 07/11/2003 All Rights ReservedOut of memory snippet
[14/Apr/2004:13:08:28 BST] [B1089]: In low memory condition, Broker is attempting to free up resources
[14/Apr/2004:13:08:28 BST] [B1088]: Entering Memory State [B0022]: YELLOW from previous state [B0021]: GREEN - current memory is 118657K, 60% of total memory
[14/Apr/2004:13:08:38 BST] WARNING [B2075]: Broker ran out of memory before the passed in VM maximum (-Xmx) 201326592 b, lowering max to currently allocated memory (200431976 b ) and trying to recover [14/Apr/2004:13:08:38 BST] [B1089]: In low memory condition, Broker is attempting to free up resources
[14/Apr/2004:13:08:38 BST] [B1088]: Entering Memory State [B0024]: RED from previous state [B0022]: YELLOW - current memory is 128796K, 99% of total memory [14/Apr/2004:13:08:38 BST] ERROR [B3008]: Message 2073-192.168.0.50(80:d:b6:c4:d6:73)-3319-1081944517772 exists in the store already [14/Apr/2004:13:08:38 BST] WARNING [B2011]: Storing of JMS message from IMQConn[AUTHENTICATED,[email protected]:3319,jms:3282] failed:
com.sun.messaging.jmq.jmsserver.util.BrokerException: Message 2073-192.168.0.50(80:d:b6:c4:d6:73)-3319-1081944517772 exists in the store already
[14/Apr/2004:13:08:38 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:38 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:39 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:39 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:39 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:39 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:40 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:40 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:40 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:40 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:41 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:42 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:42 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:42 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:42 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:43 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:43 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:43 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:43 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:44 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:44 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:44 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:45 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:45 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:46 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:46 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:47 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:47 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:47 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:47 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:48 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:49 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:49 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:49 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:49 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:50 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:50 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:50 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:50 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:51 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:51 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:51 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:51 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:52 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:52 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:53 BST] ERROR [B3107]: Attempt to free memory failed, taking more drastic measures : java.lang.OutOfMemoryError
[14/Apr/2004:13:08:53 BST] ERROR unable to deal w/ error: 1
[14/Apr/2004:13:08:53 BST] ERROR TRYING TO CLOSE [14/Apr/2004:13:08:53 BST] ERROR DONE CLOSING
[14/Apr/2004:13:08:53 BST] [B1066]: Closing: [email protected]:3319->jms:3282. Count=0 -
Short dump "Time limit exceeded" when searching for Business Transactions
Hello Experts,
We migrated from SAP CRM 5.2 to SAP CRM 7.0. After migration, our business transaction search (quotation, sales order, service order, contract etc) ends with the short dump "Time limit exceeded" in class CL_CRM_REPORT_ACC_DYNAMIC, method DATABASE_ACCESS. The select query is triggered from line 5 of this method.
Number of Records:
CRMD_ORDERADM_H: 5,115,675
CRMD_ORDER_INDEX: 74,615,914
We have done these so far, but the performance is still either poor or times out.
1. DB team checked the ORACLE parameters and confirmed they are fine. They also checked the health of indices in table CRMD_ORDER_INDEX and indices are healthy
2. Created additional indices on CRMD_ORDERADM_H and CRMD_ORDER_INDEX. After the creation of indices, some of the searches(without any criteria) work. But it takes more than a minute to fetch 1 or 2 records
3. An ST05 trace confirmed that the selection on CRMD_ORDER_INDEX takes the most time. It takes about 103 seconds to fetch 2 records (max hits + 1)
4. If we specify search parameters, say for example a date or status, then again we get a short dump with the message "Time limit exceeded".
5. Observed that only if a matching index is available for the WHERE clause, the results are returned (albeit slowly). In the absence of an index, we get the dump.
6. Searched for notes and there are no notes that could help us.
Any idea what is causing this issue and what we can do to resolve this?
Regards,
BalaHi Michael,
Thanks. Yes we considered the note 1527039. None of the three scenarios mentioned in the note helped us. But we ran CRM_INDEX_REBUILD to check if the table CRMD_ORDER_INDEX had a problem. That did not help us either.
The business users told us that they mostly search using the date fields or Object ID. We did not have any problem with search by Object ID. So we created additional indices to support search using the date fields.
Regards,
Bala
Maybe you are looking for
-
Can someone please help. I have a really old MacBook that is running really slow. I want to wipe everything and reinstall the latest software but don't know how to do this. Also is there a way of keeping Microsoft Office when doing this? Thanks in ad
-
I have two new Apple computers; 17" 2.4 GHz PB Pro and 24" iMac. Upgraded both to Leopard. Upgraded both to 10.5.1 last week. Yesterday both computers started experiencing keyboard issues. Keyboard use is intermittent. Sometimes it will not work for
-
Why do we need Segment Reporting?
Dear Experts, Why do we need Segment reporting since Profit Center reporting can do the same? Basically what's the difference? Regards, Abraham
-
Photoshop CC causes banding to all OS
I've just bought the best macbook pro available and I'm experiencing terrible graphics compared to my old late 2006 Macbook. In particular I'm noticing more banding in pictures, check this pic, I see banding in the background, I sent it to some frien
-
Is there ant difference of "wave" file between OSX with Windows XP
Hi! I made some sound effects with Sound Track Pro 2,and save it to *.wav,but I found when I put it into Windows XP,It can be played by Windows Media Player,but can not be recognized by our game Engine. I'm a developer of game company! And same sound