PO Limit exceeded on SES & Invoice
Hi experts,
We have set up a service PO (item cat. = D) with limits only.
Service Entry Sheet is entered, and total activity exceeds PO limit. Message SE364 appears warning users that limit is exceeded. Message is set as warning in our system. After the warning is bypassed here, there is no further warning on SES acceptance, and invoice does not block.
In our case, we are interfacing information from vendor, creating SES documents, and waiting on users to release/accept the SES information. Message SE364 appears when SES is created, but this step is automated & no one sees it. Is there a way to either:
1. Generate a similar warning at the step where the SES is released? Or
2. Block the invoice b/c it exceeds the PO limit. OR
3. Any other suggestions?
I'm sure other users are doing some or all of this. Any advice is extremely welcome. Thanks,
Jen
Hi,
I understand the logic behind the process, you have to capture the SES this way due to the sheer volume and number of users involved.
the problem is that you wouldn't need to worry about SESs being proposed for approval that have exceeded the limits because in a non-batch process the limit message would be an error and so the ONLY SESs that would be put forward for approval would be the ones that are within limits. That is why there is no standard check again at this stage.
Could you try something radical?
Coud you change the SES entry batch program so that it checks the limits each time and ONLY process those that are within limits? That way the approvers would not have to worry about limits because the only ones they would see are within the limit. If the program finds any SESs that are over the limit it lists these in a report and someone will have to manage these manually? (remembering that even though this would be manually, it really can't be anything else because someone would have to make the decision to proceed or not?
Perhaps they could process the list on-line and have a tick box next to each SES that is over the liomit with the value of that over spend shown. they then tick the ones that they want to go through and the batch program (next time it runs) processes these because they have a tick that allows them to be processed?
Steve B
Similar Messages
-
Credit Limit Exceeded (F-22 Customer Invoice)
When i try to post a customer invoice in F-22, the system is giving only a warnning message as credit limit exceeded. Cheked OBA5 with Message number and application area F5 139, it is in Error only but still the system is throwing a warring message and posting the document.
OB45 Completed
FD32 Completed
OBA5 Completed
Please suggest how can we make an error instead on warning message...is there any Note.This relates to SD - Credit Management. Please remember to post at correct forum for prompt answers.
However, in this case please check transaction OVAK. Credit check seems configured with Warning message for the relevant Sales Document Type. Change it to Error message.
Regards
Ajay -
Tolerance Key LD: Blanket purchase order time limit exceeded
Hi All,
Can anybody tell me how the Invoice Block Tolerance Key <b>LD: Blanket purchase order time limit exceeded</b> works.
As per my understanding, we need to maintain no days in the Tolreance Key LD , so that it checks for the Contract Validity & the Invoice Date with the tolerance limit days & block the invoice if it exceeds.
But, it's asking for the value in the LD tolerance limit setting !!! Hope it should be no of days !!! Kindly let me know.
Also let me know if you can explain me the settings & steps to run this scenario.
Thanks!
NandaSteve,
Can you answer Nanda's question? I haven't really used this LD tolerance previously. I tried this on my sandbox system and can see the cause for his confusion although I am unable to explain it.
Thanks,
H Narayan -
Approval Procedure not triggered when Credit Limit Exceeded
Hi,
I've defined an approval procedure which I would like to launch when a clients Credit Limit has been exceeded. I've checked the required fields on Administration > General Settings > BP tab (credit Limit, Commitment Limit, AR Invoice, Delivery, SO). I've also checked the Activate Approval Procedures on the same tab.
I know the approval procedure has been set up the correct way, since it is launched when I change the Terms to for instance 'Total document'.
There is a system-message when I am creating a Delivery Note for a customer indicating that the credit limit for this customer has been exceeded. But still, the system allows the user to add the document without triggering the approval procedure....
What am I doing wrong?
Btw: The credit limit exceeded is in negative numbers, but I've tried both the condition 'Deviation from Credit Limit greater than 1' and Deviation from Credit Limit Less than 1...Could you clarify what you mean by "The credit limit exceeded is in negative numbers"
Negative number: the system message I get when I create a delivery note for a customer, exceeding the credit limit is the following:
Customer has exceeded credit limit - 95.479 EUR
Customer has exceeded commitment limit - 95.479 EUR
Continue (Y/N)
Make sure the Credit Limits are set in the BP Master. --> OK, credit limit is set to 100 EUR, Commitment limit to 100 EUR
Also use Great than in Ratio and try to use a small number like 10 in the value --> I am using 1 EUR
Also, check in System Initialization > General Settings..BP Tab Activate approval procedure is checked. --> it is checked.
Using the range option does not solve the problem, in fact even if I use 'Deviation from Credit Limit does not equal 1 EUR' the approval is not triggered. When I use the same Approval template, but I alter the terms to Total document greater than 1 or Discount% greater than 1 the approval is triggered.
Edited by: Rui Pereira on Apr 29, 2009 1:19 PM -
Credit Limit Exceeded in Billing
Dear all,
While I am saving a invoice through VF01, I am getting an error that Customer Credit Limit Exceeded - message ZSD000. I have not defined any credit control settings. Please can any one faced such error? Please guide me.
Can anyone please throw some light on this issue.
Thanks & Regards,
AK
Edited by: anand k on Nov 23, 2011 11:27 AMI remember we used VKM3 transaction to clear/add some checkbox related to credit limit and saved the document. Then it allows. Try it out.
Regards,
Ganga -
Hi
I´am building a gauge portlet, but when i finish i have a problem with data limit exceeded.
my discoverer report has about 2000 rows and i'm showing a count for this 2000 rows in the portlet....
I can't reduce the number for my report because i'm showing (a count) the received invoices in time...
Tomorrow i have a presentation and i don't know what to do...
Any ideas... (sorry for my english)
Or how can i change the limit for the data??
Thanks a lot.Tobias1234 wrote:
What does a tag quality 'Limit exceeded : High Limited' / 'Limit exceeded : Low Limited' shown in the Tag monitor mean ? Which limit has exceeded ?
It is shown at tags which are inputs from a Twincat OPC server. I dont use the DSC Alarms and the values are still updated but this Quality is shown at all my analog inputs. What does it mean ?
I think it just means that the tag's value is above or below its 'Full Scale' or 'Zero Scale' settings (as configured in the TCE on the "Scaling" tab for the tag).
Can be meaningful or meaningless depending on the way you're using those settings.
Message Edited by Donald on 04-28-2006 01:58 PM
=====================================================
Fading out. " ... J. Arthur Rank on gong." -
Dear Friends,
In the cockpit, the doc has the following error 'tolerance limit exceeded'.
The PO is Rs.5000 and invoice is Rs.5000. How to analyze the error?
If there is document block, how can i find out what is the reason?
What are the Transaction codes that are associated with it.
Please explain in detail
Regards
Sridhar
Edited by: Sridhar M on Sep 14, 2009 5:29 PM
Edited by: Sridhar M on Sep 14, 2009 7:05 PMDear,
Can you check whats the tolerance limits maintained :
IMG -> MM -> Logistics Invoice Verification -> Invoice Block -> Set Tolerance limits, (Tcode OMR6.).
Regards,
Syed Hussain. -
PI 7.11 - SXI_CACHE Issue - SQL 0904 - Resource Limit Exceeded
Hi PI & IBM I Gurus,
We are having the SXI_CACHE issue in our PI 7.11 SPS04 system. When we try to do the delta cache refresh through SXI_CACHE, it is returning the error SQL 0904 - Resource limit exceeded. When we try to do the full cache refresh, it is getting the issue 'Application issue during request processing'.
We have cleaned up the SQL Packages with DLTR3PKG command, which did not resolve the issue. We recently performed a system copy to build the QA instance and I observed that the adapter engine cache for the development is presented in the QA instance and removed that cache from there.
I am not seeing the adapter engine connection data cache in our PI system. The adapter engine cache is working fine.
All the caches are working fine from the PI Administration page. The cache connectivity test is failing with the same error as I mentioned for the SXI_CACHE.
Please let me know if you have encountered any issue like this on IBM I 6.1 Platform.
Your help is highly appreciated.
Thanks
KalyanHi Kalyan,
SQL0904 has different reason codes ...
Which one are you seeing ?
Is the SQL pack really at its boundary of 1GB ?
... otherwise, it is perhaps a totally different issue ... then DLTR3PKG cannot help at all ...
If you should see this big SQL Package, you should use PRTSQLINF in order to see if there is more or less over and over the same SQL in, just with different host variables or so ...
If the last point should be the case, I would open a message with BC-DB-DB4 so that they can check how to help here or to talk to the application people to behave a bit different ...
Regards
Volker Gueldenpfennig, consolut international ag
http://www.consolut.com http://www.4soi.de http://www.easymarketplace.de -
PI 7.11 - SXI_CACHE Issue - SQL0904 - Resource limit exceeded
Hi IBM I Gurus,
We are having the SXI_CACHE issue in our PI 7.11 SPS04 system. When we try to do the delta cache refresh through SXI_CACHE, it is returning the error SQL 0904 - Resource limit exceeded. When we try to do the full cache refresh, it is getting the issue 'Application issue during request processing'.
We have cleaned up the SQL Packages with DLTR3PKG command, which did not resolve the issue. We recently performed a system copy to build the QA instance and I observed that the adapter engine cache for the development is presented in the QA instance and removed that cache from there.
I am not seeing the adapter engine connection data cache in our PI system. The adapter engine cache is working fine.
All the caches are working fine from the PI Administration page. The cache connectivity test is failing with the same error as I mentioned for the SXI_CACHE.
Please let me know if you have encountered any issue like this on IBM I 6.1 Platform.
Your help is highly appreciated.
Thanks
KalyanHi Kalyan,
SQL0904 has different reason codes ...
Which one are you seeing ?
Is the SQL pack really at its boundary of 1GB ?
... otherwise, it is perhaps a totally different issue ... then DLTR3PKG cannot help at all ...
If you should see this big SQL Package, you should use PRTSQLINF in order to see if there is more or less over and over the same SQL in, just with different host variables or so ...
If the last point should be the case, I would open a message with BC-DB-DB4 so that they can check how to help here or to talk to the application people to behave a bit different ...
Regards
Volker Gueldenpfennig, consolut international ag
http://www.consolut.com http://www.4soi.de http://www.easymarketplace.de -
Hi Everyone
My Connection Pool parameters JCO api.
client=300
user=SISGERAL_RFC
passwd=******
ashost=14.29.3.120
sysnr=00
size=10
I have these parameters on my Connection Pool and sometimes appear these wrongs in my application:
1.
2006-01-07 13:20:37,414 ERROR com.tel.webapp.framework.SAPDataSource - ##### Time limit exceeded. LOCALIZED MESSAGE = Time limit exceeded. KEY = RFC_ERROR_SYSTEM_FAILURE GROUP = 104 TOSTRING = com.sap.mw.jco.JCO$Exception: (104) RFC_ERROR_SYSTEM_FAILURE: Time limit exceeded.
2.
2006-01-07 14:01:31,007 ERROR com.tel.webapp.framework.SapPoolConnectionManager - Timeout
Id like to know if is happening.
Are there something wrong with my connection pool?
What can be happening?
ThanksRaghu,
Thanks for your response.
Yes, the pool connections are in place according to the sAP note mentioned above.
Regards,
Faisal -
Memory Leak, Receiver Got Null Message & Consumer limit exceeded on destina
When running program that adds an Object message to a JMS queue and then recieves it. I get the following.
1) interminitent NULL messages recieved.
2) jms.JMSException: [C4073]: Consumer limit exceeded on destination interactionQueueDest. Even though only one receiver can be receiving via the supplied program.
3) After many message are added to the queue 1000's the Message Queue goes to Out Of Memory exception. It should swap to disk!!
STEPS TO FOLLOW TO REPRODUCE THE PROBLEM :
RUN this program via a JSP call in the application server.
JSP
<%@ page language="java" import="jms.*"%>
<html>
<head>
<title>Leak Memory</title>
</head>
<body>
<hr/>
<h1>Leak Memory</h1>
<%
LeakMemory leakMemory = new LeakMemory();
leakMemory.runTest(10000,1000);
// NOTE will brake but slower with setting leakMemory.runTest(10000,100);
%>JMS resources must be created:
jms/queueConnectionFactory
jms/interactionQueue
must be created first.
Class:
package jms;
import javax.naming.*;
import javax.jms.*;
public class LeakMemory implements Runnable {
private QueueConnectionFactory queueConnectionFactory = null;
private Queue interactionQueue = null;
private boolean receiverRun = true;
private QueueConnection queueConnection;
private int totalMessageInQueue = 0;
public LeakMemory() {
init();
* initialize queues
public void init(){
try {
InitialContext context = new InitialContext();
this.queueConnectionFactory = (QueueConnectionFactory)context.lookup("jms/queueConnectionFactory");
this.interactionQueue = (Queue) context.lookup("jms/interactionQueue");
catch (NamingException ex) {
printerError(ex);
public void runTest(int messageCount, int messageSize){
this.receiverRun = true;
Thread receiverThread = new Thread(this);
receiverThread.start();
for (int i = 0; i < messageCount; i++) {
StringBuffer messageToSend = new StringBuffer();
for (int ii = 0; ii < messageSize; ii++) {
messageToSend.append("0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\n");
QueueConnection queueConnectionAdder = null;
QueueSession queueInteractionSession = null;
QueueSender interactionQueueSender = null;
try {
//Get a queue connection
queueConnectionAdder = this.getQueueConnection();
queueInteractionSession = queueConnectionAdder.createQueueSession(false, Session.AUTO_ACKNOWLEDGE);
interactionQueueSender = queueInteractionSession.createSender(this.interactionQueue);
ObjectMessage objectMessage = queueInteractionSession.createObjectMessage(messageToSend);
objectMessage.setStringProperty("PROPERTY", "" + System.currentTimeMillis());
//Send object
interactionQueueSender.send(objectMessage,DeliveryMode.PERSISTENT,5,0);
totalMessageInQueue++;
//Close Resources
interactionQueueSender.close();
queueInteractionSession.close();
interactionQueueSender = null;
queueInteractionSession = null;
} catch (JMSException ex) {
printerError(ex);
* run
public void run() {
while(this.receiverRun){
try {
QueueSession interactionQueueSession = this.getQueueConnection().createQueueSession(false, Session.CLIENT_ACKNOWLEDGE);
QueueReceiver queueReceiver = interactionQueueSession.createReceiver(this.interactionQueue);
ObjectMessage message = (ObjectMessage)queueReceiver.receive(100);
if(message != null){
StringBuffer messageRecived = (StringBuffer)message.getObject();
//Simulate Doing Something
synchronized (this) {
try {
Thread.sleep(400);
catch (InterruptedException ex1) {
//Can Safely be ignored
message.acknowledge();
totalMessageInQueue--;
} else {
printerError(new Exception("Receiver Got Null Message"));
queueReceiver.close();
interactionQueueSession.close();
queueReceiver = null;
interactionQueueSession = null;
catch (JMSException ex) {
printerError(ex);
* Get's the queue Connection and starts it
* @return QueueConnection The queueConnection
public synchronized QueueConnection getQueueConnection(){
if (this.queueConnection == null) {
try {
this.queueConnection = this.queueConnectionFactory.createQueueConnection();
this.queueConnection.start();
catch (JMSException ex) {
printerError(ex);
return this.queueConnection;
private void printerError(Exception ex){
System.err.print("ERROR Exception totalMessageInQueue = " + this.totalMessageInQueue + "\n");
ex.printStackTrace();
}Is there something wrong with the way I'm working with JMS or is it just this unreliable in Sun App Server 7 Update 3 on windows?1) interminitent NULL messages recieved.Thanks that explains the behavior. It was wierd getting null messages when I know there is messages in the queue.
2) jms.JMSException: [C4073]: Consumer limit exceeded
on destination interactionQueueDest. Even though only
one receiver can be receiving via the supplied
program. No other instances, Only this program. Try it yourself!! It works everytime on Sun Application Server 7 update 2 & 3
heres the broker dump at that error point
[14/Apr/2004:12:51:47 BST] [B1065]: Accepting: [email protected]:3211->admin:3205. Count=1
[14/Apr/2004:12:51:47 BST] [B1066]: Closing: [email protected]:3211->admin:3205. Count=0
[14/Apr/2004:12:52:20 BST] [B1065]: Accepting: [email protected]:3231->jms:3204. Count=1 [14/Apr/2004:12:53:31 BST] WARNING [B2009]: Creation of consumer from connection [email protected]:3231 on destination interactionQueueDest failed:
B4006: com.sun.messaging.jmq.jmsserver.util.BrokerException: [B4006]: Unable to attach to queue queue:single:interactionQueueDest: a primary queue is already active
3) After many message are added to the queue 1000's
the Message Queue goes to Out Of Memory exception. It
should swap to disk!!The broker runs out of memory. Version in use
Sun ONE Message Queue Copyright 2002
Version: 3.0.1 SP2 (Build 4-a) Sun Microsystems, Inc.
Compile: Fri 07/11/2003 All Rights ReservedOut of memory snippet
[14/Apr/2004:13:08:28 BST] [B1089]: In low memory condition, Broker is attempting to free up resources
[14/Apr/2004:13:08:28 BST] [B1088]: Entering Memory State [B0022]: YELLOW from previous state [B0021]: GREEN - current memory is 118657K, 60% of total memory
[14/Apr/2004:13:08:38 BST] WARNING [B2075]: Broker ran out of memory before the passed in VM maximum (-Xmx) 201326592 b, lowering max to currently allocated memory (200431976 b ) and trying to recover [14/Apr/2004:13:08:38 BST] [B1089]: In low memory condition, Broker is attempting to free up resources
[14/Apr/2004:13:08:38 BST] [B1088]: Entering Memory State [B0024]: RED from previous state [B0022]: YELLOW - current memory is 128796K, 99% of total memory [14/Apr/2004:13:08:38 BST] ERROR [B3008]: Message 2073-192.168.0.50(80:d:b6:c4:d6:73)-3319-1081944517772 exists in the store already [14/Apr/2004:13:08:38 BST] WARNING [B2011]: Storing of JMS message from IMQConn[AUTHENTICATED,[email protected]:3319,jms:3282] failed:
com.sun.messaging.jmq.jmsserver.util.BrokerException: Message 2073-192.168.0.50(80:d:b6:c4:d6:73)-3319-1081944517772 exists in the store already
[14/Apr/2004:13:08:38 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:38 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:39 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:39 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:39 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:39 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:40 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:40 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:40 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:40 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:41 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:42 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:42 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:42 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:42 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:43 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:43 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:43 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:43 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:44 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:44 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:44 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:45 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:45 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:46 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:46 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:47 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:47 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:47 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:47 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:48 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:49 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:49 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:49 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:49 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:50 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:50 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:50 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:50 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:51 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:51 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:51 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:51 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:52 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:52 BST] WARNING [B2076]: Broker is rejecting new producers, because it is extremely low on memory
[14/Apr/2004:13:08:53 BST] ERROR [B3107]: Attempt to free memory failed, taking more drastic measures : java.lang.OutOfMemoryError
[14/Apr/2004:13:08:53 BST] ERROR unable to deal w/ error: 1
[14/Apr/2004:13:08:53 BST] ERROR TRYING TO CLOSE [14/Apr/2004:13:08:53 BST] ERROR DONE CLOSING
[14/Apr/2004:13:08:53 BST] [B1066]: Closing: [email protected]:3319->jms:3282. Count=0 -
Short dump "Time limit exceeded" when searching for Business Transactions
Hello Experts,
We migrated from SAP CRM 5.2 to SAP CRM 7.0. After migration, our business transaction search (quotation, sales order, service order, contract etc) ends with the short dump "Time limit exceeded" in class CL_CRM_REPORT_ACC_DYNAMIC, method DATABASE_ACCESS. The select query is triggered from line 5 of this method.
Number of Records:
CRMD_ORDERADM_H: 5,115,675
CRMD_ORDER_INDEX: 74,615,914
We have done these so far, but the performance is still either poor or times out.
1. DB team checked the ORACLE parameters and confirmed they are fine. They also checked the health of indices in table CRMD_ORDER_INDEX and indices are healthy
2. Created additional indices on CRMD_ORDERADM_H and CRMD_ORDER_INDEX. After the creation of indices, some of the searches(without any criteria) work. But it takes more than a minute to fetch 1 or 2 records
3. An ST05 trace confirmed that the selection on CRMD_ORDER_INDEX takes the most time. It takes about 103 seconds to fetch 2 records (max hits + 1)
4. If we specify search parameters, say for example a date or status, then again we get a short dump with the message "Time limit exceeded".
5. Observed that only if a matching index is available for the WHERE clause, the results are returned (albeit slowly). In the absence of an index, we get the dump.
6. Searched for notes and there are no notes that could help us.
Any idea what is causing this issue and what we can do to resolve this?
Regards,
BalaHi Michael,
Thanks. Yes we considered the note 1527039. None of the three scenarios mentioned in the note helped us. But we ran CRM_INDEX_REBUILD to check if the table CRMD_ORDER_INDEX had a problem. That did not help us either.
The business users told us that they mostly search using the date fields or Object ID. We did not have any problem with search by Object ID. So we created additional indices to support search using the date fields.
Regards,
Bala -
Static credit check: credit limit exceeded for consignment issue delivery
Hi,
We are encountering a credit limit exceeded error in delivery creation for a consignment issue.
In SPRO config "Credit limit check for order types", no credit limit check is assigned to the order type. However, there is one assigned for the delivery type. Also, for CCAr/Risk cat./CG combination in OVA8 (Auto Credit Block Delivery), static credit check has been activated with error message reaction. Open orders and open deliveries were also activated. Customer exceeded credit exposure but user cannot release the order in any VKMx transactions as no credit limit check is assigned to the order type.
Given that this is the current setup, and the following conditions:
1. Credit limit for the customer cannot be increased
2. Oldest open item cannot be cleared just yet
Is there any way that we can proceed with delivery creation for this customer? The other order types have credit limit check so credit management team was able to release them. However, we are unsure on how to proceed with this consignment order. Kindly advise and thank you in advance for your help.We are encountering a credit limit exceeded error in delivery creation for a consignment issue.
but user cannot release the order in any VKMx transactions as no credit limit check is assigned to the order type.
Looks contradictory and it may need to run report RVKRED77 to reorganization of open credit values. Try to run the report in test system and then run in production system. After running the report, the use must be able to release in VKM* transaction.
Regards, -
Error while running query "time limit exceeding"
while running a query getting error "time limit exceeding".plz help.
hi devi,
use the following links
queries taking long time to run
Query taking too long
with hopes
Raja Singh -
TIME LIMIT EXCEEDED ERROR WHILE EXECUTING DTP
Hi gurus,
I Have got an error while executing
The errors are as follows.
1.Time limit exceeded. No return of the split processes
2.Background process BCTL_DK9MC0C2QM5GWRM68I1I99HZL terminated due to missing confirmation
3.Resource error. No batch process available. Process terminated
Note: Iam not executing the DTP as a back ground job.
As it is of higher priority the answers Asap Is appreciated.
Regards
Amar.Hi,
how is it possible to execute a DTP in dialog process. In my mind it is only possible for debugging...
In "Display Data Transfer Process" -> "Goto" -> "Settings for Batch Manger" you can edit settings like Number of Processes or Job Class.
Additional take a look at table RSBATCHPARALLEL and
http://help.sap.com/saphelp_nw04s/helpdata/en/42/f29aa933321a61e10000000a422035/frameset.htm
Regards
Andreas
Maybe you are looking for
-
[WLS 8.1.3] - Differences between WLS on XP & HP-UX?
We're getting AbstractMethodError errors in an application that was built using the WLS 8.1.3 libraries on XP. The errors aren't occurring when the same app is built using the WLS 8.1.3 libraries on HP-UX using the same code base. We've swapped the e
-
I have an iPhone 3G and the option to enable wireless is greyed out. It has worked before. Can anyone assist?
-
Black and white and coulored icons in Gnome3 title bar
Hello all, I wonder about icons in the Gnome3 title bar. I did'nt changed any related settings. I have an up to date physical installation and an up to date Virtual Box with beautiful black and white icons: Today I upgraded a clone of the Virtual Box
-
Reinstall Ps Extended CS6?
I purchased Adobe Photoshop Extended CS6 Win UE (Student Teacher Edition) in March and have successfully installed it. I want to rebuild my computer and reinstall the OS (Windows 7). I understand I will have to reinstall all my programs, however, I
-
Move Photoshop CS 5 to New Computer
Running Photoshop CS 5 on Windows 7 computer. CS 5 was installed as "upgrade" to previous version of Photoshop. Buying new PC which will obviously have no previous versions of Photoshop installed. How do I get CS 5 installed on the new PC? Thanks