Performance issue in customized program
Hi,
We have a performance issue in customized program.
In this program he used do-enddo twice. I mean nested do-endo.
How can we increase the performance.
And when we did the code inspector it displays 9 errors 'Char. strings w/o text elements will not be translated'.
Is there any performance issue for this error?
Regards,
Chandu.
> 'Char. strings w/o text elements will not be translated'.
> Is there any performance issue for this error?
No, this is just not so clean programming. As for the rest, Please Read before Posting in the Performance and Tuning Forum and learn how to find out where the showstoppers really are.
Thomas
Similar Messages
-
Performance issue in customized program with do-enddo
Hi,
We have a performance issue in customized program.
In this program he used do-enddo twice. I mean nested do-endo.
How can we increase the performance.
And when we did the code inspector it displays 9 errors 'Char. strings w/o text elements will not be translated'.
Is there any performance issue for this error?
Regards,
Chandu.
Moderator message - Cross post locked
Edited by: Rob Burbank on Oct 20, 2009 12:42 PM> 'Char. strings w/o text elements will not be translated'.
> Is there any performance issue for this error?
No, this is just not so clean programming. As for the rest, Please Read before Posting in the Performance and Tuning Forum and learn how to find out where the showstoppers really are.
Thomas -
Performance Issue with Concurrent Program
Hi Gurus,
I have a loader program which updates some information in the OM sales orders, this is been done using oe_order_pub.process_order and I do not see any performance issue in the package.
Using this program i had tried to update some huge number of orders uand it gets completed in few minutes, but at times this program runs for more than 6-7 hours even with
Pls could anyone advise what could be the issue with this?
Thanks & Regards,
GenooI have a loader program which updates some information in the OM sales orders, this is been done using oe_order_pub.process_order and I do not see any performance issue in the package.
Using this program i had tried to update some huge number of orders uand it gets completed in few minutes, but at times this program runs for more than 6-7 hours even with
Pls could anyone advise what could be the issue with this?Do you have the statistics collected up to date?
Can you find any errors in the database log file?
Any locks in the database?
Any invalid objects?
Please enable trace and generate the TKPROF file once the program is completed.
Thanks,
Hussein -
Performance issue in abap program
hi,
how can we improve the performance of abap programhi,
read the follwing links
ABAP provides few tools to analyse the perfomance of the objects, which was developed by us.
Run time analysis transaction SE30
This transaction gives all the analysis of an ABAP program with respect to the database and the non-database processing.
SQL Trace transaction ST05
by using this tool we can analyse the perfomance issues related to DATABASE calls.
Perfomance Techniques for improve the perfomance of the object.
1) ABAP/4 programs can take a very long time to execute, and can make other processes have to wait before executing. Here are some tips to speed up your programs and reduce the load your programs put on the system:
2) Use the GET RUN TIME command to help evaluate performance. It's hard to know whether that optimization technique REALLY helps unless you test it out.
3) Using this tool can help you know what is effective, under what kinds of conditions. The GET RUN TIME has problems under multiple CPUs, so you should use it to test small pieces of your program, rather than the whole program.
4) Generally, try to reduce I/O first, then memory, then CPU activity. I/O operations that read/write to hard disk are always the most expensive operations. Memory, if not controlled, may have to be written to swap space on the hard disk, which therefore increases your I/O read/writes to disk. CPU activity can be reduced by careful program design, and by using commands such as SUM (SQL) and COLLECT (ABAP/4).
5) Avoid 'SELECT *', especially in tables that have a lot of fields. Use SELECT A B C INTO instead, so that fields are only read if they are used. This can make a very big difference.
6) Field-groups can be useful for multi-level sorting and displaying. However, they write their data to the system's paging space, rather than to memory (internal tables use memory). For this reason, field-groups are only appropriate for processing large lists (e.g. over 50,000 records). If you have large lists, you should work with the systems administrator to decide the maximum amount of RAM your program should use, and from that, calculate how much space your lists will use. Then you can decide whether to write the data to memory or swap space.
Use as many table keys as possible in the WHERE part of your select statements.
7)Whenever possible, design the program to access a relatively constant number of records (for instance, if you only access the transactions for one month, then there probably will be a reasonable range, like 1200-1800, for the number of transactions inputted within that month). Then use a SELECT A B C INTO TABLE ITAB statement.
8) Get a good idea of how many records you will be accessing. Log into your productive system, and use SE80 -> Dictionary Objects (press Edit), enter the table name you want to see, and press Display. Go To Utilities -> Table Contents to query the table contents and see the number of records. This is extremely useful in optimizing a program's memory allocation.
9) Try to make the user interface such that the program gradually unfolds more information to the user, rather than giving a huge list of information all at once to the user.
10) Declare your internal tables using OCCURS NUM_RECS, where NUM_RECS is the number of records you expect to be accessing. If the number of records exceeds NUM_RECS, the data will be kept in swap space (not memory).
11) Use SELECT A B C INTO TABLE ITAB whenever possible. This will read all of the records into the itab in one operation, rather than repeated operations that result from a SELECT A B C INTO ITAB... ENDSELECT statement. Make sure that ITAB is declared with OCCURS NUM_RECS, where NUM_RECS is the number of records you expect to access.
12) If the number of records you are reading is constantly growing, you may be able to break it into chunks of relatively constant size. For instance, if you have to read all records from 1991 to present, you can break it into quarters, and read all records one quarter at a time. This will reduce I/O operations. Test extensively with GET RUN TIME when using this method.
13) Know how to use the 'collect' command. It can be very efficient.
14) Use the SELECT SINGLE command whenever possible.
15) Many tables contain totals fields (such as monthly expense totals). Use these avoid wasting resources by calculating a total that has already been calculated and stored.
Some tips:
1) Use joins where possible as redundant data is not fetched.
2) Use select single where ever possible.
3) Calling methods of a global class is faster than calling function modules.
4) Use constants instead of literals
5) Use WHILE instead of a DO-EXIT-ENDDO.
6) Unnecessary MOVEs should be avoided by using the explicit work area operations
see the follwing links for a brief insifght into performance tuning,
http://www.thespot4sap.com/Articles/SAPABAPPerformanceTuning_Introduction.asp
http://help.sap.com/saphelp_nw2004s/helpdata/en/d1/801f7c454211d189710000e8322d00/frameset.htm
regards
Rohan -
Performance issue with custom IDOC download into SAP MII
Hi,
We have a custom IDOC which has ten fields. We take the data from these ten fields and insert into database. The issue is
we have some where around 4500 idocs flowing into SAP MII and then from SAP MII to sql database. The time taken for these idocs is around 30 mins.
Is there a way where we can improve the performance in terms of time taken?
Additional information is
MaxReaderThreadCount has been set to 5 in Configuration Management -> Infrastructure -> Application Resources
sample idoc structure
<IDOCNAME>
<FIELD1>val1</FIELD1>
<FIELD10>val2</FIELD10>
</IDOCNAME>
Please let us know if more information is needed.
Regards,
Manish JainHi Manish,
My thinking about the queuing is that for each transaction, you are calling your database and passing in the IDoc content. Depending upon your system, database, transactions, etc., you may be having to establish a connection between MII and the database each time you process an IDoc. Each "handshake" takes a finite amount of time. So queuing up the IDocs, establishing a single DB connection and processing large numbers of IDocs in a batch might reduce your processing time substantially. Something worth investigating.
I don't think that adding additional IDoc Listeners for different IDoc types will work if the source ECC system is the same for each type. It will simply use the first one it finds and process all the types there.
Additional threads may improve performance and is certainly a low risk, easily implemented test. Just do it in small increments as you can add too many threads (check with the NW administrator) and lock up your NW system.
And you may want to check with your DBA to see if there are limits on the connections which MII can establish for uploading the IDoc content (or for that matter, any other improvements to be made on the DB side).
Regards,
Mike
Edited by: Michael Appleby on Feb 10, 2012 2:37 PM -
Performance issue in report programming..
Hi,
I am using one customized Function Module whithin a loop of internal table containing fields of PROJ table for about 200 records . And in the source code of function module there is set of select queries for different tables like COSS COSP , AUFK , PRPS , BPJA PRHI , AFPO , AFKO etc . so due to that my performance of a report is very low , So how can i improve it .
Is there any other way to change a code.
regards
ChetanHi John ,
I am using SAP ECC 6.0 .
The report is used to update a ztable which is already created for Project System plan data .
So i am calling function module which will return a internal table , I am appending this to other internal table and refreshing it , like this I am doing for each project within a loop of PROJ internal table , finaly by using the final itab I am modifying the ztable fields.
Code is as below..
select pspid from proj client specified into corresponding fields of
table t_itab1 where mandt = sy-mandt
and pspnr in s_pspnr
and vbukr = p_vbukr
and prctr in s_prctr.
loop at t_itab1.
l_pspid = t_itab1-pspid.
CALL FUNCTION 'ZPS_FUN_BUDGETS'
EXPORTING
L_PSPID = l_pspid
L_VBUKR = p_vbukr
TABLES
T_DATA = t_itab2 .
loop at t_itab2.
append t_itab2 to t_itab.
endloop.
clear : t_itab2.
refresh : t_itab2.
endloop.
LOOP AT t_itab.
***MODIFY ZTABLE.*****
ENDLOOP.
Regards
Chetan -
Performance issue with a program in HR-ABAP
Hi All,
I have a program that is taking very much time to execute please suggest some solution. I am pasting the main code below.
<large code part removed by moderator>
Moderator message: Please Read before Posting in the Performance and Tuning Forum
Moderator message: please post only relevant code parts, otherwise formatting is lost.
Edited by: Thomas Zloch on Mar 1, 2011 11:11 AM>
Aaron Shover wrote:
> OK, Aditya, I added another type pool to my program and I can now use the same type that SAP is using.
>
> Thanks for pointing out the [obvious] solution that I was overlooking!
>
> Aaron
Aaron, I was not aware of the type pools as mentioned I don't have access to the system at the moment. But type pool has type declarations as well, so what I meant was pulling out the 2 declarations that you need into your Z program could have solved the problem :). Which will happen now since you've included the type pools.
Anyways good to know that you problem is solved.
Cheers,
Advait -
Performance Issue in Oracle EBS
Hi Group,
I am working in a performance issue at customer site, let me explain the behaviour.
There is one node for the database and other for the application.
Application server is running all the services.
EBS version is 12.1.3 and database version is: 11.1.0.7 with AIX both servers..
Customer has added memory to both servers (database and application) initially they had 32 Gbytes, now they have 128 Gbytes.
Today, I have increased memory parameters for the database and also I have increased JVM's proceesses from 1 to 2 for Forms and OAcore, both JVM's are 1024M.
The behaviour is when users are navigating inside of the form, and they push the down button quickly the form gets thinking (reloading and waiting 1 or 2 minutes to response), it is no particular for a specific form, it is just happening in several forms.
Gathering statistics job is scheduled every weekend, I am not sure what can be the problem, I have collected a trace of the form and uploaded it to Oracle Support with no success or advice.
I have just send a ping command and the reponse time between servers is below to 5 ms.
I have several activities in mind like:
- OATM conversion.
- ASM implementation.
- Upgrade to 11.2.0.4.
Has anybody had this behaviour?, any advice about this problem will be really appreciated.
Thanks in advance.
Kind regards,
Francisco Mtz.Hi Bashar, thank you very much for your quick response.
If both servers are on the same network then the ping should not exceed 2 ms.
If I remember, I did a ping last Wednesday, and there were some peaks over 5 ms.
Have you checked the network performance between the clients and the application server?
Also, I did a ping from the PC to the application and database, and it was responding in less than 1 ms.
What is the status of the CPU usage on both servers?
There aren't overhead in the CPU side, I tested it (scrolling getting frozen) with no users in the application.
Did this happen after you performed the hardware upgrade?
Yes, it happened after changing some memory parameters in the JVM and the database.
Oracle has suggested to apply the latest Forms patches according to this Note: Doc ID 437878.1
Thanks in advance.
Kind regards,
Francisco Mtz. -
Performance issues involving tables S031 and S032
Hello gurus,
I am having some performance issues. The program involves accessing data from S031 and S032. I have pasted the SELECT statements below. I have read through the forums for past postings regarding performance, but I wanted to know if there is anything that stands out as being the culprit of very poor performance, and how it can be corrected. I am fairly new to SAP, so I apologize if I've missed an obvious error. From debugging the program, it seems the 2nd select statement is taking a very long time to process.
GT_S032: approx. 40,000 entries
S031: approx. 90,000 entries
MSEG: approx. 115,000 entries
MKPF: approx. 100,000 entries
MARA: approx. 90,000 entries
SELECT
vrsio "Version
werks "Plan
lgort "Storage Location
matnr "Material
ssour "Statistic(s) origin
FROM s032
INTO TABLE gt_s032
WHERE ssour = space AND vrsio = c_000 AND werks = gw_werks.
IF sy-subrc = 0.
SELECT
vrsio "Version
werks "Plant
spmon "Period to analyze - month
matnr "Material
lgort "Storage Location
wzubb "Valuated stock receipts value
wagbb "Value of valuated stock being issued
FROM s031
INTO TABLE gt_s031
FOR ALL ENTRIES IN gt_s032
WHERE ssour = gt_s032-ssour
AND vrsio = gt_s032-vrsio
AND spmon IN r_spmon
AND sptag = '00000000'
AND spwoc = '000000'
AND spbup = '000000'
AND werks = gt_s032-werks
AND matnr = gt_s032-matnr
AND lgort = gt_s032-lgort
AND ( wzubb <> 0 OR wagbb <> 0 ).
ELSE.
WRITE: 'No data selected'(m01).
EXIT.
ENDIF.
SORT gt_s032 BY vrsio werks lgort matnr.
SORT gt_s031 BY vrsio werks spmon matnr lgort.
SELECT
p~werks "Plant
p~matnr "Material
p~mblnr "Document Number
p~mjahr "Document Year
p~bwart "Movement type
p~dmbtr "Amount in local currency
t~shkzg "Debit/Credit indicator
INTO TABLE gt_scrap
FROM mkpf AS h
INNER JOIN mseg AS p
ON hmblnr = pmblnr
AND hmjahr = pmjahr
INNER JOIN mara AS m
ON pmatnr = mmatnr
INNER JOIN t156 AS t
ON pbwart = tbwart
WHERE h~budat => gw_duepr-begda
AND h~budat <= gw_duepr-endda
AND p~werks = gw_werks.
Thanks so much for your help,
JayeshIssue with table s031 and with for all entries.
Hi,
I have following code in which select statement on s031 is
taking long time and after that it shows a dump. What should I do instead of
exceeding the time limit of execution of an abap program.
TYPES:
BEGIN OF TY_MTL, " Material Master
MATNR TYPE MATNR, " Material Code
MTART TYPE MTART, " Material Type
MATKL TYPE MATKL, " Material Group
MEINS TYPE MEINS, " Base unit of Measure
WERKS TYPE WERKS_D, " Plant
MAKTX TYPE MAKTX, " Material description (Short Text)
LIFNR TYPE LIFNR, " vendor code
NAME1 TYPE NAME1_GP, " vendor name
CITY TYPE ORT01_GP, " City of Vendor
Y_RPT TYPE P DECIMALS 3, "Yearly receipt
Y_ISS TYPE P DECIMALS 3, "Yearly Consumption
M_OPG TYPE P DECIMALS 3, "Month opg
M_OPG1 TYPE P DECIMALS 3,
M_RPT TYPE P DECIMALS 3, "Month receipt
M_ISS TYPE P DECIMALS 3, "Month issue
M_CLG TYPE P DECIMALS 3, "Month Closing
D_BLK TYPE P DECIMALS 3, "Block Stock,
D_RPT TYPE P DECIMALS 3, "Today receipt
D_ISS TYPE P DECIMALS 3, "Day issues
TL_FL(2) TYPE C,
STATUS(4) TYPE C,
END OF TY_MTL,
BEGIN OF TY_OPG , " Opening File
SPMON TYPE SPMON, " Period to analyze - month
WERKS TYPE WERKS_D, " Plant
MATNR TYPE MATNR, " Material No
BASME TYPE MEINS,
MZUBB TYPE MZUBB, " Receipt Quantity
WZUBB TYPE WZUBB,
MAGBB TYPE MAGBB, " Issues Quantity
WAGBB TYPE WAGBB,
END OF TY_OPG,
DATA :
T_M TYPE STANDARD TABLE OF TY_MTL INITIAL SIZE 0,
WA_M TYPE TY_MTL,
T_O TYPE STANDARD TABLE OF TY_OPG INITIAL SIZE 0,
WA_O TYPE TY_OPG.
DATA: smonth1 TYPE spmon.
SELECT
a~matnr
a~mtart
a~matkl
a~meins
b~werks
INTO TABLE t_m FROM mara AS a
INNER JOIN marc AS b
ON a~matnr = b~matnr
* WHERE a~mtart EQ s_mtart
WHERE a~matkl IN s_matkl
AND b~werks IN s_werks
AND b~matnr IN s_matnr .
endif.
SELECT spmon
werks
matnr
basme
mzubb
WZUBB
magbb
wagbb
FROM s031 INTO TABLE t_o
FOR ALL ENTRIES IN t_m
WHERE matnr = t_m-matnr
AND werks IN s_werks
AND spmon le smonth1
AND basme = t_m-meins. -
Avoiding performance issue due to loop within loop on internal tables
Hi Experts,
I have a requirement where in i want to check whether each of the programs stored in one internal table are called from any of the programs stored in another internal table. In this case i am looping on two internal tables (Loop within a loop) which is causing a major performance issue. Program is running very very slow.
Can any one advise how to resolve this performance issue so that program runs faster.
Thanks in advance.
Regards,
Chetan.Forget the parallel cursur stuff, it is much to complicated for general usage and helps nearly nothing. I will publish a blog in the next days where this is shown in detail.
Loop on loop is no problem if the inner table is a hashed or sorted table.
If it must be a standard table, then you must make a bit more effort and faciliate a binary search (read binary search / loop from index exit)
see here the exact coding Measurements on internal tables: Reads and Loops:
/people/siegfried.boes/blog/2007/09/12/runtimes-of-reads-and-loops-on-internal-tables
And don't forget, the other table must not be sorted, the loop reaches anyway every line. The parallel cursor requires both tables to be sorted. The additional sort
consumes nearly the whole advantage of the parallel cursor compared to the simple but good loop in loop solutions.
Siegfried -
We are experiencing a performance issue with custom code using the GP API.
We get an array of IGPWorkItem objects using IGPRuntimeManager.getWorkItems.
We use IGPRuntimeManager.getProcessInstanceInformation to retieve IGPProcessInstanceInfo object for each item in the array.
There are around 220 items in the array.
Each getProcessInstanceInformation() call normally completes in 50ms.
Intermittently the time increases to 500ms.
Sometimes this is for all items in the array.
Sometimes it starts at 500ms and reduces to 50ms.
After some period, the speed returns to normal (50ms).
This happens across multiple instances running on multiple servers.
We can see no issues in cpu, memory, threads, or connections.
Would appreciate any suggestions on how to troubleshoot this issue.
Here is sample code with trace to capture times before and after method call.
IGPRuntimeManager rtManager = GPProcessFactory.getRuntimeManager();
IUser user = WDClientUser.getCurrentUser().getSAPUser();
IGPContextManager contextManager = GPContextFactory.getContextManager();
IGPUserContext userContext = contextManager.createUserContext(user, new Locale("en_US"));
IGPWorkItem[] workItems = rtManager.getWorkItems(GPWorkItemStatus.WORKITEM_STATUS_OPEN, userContext);
int len = workItems.length;
for (int i = 0; i < len; i++)
try
loc.infoT("start loop " + i);
IGPWorkItem workItem = workItems<i>;
processID = workItem.getProcessID();
loc.infoT("start info object call");
IGPProcessInstanceInfo info = rtManager.getProcessInstanceInformation(processID, user);
loc.infoT("stop info object method");
do other stuff ...
catch (Exception e)
loc.errorT(e.toString());
Edited by: Ray Erdelyan on Dec 11, 2009 10:49 PMDear BRamchan,
From what you have described , I think you are well ahead in use of MDM APIs. The information you have provided below is not enough for understanding where the issue is. If you do not have any issues, if you could send a copy of your code - ( deleteing any confidential/ intellectual property related stuff ) , i can try to simulate the same code and see what kind of issues arise. If you think it is possible ,please send me the code at [email protected]
Thanks.
Siva K. -
Not Updating Customized Table when System having Performance Issue
Hi,
This is actually the same topic as "Not Updating Customized Table when System having Performance Issue" which is posted last December by Leonard Tan regarding the user exit EXIT_SAPLMBMB_001.
Recently we changed the program function module z_mm_save_hide_qty to update task. However this causes more data not updated. Hence we put back the old version (without the update task). But now it is not working as it used to be (e.g. version 1 - 10 records not updated, version 2 with update task - 20 records not updated, back to version 1 - 20 records not updated).
I tried debugging the program, however whenever I debugged, there is nothing wrong and the data is updated correctly.
Please advise if anyone has any idea why is this happening. Many thanks.
Regards,
JanetHi Janet,
you are right. This is a basic rule not to do any COMMIT or RFC calls in a user exit.
Have a look at SAP note 92550. Here they say that exit EXIT_SAPLMBMB_001 is called in the update routine MB_POST_DOCUMENT. And this routine is already called in UPDATE TASK from FUNCTION 'MB_UPDATE_TASKS' IN UPDATE TASK.
SAP also tells us not to do any updates on SAP system tables like MBEW, MARD, MSEG.
Before the exit is called, now they call 'MB_DOCUMENT_BADI' with methods MB_DOCUMENT_BEFORE_UPDATE and MB_DOCUMENT_UPDATE. Possibly you have more success implementing the BADI.
I don't know your situation and goal so this is all I can tell you now.
Good luck!
Regards,
Clemens -
Is there a recommended limit on the number of custom sections and the cells per table so that there are no performance issues with the UI?
Thanks Kelly,
The answers would be the following:
1200 cells per custom section (NEW COUNT), and up to 30 custom sections per spec.
Assuming all will be populated, and this would apply to all final material specs in the system which could be ~25% of all material specs.
The cells will be numeric, free text, drop downs, and some calculated numeric.
Are we reaching the limits for UI performance?
Thanks -
Performance issue in Webi rep when using custom object from SAP BW univ
Hi All,
I had to design a report that runs for the previous day and hence we had created a custom object which ranks the dates and then a pre-defined filter which picks the date with highest rank.
the definition for the rank variable(in universe) is as follows:
<expression>Rank([0CALDAY].Currentmember, Order([0CALDAY].Currentmember.Level.Members ,Rank([0CALDAY].Currentmember,[0CALDAY].Currentmember.Level.Members), BDESC))</expression>
Now to the issue I am currently facing,
The report works fine when we ran it on a test environment ie :with small amount of data.
Our production environment has millions of rows of data and when I run the report with filter it just hangs.I think this is because it tries to rank all the dates(to find the max date) and thus resulting in a huge performance issue.
Can someone suggest how this performance issue can be overcome?
I work on BO XI3.1 with SAP BW.
Thanks and Regards,
Smitha.Hi,
Using a variable on the BW side is not feasible since we want to use the same BW query for a few other reports as well.
Could you please explain what you mean by 'use LAG function'.How can it be used in this scenario?
Thanks and Regards,
Smitha Mohan. -
Performance issue in a custom table
Hi All,
I have a ztable used in a program wherin I have a doubt of performance issue in selection.Its like :
SELECT ship_no invoice_no
INTO TABLE it_ship_no_hist
FROM zco_cust_hist
FOR ALL ENTRIES IN it_freight
WHERE ship_no = it_freight-tknum.
there are 7 key fields in this table out of which one ( tknum ) is used in a where condition.The table is without any index.
For performance purpose should I create an index with the very field 'tknum' in the index..can I do that or index should be created only along with non key fields.Hi,
a table has - besides a few exceptions - always one index that is the primary key. The fields are the key fields in the same order as in the table.
The primary key is always there and therefore not displayed under the botton 'index'.
Is tknum a key field? What are the key fields in correct order? If it is in the key and maybe the first one, then it does not make sense that you create an index.
Siegfried
Maybe you are looking for
-
Need help on SAP BW regarding IS-Retail Aspects
Dear Gurus I am urgently need of SAP BW implementation documents based on SAP IS-Retail. pls help me, it's really urgent Thanks and Regards [email protected]
-
Can you listen to iTunes on two computers at the same time?
If I've got iTunes playing on one computer, is it possible to listen on another computer on the same network what's playing on the first computer? I've often got two computers on the go over two floors and it would be smart if they were playing the s
-
Re: SOAP Fault: unexpected encoding style
Hi Oliver, Have you tried using the Test XML tab in the test harness to test the external web service. By using that, you can check the SOAP request sent to the service. Could you please send us the WSDL file and a sample SOAP request sent to the ext
-
Can't hook up my yahoo account
Help--- I just got my iPod touch yesterday and it was great! My email was working fine... now i haven't changed any settings but now it says it can't connect to my yahoo account. So when i tried to reset it, it said that the account verification fail
-
DB Adpter invisabul state in Production
Hi, When i going to shut Down the DB Adptet it's gives below error and it going to invisbal state please any body halp on this isue: Internal error: Deployment Failed: javax.management.RuntimeMBeanException: RuntimeException thrown in postRegiste