B1IF Memory consumption and SQL Release
Hi All,
the problem i have at the moment is that SQL does not seem to release any memory of B1IF Transactional/SQL Query related usage.
Our 1 client is set on 40gb ram, 28gb of that is used by SQL alone of which about 16 is SAP B1 Relating and the rest is due to B1IF Transactions and queries.
It seems like SQL does not release any memory, several posts i have come across relate only to memory management for B1IF(Tomcat6). but i cant seem to find anything relating to B1IF's SQL side Memory usage and ways to limit or recycle the unused memory.
this is causing our customers systems to start lagging and slowing down in general.
Any help with this problem would be greatly appreciated.
Thanks,
Gideon
Hi Gideon
AFAIK you don't manage the SQL Server's memory usage on application basis. In other words: SQL Server knows best how much memory to use and how much each application should consume. In order to help your OS you should set max memory usage for SQL Server:
SQL Server not releasing memory after query executes - Stack Overflow
Kind regards,
Radek
Similar Messages
-
Query on memory consumption during SQL
Hi SAP Gurus,
Could I kindly request for your inputs concerning the following scenario?
To put it quite simply, we have a program where we're required to retrieve all the fields from a lengthy custom table, i.e. the select statement uses an asterisk. Unfortunately, there isn't really a way to avoid this short of a total overhaul of the code, so we had to settle with this (for now).
The program retrieves from the database table using a where clause filtering only to a single value company code. Kindly note that company code is not the only key in the table. In order to help with the memory consumption, the original developer had employed retrieval by packages (also note that the total length of each record is 1803...).
The problem encountered is as follows:
- Using company code A, retrieving for 700k entries in packages of 277, the program ran without any issues.
- However, using company code B, retrieving for 1.8m in packages of 277, the program encountered a TSV_TNEW_PAGE_ALLOC_FAILED short dump. This error is encountered at the very first time the program goes through the select statement, ergo it has not even been able to pass through any additional internal table processing yet.
About the only biggest difference between the two company codes is the number of corresponding records they have in the table. I've checked if company code B had more values in its columns than company code A. However, they're just the same.
What I do not quite understand is why memory consumption changed just by changing the company code in the selection. I thought that the memory consumed by both company codes should be the same... at least, in the beginning, considering that we're retrieving by packages, so we're not trying to get all of the records all at once. However, the fact that it failed at the very beginning has shown me that I'm gravely mistaken.
Could someone please enlighten me on how memory is consumed during database retrieval?
Thanks!Hi,
with FAE (FOR ALL ENTRIES) the whole query even for a single record in the itab is executed and all results for
the company code are transfered from the database to the DBI since the duplicates will be removed by the DBI
not by the database.
If you use package size the resultset is buffered in a system table in the DBI (which allocates memory from your user quota). And from there on the package sizes are built and handed over to your application (into table lt_temp).
see recent ABAP documentation:
Since duplicate rows are only removed on the application server, all rows specified using the WHERE condition are sometimes transferred to an internal system table and aggregated here. This system table has the same maximum size as the normal internal tables. The system table is always required if addition PACKAGE SIZE or UP TO n ROWS is used at the same time. These do not affect the amount of rows transferred from the database server to the application server; instead, they are used to transfer the rows from the system table to the actual target area.
What you should do:
calculate the size needed for your big company code B. How many row multiplied with line length.
That is the minimum amount you need for your user memory quota. (quotas can be checked with
ABAP report RSMEMORY) If the amount of memory is sufficient then try without packagesize.
SELECT * FROM <custom table>
INTO TABLE lt_temp
FOR ALL ENTRIES IN lt_bukrs
WHERE bukrs = lt_bukrs-bukrs
ORDER BY primary key.
This might actually use less memory than the package size option for the FOR ALL ENTRIES.
Since with FAE it is buffered anyway in the DBI (and subtracted from your quota) you can
do it right away and avoid double saving portions (the DBI buffer and a portion of that in the
packe in lt_temp).
If the amount of memory is still too big, you have to either increase the quotas or select
less data (additional where conditions) or avoid using FAE in this case in order to not read all
the data in one go.
Hope this helps,
Hermann -
Check Process memory consumption and Kill it
Hello
I have just installed Orchestrator and have a problem that I think is perfekt for Orchestrator to handle.
I have a process that sometimes hangs and the only way to spot it is that the memory consumption has stoped.
The process is started every 15 minutes and scans a folder, if it finds a file it reads the file to a system. You can see that it is working by the increasing Memory consumption. If the read fails then the memory consumption stops. The process is still working
and is responding but is hung.
I'm thinking about doing a runbook that checks the memory-consumption every 5 minutes and compares it with the previous value. if the last three values are the same then I will kill the process and start it again.
My problem is that I have not found a way to check the memory consumption of a process.
I have set up a small test, just verify that I get the correct process, with the activity Monitor process -> Get Process Status -> Append Line (process name).
But How do I get the process memory consumption?
/AndersNow that I think about it a bit more I don't think there will be an easy way to set up a monitor for your situation in SCOM. Not that it couldn't be done, just not easily. Getting back to SCORCH. What you are trying to do isn't an every day kind of
scenario. I don't think there is a built in activity for this.
The hardest thing to overcome whether you use SCORCH or SCOM is likely going to be determining the error condition of three consecutive samples of the same memory usage. you'll need a way to track the samples. I can't think of a good way to do
this without utilizing scripting. -
OVI Suite 2.0: memory consumption and disk activit...
Hi all,
the new Ovi suite is consuming ridiculous amounts of memory:
- nokiamserver.exe - 150 MB
- Nokiaovisuite - 240 MB
Come on!
in addition to this the nokiamserver.exe keeps the hard drive extremely busy in every boot for more than 20 minutes!
I have this latest OVI suite now installed now in my BUSINESS laptop and I just don't need or want mediaservers running in it. Just want the internet connectivity and occasional photo/message sync.
Is there a way to disable it and start it I ever happen to need it?
JH
Running on XP SP2Absolutely the same here.
It seems so avid of resources because it is in "Watchfiles" mode (absolutely consuming if you, like me, use OVI on a virtual machines whose folders (almost ever on VM) are located on a net share.
I didn't try it on a normal machine, but IMO if OVI have to handle large pictures/music folders its background watchfiles process do a hard job to monitor changes in real time.
So I launched regedit and removed (or you can leave it putting "REM" before the path) the registry key
REM C:\Program Files\Common Files\Nokia\MPlatform\NokiaMServer /watchfiles startup
that you can find here:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Run
Stefania -
Memory Consumption: Start A Petition!
I am using SQL Developer 4.0.0.13 Build MAIN 13.80. I was praying that SQL Developer 4.0 would no longer use so much memory and, when doing so, slow to a crawl. But that is not the case.
Is there a way to start a "petition" to have the SQL Development team focus on the products memory usage? This is problem has been there for years now with many posts and no real answer.
If there isn't a place to start a "petition" let's do something here that Oracle will respond to.
Thank youYes, at this point (after restarting) SQL Developer is functioning fine. Windows reports 1+ GB of free memory. I have 3 worksheets open all connected to two different DB connections. Each worksheet has 1 to 3 pinned query results. My problem is that after working in SQL Developer for a a day or so with perhaps 10 worksheets open across 3 database connections and having queried large data sets and performing large exports it becomes unresponsive even after closing worksheets. It appears like it does not clean up after itself to me.
I will use Java VisualVM to compare memory consumption and see if it reports that SQL Developer is releasing memory but in the end I don't care about that. I just need a responsive SQL Developer and if I need to close some worksheets at times I can understand doing so but at this time that does not help. -
Question regarding Memory Consumption in Collections
I Read this on some website -
Memory for collections is stored in the program global area (PGA), not the system global area (SGA). SGA memory is shared by all sessions connected to Oracle Database, but PGA memory is allocated for each session. Thus, if a program requires 5MB of memory to populate a collection and there are 100 simultaneous connections, that program causes the consumption of 500MB of PGA memory, in addition to the memory allocated to the SGA.
My Question is -
If i use Collections in my Procedures and they consume 5 MB space, is the memory still allocated to the session even if the Procedure completes its execution ??
I mean is memory released for other sessions after procedure completion or session termination ??user9276238 wrote:
If i use Collections in my Procedures and they consume 5 MB space, is the memory still allocated to the session even if the Procedure completes its execution ??Yes. How does the session know that your very next instruction to it is not run that very same procedure again (with different parameters). It will again need 5MB of memory. And if releases, it will need to reallocate it. If all processes work like this, immediately releasing memory and then grabbing memory again - that could cause contention and other issues.
Memory management is more complex than this. For example, let's say your session required 1 few KB of space. So the process's memory looks something as follows:
xxxx
Next your session needs that 5MB for the (enormous) collection:
xxxxyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy
And during the processing, it needs a few KB more:
xxxxyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyzz
Okay, so now it can release the 5MB memory (as indicated by the y ) back to the kernel? It cannot shrink it as there is trailing memory allocated and used (as indicated by the z ). It will need to re-org that memory in order to release that 5MB worth of y's.
So it is not as straight forward as one would like to think. The basic rule for PL/SQL is to tread softly using memory. Why? Because this is server code. Memory used in PL is allocated as private memory to that process. Unlike shared memory, no-one else can share in its use. Thus the memory must be considered expensive.
There's also the issue of scalability. If your PL code needs 5MB of private memory from the server.. what happens when a 100 client sessions all run your code? How well does your code scale in a multi-process and multi-server environment?
So what memory should you ideally then use in PL? Shared memory. The SGA. This means it is better for scalability and server resources to rather store large data structures for PL as GTT's (global temporary tables - where a temp table is created per session).
I mean is memory released for other sessions after procedure completion or session termination ??Session termination is a "sure thing" ito memory being released back to the kernel as free memory - assuming this is a dedicated server session. The actual server process will terminate and its PGA will be released. Shared server processes services multiple sessions during their life time. And only when the the shared server process terminates, will its PGA be released.(though PGAs can be shrunk as and when decided feasible by their memory managers) -
ALV SapGui Memory consumptions
hi ,
we experienced when using alv-controls the memory usage of sapgui is always increasing, even when programs free the alv controls the memory consumption of the sapgui ist nearly the same as before - with sapgui Windows.
any ideas`?
kind regards
oliverWhat version of the SAPGui are you running? What Patch Level? Memory Consumption and Leaks are quite frequently listed among the fixes for a particular patch level. We very recently did testing with SAPGui 640 Patch 3 and did not see noticable increased memory consumption with ALV (At least not to the point that it was noticable in Windows Task Manager.
-
XML,CLOB AND MEMORY : CONSUMED BUT NOT RELEASED !!!
Hi,
I'm working with XMLGEN Package and XSLT Processor to produce XML Document on Oracle 8.1.7.3 server, I use 9i XML and Java packages.
I'm facing the following BIG MEMORY problem :
Environment : I must generate an XML parsed with an XSL document of 80Mo (nearly 22 000 rows),
as XMLGEN is on DOM parsing method, I extract my XML by 500 Rows and I loop until 22 000 rows. I use DBMS_JOB with jobs who read and execute export each minute.
The algorithme is :
keeprow=22000
while keeprow>0
Create 3 clob (DBMS_LOB Package) : one for XSL Sheet, one for XML and one for result
GetXML (XMLGEN Package)
Transform in XSL (XSL Processor)
Write to disk (UTL_FILE Package)
Free the 3 Clob ((DBMS_LOB Package))
keeprow =keeprow-500
loop
The problem : The process start at 250Mo ot total memory and END at 1000 Mo of used memory and NEVER RELEASE 1 ko of memory.
So often I get a JavaOutOfMemoryError (I've allocated 1Go Ram to my JAVA process).
Any help will be very very appreciated !
My derived problem is 22 000 rows is not enough I've some export of 200 000 rows to do (And I cannot allocate 10 Go of RAM !!!)
Following My PL/SQL Code.
Regards
Fred
PROCEDURE DO_EXPORT_XML(
P_JOB_ID JOB_PARAMETRE.JOB_ID%TYPE,
P_JOB_ID_ORDRE JOB_PARAMETRE.JOB_ID_ORDRE%TYPE,
P_CLE_UP UPLOADREQ_TEMP.ID%TYPE,
P_LOAD_OR_DELOAD VARCHAR2)
IS
L_FILE_NAME JOB_PARAMETRE.JOB_FILE_NAME_DEST%TYPE;
L_REP_NAME JOB_PARAMETRE.JOB_REP_NAME_DEST%TYPE;
L_FILE_STYLESHEET JOB_PARAMETRE.JOB_STYLESHEET%TYPE;
L_VERSION_PDM JOB_PARAMETRE.JOB_VPDM%TYPE;
P_SELECT varchar2(4000):='';
P_CURSOR varchar2(4000):='';
l_filehandler_out UTL_FILE.FILE_TYPE;
--Variable pour le traitement par lot de 500
L_NBROW_TODO_ATONCE number := 500;
L_NBROW_MIN number := 1;
L_NBROW_MAX number := -1;
L_NBROWKEEPTODO number := -1;
xslString CLOB := null;
res number default -1;
xmlString CLOB := null;
li_ret number := 0;
li_faitle number := 0;
amount integer:= 255;
li_loop integer := 0;
charString varchar2(255);
ls_deload varchar2(255) default '';
ls_SQL varchar2(4000) default '';
ls_temp_file varchar2(255) default '';
text_file_dir varchar2(32) := 'e:\temporarydir';
l_par xmlparser.parser;
l_xml xmldom.domdocument;
l_pro xslprocessor.processor;
l_xsl xslprocessor.stylesheet;
docfragnode xmldom.DOMNode;
docfrag xmldom.DOMDocumentFragment;
l_parsedclob clob := null;
l_amount binary_integer := 32767;
l_ligne varchar2(32767);
l_offset number default 1;
l_pos number default null;
l_pos2 number default null;
l_lobsize number default null;
l_memsize number default 1073741824; --1024 Mo
l_mempipo number default 0;
type rc is ref cursor;
l_cursor rc;
cursor TEMPLATE is select UNSPSC,1 as NB from UPLOADREQ_TEMP where 1=2;
c1rec TEMPLATE%rowtype;
BEGIN
l_mempipo:=setmaxmemorysize(l_memsize);
dbms_lob.createtemporary(l_parsedclob, true, dbms_lob.session);
dbms_lob.createtemporary(xmlstring, true, dbms_lob.session);
--return the good select
GET_SELECT_TO_EXPORT_XML(P_JOB_ID , P_JOB_ID_ORDRE , P_CLE_UP , P_SELECT,P_CURSOR, P_LOAD_OR_DELOAD);
SELECT JOB_FILE_NAME_DEST,JOB_REP_NAME_DEST,JOB_STYLESHEET
INTO L_FILE_NAME,L_REP_NAME,L_FILE_STYLESHEET
FROM JOB_PARAMETRE
WHERE JOB_ID =P_JOB_ID AND JOB_ID_ORDRE=P_JOB_ID_ORDRE;
l_filehandler_out := UTL_FILE.FOPEN(text_file_dir, L_FILE_NAME, 'w',l_amount);
--Return XSL Sheet in a clob : cause of memory consumed but not released
xslString := METACAT.load_a_file( 1,L_FILE_STYLESHEET);
open l_cursor for P_CURSOR;
LOOP
fetch l_cursor into c1rec;
exit when l_cursor%notfound;
L_NBROW_MIN := 1;
L_NBROW_MAX := 0;
L_NBROWKEEPTODO:=c1rec.NB;
LOOP
begin
if(L_NBROWKEEPTODO > L_NBROW_TODO_ATONCE) THEN
begin
L_NBROW_MAX:= L_NBROW_TODO_ATONCE + L_NBROW_MAX;
L_NBROWKEEPTODO:= L_NBROWKEEPTODO - L_NBROW_TODO_ATONCE;
end;
else
begin
L_NBROW_MAX:= L_NBROW_MAX + L_NBROWKEEPTODO;
L_NBROWKEEPTODO:=0;
end;
end if;
--on ouvre le fichier de risultats
ls_SQL:= P_SELECT || ' AND ( ROWNUM BETWEEN ' || L_NBROW_MIN || ' AND ' || L_NBROW_MAX || ' ) and UNSPSC=''' || c1rec.UNSPSC || '''';
ls_temp_file := c1rec.UNSPSC || '_' || L_FILE_NAME;
L_NBROW_MIN:=L_NBROW_TODO_ATONCE + L_NBROW_MIN;
--CAT_AUTOLOAD.JOB_ADD_TRACE (P_JOB_ID,'UPLOAD REQUISITE : Export donnies REQUETE ' || to_char(li_loop), ls_SQL,'',0,0);
xmlgen.resetOptions;
xmlgen.setErrorTag('ERROR_RESULT');
xmlgen.setRowIdAttrName('NAH');
xmlgen.setRowIdColumn('NAH');
xmlgen.setEncodingTag('ISO-8859-1');
xmlgen.useNullAttributeIndicator(false);
if(xmlString is not null) then
dbms_lob.open(xmlString,dbms_lob.lob_readwrite);
l_lobsize:= dbms_lob.Getlength(xmlString);
if(l_lobsize>0) then
dbms_lob.erase(xmlString,l_lobsize,1);
end if;
dbms_lob.close(xmlString);
dbms_lob.freetemporary(xmlString);
dbms_lob.createtemporary(xmlstring, true, dbms_lob.session);
end if;
--Return XML in a clob : cause of memory consumed but not released
xmlString := xmlgen.getXML(ls_SQL,0);
l_par := xmlparser.newparser;
xmlparser.parseclob(l_par, xslString);
l_xsl := xslprocessor.newstylesheet(xmlparser.getdocument(l_par),null);
xmlparser.parseclob(l_par, xmlString);
l_xml := xmlparser.getdocument(l_par);
l_pro := xslprocessor.newprocessor;
xslprocessor.showWarnings(l_pro, true);
xslprocessor.setErrorLog(l_pro, text_file_dir || substr(ls_temp_file,0,length(ls_temp_file)-4) || '_logerreur.XML');
if(l_parsedclob is not null) then
dbms_lob.open(l_parsedclob,dbms_lob.lob_readwrite);
l_lobsize:= dbms_lob.Getlength(l_parsedclob);
if(l_lobsize>0) then
dbms_lob.erase(l_parsedclob,l_lobsize,1);
end if;
dbms_lob.close(l_parsedclob);
dbms_lob.freetemporary(l_parsedclob);
dbms_lob.createtemporary(l_parsedclob, true, dbms_lob.session);
end if;
--Return XML Processed with XSL in a clob : cause of memory consumed but not released
xslprocessor.processxsl(l_pro,l_xsl,l_xml,l_parsedclob);
--release NOTHING
xmlparser.freeparser(l_par);
xslprocessor.freeprocessor(l_pro);
l_ligne:='';
l_offset :=1;
l_pos := null;
l_pos2 := null;
if(li_loop=0) then
begin
--on ouvre le fichier et on sauve l'entete + les donnies dedans.
l_pos:=dbms_lob.instr(l_parsedclob,'</DATA>');
if ( nvl(l_pos,0) > 0 ) then
loop
if(l_pos-1>l_amount + l_offset ) then
l_ligne:=dbms_lob.SUBSTR(l_parsedclob,l_amount,l_offset);
UTL_FILE.PUT(l_filehandler_out,l_ligne);
UTL_FILE.fflush(l_filehandler_out);
l_offset:=l_offset+l_amount;
else
l_ligne:=dbms_lob.SUBSTR(l_parsedclob,l_pos-1 -l_offset ,l_offset);
UTL_FILE.PUT(l_filehandler_out,l_ligne);
UTL_FILE.fflush(l_filehandler_out);
exit;
end if;
end loop;
else
EXIT;
end if;
end;
else
--on met les donnies donc on ne repete pas le debut
begin
l_pos:=dbms_lob.instr(l_parsedclob,'<ITEM');
if ( nvl(l_pos,0) > 0 ) then
l_pos2:=dbms_lob.instr(l_parsedclob,'</DATA>');
if ( nvl(l_pos2,0) > 0 ) then
loop
if(l_pos + l_amount <= l_pos2 -1 ) then
l_ligne:=dbms_lob.SUBSTR(l_parsedclob,l_amount,l_pos);
UTL_FILE.PUT(l_filehandler_out,l_ligne);
UTL_FILE.fflush(l_filehandler_out);
l_pos:=l_pos +l_amount;
else
l_ligne:=dbms_lob.SUBSTR(l_parsedclob,l_pos2 -1 -l_pos,l_pos);
UTL_FILE.PUT(l_filehandler_out,l_ligne);
UTL_FILE.fflush(l_filehandler_out);
exit;
end if;
end loop;
else
exit;
end if;
end if;
end;
end if;
li_loop:=li_loop + 1 ;
--UTL_FILE.FCLOSE(l_filehandler_in);
JAVA_GC();
EXIT WHEN L_NBROWKEEPTODO=0;
Exception
when others then
begin
-- IF(utl_file.is_open(l_filehandler_in)) THEN
-- utl_file.fclose( l_filehandler_in);
-- END IF;
IF(utl_file.is_open(l_filehandler_out)) THEN
utl_file.fclose( l_filehandler_out);
END IF;
RAISE_APPLICATION_ERROR(-20001,'File with errors');
end;
END;
END LOOP;
END LOOP;
CLOSE l_cursor;
if ( xmlString is not null ) then
dbms_lob.open(xmlString,dbms_lob.lob_readwrite);
l_lobsize:= dbms_lob.Getlength(xmlString);
if(l_lobsize>0) then
dbms_lob.erase(xmlString,l_lobsize,1);
end if;
dbms_lob.close(xmlString);
dbms_lob.freeTemporary( xmlString);
end if;
if(l_parsedclob is not null) then
dbms_lob.open(l_parsedclob,dbms_lob.lob_readwrite);
l_lobsize:= dbms_lob.Getlength(l_parsedclob);
if(l_lobsize>0) then
dbms_lob.erase(l_parsedclob,l_lobsize,1);
end if;
dbms_lob.close(l_parsedclob);
dbms_lob.freetemporary(l_parsedclob);
end if;
UTL_FILE.NEW_LINE(l_filehandler_out);
l_ligne:='</DATA></CATALOG>';
UTL_FILE.PUT(l_filehandler_out,l_ligne);
UTL_FILE.FCLOSE(l_filehandler_out);
EXCEPTION
when others then
begin
IF(utl_file.is_open(l_filehandler_out)) THEN
utl_file.fclose( l_filehandler_out);
END IF;
end;
END;
******************************Thank you for the info - I had no idea I was puing myself in danger by cutting it so close. Since your post I have moved my iphoto library to an external drive and now have 165 GB of space on my HD. Following this I have 2 questions.
1. Since my available HD space was reduced by the size of the photo download it seems logical that the download is somewhere on my HD still. Is there a place where these photos might be hiding on my HD even though they are not available on the iphoto library?
2. I was able to recover the .jpg files which are fine. I also recovered the .mov files but they have been compromised. I am hoping I can find the originals still on the HD somewhere. If not, do you have any suggestions for recovery methods or programs? I have not used the SD card since the incident so I should be able to attempt another recovery to salvage the .mov files if there is an alternative method/program available.
Thanks again! -
SQL Developer High Memory Consumption 3.2.20.09.87
Hello,
I have been using SQL developer for quite some time and had initially had problems in the past with high memory consuption when I use this tool.
I have received and applied advice from this forum and while it had helped a little, the majority of the high memory consuption remained an issue.
I finally got more time to dig around and try to specifically isolate where the problem seems to be coming from and here is what I found.
*1)* I have removed the Check for Updates feature
*2)* I have turned off many of the extensions except for DBA Navigator, Real Time SQL Monitoring, SearchBar and Snippet.
*3)* When I start a fresh SQL Developer Session and initiate a Oracle Connection the application consumes roughly 148 meg fo RAM
*4)* When I open my Windows Task Manager and watch the memory allocated to SQL Developer I notice it goes up when I move my mouse over the SQL Developer tool and When I run through menus at roughly 5k a second or so and the memory is never released back to the system.
*5)* When I run a Large SQL to the grid the memory jumps by about 100 meg or so and will continue to do so every time I repeat the SQL until SQL Developer consumes roughly 748 meg of RAM.
*6)* 748 Meg of RAM seems to be the number when SQL Developer (with one Oracle connection) no longer continues to consume more and then not return the memory to the system.
Is there a way to have SQL Developer automatically clear up it's active memory usage without closing it down and restarting it?
Why does SQL Developer continue to consume more and more memory just from moving your mouse over it and/or by navigating menus?
Here is my About Detail;
Oracle SQL Developer 3.2.20.09
Version 3.2.20.09
Build MAIN-09.87
Copyright © 2005, 2012 Oracle. All Rights Reserved.
IDE Version: 11.1.1.4.37.59.48
Product ID: oracle.sqldeveloper
Product Version: 11.2.0.09.87
Version
Component Version
========= =======
Java(TM) Platform 1.6.0_35
Oracle IDE 3.2.20.09.87
Properties
Name Value
==== =====
awt.toolkit sun.awt.windows.WToolkit
class.load.environment oracle.ide.boot.IdeClassLoadEnvironment
class.load.log.level CONFIG
class.transfer delegate
file.encoding Cp1252
file.encoding.pkg sun.io
file.separator \
ice.browser.forcegc false
ice.pilots.html4.ignoreNonGenericFonts true
ice.pilots.html4.tileOptThreshold 0
ide.AssertTracingDisabled true
ide.bootstrap.start 109707460930968
ide.build MAIN-09.87
ide.conf C:\app\twilliams\product\11.2.0\client_3\sqldeveloper\sqldeveloper\bin\sqldeveloper.conf
ide.config_pathname C:\app\twilliams\product\11.2.0\client_3\sqldeveloper\sqldeveloper\bin\sqldeveloper.conf
ide.debugbuild false
ide.devbuild false
ide.extension.search.path sqldeveloper/extensions:jdev/extensions:ide/extensions
ide.firstrun true
ide.java.minversion 1.6.0_04
ide.launcherProcessId 3276
ide.main.class oracle.ide.boot.IdeLauncher
ide.patches.dir ide/lib/patches
ide.pref.dir C:\Users\twilliams\AppData\Roaming\SQL Developer
ide.pref.dir.base C:\Users\twilliams\AppData\Roaming
ide.product oracle.sqldeveloper
ide.shell.enableFileTypeAssociation C:\app\twilliams\product\11.2.0\client_3\sqldeveloper\sqldeveloper\bin\sqldeveloperW.exe
ide.splash.screen splash.gif
ide.startingArg0 C:\app\twilliams\product\11.2.0\client_3\sqldeveloper\sqldeveloper\bin\sqldeveloperW.exe
ide.startingcwd C:\app\twilliams\product\11.2.0\client_3\SQLDEVELOPER\SQLDEVELOPER\BIN
ide.user.dir C:\Users\twilliams\AppData\Roaming\SQL Developer
ide.user.dir.var IDE_USER_DIR
ide.work.dir C:\Users\twilliams\Documents\SQL Developer
ide.work.dir.base C:\Users\twilliams\Documents
ilog.propagatesPropertyEditors false
java.awt.graphicsenv sun.awt.Win32GraphicsEnvironment
java.awt.printerjob sun.awt.windows.WPrinterJob
java.class.path ..\..\ide\lib\ide-boot.jar
java.class.version 50.0
java.endorsed.dirs C:\app\twilliams\product\11.2.0\client_3\sqldeveloper\jdk\jre\lib\endorsed
java.ext.dirs C:\app\twilliams\product\11.2.0\client_3\sqldeveloper\jdk\jre\lib\ext;C:\Windows\Sun\Java\lib\ext
java.home C:\app\twilliams\product\11.2.0\client_3\sqldeveloper\jdk\jre
java.io.tmpdir c:\Temp\
java.library.path C:\app\twilliams\product\11.2.0\client_3\sqldeveloper\sqldeveloper\bin;C:\Windows\Sun\Java\bin;C:\Windows\system32;C:\Windows;C:\app\twilliams\product\11.2.0\client_3\bin;C:\app\twilliams\product\11.2.0\client_3;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Program Files\Lenovo\Access Connections\;C:\Program Files\WinMerge;C:\Program Files\ThinkPad\Bluetooth Software\;.
java.naming.factory.initial oracle.javatools.jndi.LocalInitialContextFactory
java.protocol.handler.pkgs oracle.jdevimpl.handler
java.runtime.name Java(TM) SE Runtime Environment
java.runtime.version 1.6.0_35-b10
java.specification.name Java Platform API Specification
java.specification.vendor Sun Microsystems Inc.
java.specification.version 1.6
java.util.logging.config.file logging.conf
java.vendor Sun Microsystems Inc.
java.vendor.url http://java.sun.com/
java.vendor.url.bug http://java.sun.com/cgi-bin/bugreport.cgi
java.version 1.6.0_35
java.vm.info mixed mode
java.vm.name Java HotSpot(TM) Client VM
java.vm.specification.name Java Virtual Machine Specification
java.vm.specification.vendor Sun Microsystems Inc.
java.vm.specification.version 1.0
java.vm.vendor Sun Microsystems Inc.
java.vm.version 20.10-b01
jdbc.driver.home /C:/app/twilliams/product/11.2.0/client_3/
jdbc.library /C:/app/twilliams/product/11.2.0/client_3/jdbc/lib/ojdbc6.jar
line.separator \r\n
oracle.home C:\app\twilliams\product\11.2.0\client_3\sqldeveloper
oracle.ide.util.AddinPolicyUtils.OVERRIDE_FLAG true
oracle.jdbc.mapDateToTimestamp false
oracle.translated.locales de,es,fr,it,ja,ko,pt_BR,zh_CN,zh_TW
oracle.xdkjava.compatibility.version 9.0.4
orai18n.library /C:/app/twilliams/product/11.2.0/client_3/jlib/orai18n.jar
os.arch x86
os.name Windows 7
os.version 6.1
path.separator ;
reserved_filenames con,aux,prn,lpt1,lpt2,lpt3,lpt4,lpt5,lpt6,lpt7,lpt8,lpt9,com1,com2,com3,com4,com5,com6,com7,com8,com9,conin$,conout,conout$
sqldev.debug false
sun.arch.data.model 32
sun.boot.class.path C:\app\twilliams\product\11.2.0\client_3\sqldeveloper\jdk\jre\lib\resources.jar;C:\app\twilliams\product\11.2.0\client_3\sqldeveloper\jdk\jre\lib\rt.jar;C:\app\twilliams\product\11.2.0\client_3\sqldeveloper\jdk\jre\lib\sunrsasign.jar;C:\app\twilliams\product\11.2.0\client_3\sqldeveloper\jdk\jre\lib\jsse.jar;C:\app\twilliams\product\11.2.0\client_3\sqldeveloper\jdk\jre\lib\jce.jar;C:\app\twilliams\product\11.2.0\client_3\sqldeveloper\jdk\jre\lib\charsets.jar;C:\app\twilliams\product\11.2.0\client_3\sqldeveloper\jdk\jre\lib\modules\jdk.boot.jar;C:\app\twilliams\product\11.2.0\client_3\sqldeveloper\jdk\jre\classes
sun.boot.library.path C:\app\twilliams\product\11.2.0\client_3\sqldeveloper\jdk\jre\bin
sun.cpu.endian little
sun.cpu.isalist pentium_pro+mmx pentium_pro pentium+mmx pentium i486 i386 i86
sun.desktop windows
sun.io.unicode.encoding UnicodeLittle
sun.java2d.ddoffscreen false
sun.jnu.encoding Cp1252
sun.management.compiler HotSpot Client Compiler
sun.os.patch.level Service Pack 1
user.country US
user.dir C:\app\twilliams\product\11.2.0\client_3\SQLDEVELOPER\SQLDEVELOPER\BIN
user.home C:\Users\twilliams
user.language en
user.name twilliams
user.timezone America/Los_Angeles
user.variant
windows.shell.font.languages
Extensions
Name Identifier Version Status
==== ========== ======= ======
Check For Updates oracle.ide.webupdate 11.1.1.4.37.59.48 Loaded
Code Editor oracle.ide.ceditor 11.1.1.4.37.59.48 Loaded
Component Palette oracle.ide.palette1 11.1.1.4.37.59.48 Loaded
Data Miner oracle.dmt.dataminer 11.2.1.1.09.87 Disabled by user
Database Connection Support oracle.jdeveloper.db.connection 11.1.1.4.37.59.48 Loaded
Database Object Explorers oracle.ide.db.explorer 11.1.1.4.37.59.48 Loaded
Database UI oracle.ide.db 11.1.1.4.37.59.48 Loaded
Diagram Framework oracle.diagram 11.1.1.4.37.59.48 Loaded
Diagram Javadoc Extension oracle.diagram.javadoc 11.1.1.4.37.59.48 Loaded
Diagram Thumbnail oracle.diagram.thumbnail 11.1.1.4.37.59.48 Loaded
Diff/Merge oracle.ide.diffmerge 11.1.1.4.37.59.48 Loaded
Extended IDE Platform oracle.javacore 11.1.1.4.37.59.48 Loaded
External Tools oracle.ide.externaltools 11.1.1.4.37.59.48 Loaded
File Support oracle.ide.files 11.1.1.4.37.59.48 Loaded
Help System oracle.ide.help 11.1.1.4.37.59.48 Loaded
History Support oracle.jdeveloper.history 11.1.1.4.37.59.48 Loaded
Import/Export Support oracle.ide.importexport 11.1.1.4.37.59.48 Loaded
Index Migrator support oracle.ideimpl.indexing-migrator 11.1.1.4.37.59.48 Loaded
JDeveloper Runner oracle.jdeveloper.runner 11.1.1.4.37.59.48 Loaded
JViews Registration Addin oracle.diagram.registration 11.1.1.4.37.59.48 Loaded
Log Window oracle.ide.log 11.1.1.4.37.59.48 Loaded
Mac OS X Adapter oracle.ideimpl.apple 11.1.1.4.37.59.48 Loaded
Navigator oracle.ide.navigator 11.1.1.4.37.59.48 Loaded
Object Gallery oracle.ide.gallery 11.1.1.4.37.59.48 Loaded
Oracle IDE oracle.ide 11.1.1.4.37.59.48 Loaded
Oracle SQL Developer oracle.sqldeveloper 11.2.0.09.87 Loaded
Oracle SQL Developer - 3rd Party Database Browsers oracle.sqldeveloper.thirdparty.browsers 11.2.0.09.87 Loaded
Oracle SQL Developer - APEX Listener Administration oracle.sqldeveloper.listener 11.2.0.09.87 Loaded
Oracle SQL Developer - Change Mangement oracle.sqldeveloper.em_cm 11.2.0.09.87 Loaded
Oracle SQL Developer - DBA Navigator oracle.sqldeveloper.dbanavigator 11.2.0.09.87 Loaded
Oracle SQL Developer - Database Cart oracle.sqldeveloper.dbcart 11.2.0.09.87 Loaded
Oracle SQL Developer - Extras oracle.sqldeveloper.extras 11.2.0.09.87 Loaded
Oracle SQL Developer - File Navigator oracle.sqldeveloper.filenavigator 11.2.0.09.87 Loaded
Oracle SQL Developer - Migrations Antlr3 Translator oracle.sqldeveloper.migration.translation.core_antlr3 11.2.0.09.87 Missing dependencies: oracle.sqldeveloper.migration
Oracle SQL Developer - Migrations Application Migration oracle.sqldeveloper.migration.application 11.2.0.09.87 Disabled by user
Oracle SQL Developer - Migrations Core oracle.sqldeveloper.migration 11.2.0.09.87 Disabled by user
Oracle SQL Developer - Migrations DB2 oracle.sqldeveloper.migration.db2 11.2.0.09.87 Disabled by user
Oracle SQL Developer - Migrations DB2 Translator oracle.sqldeveloper.migration.translation.db2 11.2.0.09.87 Missing dependencies: oracle.sqldeveloper.migration, oracle.sqldeveloper.migration.translation.core_antlr3
Oracle SQL Developer - Migrations Microsoft Access oracle.sqldeveloper.migration.msaccess 11.2.0.09.87 Disabled by user
Oracle SQL Developer - Migrations Microsoft SQL Server oracle.sqldeveloper.migration.sqlserver 11.2.0.09.87 Disabled by user
Oracle SQL Developer - Migrations MySQL oracle.sqldeveloper.migration.mysql 11.2.0.09.87 Disabled by user
Oracle SQL Developer - Migrations Sybase Adaptive Server oracle.sqldeveloper.migration.sybase 11.2.0.09.87 Disabled by user
Oracle SQL Developer - Migrations T-SQL Translator oracle.sqldeveloper.migration.translation.core 11.2.0.09.87 Missing dependencies: oracle.sqldeveloper.migration
Oracle SQL Developer - Migrations Teradata oracle.sqldeveloper.migration.teradata 11.2.0.09.87 Disabled by user
Oracle SQL Developer - Migrations Teradata SQL Translator oracle.sqldeveloper.migration.translation.teradata_translator 11.2.0.09.87 Missing dependencies: oracle.sqldeveloper.migration, oracle.sqldeveloper.migration.translation.core
Oracle SQL Developer - Migrations Translation UI oracle.sqldeveloper.migration.translation.gui 11.2.0.09.87 Disabled by user
Oracle SQL Developer - Object Viewer oracle.sqldeveloper.oviewer 11.2.0.09.87 Loaded
Oracle SQL Developer - Real Time SQL Monitoring oracle.sqldeveloper.sqlmonitor 11.2.0.09.87 Loaded
Oracle SQL Developer - Reports oracle.sqldeveloper.report 11.2.0.09.87 Loaded
Oracle SQL Developer - Scheduler oracle.sqldeveloper.scheduler 11.2.0.09.87 Disabled by user
Oracle SQL Developer - Schema Browser oracle.sqldeveloper.schemabrowser 11.2.0.09.87 Loaded
Oracle SQL Developer - SearchBar oracle.sqldeveloper.searchbar 11.2.0.09.87 Loaded
Oracle SQL Developer - Security oracle.sqldeveloper.security 11.2.0.09.87 Disabled by user
Oracle SQL Developer - Snippet oracle.sqldeveloper.snippet 11.2.0.09.87 Loaded
Oracle SQL Developer - Spatial oracle.sqldeveloper.spatial 11.2.0.09.87 Disabled by user
Oracle SQL Developer - TimesTen oracle.sqldeveloper.timesten 11.2.0.09.87 Disabled by user
Oracle SQL Developer - Tuning oracle.sqldeveloper.tuning 11.2.0.09.87 Loaded
Oracle SQL Developer - Unit Test oracle.sqldeveloper.unit_test 11.2.0.09.87 Disabled by user
Oracle SQL Developer - User Extensions Support oracle.sqldeveloper.userextensions 11.2.0.09.87 Loaded
Oracle SQL Developer - Worksheet v2 oracle.sqldeveloper.worksheet 11.2.0.09.87 Loaded
Oracle SQL Developer - XML Schema oracle.sqldeveloper.xmlschema 11.2.0.09.87 Loaded
Oracle SQL Developer Data Modeler oracle.datamodeler 3.1.4.710 Disabled by user
Oracle SQL Developer Data Modeler - Reports oracle.sqldeveloper.datamodeler_reports 11.2.0.09.87 Disabled by user
PROBE Debugger oracle.jdeveloper.db.debug.probe 11.1.1.4.37.59.48 Loaded
Peek oracle.ide.peek 11.1.1.4.37.59.48 Loaded
Persistent Storage oracle.ide.persistence 11.1.1.4.37.59.48 Loaded
Property Inspector oracle.ide.inspector 11.1.1.4.37.59.48 Loaded
QuickDiff oracle.ide.quickdiff 11.1.1.4.37.59.48 Loaded
Replace With oracle.ide.replace 11.1.1.4.37.59.48 Loaded
Runner oracle.ide.runner 11.1.1.4.37.59.48 Loaded
VHV oracle.ide.vhv 11.1.1.4.37.59.48 Loaded
Versioning Support oracle.jdeveloper.vcs 11.1.1.4.37.59.48 Disabled by user
Versioning Support for Subversion oracle.jdeveloper.subversion 11.1.1.4.37.59.48 Missing dependencies: oracle.jdeveloper.vcs
Virtual File System oracle.ide.vfs 11.1.1.4.37.59.48 Loaded
Web Browser and Proxy oracle.ide.webbrowser 11.1.1.4.37.59.48 Loaded
XML Editing Framework IDE Extension oracle.ide.xmlef 11.1.1.4.37.59.48 Loaded
audit oracle.ide.audit 11.1.1.4.37.59.48 Loaded
classpath: protocol handler extension oracle.jdeveloper.classpath 11.1.1.0.0 Loaded
jdukshare oracle.bm.jdukshare 11.1.1.4.37.59.48 Loaded
mof-xmi oracle.mof.xmi 11.1.1.4.37.59.48 Loaded
oracle.ide.dependency oracle.ide.dependency 11.1.1.4.37.59.48 Loaded
oracle.ide.indexing oracle.ide.indexing 11.1.1.4.37.59.48 Loaded
palette2 oracle.ide.palette2 11.1.1.4.37.59.48 Loaded
status oracle.ide.status 11.1.1.4.37.59.48 Loaded
Thanks in advance...
Tom
Edited by: ERPDude on Feb 28, 2013 2:46 PMAces!!! You nailed it Gary...
Thank you.
I applied fixed noted in Re: Reduce SQLDeveloper memory footprint with JDK 1.7
For others, to summarize my changes.
product\11.2.0\client_3\sqldeveloper\sqldeveloper\bin\sqldeveloper.conf
AddVMOption -XX:+UnlockExperimentalVMOptions
AddVMOption -XX:+UseG1GC
AddVMOption -XX:MaxGCPauseMillis=50
AddVMOption -XX:GCPauseIntervalMillis=200
AddVMOption -XX:MaxPermSize=128M
AddVMOption -Xms50M
AddVMOption -Xmx384M
AddVMOption -XX:MinHeapFreeRatio=10
AddVMOption -XX:MaxHeapFreeRatio=10
product\11.2.0\client_3\sqldeveloper\ide\bin\ide.conf
comment the following two lines as shown below
#AddVMOption -Xmx640M
#AddVMOption -Xms128M
Now SQL Developer runs at roughly 500 meg.
I guess the only thing I have left to ask others reviewing this would be if there is a way to get these memory values down without having much adverse impacts on SQL Developer based on newer JVM switches/functionalities.
The posts that drove these changes are old from a technologoical perspective :) 2010.
Tom -
Very high memory consumption of B1i and cockpit widgets
Hi all,
finally I have managed it to install B1i successfully, but I think something is wrong though.
Memory consumption in my test environment (Win2003, 1024 MB RAM), while no other applications and no SAP addons are started:
tomcat5.exe 305 MB
SAP B1 client 315 MB
SAP B1DIProxy.exe 115 MB
sqlservr.exe 40 MB
SAPB1iEventSender.exe 15 MB
others less than 6 MB and almost only system based processes...
For each widget I open (3 default widgets, one on each standard cockpit), the tomcat grows bigger and leaves less for the sql server, which has to fetch all the data (several seconds on 100% of CPU usage).
Is this heavy memory consumption normal? What happens if several users are logged into SAP B1 using widgets?
Thanks in advance
Regards
SebastianHi Gordon,
so this is normal? Then I guess the dashboards are not suitable for many customers, especially for them who are working on a terminal server infrastructure. Even if the tomcat server has this memory consumption only on the SAP server, when each client needs about 300 MB (and add some hundred for the several addons they need!), I could not activate the widgets. And generally SAP B1 is not the only application running at the customers site. Suggesting to buy more memory for some Xcelsius dashboards won't convince the customer.
I hope that this feature will be improved in the future, otherwise the cockpit is just an extension of the old user menu (except for the brilliant quickfinder on top of the screen).
Regards
Sebastian -
Hi,
i'm having some doubts on the memory issues, like allocation, release, EEPROM and RAM:
Question 1:
private method1()
byte[]a = new byte[10];
byte[]b = JCSystem.makeTransientByteArray(...);
byte c;
When will some memory be allocated to variables a, b and c, and when will that memory be released?
Question 2:
JCRE (until 2.2 at least) doesn't have Garbage collector, but if the card itself has that mechanism, will the applet automatically use it?
Thanks in advance!It's not a question on how many EEPROM writes are done each day/hout/minute/second. It's a question of whether the data must be saved across sessions. RAM is mainly used for intermediate computations & session data. EEPROM is used to store persistent info (user info, credit, phonebook, etc...).
RAM is also a good way to optimize processing time. If you have to manipulate a lot of persistent data during an APDU, it's a good idea to copy everything in a "cache" (transient buffer) and/or local variables, do all of your processing on the cached values, and then perform the persistent write at the end of the command.
As to your last question on how much RAM is acceptable, it depends on the context. If you know that your applet will be alone, feel free to use as much as the platform can give you. If not, try to be reasonable. Cryptographic intensive applets generally use a lot of RAM to store intermediate computation results.
From personal experience, I've written very simple applets that needed about 20 transient bytes, and complex ones that needed up to 1500 transient bytes. If you really need to set a limit, 200 bytes is already a considerable amount of transient space and should be more than enough for most applets. But then again, my guess is as good as any. -
OIM 11g using too much memory and not releasing it when shutting down
Hello,
we have a problem with OIM 11g using too much memory (about 5gb) and it's not releasing it at shutdown (4gb still used).
We are using a VM with RedHat Linux 5.6. Originally we had 4gb RAM + 2gb swap file. We installed Admin Server, OAM, OIM and SOA on that machine but quickly realised we couldn't run all 4 programs at once in 6gb. AdminServer could run with 2 other products, but it was a tight fit in memory.
So we increased the RAM to 8gb (still 2gb swap file). But then our problem occured : I start the Admin Server (2.7gb total memory used), then OAM (4.6gb total memory used) and then OIM. After it started the server is now using 9.6gb of memory (~300mb of free memory) ! The problem gets even better : after I shut down everything (OIM, OAM, admin server) the "top" command show that there is still 4gb of memory used even tho nothing else is running on the server ! There is absolutely no other process (other than root stuff) running and no other users connected to the machine. After a reboot, it shows 400mb of memory used. I tried restarting the programs and it did the same thing.
Our intuition is that there might be a memory leak or some bug in OIM that might use up almost all the free memory and it's not releasing it upon shutdown. It might have been there before we increased the memory but have not noticed it since memory was already tight.
Anyone encountered the same problem ? Any idea ? Any suggestion to narrow down the problem ?
Thank youYou can adjust the memory settings for WLS by editing the setSOADomainEnv.sh file that can be found in your /middleware/user_projects/domains/<domain>/bin/ folder. There is an argument called PORT_MEM_ARGS which is used to set your Java memory arguments. This way you can decrease/increase the amount of memory used by each managed server.
I usually type "ps -ef | grep oracle" to see what processes are running by the oracle user. This way the output doesn't get cluttered with root processes.
Sunny Tsang -
I have 24GB of RAM in my 64 bit Windows 7 system running on RAID 5 with an i7 CPU.
A while ago I updated from Premiere CS5 to CC and then from Premiere CC to CC 2014. I updated all my then current projects to the new version as well.
Most of the projects contained 1080i 25fps (1080x1440 anamorphic) MPEG clips originally imported (captured from HDV tape) from a Sony HDV camera using Premiere CS5 or CC.
Memory consumption during re-indexing.
When updating projects I experienced frequent crashes going from CS5 to CC and later going from CC to CC 2014. Updating projects caused all clips in the project to be re-indexed. The crashes were due to the re-indexing process causing excessive RAM consumption and I had to re-open each project several times before the re-index would eventually complete successfully. This is despite using the setting to limit the RAM consumed by Premiere to much less than the 24GB RAM in my system.
I checked that clips played; there were no errors generated; no clips showed as Offline.
Some Clips now Offline:Importer CC 2014
Now, after some months editing one project I found some of the MPEG clips have been flagged as "Offline: Importer" and will not relink. The error reported is "An error occurred decompressing video or audio".
The same clips play perfectly well in, for example, Windows Media Player.
I still have the earlier Premiere CC and the project file and the clips that CC 2014 importer rejects are still OK in the Premiere CC version of the project.
It seems that the importer in CC 2014 has a bug that causes it to reject MPEG clips with which earlier versions of Premiere had no problem.
It's not the sort of problem expected with a premium product.
After this experience, I will not be updating premiere mid-project ever again.
How can I get these clips into CC 2014? I can't go back to the version of the project in Premiere CC without losing hours of work/edits in Premiere CC 2014.
Any help appreciated. Thanks.To answer my own question: I could find no answer to this myself and, with there being no replies in this forum, I have resorted to re-capturing the affected HDV tapes from scratch.
Luckily, I still had my HDV camera and the source tapes and had not already used any of the clips that became Offline in Premiere Pro CC 2014.
It seems clear that the MPEG importer in Premiere Pro CC 2014 rejects clips that Premiere Pro CC once accepted. It's a pretty horrible bug that ought to be fixed. Whether Adobe have a workaround or at least know about this issue and are working on it is unknown.
It also seems clear that the clip re-indexing process that occurs when upgrading a project (from CS5 to CC and also from CC to CC 2014) has a bug which causes memory consumption to grow continuously while it runs. I have 24GB RAM in my system and regardless of the amount RAM I allocated to Premiere Pro, it would eventually crash. Fortunately on restarting Premiere Pro and re-loading the project, re-indexing would resume where it left off, and, depending on the size of the project (number of clips to be indexed), after many repeated crashes and restarts re-indexing would eventually complete and the project would be OK after that.
It also seems clear that Adobe support isn't the greatest at recognising and responding when there are technical issues, publishing "known issues" (I could find no Adobe reference to either of these issues) or publishing workarounds. I logged the re-index issue as a bug and had zero response. Surely I am not the only one who has experienced these particular issues?
This is very poor support for what is supposed to be a premium product.
Lesson learned: I won't be upgrading Premiere again mid project after these experiences. -
BW data model and impacts to HANA memory consumption
Hi All,
As I consider how to create BW models where HANA is the DB for a BW application, it makes sense moving the reporting target from Cubes to DSOs. Now the next logical progression of thought is that the DSO should store the lowest granularity of data(document level). So a consolidated data model that reports on cross functional data would combine sales, inventory and purchasing data all being stored at document level. In this scenario:
Will a single report execution that requires data from all 3 DSOs use more memory vs the 3 DSOs aggregated say at site/day/material?Lower Granularity Data = Higher Memory Consumption per report execution
I'm thinking that more memory is required to aggregate the data in HANA before sending to BW. Is aggregation still necessary to manage execution memory usage?
Regards,
Dae JinLet me rephrase.
I got an EarlyWatch that said my dimensions on one of cube were too big. I ran SAP_INFOCUBE_DESIGNS in SE38 in my development box and that confirmed it.
So, I redesigned the cube, reactivated it and reloaded it. I then ran SAP_INFOCUBE_DESIGNS again. The cube doesn't even show up on it. I suspect I have to trigger something in BW to make it populate for that cube. How do I make that happen manually?
Thanks.
Dave -
High memory usage in SQL Servers
Hi,
I have few SQL servers which are having memory utilization more than 95% when checked in task manager.
Servers OS - Windows server 2008 R2
SQL Sever 2008 and SQL Server 2012
AWE option is not enabled, Min memory setting - 1 GB and Max memory setting - 80% of the total memory in server.
SQL process is using less than max memory, when checked in task manager. So i was thinking where is remaining memory.
Used RAMMAP tool and found memory in AWE, even if i disabled AWE in SQL Server. Please let me know how to avoid AWE.Hi,
Task manager does not always show you correct value so I would advise you not to loot at task manager but in this case since you SQL Server service account does not have Locked pages in memory privilege(LPIM) task manager can be referred
please use below query to see SQL Server memory utilization
select
(physical_memory_in_use_kb/1024)Memory_usedby_Sqlserver_MB,
(locked_page_allocations_kb/1024 )Locked_pages_used_Sqlserver_MB,
(total_virtual_address_space_kb/1024 )Total_VAS_in_MB,
process_physical_memory_low,
process_virtual_memory_low
from sys. dm_os_process_memory
Is your SQL server 64 bit version, then no need to enable AWE option.
Its totally normal behavior for SQL Server to utilize memory. It can also utilize memory greater that what is limit in max server memory setting as this only controls buffer pool allocation.
You can reduce max server memory value IF you really want SQL Server memory utilization in tak manager to come down. But I would again say its normal.
What is max server memory value anyway?
How much you have left for OS on atleast 5-6 G id required for OS. Value may vary.
>>SQL process is using less than max memory, when checked in task manager. So i was thinking where is remaining memory
SQL Server caches memory to store as much data and index pages(in memory) as possible to avoid any physical I/O which is costly and once taken memory it will not release unless SQLOS asks it to do so . It will start trimming its mmeory consumption when
SQLOS says it to do because of some memory pressure or other requirement. The process is totally dynamic and is managed perfectly so you dont need to worry.
More about SQL Server memory and troubleshooting
http://social.technet.microsoft.com/wiki/contents/articles/22316.sql-server-memory-and-troubleshooting.aspx
Please mark this reply as answer if it solved your issue or vote as helpful if it helped so that other forum members can benefit from it.
My TechNet Wiki Articles
Maybe you are looking for
-
When trying to create a new server farm in the sharepoint foundation 2013 we get a following error : The local farm is not accessible. Cmdlets with FeatureDependencyId are not registered. PS C:\Users\Administrator> New-SPConfigurationDatabase cmdlet
-
I can't open my iPhoto after I upgraded from the app store. Any suggestions?
Good Morning. I recently upgraded my iphoto at the app store. Since then I cannot open my iphoto. It comes up error 1712. Can anyone give me any help? Thanks Jeff
-
I am using Mountain Lion (OS X 10.8.2) on my MacBook (mid-2009); Mail version 6.1 (1498). My incoming mail server is pop3.live.com (Hotmail). When I send an email on the Hotmail website, it does not register in my Mac Mail sent box. How do I rectify
-
Hi Experts, I've got a report with many master data of a customer. But not always all data are maintenanced. The result is, that instead of blank there's "#" or 'Not assigned" value. How can I change this to blank? Thanks a lot for your help
-
Alerts for item master change and new item creation
hai experts my manager has to get an alert when any update has been done in item master or when a new item master is created .