Large list memory problem

Hi,
I am using MX2004 (10.1), I am finding that when I am working
with large lists, there seems to be a memory leak when the memory
should be returned when the list is finished with. Projector is
being run on Windows XP.
This only seems to happen in a projector, it does not happen
in authoring.
Please see the attached code for a test sample I made.
My actual application creates a list of property lists that
is about 1500 evements x 10 properties, but the attached example
has the same effect.
The sample basically repopulates an array with 30000 text
elements, every 5 seconds.
If you make this into a projector, you will see (if the
problem is not limited to the 5 machines I am using) that every 5
seconds, the memory in use will increment, and never go down,
despite the fact that the list is VOID at the end of each
calculation.
Has anybody got any explanation for this, a work around or
can at least replicate this so I know I am not going mad.
I have tested creating the projector on several machines in
our office, andon several flavours of windows XP - all with the
same result.
Thanks for any ideas.

Tested, and I can verify that you've found a bug. And it's
probably worst
than it seems.
First of all, it has nothing to do with using globals. You
can replicate the
leak by publishing a movie containing the following frame
script:
on beginsprite me
the debugplaybackenabled=true
repeat while not the shiftdown
nArray = []
repeat with i=1 to 30000
nArray.add( "text " &i )
end repeat
put the milliseconds
end repeat
end
Second, it's not a list issue - not releasing elements on
clearup.
I tried appending lists instead of strings : nArray.add( [] )
: and the
issue remained.
Then I tried using xtrema's strings, _a("myString"&i),
and other
non-director native values, and everything was ok - no leaks.
And finally, I tried using xLists, containing director
strings. And the leak
occurred again.
Based on the above, I'd say that the leak is caused by
director's failure to
release the allocated memory of native values that require
allocated buffers
for storing their data.
And now to the 'probably worst' part.
When adding 'just' 20000 instead of 30000 strings, there was
no leak. So, I
guessed that the problem occurred when a large, yet fixed,
number of
allocations was involved. But then I tried using a larger
string
("textAAAAAAAAAA"&"i), and there was the leak again.
So, the leak depends not only to the number of unique
allocations, but to
the size of the allocated buffers as well.
Seems that the issue is fixed on Dir11. However, this bug,
along with the,
also fixed in 11, legacy memory allocating issue (a pre v11
director windows
projector allocates aprox 10% of the physical ram!!) is
something that I
strongly believe justify a dir 10.x update. I bet it won't
happen, but, in
my book, bug fixes=update, new features=upgrade/new version.
"TJW-dev" <[email protected]> wrote in
message
news:[email protected]...
> Hi,
>
> I am using MX2004 (10.1), I am finding that when I am
working with large
> lists, there seems to be a memory leak when the memory
should be returned
> when
> the list is finished with. Projector is being run on
Windows XP.
>
> This only seems to happen in a projector, it does not
happen in authoring.
> Please see the attached code for a test sample I made.
> My actual application creates a list of property lists
that is about 1500
> evements x 10 properties, but the attached example has
the same effect.
>
> The sample basically repopulates an array with 30000
text elements, every
> 5
> seconds.
>
> If you make this into a projector, you will see (if the
problem is not
> limited
> to the 5 machines I am using) that every 5 seconds, the
memory in use will
> increment, and never go down, despite the fact that the
list is VOID at
> the end
> of each calculation.
>
> Has anybody got any explanation for this, a work around
or can at least
> replicate this so I know I am not going mad.
> I have tested creating the projector on several machines
in our office,
> andon
> several flavours of windows XP - all with the same
result.
>
> Thanks for any ideas.
>
>
>
> global nArray
>
> on prepareMovie
> nArray = []
> updateTimer = timeout("restartTimer").new(5000,
#popArray)
>
> end prepareMovie
>
> on popArray
> nArray = []
> repeat with i=1 to 30000
> nArray.add("Text " & i)
> end repeat
> nArray = VOID
> end popArray
>

Similar Messages

  • Java library for large list sorts in small amount of memory

    Hi All
    Wondering whether anybody would know of a ready made java library for sorting large lists of objects. We will write ourselves if necessary but I'm hoping this is the sort of problem that has been solved a million times and may hopefully have been wrapped behind some nice interface
    What we need to do is something along the line of:
    Load a large list with objects (duh!)
    If that list fits into an allowed amount of memory great. Sort it and end of story.
    If it won't then have some mechanism for spilling over onto disk and perform the best sort algorithm possible given some spill over onto disk.
    These sorts need to be used in scenarios such as a random selection of 10% of the records in a 100G unsorted file and delivered sorted, with the jobs doing the sorts being limited to as little as 64MB of memory (these are extremes but ones that happen quite regularly).
    Some of you might think "just use a database". Been there. Done that. And for reasons beyond the scope of this post its a no goer.
    Thanking you for all your help in advance.

    Of course this kind of sorting was common back in days of yore when we had mainframes with less than 1MB or ram.
    The classic algorithm uses serveral files.
    1) Load as much as will fit
    2) sort it
    3) Write it to a temporary file
    4) Repeat from 1, altnerating output between two or three temp files.
    Your temporary files then each contain a series of sorted sequences (try saying that three times fast).
    You then merge these sequences into longer sequences, creating more temp files. Then merge the longer sequences into even longer ones, and so on until you're left with just the one sequence.
    Merging doesn't require any significant memory use however long the sequences. You only need to hold one record from each temp file being merged.

  • Memory problems with PreparedStatements

    Driver: 9.0.1 JDBC Thin
    I am having memory problems using "PreparedStatement" via jdbc.
    After profiling our application, we found that a large number oracle.jdbc.ttc7.TTCItem objects were being created, but not released, even though we were "closing" the ResultSets of a prepared statements.
    Tracing through the application, it appears that most of these TTCItem objects are created when the statement is executed (not when prepared), therefore I would have assumed that they would be released when the ResultSet is close, but this does not seem to be the case.
    We tend to have a large number of PreparedStatement objects in use (over 100, most with closed ResultSets) and find that our application is using huge amounts of memory when compared to using the same code, but closing the PreparedStatement at the same time as closing the ResultSet.
    Has anyone else found similar problems? If so, does anyone have a work-around or know if this is something that Oracle is looking at fixing?
    Thanks
    Bruce Crosgrove

    From your mail, it is not very clear:
    a) whether your session is an HTTPSession or an application defined
    session.
    b) What is meant by saying: JSP/Servlet is growing.
    However, some pointers:
    a) Are there any timeouts associated with session.
    b) Try to profile your code to see what is causing the memory leak.
    c) Are there references to stale data in your application code.
    Marilla Bax wrote:
    hi,
    we have some memory - problems with the WebLogic Application Server
    4.5.1 on Sun Solaris
    In our Customer Projects we are working with EJB's. for each customer
    transaction we create a session to the weblogic application server.
    now there are some urgent problems with the java process on the server.
    for each session there were allocated 200 - 500 kb memory, within a day
    the JSP process on our server is growing for each session and don't
    reallocate the reserved memory for the old session. as a work around we
    now restart the server every night.
    How can we solve this problem ?? Is it a problem with the operating
    system or the application server or the EJB's ?? Do you have problems
    like this before ?
    greetings from germany,

  • An error occurred when processing the spool request.  (Possibly due to memory problems)

    Hi SAP,
    My backgroud job is failing with log as shown below
    Job started
    Step 001 started (program ZMRS0065, variant SUS
    Step 002 started (program ZDRS0090, variant SUS
    Step 003 started (program ZFRS0305, variant SUS
    Step 004 started (program ZFRS0300, variant SUS
    Access error
    Job cancelled after system exception ERROR_MESSAGE
    An error occurred when processing the spool request.  (Possibly due to memory problems)
    Kindly help out in this issue
    Regards
    Mohammed

    Hello Mohammed,
    Have you seen any error in your SM21 system log and developer trace of spool work process
    during that time?
    The maximum size when create spool is 2GB.Pls also check if you are creating a spool
    larger than it.
    Best Regards
    Jie Bai

  • Create a view in SharePoint UI to paginate through data on a large list.

    Some of my users are facing a usability issue with SharePoint.
    The problem seems to be that if a list exceeds the threshold. SharePoint gives them the error ... but doesn't allow them to fix it.
    So when the list threshold is exceeded... the user cannot go into the list at all to create a view or to delete old records. 
    They must call in the system administrator who can open bigger lists... but in large companies system administrators are not easily accessible.
    How can user fix the problem himself/herself in case the threshold has increased?
    The second issue is that I want to create a view on a large list that shows 50 items at a time. I don't want to filter (like year = 2013 or age < 10) I don't want to filter but I want to reduce the number of records fetched by means of pagination. Is
    this possible on a large list. (I found many blogs on the web with people using XSLT web part and DVWP..... but none of them work when the list has exceeded threshold). I want a very simple view on the list which takes only 50 items at a time and works with
    large lists.
    val it: unit=()

    In order to handle large lists (over 5000 items) you need to select the fields that you what to show in the view and set them as indexes , you can do it through list settings-> set indexes.
    it might be a problem in an allready large list 'but in a small one - its pretty quick.
    and to create a view'and set the pagination there - use the create view of the list itself and not a dataview webpart or other tool.
    I have tested it in a project of mine in a list of 15000 records and it works , although you can not filter and sort on the view itself but on the view definitions only.
    if the list is bigger then 20000 items - > it is most desireable not to use sharepoint .
    Shlomy

  • InDesign CS4 memory problems

    I have had memory problems with InDesign CS4 (and also with Photoshop CS4) for about a year now. If my file has a lot of pictures (100+) at some stage I get error message "Out of memory" . In the beginning it just have problems with redraw pictures (half of picture stays black), but also export to pdf is halted by same message and printing to file also. No problems in the beginning of large files or with files with text only. At one point InD just has enough and starts to behave like a ... Only solution is to export InD CS4 file as INX and continue to work in CS3 (but it takes a long time to match the text flow because textengine works differently). It's not monitor, because I have changed monitor during this year (from LaCie 22 blue IV to EIZO 27"). It seems that it's not video (NVIDIA GeForce 9800 GTX, 512 MB) also, because everything functions normally with CS3. I know, that it's not a proper video card for my work, but one likes to play Bioshock sometimes... Something strange happens with Photoshop CS4 also when I work for a long time without shutting down the program. After about editing 60+ pictures Photoshop CS4 begins to slow down (specially with using clone tool).up to impossible to work. You just have to wait while the cursor is dragging itself to right place an picture is redrawing itself. Seems like memory cannot empty itself and gets overloaded when working with CS4. No problems with Photoshop CS3. One cannot notice anything with small files. I can work with CS3 but there are features I would like to use in CS4.
    Have anyone experienced anything similar?
    Claudius
    Win XP SP3
    4GB RAM
    video NVIDIA GeForce 9800 GTX 512MB with latest drivers
    CS4 and CS3

    Having exactly the same here. 16 Gb of ram in an 8 core 64 bits Vista system. Absolutely no infections on my system. Having great troubles getting my work ouputted.
    I'm making an overview with 50 cards with pictures in them and bevels on the edges, dropshadows. It might be heavy but it is a normal question from my client.
    I don't care if transparency is difficult for Adobe to handle. They shouldn't make it public if it is not properly tested. I am now so far as to make the composition in photoshop ( if that works)
    What I thought was the problem is that I originaly placed indesign files in frames. But there is a great script out there on www.automatication.com to convert that in  editable object again. Works great so check that out. But this didn't solve my problem. The effects used in these items are the memory consummers and Adobe is not freeing the memory properly.
    Advise to Adobe: check also to dropshadows feature because this is not scaling up or down with the rest of the effect.
    I'm am writing a realtime 3D renderengine here, if CS5 is not working fine, I might digg in to this one as well, getting a bit tired of waiting for properly working software.
    Jaap Clarenburg
    [email protected]

  • Sound Problem Plus Memory Problem

    I've also posted this same issue at the EVGA website with no help what so ever yet. Also if it's ok I'm having the above mentioned memory problem that it probable not related but if anyone does have good input I would love to hear. Hope this doesn't break any rules, didn't see anything in the listed rules thread. So here it goes.
    Ok I'm having a few minor but very perplexing and annoying problems. I have recently built a new rig to include the following.
    Windows 7 64 bit Home Edition
    EVGA P55 FTW Intel P55 Socket LGA56
    Intel Core i5-750 2.66Ghz 8M LGA56 CPU
    Corsair Dominator 4GB DDR3 PC2800 (2 x 2 GB sticks)
    Ultra LSP750 750w Power Supply
    Seagate TB Serial ATA HD 7200/32MB
    Cooler Master HAF 932 Full Tower Black Case
    Galaxy GeForce GTS 250 52MB PCIe x2 (SLI Setup)
    BFG - NVIDIA GeForce 9800 GT GB GDDR3 PCI Express Graphics Card (Physx Setup)
    PROBLEM ONE:
    This one I had a lot less that I was able to actually try because it just confused the crap out of me. So I started by going to test my sound. I'm running an optical cable from the on board sound port to back of my sony surround sound system. I get sound from my left and right channel but nothing from the center, rears or sub. So figure no biggie just need to change the configuration from 2 speaker stereo to 5. surround sound. Guess what... the option isn't there? WTF? A sound card with an optical port with no 5. support? So went online found the drivers for the sound on the mother board and tried reinstalling thinking maybe somthing was wrong with the drivers on the CD. Well it still didn't work. So I figured what ever, I got an external sound card that I know works with Windows 7 since it was on other desktop and hooked it up. Its a Creative - Sound Blaster X-Fi Surround 5. External USB Sound Card. Well it installs fine and I go to test it. Yep you guessed it! Still only sound from the left channel and the right channel with no option for surround sound. So I was able to find updated drivers for that and ran em. Well now I have an option for Surround sound with side speakers only no rear. Oh but it doesn't really matter because the left and right channel are still the only ones able to produce sound. And yes the speakers work I turned on my X-Box 360 hooked up to the same speakers and LCD tv and played COD4 with full sound working fine. So something else has to be happening I just have no idea.
    PROBLEM TWO:
    The second problem was with the memory. Everything booted fine but it initial only detected 2 GB. By the way I had them installed in slots and 3. So I shut down and popped em out checked the copper connectors everything looked fine and reseated the chi
    ps.
    This time during the boot up I checked the BIOS. It saw all 4 GBs of memory. Continued into Windows. Windows now saw all 4 but only .99 GB was available. Now the only other time I've seen anything like this was with on board video sharing the memory, and even then it wasn't half of the systems total memory. I then tried shutting down and putting the chips in 2 and 4 just see what would happen. Well I got an EA post error on the LED, couldn't even boot. So I figured o'well power down completely and tried to run the memory in 2 and 3 to run it out of dual channel. I was able to boot but still had the same issue. .99 GB available and still saw all 4. At this point it was just getting silly so I figured ok I'll download the new BIOS with my laptop, burned it to a CD. Booted the desktop and ran the BIOS flash. Update seemed work fine, honestly the BIOS didn't look much different and I didn't take note of the previous version before I did it so I can only assume it worked. However it did not fix my problem. So I'm still stuck at windows seeing 4 GB of memory with only .99 available.
    Anyway... I'll keep trying to get this working and hunting around online, but I'm really hoping you all have some ideas. I'll be trying to call EVGA today for some help too and possibly Corsair. Thanks in advance for the help!

    Hello,
    cannot help with the sound problem.
    For memory, I've got almost the same thing but with 4 sticks of 2gb and only 2 detected on an ASUS P7P55D deluxe.
    Bios was my issue but I needed to do this (but with pairs of sticks instead of sticks) after the bios update to finally get my 8gb under windows:
    Make a clear cmos. Then boot the computer with only one stick of ram in the port .
    If everything is fine, turn off and add the other port of the same color for dual channel.
    If it does not boot with the 2 sticks in the port of the good color, try to switch ports again. Maybe one stick is dead?

  • Not Enough Memory Problem!

    Hi Guys,
    Often I am getting the following message when I open any AI file. (you can see the screen shot). Once I click 'OK' and re-open the same file, then It is coming properly without any problem.
    For your extra information:
    Using Software: Adobe CS4
    Processor: 2.5 GHz PowerPC G5 (2x)
    Memory: 6GB DDR SDRAM
    If you guys knows any solution or reason, pleas share with me.
    Thanks in advance.
    Regards
    HARI

    Thanks for your reply snunicycler
    I do have file with 3.5mb and with small images (without PDF Compatable). and running Photoshop also without any file (just photoshop open). Suitcase also runnin.So that time it self i am getting Not enough memory problem.
    FYI: Earlier when I use CS3 I worked with larger, largest files without any problem. I am getting this error in CS4 only.
    So finally don't what's happening in my system,
    HARI

  • A possible solution to FCP X memory problems - it is working for me so far

    I think I may have found a way to prevent FCP from gobbling up all available memory (and bring my mac to its knees).
    The idea came to me from a post by Tony Reidsma  in this thread: https://discussions.apple.com/thread/3770230?start=0&tstart=0
    He pointed to this page regarding memory usage in FCP X and Motion: http://bradbell.tv/filmmaking/improving-ram-performance-in-final-cut-pro-x/
    The above page,mentions a preference setting that exists in Motion but "not in FCP X" - the setting for a "cache percentage".
    This is supposed to determine how much of the available memory Motion should retain for caching content (or so I gather).
    Sure enough, this preference is not there visibly in FCP X, but since FCP X and Motion share a common foundation, I thought it might be there anyway.
    So after looking into the plist files in XCode, I found the name of this preference setting and used the defaults command in Terminal.
    I have been throwing everything at it for a day now and it seems to be working well, so I thought I'd share.
    Here is the relevant defaults command (NOTE: use this in Terminal when FCP X is NOT running):
            defaults write com.apple.FinalCut OZPreferenceManager::CachePercentage -int 20
    (this seems leave free about 20% of the ram available when FCP X starts; I have experimented with different values, and apparently the higher the number, the more free memory is untouched)
    I have 15 applications running in MacBook Pro (with 8GB Ram), for hours, and no memory problems (and no need to "purge"). And FCP X and all other applications have not shown any of the slowdowns typical of low memory situations.
    NOTE: While I have not experienced any negative effects, I have no internal knowledge of the software and no guarantee that this is effective or harmless.
    One can revert the above setting by typing (or pasting) the following in Terminal (again, with FCP X NOT running):
        defaults delete com.apple.FinalCut OZPreferenceManager::CachePercentage
    If any of you try this and it does or does not work for you, I'd appreciate if you post your results back here.

    Hey Luis, and everyone else.
    I just tried your terminal script, and it has completely cured the sluggish problem with FCPX. Thank you, thank you , thank you!!.
    I am working in a pretty large project, and now is flowing like a champ. Even skimmin is on point.
    The project is 60 GB in size and I didnt create optimized media or used Proxy, so is working out of the RAW data, and boy must I say is flowing.
    Again thank you dude.
    PS: to you all out there, if you jump in a large project, I found that by making compund clip over compound clip helps the speed of the whole project. Example: I am braking down scenes in to compound clips, and portions of the scenes in to more compund clips, that has created a fast flow to go in an out of.
    Macbook Pro 13 2009,
    8GB ram / 750GB HDD
    FCPX 10.0.0

  • Mac Pro 6 core, D500, 256ssd 64gb loss of memory problems

    I Can simply have the computer running with no Apps other than Finder & Adobe CC cloud and it burns up all RAM within an hour.
    Help?

    By far the things that cause the most issues are "just one little thing" that Users have added. These tend to cause trouble because they are badly-crafted or because they do things in violation of Developer Guidelines.
    Lots of seemingly innocent little things that users have added have been seen to cause serious problems in high-end Mac Pros. If you want trouble-free, remove all the non-Apple add-ons and just run straight, unmodified Mac OS X and Major Applications, no add-ons. Readers here could easily produce a list of more than 50 add-ons that have been seen in postings here to cause trouble.  When in doubt, throw it out.
    The anti-malware features built into Mac OS X work better that any so-called "anti-Virus". Most of these commercial packages are simply worthless, but some also ruin performance, ruin memory utilization, or cause kernel panics without adding any additional value.
    Hardware problems do not cause memory leaks.
    The Mac Pro has Error Correcting Code memory, and its memory problems do not fester undetected. But you do need to check that all the memory you installed is still listed as present, because ANY problems detected at Startup can cause failing modules to be marked "absent".

  • I have Yosemite 16 gig mem and still have memory problems.

    I have the new activity monitor which has that green line every one is talking about. I used to have only 4 gig and my computer was very slow then i upgraded to 16 gig and at the beginning it ripped.  I started using Firefox on a site that had a lot of large JPEGs. after some time things started to slow down again, when i would try to save a JPEG it would take longer to see the line on " save image as " , after a while it got pretty slow not s bad as before but a lot slower than when i booted up.  
    My memory checks out fine HOWEVER . when i looked at the activity monitor the cache size just kept getting higher  at one point at 9 gig. this cache size was slowly growing the longer i was on line. The larger it grew the slower things worked.
    I understand i am supposed to just look at the memory pressure and yes it stayed green the entire time. but the slow down did happen at the same time the cache size grew.
    I also have a windows 8.1 computer only 6 gigs. and no mater how long i stay on the internet and i can open 15 tabs i never experience a slow down. 
    I would think with 16 gig and an i7 i would have some speed. I wish there was a way to limit the cache.
    I use the Mac most of the time and i wish they would fix the memory problem.
    What can i do ?

    I use the Mac most of the time and i wish they would fix the memory problem.
    It's not a problem. It's working as designed. Memory management has been changed to take full advantage of increased RAM. It, by design, takes advantage of RAM reserves by holding things in RAM instead of constantly moving things in and out.
    What can i do ?
    Chill out.

  • Do modules cause memory problems?

    I have a big app that I decided to make modular. Everything
    seemed to be going along smoothly at first as I started adding
    modules. Now I have upwards of 40 modules and the app runs slow on
    the client side and it takes forever for Flex Builder to build as
    well. I kept getting an out of memory error so I upped my
    FlexBuilder.ini parameter to -Xmx1024M. I am not getting the memory
    problem, it does eventually build, but it takes a few minutes. I
    guess my question is, does the number of modules I have in an
    application have this big of an affect on building the application?
    I have 1.5 gig of RAM and now allocating upwards of 2/3 of it to
    Flex Builder if it needs it. Is there a better way or do I just
    deal with it?

    If the modules are large it's possible that your stuck with
    it but that is a rare situation. Likely you have a couple of things
    you can do.
    First is do not load a module until you really need to. This
    I assume you already know. Inside each of the modules you will want
    to take the same approach. Delay the creation of objects (UI or
    otherwise) until the application actually needs them.
    Second is you may have memory leaks: Might want to check out
    this article by Grant Skinner if you have not already.
    http://www.gskinner.com/blog/archives/2006/06/as3_resource_ma.html
    My guess is that you have loitering objects and runaway event
    listeners that all need to be stopped, closed down or de-referenced
    or the garbage collector cannot collect them and eventually you run
    out of memory.
    If you have Flex 3 Beta 3 you can use the profiler to check
    for Loitering Objects and verify that when you close certain
    objects down that the memory is returning to where it should be or
    at least reasonably close.
    Either way the articles from Grant Skinner are a good
    read.

  • 875P NEO-FIS2R no boot with memory problems

    I just got a NEO FIS2R board, a P4 2.6C, and two 512MB sticks of OCZ PC3200 Premier DDR.  I installed the board and connected only my hard drive, video card (GeForce4 Ti4200), both sticks of memory, and CPU (on a 300W PSU).  At first, the board would eventually make it to the "booting OS" stage, after spending a little time on "initializing video" (normal) and "assigning ISA" or something like that (don't know what that is, since the board has no ISA slot).  Unfortunately, although my monitor's light changed from yellow to green, meaning it was getting a signal, the screen remained blank.  I gave it a few minutes and finally turned off the computer.  After reseating the video card and memory and disconnecting the hard drive, the same problem occurred.  Finally, I took out one of the sticks.  After this, the computer hung on that ISA thing for about ten seconds, and then went back to "detecting memory" and gave a 2-1 error beep.  I switched between the two OCZ sticks and also two sticks slow generic PC2100 memory that I had in my old computer and I know work fine.  Eventually the computer stopped pausing at the ISA stage and went to the memory detection after only a second or so.  Since the video was obviously acting strange, I disconnected the video card so the computer contained only CPU and memory.  The bootup sequence was the same, except that in place of the single beep during video intialization, it gave me a 2-8 error code.  Finally, I took the board out of my case and laid it flat on the wooden table and tried again.  Still no boot.  I also cleared CMOS several times throughout this procedure, so that shouldn't be the problem.
    Can anyone figure out what's going on here?  It's especially confusing to me that the computer booted all the way (sans video) at first, then started having memory problems, and that the system passes the initial memory test but then goes back to it after that ISA thing (this is all according to the D-Bracket).

    Quote
    On the compatability list, it says
    "Kingston KVR400X64C25/512
    W942508BH-5 (Winbond)"
    Since the memory I am looking at is CL2, does that mean it might not be compatible after all?
    If you're wondering whether the Kingston HyperX RAM you're looking at is the same as the Kingston RAM you've found on the compatibililty list, the answer is NO.  The RAM from the list that you're referring to has a model number that starts with "KVR", which stands for "Kingston Value RAM".  HyperX RAM has model numbers that start with the letters "KHX".
    You might also note that there is no listing for any of the HyperX stuff on that compatibility chart.  However, I have seen posts from several people on this forum who seem to be using HyperX RAM with no problem.  I suspect that MSI hasn't updated that list in awhile, because there are no listings for many of the newer PC 3500, 3700, and 4000 RAM models.
    I personally went 512mb (2x256) of OCZ PC3200EL DDR RAM, it is listed as 'OCZ (PN- OCZ400256EL ) PN- OCZ400256EL' on the compatibility chart.  I had plenty of problems trying to run a set of Corsair TwinX 3200PCT ver. 1.2 on my 875P NEO board, and I settled them by switching over to the OCZ RAM only because it was listed.  While it wasn't my first choice I'm pretty happy with it because so far it's been running stable at CAS 2,2,3,6, with the voltage at 2.65.

  • Memory problem please help Idmediently

    Hey guys, Ive got an ipad 3 (16gb) and I had a look on my memory i have a few games and apps (GarageBand, keynote, imovie, pages, monoply, game of life and a few others) and it said 12.2 gb used and 1.2gb available i think that's counted wrong and I know approx 1000mb is 1gb it's impossible what do I do to get it accurate, even my brain holds more memory than that! Please help ASAP also I've tried restarting it
    Here is a list of apps:
    GarageBand 696mb
    Imovie 531mb
    Ibooks 366mb
    Keynote 362mb
    Numbers 336mb
    Pages 311mb
    And 52 apps are 150mb and under
    I know that I had the same number of apps yesterday and the memory was 8.2gb I want it to be accurate.
    Reply ASAP.

    Centipedee, No memory problem, you don't have enough room. It can take at least 2 times the required memory to load an app to the iPod. Once it loads it only requires 1.8GB of memory space but to get it there you'll need at least (maybe more) 3.6GB of space, I know it sounds screwy but it's how it works. Hope this helps, Good luck.

  • KT4VL CPU and Memory problems

    I have a KT4VL that I updated to BIOS version 1.60 with the liveupdate feature (I really like that by the way).  Anyway, I have two problems with the board.  I have a AMD 1700+ XP 1.47Ghz/266 CPU.  The manual and your website says I should change the CPU FSB Clock under Frequency/Voltage control to 133 for that CPU.  When I do, the system locks (corrupted a file and forced a reinstall of XP that was a pain). I've repeated that and when I change it back to 100Mhz, it doesn't lock. The web site says to change the CPU ratio to 11 with the 133 setting which I did but the system still locks/reboots suddenly.  I even tried changing the Spread Spectrum setting from the 25% default to none but it still crashes.
    The second issue is that I have a PC3200 DDR 256MB DIMM (Samsung
    K4H560838D-TCC4) that is specifically listed in the manual as a recommended DIMM.  However, if I put it as the only DIMM in the system, it doesn't POST or boot and gives 3 Long Beeps.  The D bracket indicates that it's a memory problem.  I exchanged the DIMM for the new one to make sure I just didn't get a bad one but it has the same behavior with the new one.  When I put it in with a PC2700 DIMM I have, it doesn't boot or beep.  It just sits there.
    What BIOS settings do I need to put to support my CPU and memory correctly?

    Bas,
       Ok, I'll try 1.7.  You think that will fix the memory problem?  
       The power supply is an AMD approved 350W power supply so I don't think that's it.  I'll double check the cooling setup (it's a volcano so it oughta be good) but cooling could definitely be it.
       The only thing I need you to verify for me is the BIOS settings...Will the AUTO setting work for the PC3200?  Or do I need to specify it?  And for the CPU, the FSB should be 133 and the ratio 11?  Any other changes?  Should I mess with the spread spectrum settings?
       I guess I'd like to verify what my bios settings should be to support my CPU and memory.
       Lastly, will I be able to use both my pc3200 and my pc2700 DIMM at the same time?
        Thanks so much for your time and help.
    -james

Maybe you are looking for

  • Custom comments on pdf

    Hi, Is it possible to enable commenting in a pdf, via reader extensions without enabling the "attach a File as a Comment" or the "Record Audio Comment" options? Thx, João

  • I don't use Fireworks or Flash - can I uninstall these individually?

    I need to free up space. When I try and remove software through Control Panel, It looks like the only option is to remove the entire package.

  • Oracle HR - lookup values

    Hi, Could you tell me where the lookup values for domainValues_Employee_Ethnic_Group_Code_ora11i come from.  The documentation says to 'SELECT DISTINCT PER_INFORMATION1 FROM PER_ALL_PEOPLE_F' - but where does this get populated from? Thanks in advanc

  • Does OSPF support IP LDP Fast Reroute Loop Free Alternate?

                      I only saw examples with IS-IS protocol. I have got ARS9010 with IOS XR 4.01. OSPF is in MPLS core and I need failover below 1sec. The core will state 5 ASR in ring.

  • Open dialogue box in Lion is not supporting drag and drop files (UPDATE)

    This is freaking me out. Because now in Lion you can't drag and drop file from the desktop (and other folders!) to open dialogue box. It was most usefull function in Mac OS X. Why Apple kill usabillity? Access to the desktop is not a problem (command