Eclipse vs. Netbeans for large projects

I've been getting a bit fed up with Netbeans this last year, to tell the truth. The main bugbear is all the "project scanning" that goes on when you make the least change to libraries etc.. It's also got some very nasty habits, like I'm typing a method name and the auto-completion suddenly substitutes a completely different method, presumably on the basis of what I was typing a second ago, and I'm still having debugger problems on my largest program.
And the metadata database all this project scanning produces is unreliable, about once a week I have to delete the whole thing and let it be built again. Several releases have gone past without these fundamentals being fixed, the developers have more fun adding facilities for the next language I'm never likely to use.
Anyway, enough ranting. The question in my mind is, should I finally give up and switch to eclipse, am I going to find it has it's own problems just as annoying? So this question is addressed to people who have used both on large projects. Is the grass really greener on the other side?
I'm irritated that there's no open source plug-in for mattise. I have several forms created under Netbeans. If we're going to MyEclipse does this mean I should consider the other commercial Java IDEs?

Malcolmmc,
Through my career I've had to support users in real life that have issues just as you describe, but with Product X, while I'm not saying this is what is ailing you--because I've not seen anything about what you are doing or running--here are some of my findings:
1 - (and this is the #1 cause that I've run across for this type of thing) How much stuff are you running in memory at startup? Those little icons down in the tray actually represent programs running in memory that each take up a finite and distinct part of your available memory, and some of them stream information or access the internet and other rerources frequently. I once answered a ticket from a person that had a huge box, for the time, he had 30--yes, that is right 30--programs loading on startup. His complaint was that the machine took a long time to start and then was sluggish. I unloaded everything and had him run his apps--amazingly, he was delighted with the speed and wanted to know what I had done. I then had to tell him the bad news: he couldn't monitor the stock exchange, weather, recieve up to the minute nascar happenings, and etc. He didn't like it, but it fixed his problem. I later went back to find he had 20 of them reloaded--20 seemed to him a happy compromise between instant information and speed.
2 - Virtual memory--none or too small. I've found a few people that run without a swap file... it runs great until memory gets full, then bad things happen.
3 - Other programs competing for resources. I see this all the time, one program is trying to do I/O and being blocked by another.
4 - Hardware issures. Often the hardware just wares out. For instance: a fan stops or slows down on a video card and the whole system slows down as the GPU overheats--not enough to shut everything off, but maybe just enough that your super fast system become a dog.
5 - Hardware conflicts--this is not nearly as frequent a problem as in the Standardization Era post '96, but it does still happen. Check your hardware manager and the drivers being used.
6 - Out of date driver issues... enough said there, we've all had them.
7 - HD fragmentation--this is not so common, but does happen still.
8 - Low Memory (this should be moved up to #2) People still persist in undervaluing the worth of extra memory. There isn't an amount of RAM supported on a computer that represents too much RAM. RAM is cheap, RAM is good, MAX RAM is even best. (btw--with 3 GB you shouldn't have problems running NetBeans.)
9 - Installation issues. Not everything installs properly and when it doesn't install properly it's not always a reported error.
10 - Configuration problems. Sometimes new features can be very annoying and can be turned off.
11 - Believe it or not, OS's are not perfect and they need to be tuned up from time to time. If you are running Windows, then clean out your temporary files area, not just your Internet Temporary Files Area, but the system temporary files area. If you've been running heavily, then I guarantee you have 1000's of temporary files that are choking your performance as Windows tries to keep track of them. Delete them.
12 - Part of your program or resource your program uses may have become corrupt. This happens more that we like to admit.
There are tons of other things, but these at the big 12 that I usually go looking for when I get a call saying computer problems are beating them up.

Similar Messages

  • Best practices code structure for large projects?

    Hi, I come from the Java world where organizing your code is handled conveniently through packages. Is there an equivalent in XCode/Objective C? I'd rather not lump all my observers, entities, controllers, etc in one place under "Classes"...or maybe it doesn't matter...
    If anyone could point me to a document outlining recommended guidelines I'd appreciate it.
    Thanks! Jon

    If you have a small project, you can setup Groups in Xcode to logically organize your files. Those Groups do not necessarily have to correspond to any directory structure. I have all my source files in one directory but organize them into Groups in Xcode.
    If you have a larger project, you can do the same thing, but with code organized into actual directories. Groups can be defined to be relative to a particular directory.
    If really do have a large project, you should organize things the same was as in Java. Your "packages" would just be libraries - either static or dynamic.
    As far as official guidelines go, there really aren't any. It would be best to stick to the Cocoa Model-View-Controller architecture if that is the type of application you are working on. For other software, you can do it however you want, including following something like Sun's guidelines if you want.

  • I am trying to upload an imovie to youtube. it says project too large but i have permission for larger projects. Help and thanks.

    I am trying to upload an imovie to youtube. It keeps saying "project too large" even though I have permission from youtube for over 15 minutes. Please help. Thanks.

    share the video via a quicktime movie in whatever format you want to use, I usually create an mp4.  Once you that video is set (I usually save to my desktop) go ahead and upload from YouTube rather than going direct from iMovie.  I suspect you will not have the same size issue.

  • Optimizing After Effects Performance/ Rendering Speed for large projects

    Problem:
    My after effects is extremely slow and i want to use an external harddrive  to increase the speed temporarily until i can afford to change the internal hard drive. i also have other video production software like Cinema 4D, Final Cut Pro, Premiere Pro, Cheetah 3D, etc.
    Macbook pro with the following specs:
    Processor  2.4 GHz Intel Core 2 Duo
    Memory  8 GB 1067 MHz DDR3
    Graphics  NVIDIA GeForce 320M 256 MB (Available space approx 91.GB)
    Software  OS X 10.8.5 (12F45)
    Action:
    i have set my Disk Cache to use the 500GB Gtech firewire external hardrive in After Effects Preferences
    Issue: After effects still slow
    Recommendation From expert: Change Internal Hard Drive to 512 GB SSD HARD DRIVE
    Question: How can i optimize After effects with the current hard drive space i have right now?

    Thank you. As i stated earlier i am new to after effects and video editing in general so this is a learning process for me. i have also realized that i have soo much going in the layer that i need to clean up and prerender as you said. Also one of the biggest mistake i am making is with the "composed individual lyric effect" i created. As i said earlier, i masked the lyrics to audio spectrum and choose the mp3 song in other to make the effect active, but since i am working on the lyric composition individually i did not crop the yellow bar in the workframe to the exact part of the lyric that applies to the song. hence leaving me with 100+ of words with each having the full length of the song in each individual composition i created in the master layer, in addition to the spin fx effect that i have not prerendered. After effects is trying to render alll these together basically. The extremely slow rendering procesing time arose  when i combined all individual lyric effect into the layer while also implementing typography tricks like in the same layer.
    Basically, my question is should i reduce the frame work in each lyric composition to the exact section of the song that applies, and will this improve the overall rendering time in the master layer and solve my problem?
    Also should i peform my typography skill like position, scaling rotation and so on in the individual layer/composition of the lyric effect i made in order to further save rendering time in the master layer?

  • What is the best way to share for large projection?

    I would like embed my movie as part of the Powerpoint to be shown on a Windows machine (my only option). What is the best way to share it so that it doesn't look grainy or pixelated?

    Fool proof? Does Mircosoft make Powerpoint?
    LOL... but seriously
    You could export it out to a Windows Media file.
    http://www.Flip4Mac.com

  • How to use Source Code Control for Large Application?

    Hi, All!
    I would like to collect knowledge about "best practice" examples for using Source Code Control and project organization for relative large application (let's say approx 1000 SubVIs).
    Tools used:
    LabVIEW 8.0
    CVS Server
    PushOK CVS Proxy Client
    WinCVS
    With LabVIEW 8 we can organize large project pretty well. This described in article Managing Large Applications with the LabVIEW Project.
    I have read this article too: Using Source Control Software with LabVIEW In this Article Source Safe used, but with PushOK all looks nearby the same and works (some tricks for compare function are required).
    Example. Two developers working together on same project. Internally project is modular, so one developer will work with module "Analysis", and another one with "Configuration" without interferences. These modules placed into Subfolders as shown in example above.
    Scenario 1:
    Developer A started with modification of module "Analysis". Some files checked out. He would like to add some SubVIs here. So, he must also perform check out for the project file (*.lvproj), otherwise he cannot add anything into project structure.
    Developer B at the same time would like to add some new functions into module "Configuration". He also needed to check out project file, but this file already checked out by Developer A (and locked). So, he must wait until lvproj file will be checked in. Another way is mark *.lvproj files as text files in PushOK, but then one of developers will get conflict message by checking in and then merging will be necessary. This situation will coming very often, because in most cases *.lvproj file will be checked out all the time.
    Question: Which practice is better for such situation? Is Libraries better than folder for large project?
    Scenario 2:
    Developer C joined to the team. First, he must get complete project code for starting (or may be at least code of one Library, which assigned to him).
    Question: How it can be done within LabVIEW IDE? Or WinCVS (or other SCC UI) should be used for initial checkout?
    Scenario 3:
    Developer D is responcible for Build. Developers A,B,C have added lot of files into modules "Analysis", Configuration" and "FileIO". For building he need to get complete code. If our project splitted into folders, he should get latest *.lvproj first, then newly added SubVIs will appear in Project Explorer, then he should expand tree, select all SubVIs and get latest versions for all. If Project organized in Libraries, he must do the same for each library, isn't?.
    Question: Is this "normal way", or WinCVS should be used for this way? In WinCVS its possible with two mouseclicks, but I prefer to get all code from CVS within LabVIEW IDE recursively...
    That was a long post... So, if you already working with LabVIEW 8 with SCC used for large project, please post your knowledge here about project structure (Folders or Libraries) and best practices, its may be helpful and useful for all of us. Any examples/use cases/links etc are appreciated.
    Thank you,
    Andrey

    Regarding your scenarios:
    1. Using your example, let's say both developers checked out version 3
    of the project file. Assuming that there are only files under the
    directories in the example project, when Developer A checks in his
    version of the project, there will be new files in one section of the
    project separate from where Developer B is working. Developer B,
    notices that there is now a version 4 of the project. He needs to
    resolve the changes so will need to merge his changes to the latest
    version of project file. Since the project file is a text file, that is
    easy to do. Where an issue arrises is that after Developer B checks in
    his merged changes, there is a revision 5. When Developer A and B go to
    make another change, they get the latest version which will have the
    merged changes to the project file but not the referenced files from
    both Developer A and B. So when A opens version 5, he sees that he is
    missing the files that B checked in and visa versa. Here is where the
    developers will needs to manually use the source control client and,
    external to LabVIEW, get those new files.
    Where libraries help with the above scenario is that the library is a
    separate file from the project so changes made to it outside of the
    project do not require the project to be modified. So this time, the
    developers are using a single project again which time time references
    two libraries. The developers check out the libraries, make changes to
    the libraries, and then check those changes in. So when each developer
    opens the project file, since it references the project file, the
    changes to the library will be reflected. There is still the issue of
    the new files not automatically coming down when the latest version of
    the library is obtained. Again, the developers will needs to manually
    use the source control client and, external to LabVIEW, get those new
    files. In general, you should take advantage of the the modularity that
    libraries provide.
    2. As noted in the above scenario, there is no intrinsic mechanism to
    get all files referenced by a LabVIEW project. Files that are missing
    will be noted. The developer will then have to use the source control
    provider's IDE to get the initial contents of the project  (or library).
    3. See above scenarios.
    George M
    National Instruments

  • SharePoint Library for Large Amounts of Engineering Data

    We are currently using traditional project directory folders for large projects with sometimes tens of thousands of documents. 
    We are planning on migrating the data to SharePoint and the path forward in unclear.
    Initially it was recommended to use a library, not numerous folders, to contain the data so that searching of data in improved. 
    That sounded great.  The 1<sup>st</sup> project used to pilot this for other project is divided into 20 different modification packages. 
    A library category was created for MODS with selectable options of the 20 mod package names and “No Defined” (default value). 
    Some data items are shared between more than one MOD so this category can have more than one assignment.
    When we looked at the directory structure in place we found no consistency in folder names, no consistency in directory structure. 
    Many folders have 5 or 6 (or more) levels of subdirectories. 
    Ideally we want no more than 4 or 5 categories of meta data to define all data. 
    Mapping from chaos into a comparatively small number of categories is daunting.
    When searching this forum I find that libraries should be limited to 2,000 items. 
    There are tens of thousands of items in our pilot project. 
    Surely someone somewhere has encountered this organizational problem. 
    I could use some advice from someone who have been there before.

    John,
    The limit of 2000 is not a hard limit, the actual no of items you can store in a list is 30,000,000. however more item would have impact on performance on rendering and lock on the SQL table.
    Also the limit that you have mentioned (2000) is list view threshold limit and  actually it is 5000.
    One important aspect is Boundaries are hard limit, which you cannot exceed and Supported limits are limits based on tests, which can be exceeded but may cause issues.
    Being said that , I would suggest you to check out this link on
    SharePoint Server 2010 capacity management: Software boundaries and limits
    http://technet.microsoft.com/en-us/library/cc262787(v=office.14).aspx
    and explore other ways of optimizing your list
    here are some references that would help you to optimize -
    http://office.microsoft.com/en-us/sharepoint-foundation-help/manage-lists-and-libraries-with-many-items-HA010377496.aspx
    http://technet.microsoft.com/en-us/library/cc262813(v=office.14).aspx
    http://office.microsoft.com/en-us/sharepoint-server-help/sharepoint-lists-v-techniques-for-managing-large-lists-RZ101874361.aspx
    Hope this helps!
    Ram - SharePoint Architect
    Blog - http://www.SharePointDeveloper.in
    Please vote or mark your question answered, if my reply helps you

  • Help in deciding the GUI for my project..

    Hi,
    Earlier in this forum for the same question some of you had given very useful replies.. My doubt now is which IDE should i use for my project ( project scheduler in a client/server environment) . I first decided it as Eclipse and then for my project's GUI i thought of using SWING but after reading some threads in other forums i understood that if ECLIPSE is being used then you can't use SWING for GUI am i right????? and please tell me which is the fastest way to build a GUI ??? for my project i also want to add some gantt charts so my GUI should also be compatible with it... so any suggestions ?????

    kavi_g wrote:
    hey thank you all for your replies that was really helpfull.. I can now continue with my projects without any confusion..
    As for J2EE, if you're making an enterprise app, might it not make more sense to use servlets or jsp to make a web-based GUI, instead of making a desktop app?
    - AdamDO you mean to tell you can use SWING to build only for desktop applications???? Using servlets and jsp with embedded HTML for GUI will be fine but it wont have that high level GUI which is needed for my project.. Correct me if i am wrong...I would say that you are incorrect on both counts. You certainly can use swing for web-based applications, if you want to make an applet, and embed that in the web page. I have done that before, when the situation called for it. However, when it's feasible to use jsp, I would say that jsp is the better option.
    Also, the capability for rich, highly interactive GUIs in javascript has always been there. It's just taken a long time for people to reach it's potential. AJAX is a good technology to look into. And there's tons of open-source material on the web for creating GUI widgets above and beyond those provided by HTML. It's really a question of your relative skill in java / javascript, and how much time you have.
    - Adam

  • Export format for clips to be used in larger project.

    Back in the PE2 days, I used to capture and trim a clip and "export movie"  which resulted in a "DV-AVI" file.  This is what I understood PE "liked" and the best format to use for import to a larger project (and it worked nicely every time).  Now, in PE11, I don't see that option.  This method is handy, not only for trimming stuff you don't want, but naming the clips to help with content id.  What option should I use that preserves quality and cooperates with PE best?

    BobSomr wrote:http://www.lynda.com/Premiere-Elements-tutorials/Up-Running-Premiere-Elements-11/109763-2. html
    Hey Steve,
    Will there be a PrE11/PSE11 book?
    Bob
    http://www.amazon.com/Muvipix-Guide-Premiere-Elements-version/dp/1479311200/ref=sr_1_2?s=b ooks&ie=UTF8&qid=1358603413&sr=1-2&keywords=muvipix
    http://www.amazon.com/Muvipix-com-Guide-Photoshop-Elements-Premiere/dp/1480209392/ref=sr_1 _10?s=books&ie=UTF8&qid=1358603413&sr=1-10&keywords=muvipix
    For less than the book, watch Steve in a 3 hour course here:  http://www.lynda.com/Premiere-Elements-tutorials/Up-Running-Premiere-Elements-11/109763-2. html

  • Best practices for large ADF projects?

    I've heard mention (for example, ADF Large Projects of documentation about dealing with large ADF projects. Where exactly is this documentation? I'm interested in questions like whether Fusion web applications can have more than one ViewController project (different names, of course), more than one Model project, the best way to break up applications for ease of maintenance, etc. Thanks.
    Mark

    I'd like to mention something:
    Better have unix machines for your development.
    Have at least 3 GB of RAM on windows machines.
    Create all your commonly used LOVs & VOs first.
    If you use web services extensively, create it as a seperate app.
    Make use of popups, it's very user friendly and fast too. You no need to deal with browser back button.
    If you want to use common page template, create it at the beginning. It's very difficult if you want to apply it later after you developed pages.
    Use declarative components for commonly used forms like address, etc.
    Search the forum, you will see couple of good util classes.
    When you check-in the code, watch out some of the files don't show up in JDev like connections.xml
    Make use of this forum, you will get answers immediately from great experts.
    http://www.oracle.com/technology/products/jdev/collateral/4gl/papers/Introduction_Best_Practices.pdf

  • Advice on best workflow for large Motion project

    I am a part-time video editor/designer/motion graphics creator/etc. Normally, I work on projects with pieces no longer than 5 minutes, even if the projects themselves might be 30-40 minutes of total material--mostly video support for conferences and awards shows.
    Right now I am embarking upon a mark larger project--10 30-minute segments, each of which is 100% motion graphics. They all involve a speaker against a green screen for the entire segment with the motion graphics keyed in front of and behind him.
    We recorded this directly to hard drive in a studio that had a VT4 (Video Toaster) system, so the best Mac-compatible codec they could provide me for clean green-screening was full-resolution component video. This is giving me great keys, but I also have about 500 GB of raw footage.
    In this project, I need to first edit all the takes from each episode into a clean 30-minute piece, and then add the motion graphics. And this is where my question comes in. It seems to me FCP is much better for editing the raw video, but that Motion is where I want to do just about everything else. I need to somehow bring the video into Motion, because I want to create "real" shadows against my background from my keyed footage.
    When working with a long project, and with a full-resolution codec, what is my smartest workflow? I am trying to spend the least time possible rendering back and forth, and also avoid generating huge in-between files each step of the way. It seems that any way to approach it has plusses and minuses, so I want to hear from people who have been there which path gets me to my goal with the least hassle.

    I need to somehow bring the video into Motion, because I want to create "real" shadows against my
    background from my keyed footage.
    "Real shadows are only faked in Motion. You have many options including a simple drop shadow or a copy of your matte layer filled with a gradient and a gradient blur applied with a distortion filter so it appears to be projected onto the wall. Be sure to take the time to make this a template effect and to keyframe the shadow angle if the foreground subject moves.
    When working with a long project, and with a full-resolution codec, what is my smartest workflow? I
    am trying to spend the least time possible rendering back and forth, and also avoid generating huge
    in-between files each step of the way. It seems that any way to approach it has plusses and minuses,
    so I want to hear from people who have been there which path gets me to my goal with the least
    hassle.
    Well, you've got two conflicting interests. One, you have to sync the Motion work with the video of the keyed speaker and, two, you have to edit. But it seems to me that your planning must include lots of design work up front, media you can re-use or modify slightly, text formatting that can be precomped, a large stock of effects you will apply over and over again. Do all of this stuff first.
    You also want to explore working at lower rez through your planning and roughing stages. for instance, there's no reason to pull a full rez copy of your foreground into Motion if all you need to do is sync to his audio and get rough positioning. You can put him over black and export all of his clips using any medium to low rez codec at reduced frame rates and just use the Screen Blend Mode to drop him roughly onto your Motion projects.
    You'll get lots of advice over the next few days. If you're posting to other Motion or motion graphics forums, please do us all a favor and return someday to all of your threads and tell us what you did and what you learned.
    bogiesan

  • Eclipse or Netbeans?

    first, i wanna say, "i don't want to invoke a debate, and what i want is only some advice."
    then, i think i should tell u something about my situation. last month, i decided to hop from a large company to a comparatively small one which i thought could guarantee a better future. but now i think i may have made a mistake... --they let me handle the hole java team without the BASIC programming tools such as Sun Java Studio or J Builder. they told me the financial budget this year did not allow to buy those softwares, and they would buy those softwares next January.
    what can i do ? i cannot let 23 computers without a Programming tool...
    so which should i choose? Eclipse or Netbeans or...

    and u are right one cannot use one's favorate IDE is
    vary bad,That wasn't quite my point. It is bad - think of network effects (people being able to help each other) and decreased "tool maintenance time" when all use the same IDE. It is something that needs to be introduced sooner or later, but it just sounded like you had too much trouble at hand as it alsready is. Changing the environment and switching IDEs at the same time might decrease productivity too much.
    Introduce the processes, let people get accustomed to them, and then integrate them with an IDE - with useful plug-ins available for the standard-IDE-to-be they might even voluntarily give up their own choice for the standard if they see that it integrates better into the process.
    Like Ultraedit users giving it up in favor of direct groupware and repository access... I saw it happen.

  • Large project in Premiere CS3 causes Windows XP64 to reboot

    Ok...I have a rather sizeable project going in Premiere CS3.  It contains a rather large number of animations created in Autodesk Maya (most between 10 seconds and 3 minutes long) and a large number of stills, as well as scrolling background (a .mov file I created using Photoshop and Premiere) and a soundtrack created by myself (in Cakewalk/Sonar).  The total project is about 18 1/2 minutes in length....around 8 video tracks and 3 audio tracks at the moment. The main problem is that when I start working on it...either importing files and especially moving files around in the project or even just hitting the playback button, the system just reboots.  I have seen the infamous Windows "blue screen of death" flash briefly once or twice, but it doesn't hang...just goes right to the system BIOS screen.  The ONLY thing that seems to help is basically saving the project like every 30 seconds.  Import a file, save, conform file to widescreen, save, bring file into project, save, move file around a bit, save, add crossfade, save....you get the idea.
    Now before I go too much futher here, I suspect the problem isn't actually with Adobe/Premiere as I had this happen a few times while I was creating the sound track in Sonar 4 as well.  The sound track is basically a pseudo-orchastral piece and is equally around 18 1/2 minutes long....currently with around 42 tracks of audio and 5 midi tracks.  The audio has been sampled down to a standard stereo wav file for the import into Premiere. And yes, as the audio portion of this project got bigger and bigger, it caused a few reboots as well (not to mention drop-outs, studders and all the other fun stuff associated with audio production).  With that, I was able to "compact" the Sonar file (basically a Sonar project defrag) and the reboot problem seems to have went away on that end.
    The system itself SHOULD be able to handle this project...not top of the line, but pretty beefy. 2nd generation Intel i5 quad core, 16 gigs of Kingston DR-1600 RAM, Tascam US-428 for the audio and a good 2+ terabytes across 5 harddrives (3 internal 3 gig-per-second SATA's, 2 USB).  Also has a 4 gig HIS vid card with ATI chipset (still can't afford a FirePro yet).  While Premiere is installed on the (comparatively dinky) 160 gig C drive, I'm using the 500 gig E drive internal and a 500 gig USB drive as my work drives.  Most of the Maya animations are on the internal E drive but the rest...the stills and Premiere files and such...are on the USB drive.  I.e., the system itself should have plenty of rendering power for this.  Also I have replaced the C drive with a nearly identical twin.  After a recent crash, I finally broke down and got a backup C drive...in other words, I'm sure the drive(s) are a-ok...even put a brand new 500 watt power supply in just in case.
    I have already re-installed XP64...a couple of times in fact, as just before I started this project one of those god-foresaken MS "critical updates" crashed the crap out of my computer.  Then on the first reinstall, something hosed in .NET framework during the "night of a thousand updates" so I had to scrap it and start the reinstall all over again.  Likewise, I reinstalled Premiere CS3 (along with everything else including Maya) once I finally had the computer running again.
    A quick note here: I did create my own custom 1080 Hi Def widescreen settings for Premiere since CS3 didn't appear to have any such thing.  For short projects they seem to work just fine...although admitedly, the render time is a serious b*tch.  Likewise, all the Maya animations have been rendered out at 1080 widescreen, production quality (yea...the render time there was a killer too).
    Another quick note:  While I'm trying to use Premiere exclusively while I'm working, on occasion I have had to open either Photoshop or Maya to deal with a couple of stills...and when I open something else, I get a "windows virtual memory to low" (16 gigs of RAM and the virtual memory is too low??).  Otherwise, except for Avira (virus scanner) I don't really have anything else running in memory...no screen savers, no desktop pictures, no Instant Messanger crap, etc..  Never cared for bells and whistles and such, so the system is pretty clean in that regard (and I do keep up on my degrags as well).
    Finally, while I -am- grateful for any helpful suggestions regarding this, please do NOT simply tell me to upgrade something (Premiere, Windows, etc) as point blank; it ain't gonna happen.  Aside from the fact that Adobe seems to have made Premiere CS4 and up much more difficult to use (I have -many- niggles about the newer interfaces there...using CS5.5 at the college and it's REALLY annoying) and aside from my wife being a MS programmer/developer and having to listen to her complain about Windows 7 and 8, we're in the process of buying a new house, so any and all "disposable income" is nill...and will be for quite some time.  No money...zero...nadda...i.e. no upgrades.  Further, while granted this is certainly the biggest project I've done to date, I've ran this setup/configuration for at least the past 3 years now without a hitch...this all seems to have started with that last MS critical update (at the risk of sounding paranoid, I'm really starting to think they slid a nasty in that last one to FORCE people to upgrade since they're discontinuing XP64 support soon). 
    Alrighty...I know that's a lot but I wanted to provide as many details as I could here in hopes that someone can help me get a handle on this reboot issue.  Getting to be REALLY annoying having to save every 30 seconds or so.  I have searched the help files, but got tired of randomly sorting thru all the blather and not finding anything helpful.
    I'm grateful for any help here...thanks!
    Jim

    Well, I'm about 98% sure it isn't a hardware problem.  I had already pulled the RAM and tried some backup RAM I have sitting in the box (8 gigs of Corsair), went thru the harddrives weeks ago with Chkdsk and everything was a-ok there (a couple of bad clusters on the old C drive, but that has since been replaced and is on the shelf as a backup), motherboard seems to be fine, re-seated all cards and cables, etc..  Again I even slapped a new 500 watt PS in just in case (the old one was a 400 watt and with all the crap I have on this system, thought it might have been a voltage drop off issue...nadda).   I did have a bit of a heat problem a while back...one of the case cooling fans was going bad and the CPU was getting just hot enough to trip the system (about 5 degrees over).  That said, that also shows up in the BIOS monitor as well and was easy enough to track down.  Either way, the bad cooling fan has been replaced and I even added a "squirrel cage" style fan as well...system is running nice and cool now.
    BTW...I used to be A+ certified on both PC's and Macs when I was a hardware tech (not to mention HP, IBM and Lexmark certified, Cat 5 installer, etc), so I like to think I know my hardware fairly well.  Not just a weekend warrior on that front...used to do it for a living.
    I had also already tried manually resizing the paging file as well...tried different sizes, tried spreading it out across the drives, etc..  Didn't really make a difference either way and again the only time the virtual memory low comes up is if I try to open Photoshop or Maya while this big project is open in Premiere.  That said, Photoshop and Maya both tend to be pigs when it comes to memory/RAM (Photoshop especially).  Doesn't seem to happen with Illustrator or any other program....just PS and Maya if this project is open in Premiere.  
    One other oddity I have noticed recently is actually with Maya.  Before all of this started...in other words, before that last MS critical update and 3 days of reinstalling software...I used to be able to do a background render with Maya and still be able to keep working on stuff.  I'd fire off batch render and then I could move on to other things...editing images in Photoshop (CS 5 there if it matters), surfing the web and even doing some light stuff in Premiere.  Since this problem has been occuring however, I have noticed that the Maya batch renders are REALLY slowing down the system big time.  Difficult to even check my email while a batch is running.  Likewise, I have noticed that since I upgraded the Maya from 2011 to 2013, Maya has a nasty tendency to freeze during a batch render.  If I have a long batch render, I'll set it up to run over night...when I get up the next moring, the render itself will still be running (you can see Mayabatch.exe still going in processes under Task Manager) but Maya is totally frozen up.  Now I did that update at the same time as the rest of this mess happened so I don't know if these problems are due to the Maya upgrade or the (apparent) Windows problem...I'm inclined to believe the latter.
    BTW...after having a project file get corrupted yesterday during one of these shut downs, I've been double saving the files (so as to have a backup) and that too seems to have made things a bit better.  Hasn't actually shut down on me when I've been doing this.  That said, it's also a serious pain in the butt having to save that way too.
    With that, looking thru the rest of Hunt's article there, I did kill indexing services...at least on the work drives.  it's about the only thing there that I haven't tried, so we'll see what happens this afternoon.  I seriously doubt that would be the problem though as usually that would give an error or a lockup and not an instant shut down, but we'll see.

  • Multiple TOCs for One Project

    Is it possible to split the topics in one help project into
    two toc (hhc) files? I have a master project that merges about 53
    chm files. Some of the slave projects are rather large and over the
    years have had several types of information added to them. The
    information in the projects are related to each other, but they
    need to be organized separately in the master TOC. Splitting the
    projects themselves is not a good solution. I would like to have
    multiple hhc files for the projects so that the topics can easily
    be inserted in their logical place in the master TOC.

    Hi,
    I am TW working jointly with another TW on a huge multifirm
    project for a same client. He is well versed in RH, while I am a
    beginner (with using RH) not with TW. He and I each produce large
    parts of what will be merged into a single huge context sensitive
    .chm file. Having experimented a bit with possibilities such as the
    ones you are exploring, I thought we could use it for our project.
    But the other TW felt we would have problems down the road.
    Just to confirm what John says. But then, I also am not
    adding anything positive by way of solution. Hope somebody else
    reacts to these two successive negative answers, and comes up with
    help for you.
    Daniel Garneau, CTM, CL
    Technical Writer, Canada

  • Do we need to put the following code in the web-xml for the project to run

    Hi^^^,
    actually I have created a project in Eclipse WTP and I am running it from remote server. Its giving me 404 error when I tried to run it.
    I know 404 error is generally due to some error in deployment descriptor.
    I am going through this tutorial for creating project in eclipse WTP
    this says that I need to include the following code in web-xml. Please look at the quotes below
    "Web modules in J2EE has a deployment descriptor where you configure the web application and its components. This deployment descriptors is called the web.xml. According to the J2EE specification, it must be located in the WEB-INF folder. web.xml must have definitions for the Servlet and the Servlet URI mapping. Enter the following lines into web.xml:"
    "Listing 2. Deployment Descriptor web.xml"
    <servlet>
    <servlet-name>Snoop Servlet</servlet-name>
    <servlet-class>org.eclipse.wtp.tutorial.SnoopServlet</servlet-class>
    </servlet>
    <servlet-mapping>
    <servlet-name>Snoop Servlet</servlet-name>
    <url-pattern>/snoop/*</url-pattern>
    </servlet-mapping>
    My question is, it is necessary to include the above lines between <servlet> and </servlet-mapping> in web-xml
    thanks and regards,
    Prashant

    pksingh79 wrote:
    actually I have created a project in Eclipse WTP and I am running it from remote server. Its giving me 404 error when I tried to run it.
    I know 404 error is generally due to some error in deployment descriptor. what's the url you've put.
    <servlet>
    <servlet-name>Snoop Servlet</servlet-name>
    <servlet-class>org.eclipse.wtp.tutorial.SnoopServlet</servlet-class>
    </servlet> Every Servlet has to have a <servlet></Servlet> tag in the web.xml
    the <servlet-name>is for the naming the servlet and the <servlet-calss>is for class file of the servlet in your case the .class file is to be in the package of tutorial,if it's not then how the container will no where the calss file is
    <servlet-mapping>
    <servlet-name>Snoop Servlet</servlet-name>
    <url-pattern>/snoop/*</url-pattern>
    </servlet-mapping>You type something in your url likk http://localhost:8080/webappname (Tomcat server),so for url mapping instead of typing the entire class file name ,you just enough have to type what you've put in the <url-mapping> tag and it has to be inside of <servlet-mapping>
    I think the problem is in <url-pattern> change it like /snoop<url-pattern>
    My question is, it is necessary to include the above lines between <servlet> and ></servlet->mapping> in web.xmlSo now you think whether you need something inside <servlet>and </servlet-mapping>

Maybe you are looking for

  • Itunes Apple ID account help.

    I changed my apple ID username and password from previous account and now I cannot play music purchased under previous account username/password. Please help.

  • Safari slow

    Since the last update it seems that my whole mac is extremely slow, but no where is this more evident than when running Safari. To qualify this I'll tell you that per activity monitor only half the memory is being used and the CPU usage is fairly slo

  • Updating a table with no Primary key

    What I am trying to find out is if you can uniquely update a single record in a table that has no primary key. I see no way to reference the record I want, other than specifying all the field values in a where clause. This is a problem though, becaus

  • Booting up a windows install

    I tried to install windows 7 with boot camp but got no luck. I get asked to press any key but no key will work! Tried an aly keyboard and also normal windows keyboard. Neither of them seem to work at the caps lock light deosnt come on. So, I installe

  • SQL statement works with SQL/Plus - but not with ODBC

    Hi all, I have a rather copmplex SQL statement: BEGIN UPDATE ContentDataTable SET SYMBOLIC_PATH_PARENT = N'/Test', SYMBOLIC_NAME = N'HAWK01.GIF', VERSION_NUMBER = 1 + SELECT MAX(VERSION) FROM (SELECT MAX(VERSION_NUMBER) AS VERSION FROM ContentDataTab