Memory Question about my K8N!

Right now I am using 2 Kingston HyperX DDR400 memory.  The boards are 512Mb doublesides... If I was to add anotherboard same brand and speed but 1gb (for a total of 2gb) will the memory run ad 440mhz or drop to 333mhz?  Better question, will this work!

what stepping is your processor (co or cg), supposedly co's have serious problems with 3 double sided sticks, cg may have better luck.  If you look at your processor code, if it ends in ar or ax, it is a cg, ap is co.

Similar Messages

  • MOVED: A question about the K8N Diamond mainboard

    This topic has been moved to AMD64 nVidia Based board.
    A question about the K8N Diamond mainboard

    You can try it. Worst that could happen is a non-boot.

  • Question about MSI K8N SLI Platinum

    i have a question about this mobo.
    my previous msi boards had a feature where if u overclocked too much, and the system couldnt handle it, it would load the default settings in the bios. I really liked this feature and am wondering if it will be present in this board.
    is it?

    sorry m8 after that post i double checked and i thought you were talking about neo1 - didnt notice the SLI bit  .  i think a couple of others got caught by that one too.
    so i did some investigating:
    it uses the same award phoenix bios 6.00PG as neo1 and 2.  every board i have owned since kt266a fom various manufacturers with that bios has had that feature. plus the insert key trick even works on a lot of 486 and early pentium 1 machines.  i am still 99% sure it will put it that way.

  • A quick and simple question about MSI K8N Neo 2 Platinum

    I want to overclock my 3500+ but when I go like to 2.4GHZ, the computer isn't stable and sometimes even corrupt some of my Windows files.
    My HTT is alright so my mobo stays around 800MHZ
    My memory is at 166MHZ so it doesn't go too high
    My Vcore is 1.4v but this overclock is so minimal it doesn't cause a problem
    Last thing.... AGP and PCI Locks
    AGP, no problem, it is locked to 66MHZ, but what about PCI?
    I downloaded CoreGen and when I increased the HTT, AGP stood at 66, but PCI increased with it.
    Some told me that when you lock AGP to 66, it will also lock PCI, well I have to say that it ain't true 'cause I experienced it myself.
    Why isn't PCI Clock not in the motherboard, or is it? and any other motherboard have it???

    Never mind, I just noticed I had to put the AGP to 67MHZ to lock both, BUT, I think my comp just sucks in overclocking.....
    I set it to 2400MHZ
    220x11 with memory at 183MHZ using A64Tweaker so that make my memory set to right on 200MHZ.
    My motherboard is set to 880MHZ (800MHZ Origin) so I set HTT to 3 to make 660MHZ and still doesn't work.
    When I say it doesn't work it is because my PC doesn't pass Super PI test, it freezes completly after the first LOOP but if I do not run Super PI, the PC won't crash. Is it a Super PI problem or what?

  • Some questions about MSI K8N Neo4 Platinum nForce4 Ultra

    Hi to all!
    First of all sorry for my bad english  :P
    I have some questions regarding this motherboard that i would like to buy.
    I've red some times ago that nForce 4 chipset had a bug with the HyperTransport (or similar) so all the motherboards with this chipset were post-poned. Now the german MSI site has posted a press-release of the MSI Neo4 saying that mass-volumes distribution will be in mid december.
    I would like to know if some Neo4 were distribuited before nVidia found the bug and (if so) if there's a method to identify the "good" from the "bugged" versions.
    The second question is about the possible heatsink. I would like to install a Zalman CNPS-7700 AlCu (with 12x12 silent fan) but I dunno if the heatsink "dimensions" could create problems to some motherboard component.
    Thanks!

    Hi to all!
    First of all sorry for my bad english  :P
    I have some questions regarding this motherboard that i would like to buy.
    I've red some times ago that nForce 4 chipset had a bug with the HyperTransport (or similar) so all the motherboards with this chipset were post-poned. Now the german MSI site has posted a press-release of the MSI Neo4 saying that mass-volumes distribution will be in mid december.
    I would like to know if some Neo4 were distribuited before nVidia found the bug and (if so) if there's a method to identify the "good" from the "bugged" versions.
    The second question is about the possible heatsink. I would like to install a Zalman CNPS-7700 AlCu (with 12x12 silent fan) but I dunno if the heatsink "dimensions" could create problems to some motherboard component.
    Thanks!

  • A question about the K8N Diamond mainboard

    I have a K8N diamnond mainboard, and I was wondering if I can use 2 dual-ram sticks and 1 normal ram-stick at the same time? I am a beginner when it comes to mainboards, so I better not do anything that may harm the pc.
    Thank you

    You can try it. Worst that could happen is a non-boot.

  • Update on my K8N and a question about voltage

    Well,
    I've read all the threads in this and several other forums about the k8n.  I have tried everything (including putting my raid on sata 3 & 4).  I cannot go above 235 - 240 HTT and that's with an 8x multiplier (also tried 10, 9, 7, 6, 5, 4) and memory set to 166.  This is using clockgen (also tried through the bios).  
    I can run 220 HTT 11x multiplier (3400+) with 1:1 memory and it is 100% stable.  I have the 1.2b5 bios.  So I'm still pretty bummed about the lack of OCing.
    Anyway, on to my question.  When I bring up CPU-Z and look at the CPU tab. The voltage jumps around a bit.  I have it set to 1.55 the bios and CPU-Z will display 1.552, 1.568, 1.552, 1.536, 1.552, etc. back and forth jumping every few seconds or so. Also the core speed fluxuates by about 0.1 MHz for each jump. So..... Is this normal or is my power supply wacked?  I've just never seen this before on my previous computers.
    Thanks!

    My voltages do the same thing. Since it's not a huge percentage of the total voltage, I wasn't too worried about it and attributed it to tolerance.

  • Memory question

    I have a couple of questions about memory.
    1. I have Corsair TwinX XMS PC3200 2x1024mb. settings at corsair are 2.5-3-3-6 It shows up in speedfan as:       
     FSB-DRAM................. CPU/14
     CAS# Latency............. 3.0 clocks
     RAS#to CAS# Delay..... 3 clocks
     Ras Precharge...............3 Clocks
    Cycle Time(tras).............7 Clocks
    Bank Cycle Time(trc)......10 clocks
    DRAM IdleTimer.............16 clocks
      I just want to run it at a better speed to get the most out of it.
    2. Plus my MB supports DDR266/333/400 modules. Does that mean I can't run the 433/500/533/550?
     When I started to overclock I had the DRAM set on 400 and with the OC it went to 424. with no problems. Then I had to drop it to 333 to keep OCing and got the DRAM up to 198 running stable. I had to drop the FSB so eventually the DRAM went to 184. I ran Sandra with the 424mhs setting and the 184mhz setting and the 184mhz setting was really a lot quicker...so I stayed there.
        Now how can I get better speeds out of the memory? Would it be better to buy a different memory to achieve my goals?

    MR,
    You Stated :
    1. I have Corsair TwinX XMS PC3200 2x1024mb. settings at corsair are 2.5-3-3-6 It shows up in speedfan as:       
    FSB-DRAM................. CPU/14
    CAS# Latency............. 3.0 clocks
    RAS#to CAS# Delay..... 3 clocks
    Ras Precharge...............3 Clocks
    Cycle Time(tras).............8 Clocks
    Bank Cycle Time(trc)......12 clocks
    DRAM IdleTimer.............16 clocks
      I just want to run it at a better speed to get the most out of it.
    The figures you posted above look more like figure you would have gotten from the memory Tab in CPU Z. Can you confirm this and also post all figures you get form the SPD Tab in CPU Z regarding your RAM. If you RAM setting are set to Auto in BIOS then the above figures are really your SPD readings that were applied by your BIOS Auto RAM setting feature.  The CPU Z SPD read out will confirm this. If this is the Case then you have Corsair TwinX2048-3200 Kit (3-3-3-8) vice the TwinX2048-3200C2 (2--3-3-6*) with the asterix annotated as [* These parts support 2-3-3-6 latency on Intel platforms, and 2.5-3-3-6 latency on AMD platforms.]. Double check your product number on your RAM and pay perticular attention to letters and/or numbers after TwinX2048-3200....>  Check result against listing the TwinX matched pair memory listing here http://www.corsairmemory.com/corsair/xms.html
    Plus my MB supports DDR266/333/400 modules. Does that mean I can't run the 433/500/533/550?
    You could likely use memory up to DDR 500 on your board if you wanted to OC and leave ram at 1:1 ratio. Higher memory  would be wasted money.
    Good luck,

  • A Question about RAW and Previews

    I have just recently starting shooting in RAW (mostly for the post production editing abilities - I am an avid amateur photographer bent on learning as much as I can). I set my camera to capture in RAW + L. I don't know why I feel like I want it to capture both the RAW and JPEG file, and thus leads me to my first question: Is it necessary to have the camera capture both the RAW and Large JPEG? I am assuming the answer to be no, as I am sure if after importing the RAW file into Aperture, you could always export a JPEG if you wanted one? So no need to fill up your internal memory (if using managed masters) with the extra JPEG? Is this thinking correct?
    Next, if you do import RAW-only files and then want to export certain images, do you have a choice to export the original RAW image? It seems that it only allows you to export a JPEG Original Size. To answer my own question, perhaps you have to export the Master in order to export the full RAW file when exporting? If you want to export a JPEG, you have to export not the Master, but a version of the Master? Is this correct?
    Lastly, I wanted to ask a question about Previews. I have my preferences set so that previews have the highest quality with no limits to size. What is the significance of setting it this way? I just assumed that if I wanted to share an image at the highest quality without exporting it, this was the way to go. Is there any validity to this? The reason I ask is that I don't want to have all of these high quality previews taking up internal disk space if I really don't need to. Is there a way to change the preview size once previews are created? Meaning, if you have it set to generate low quality previews, can you change it dynamically to high and vice versa?
    I know this is a lot in one post. Thanks for tackling it.
    Mac

    You can change the quality of the Previews in the Preferences -> Previews tab.
    You can regenerate Previews with the Delete and Update Previews under the Images menu.
    Regards
    TD

  • A few questions about MacBooks and Parallels Desktop.

    I have a few questions about MacBooks and Parallels Desktop.
    1) I understand I need at least 1GB of RAM to run Parallels Desktop but what about the hard drive, is the stock 60GB drive big enough?
    2) Related to question 1, even if it was big enough to allow me to install and run Windows would the 60GB drive be enough if I wanted to install a Linux distribution as well?
    3) This has nothing to do with Parallels Desktop but thought I'd ask it here anyway, do Apple Stores carry just the stock MacBooks, or do they carry other configurations?
    Thanks
    Keith

    1. Depend on how intensive you use that HD for saving data on both Mac OS and XP. For standard installation on both OS X and XP, the space of 60 Gb is enough.
    2. Same answer as no 1. You can install all three on that HD space, but the extra spacce available will be less and less for your data. You can save your data on external or back up on cd/dvd and erase it from the HD to keep the free space.
    Remember to leave at least 2 or 3 Gb for virtual memory usage.
    3. Just call them, maybe they don't have it in store stock, but by appointment they might configure one for you before your pick-up date.
    Good Luck

  • 2 Questions about final cut pro 5

    My first question is quite simple, I know that previously final cut pro could only use up to 2.5GBs of memory, now I'm just a bit curious if it can use more since leopard is a 64-bit OS. Right now I'm using Final Cut Studio 1. I'm just asking this question since I intend to get an additional 4GB of ram (Right now I only have 2GB) I'm going to get Muskin's MacPro memory. I generally run my system down to about 200MB of free memory and sometimes down to 10MB when I'm using photoshop in conjunction to FCP. What does everyone think about getting an additional 4GB?
    Thanks for the answers.
    Okay, to not make two posts I'm give me final question/Problem. Ever since I've upgraded to Leapord I noticed that the playback quality in the canvas and viewer is poor. There is bad gamma. What I mean by that is that i'm getting extremes in color, I have areas that look normal followed by areas of over saturated color. It's more on the over saturated side though. Besides that the play back quality is poor, even though my sequence settings are at "Best" for video processing. I also seem to get jagged lines on things like railings, people's shoulders, and on other edges. My picture quality is grainy, however there are no artifacts. Even on the FX transistions I get the jagged lines on edges of oh say page peels and what not.
    Now when using DVD player or watching my MPEG2's on DVD Studio, they play poorly too. The image is clear however I get the horizontal lines when there is motion. Now when I put one of my DVD's in my DVD player (not computer) the problem goes away and everything looks fine. This is less of a problem (the horizontal lines), but I don't understand the poor playback quality.
    The video looks fine on my external NTSC video monitor.
    I have the GeForce 7300 graphics card, with the 24" samsung 245BW. I upgraded to 10.5 on a clean install.
    Thanks a lot everyone!!

    Thanks for my memory question!
    However I'm still getting poor looking footage. Even when I open up my source clips and play them they look poor in quality. I'm getting the horizontal lines on everything that moves and corners are not sharp, they look distorted and jagged. I never had this problem before I'm not sure whats the cause or the solution. Even the footage from Final Cut Studio's tutorial looks bad (grainy, strange contrast, jagged corners and lots of horizontal lines). However the edges on my footage look a little more jagged. I think i mentioned it already but the FX transitions that are page peels, cube spins...etc give me jagged edges too.
    I removed flip4mac and all other non-apple codecs too.
    ugh i hate computers sometimes
    -Tom

  • Question about Video Cards on Laptops

     Hi everyone, and sorry if I post this in the wrong area. I own a few computers, some are consider ancient and others are somewhat old but can run Windows XP just fine. I have a question about the computers in the stores today.
    I am looking to invest in a new laptop, possibly a desktop in the near future depending on my budget. The question I have about video cards today is that, I enjoy a good MMO game now and then, I don't expect to play heavy intensive games, just maybe some more advance games. I notice that for one game that I want to play, it requires a video card to have (Shadow Rendering, Vertex and Pixle Shaders). Do the computers today have those as standard for thier video card? Or do I have to invest in a little bit more expensive laptop/desktop. I am willing to invest money for some more RAM, if I need to improve its preformance. The reason I ask is because I have a 4 year old Laptop with a broken keyboard, and I was thinking about just replacing the laptop with a newer one. it has some pretty good specs and runs everything fine, but it can not play certain games because of those missing video requirements. (Current laptop incase someone is curious: 1.5Ghz Celeron M, 1GB memory 64MB video (Shared) ) Thanks to anyone who has the answer to this question. (P.S sorry if everything is scrunched together. for some reason the formatting is wierd on Opera web browser

    Graphics cards have all those standard today... or at least they should unless you get a really cheap Graphics card. The kind of graphics card and processor and other specs of the computer you want to get should depend on what you want to do with it. It also depends on the kind of games you want to play. If you're playing FPS [first person shooter] then you need a really good graphics card and processor. If you're playing something like Diablo 2 or something that's on the lower end of graphics then you just need decent specs. Graphics cards, I suggest something with aGeForce Nividia card... as of today, they are in the 9 series for high-end cards I believe. For processors... get something like Intel Core 2 Duo T9500 or T9600 or the Core 2 Quatros or Extremes. Extremes are intended for hardcore gaming and would be more expensive. For RAM you should have at least 2GB... you can always upgrade that later.
    For laptops my suggestions are:
    $3k+ get a customized alienware
    $2-3 get a customized dell xps, or macbook pro [although if you're gaming i would suggest you dual boot with windows xp or vista]
    $1-2k get a cheaper customized dell, or sony vaio
    under $1k... well I don't know what to say... you can't really get a nice gaming computer for under that price.
    I personally DO NOT like HP computers at all because they are so quirky and have so many problems and break down easily.

  • Question about pictures file in home folder

    I am trying to create more space on my hard drive and have a question about the pictures file.
    In addition to the iPhoto Library I have several folders named 1, 2, 3 etc each containing a bunch of photos; plus several hundred individual photo files.
    My question is whether these files are duplicates of what is in the iPhoto Library - and is there an easy way to find that out?
    And I guess also to find out how they got there in the first place. Could these be imports from disks as opposed to my digital camera?
    Thanks for any help,
    Laura

    Hey laura,
    I haven't used iPhoto for a while but from memory i think that i work similar to iTunes. Whne files are processed by iPhoto they are catalogued on the hard drive according to the iPhoto preferences. I would check out the iPhoto preferences and see if you can see anything in here that will tell you what is going on.

  • Question about size of ints in Xcode

    Hello. I have a a few quick questions about declaring and defining variables and about their size and so forth. The book I'm reading at the moment, "Learn C on the Mac", says the following in reference to the difference between declaring and defining a variable:
    A variable declaration is any statement that specifies a variables name and type. The line *int myInt;* certainly does that. A variable definition is a declaration that causes memory to be allocated for the variable. Since the previous statement does cause memory to be allocated for myInt, it does qualify as a definition.
    I always thought a definition of a variable was a statement that assigned a value to a variable. If a basic declaration like "int myInt;" does allocate memory for the variable and therefore is a definition, can anyone give me an example of a declaration that does not allocate memory for the variable and therefore is not a definition?
    The book goes on, a page or so late, to say this:
    Since myInt was declared to be of type int, and since Xcode is currently set to use 4-byte ints, 4 bytes of memory were reserved for myInt. Since we haven't placed a value in those 4 bytes yet, they could contain any value at all. Some compilers place a value of 0 in a newly allocated variable, but others do not. The key is not to depend on a variable being preset to some specific value. If you want a variable to contain a specific value, assign the value to the variable yourself.
    First, I know that an int can be different sizes (either 4 bytes or 8 bytes, I think), but what does this depend on? I thought it depended on the compiler, but the above quote makes it sound like it depends on the IDE, Xcode. Which is it?
    Second, it said that Xcode is currently set to use 4-byte ints. Does this mean that there is a setting that the user can change to make ints a different size (like 8 bytes), or does it mean that the creators of Xcode currently have it set to use 4-byte ints?
    Third, for the part about some compilers giving a newly allocated variable a value of 0, does this apply to Xcode or any of its compilers? I assume not, but I wanted to check.
    Thanks for all the help, and have a great weekend!

    Tron55555 wrote:
    I always thought a definition of a variable was a statement that assigned a value to a variable. If a basic declaration like "int myInt;" does allocate memory for the variable and therefore is a definition, can anyone give me an example of a declaration that does not allocate memory for the variable and therefore is not a definition?
    I always like to think of a "declaration" to be something that makes no changes to the actual code, but just provides visibility so that compilation and/or linking will succeed. The "definition" allocates space.
    You can declare a function to establish it in the namespace for the compiler to find but the linker needs an actual definition somewhere to link against. With a variable, you could also declare a variable as "extern int myvar;". The actual definition "int myvar;" would be somewhere else.
    According to that book, both "extern int myvar;" and "int myvar;" are declarations, but only the latter is a definition. That is a valid way to look at it. Both statements 'delcare' something to the compiler, but on the second one 'define's some actual data.
    First, I know that an int can be different sizes (either 4 bytes or 8 bytes, I think), but what does this depend on? I thought it depended on the compiler, but the above quote makes it sound like it depends on the IDE, Xcode. Which is it?
    An "int" is supposed to be a processor's "native" size and the most efficient data type to use. A compiler may or may not be able to change that, depending on the target and the compiler. If a compiler supports that option and Xcode supports that compiler and that option, then Xcode can control it, via the compiler.
    Second, it said that Xcode is currently set to use 4-byte ints. Does this mean that there is a setting that the user can change to make ints a different size (like 8 bytes), or does it mean that the creators of Xcode currently have it set to use 4-byte ints?
    I think that "setting" is just not specifying any option to explicitly set the size. You can use "-m32" or "-m64" to control this, but I wouldn't recommend it. Let Xcode handle those low-level details.
    Third, for the part about some compilers giving a newly allocated variable a value of 0, does this apply to Xcode or any of its compilers? I assume not, but I wanted to check.
    I don't know for sure. Why would you ask? Are you thinking of including 45 lines of macro declarations 3 levels deep to initialize values based on whether or not a particular compiler/target supports automatic initialization? Xcode current supports GCC 3.3, GCC 4.0, GCC 4.2, LLVM GCC, CLang, and Intel's compiler for building PPC, i386, and x86_64 code in both debug and release, with a large number of optimization options. It doesn't matter what compiler you use or what it's behavior is - initialize your variables in C.

  • Question about navigation in session scope

    Hi.
    I dont know how to resolve a problem or how to focus this stuff.
    I'll try to explain myself.
    Let say I have a page (a.jsf) with several links, all this links navigates to the same page (b.jsf) which shows the results.
    For each link in a.jsf I have attached a bean with a logic, so If I click in link 1 I go to the b.jsf showing data read from the database.table1. If I clik in link2 I go to b.jsf showing data read from database.table2, and so on...
    The beans are in session scope (and must be).
    The first time works ok because I initialize the bean in b.jsf, read data and I show using a selecManyListBox to show it, but if I go back and select another link it goes to b.jsf, but it shows the old data, the data read the first time, because it never calls again the init method.
    Somebody has talked about using an additional bean to control this but once the bean in b.jsf is created I don't know how to call again the init method in beanB (b.jsf)..
    I have attached a very simple project to deploy in eclipse 3.3 and tomcat 6.0. In this example instead of read from database I read from an structure created in memory to simulate this.
    Somebody could take a look and comment something about it.
    http://rapidshare.com/files/197755305/_session-forms.war
    Thanks

    Hi.
    I understand is the same doing in the action method in a button or a commnad, the project is just an example, my real app is a tree, so is not a question about a button or a command, is about the logic being in session scope. I don't know how to face it.
    thanks

Maybe you are looking for