The best OC solution?

I'v been runing my rig for some time with signature setings.
I'v tried to go up but I failed.
I'v tried everything:lower divider,clockgen,voltages up but nothing.
I'v relised that my CPU is holding me back(it would not go higher than 2450 Mhz).
I thought that venice OC to 2700 and higher easy.
so I tried diferent config:
CPU div. to 8
FSB 305 Mhz
mem div. to 150
what resulted with 2440 Mhz OC
or:
CPU div. to 8,5
FSB 288 Mhz
mem div. to 166
what resulted with 2448 Mhz OC
voltages staid the same as i my sig.
at the end: what is the best combination?
or I can push the CPU up...  ...somehow?

1.6V Is high Your cooking your CPU with 2 much V for a low O/C
I get 2400Mhz on stock 1.4V
1.425 gets me 2550+ My psu limits me.
Your CPU is 1800 stock to 2400+ is a 600Mhz O/C not bad at all.   
Try  X2 HTT that works good for me. Keep (X9) CPU and up it to 220 via bios. Mem DIV 133, then use clock gen to O/C in windows.
If it won't cold boot then low O/C via BIOS and up in windows.
Use Prime/memtest 86/ 3d mark and test its stable. until the point where its not. lockups restarts etc

Similar Messages

  • What's the best storage solution for a large iLife? RAID? NAS?

    I'm looking for an affordable RAID storage solution for my Time Machine, iTunes Library, iMovie videos, and iPhoto Library. To this point I've been doing a hodgepodge of external hard drives without the saftey of redundancy and I've finaly been bitten with HD failures. So I'm trying to determine what would be the best recommendation for my scenario. Small Home Office for my wife's business (just her), and me with all our media. I currentlty have a mid-2010 Mac Mini (no Thunderbolt), she has an aging 2007 iMac and 2006 MacBook Pro (funny that they're all about the same benchmark speed). We have an AppleTV (original), iPad2 and two iPhone 4S's.
    1st Question: Is it better to get a RAID and connect it to my Airport Extreme Base Station USB port as a shared disk? OR to connect it directly to my Mac Mini and share through Home Sharing? OR Should I go with a NAS RAID?
    2nd Question: Simple is Better. Should I go with a Mac Mini Server and connect drives to it? (convert my Mac Mini into a server) or Should I just get one of those nice all-in-one 4-bay RAID drive solutions that I can expand with?
    Requirements:
    1. Expandable and Upgradeable. I don't want something limited to 2TB drives, but as drives get bigger and cheaper I want to easily throw one in w/o concerns.
    2. Simple integration with Time Machine and my iLife: iTunes, iMovie, iPhoto. If iTune's Home Sharing feature is currently the best way of using my media across multiple devices then why mess with it? I see "DLNA certified" storage on some devices and wonder if that would just add another layer of complexity I don't need. One more piece to make compatible.
    3. Inexpensive. I totally believe in the "You Get What You Pay For" concept. But I also realize sometimes I'm buying marketing, not product. I imagine that to start, I'm going to want a diskless system (because of $$$) to throw all my drives into, and then upgrade bigger drives as my data and funds grow.
    4. Security. I don't know if its practical, but I like the idea of being able to pop two drives out and put them in my safe and then pop them back in once a week for the backup/mirroring. I like this idea because I'm concerned that onsite backup is not always the safest. Unfortunately those cloud based services aren't designed for Terabytes of raw family video, or an entire media library that isn't wholey from the iTunes Store. I can't be the only one facing this challenge. Surely there's an affordable way to keep a safe backup for the average Joe. But what is it?
    5. Not WD. I've had bad experiences with Western Digital drives, and I loathe their consumer packaged backup software that comes preloaded on their external drives. They are what I meant when I say you get what you pay for. Prettily packed garbage.
    6. Relatively Fast. I have put all my media on an external drive before (back when it fit on one drive) and there's noticeable spool-up hang time. Thunderbolt's nice and all, but so new that its not easily available across devices, nor is it cheap. eSata is not really an option. I love Firewire but I'm getting the feeling that Apple has made it the red-headed step-child of connections. USB 3.0 looks decent, but like eSata, Apple doesn't recognize it exists. Where does that leave us? Considering this dilemma I really liked Seagate's GoFlex external drives because it meant I could always buy a new base and still be compatible. But that only works with single drives. And as impressive as Seagate is, we can't expect them to consistently double drive sizes every two years like they have been -cool as that may be.
    So help me out without getting too technical. What's the best setup? Is it Drobo? Thecus? ReadyNAS? Seagate's BlackArmor? Or something else entirely?
    All comments are appreciated. Thanks in advance.

    I am currently using WD 2TB Thunderbolt hard drive for my iTunes, which i love and is works great.  i am connected directly to my Mac Book Pro. I am running low on Memory and thinking of buying a bigger Hard drive.  My question is should I buy 6TB thunderbolt HD or 6TB NAS drive to work solely for iTunes.  I have home sharing enabled for my Apple TV 
    I also have my time capsule connected just as back up only.   

  • What's the best overall solution for managing books and eBooks?

    I have a lot of eBooks (in .mobi and PDF format). I'd like to find a solution (software/system/scheme) such that
    1) The actual content syncs to several devices (Mac, iPad, iPhone) and can be read even when network is not present.
    2) There is a searchable, convenient database of what books are there.
    3) The database can be added to by barcode scanning the ISBN tag
    4) Some entries in the database are linked to full content. That is, I'd enter books that I have in electronic format, but also books from my library that exist only in paper form.
    I'm looking for a total solution to manage all my books - list them all, plus enable ready access to the content for the ones I have. I'd like to have it sync to several devices. What's the best system - anyone have it set up like this? What's the best reader, database app, and way to hang it all together (with iTunes?). Thanks in advance,
    Mike

    I am currently using WD 2TB Thunderbolt hard drive for my iTunes, which i love and is works great.  i am connected directly to my Mac Book Pro. I am running low on Memory and thinking of buying a bigger Hard drive.  My question is should I buy 6TB thunderbolt HD or 6TB NAS drive to work solely for iTunes.  I have home sharing enabled for my Apple TV 
    I also have my time capsule connected just as back up only.   

  • What's the best RAID solution for my iMac / MacBook?

    I think it's time I invested in a larger backup hard drive, and for data security I'm looking at 2-drive RAID devices. Is the Lacie 2Big Network good for this purpose? http://www.lacie.com/uk/products/product.htm?pid=10953
    What's the reliability and speed really like on such devices?
    I have a 1Gbit switch that I could attach a network storage drive to, and the idea of having a public ftp area on it sounds quite cool and useful, anyone know just how secure it is?
    What about replacing drives in these things - is it just a case of taking existing hard drive out of a caddy and putting a new one in? (same type, size and speed, presumably)
    At work we have some dinosaur Dell servers that have big hardware RAID arrays in them, I've only just started using them myself but can see how versatile the concept is.. are there any drawbacks to having a small home hardware RAID setup?
    Thanks!

    If you are not limited to NAS, take a look at CalDigit for a great eSATA/FW800/FW400/USB RAID 1/0 external drive.
    http://www.caldigit.com/CalDigit_VR/ , we will get one soon.
    AFAIK, the CalDigitVR connects using either eSATA, FireWire 800, 400, and USB2.0. The RAID is configurable as performance (RAID 0), protected (RAID 1), or SPAN. The CalDigit's products are used by serious individuals, photographers, movie studios, etc. and are as bulletproof as you can get.
    Everything is swappable (including the fan assembly) and can be daisy chained through FireWire port when out of capacity. If know you are using iMac, but if you have a MacPro, you can use their SATA kit to speed up to 200 MB/s !!!.
    In the past, RAID protection doesn't come cheap but neither does true peace-of-mind. There are other solutions to be sure, but none offer the longevity and flexibility of the Caldigit products. And the best part is CalDigitVR 500GB starts at only $399.

  • Is DLT the best backup solution?

    Hi,
    I am curious to know what people use as a good backup for their FCP projects and footage. I know DLT is recommended for outputting DVD projects - but are there better options? Any feedback is greatly appreciated.
    Thanks in advance!

    DLT is the dinosaur of the industry. Rock solid, totally dependable. The reasons it's still used for DVD replication are not associated with any reasons why you'd want to use DLT for backing up your FCP projects. DVD replication is an entirely different industry.
    DLT would not be efficient for video offloading in the slightest. Linear and long. But, as said, utterly bombproof and relatively inexpensive. if you have a couple of DLT machines and lots of cart,, you can certainly use them for offloading your FCP media. But "best"? Not at all. There is no single best backup solution, only what works for your budget and workflow. Hot swap RAIDs, cheap FW drives, optical, DVD-ROM, even installing and removing ATA drives in your Mac; they're all viable.
    bogiesan

  • Please specify the best DR Solution

    Hi,
    We have Oracle Applications 11.5.10.2 running on HP-UX PARIC Server(11.11)
    and Database 9.2.0.7 running on HP-UX Itanium (11.23). On the same Database server one more 9.2.0.7 Database running and used for Banking Applciations.
    We have DR site, which is in another Location. Right now, there is no connection between Primary and DR Sites.
    DR Servers are exact replica of Primary Servers (same Hostname, IP Address , same database name, same unix accounts). We are manually moving the PROD Backups to DR Servers and restore there and startup the services for every week like that.
    Now, we want to implement better DR setup, which should be online. we have couple of questions below, can anybody please suggest the best solution.
    (1) Oracle tools: what is the needed effort to use oracle tools for this and what is the impact on our daily backup (Daily we have cold backup for both Database and Applications in Primary Site, and restoration plans?
    (2) Other software: are there any other software that will make this replication easier?

    Hi;
    (1) Oracle tools: what is the needed effort to use oracle tools for this and what is the impact on our daily backup (Daily we have cold backup for both Database and Applications in Primary Site, and restoration plans?If you are going to online which mean DR and Primer site can comminicate i belive you can follow:
    1. Dataguard
    2. Goldengate
    3. Streams
    Check those options.Please googling you can find many good article thatswhy i dont share here now. If you have doubt update issue.
    For us we prefer to use dataguard,goldengate
    (2) Other software: are there any other software that will make this replication easier?
    There are some tool exsist like riverblade which is using dif. algorithm to send packet from one to other location. We do not have yet but i know it from some of my friend company which makes they very happy
    Regard
    Helios

  • What is  the Best Storage Solution?

    I have a laptop and an external drive E (where I store all my photos 9000plus). I would like some suggestions on the best (most reliable and long term) storage solutions. I was burning them on CD as well but now I have a Canon EOS Rebel GT (pictures are much larger files) I dont get many on a CD. I was wondering if I should get a DVD burner. I thought they probablly store a lot more pictures?? Wondering how many more?
    Also, do I save all my tagged & edited pictures seperately from my originals (have two sets of photos)? And if so on the hard drive or on CD/DVD? Then I assume if I backup my catalogue when I get a new computer all I need to do is restore and everything is there?
    Finally, any good stragegies on deleting pictures. I have 3 kids and have such a hard time deleting their photos. The obvious poor ones no problem but when I take rapid shot pictures they all seem so good. At this rate I will need to buy a bigger house to store all the hard drives and back up disks :)
    Would love to hear what works well for you!
    Thanks,
    julie

    The original poster asked the question whether or not to have two sets of photos (originals and edited).
    What do most people do as far as this is concerned?
    Do you include your original files in your Organizer catalogue or not?
    The "workflow" that I have finished up with is this:
    At the end of each day, I transfer all my camera shots into a folder such as C:\Negs\2006\0605\2006_0521Flowers ("Negs" because I think of them as equivalent to "negatives")
    I do not use PSE4 downloader but use Fuji FinePixViewer simply because it came with my first camera and I have just continued with it. It makes it easy to rename the files to things like 0605210001.jpg (last 4 digits sequential for each date). I have tried descriptive file names but found them too difficult.
    The word tacked onto the sub-folder name is a reminder of the general nature of the subject matter. I do not worry if there are a few shots that do not really belong but if there are a significant number of shots of totally unrelated subjects, I can cheat and artificially split the sub-folder in two. (The descriptive part of the folder name is pretty much redundant because of PSE4's tagging ability, it is really a carry-over from the Fuji program.)
    I then go through and delete any completely useless shots.
    I then copy and paste the sub-folder into a another folder named C:\Photos" which contains an identical sub-folder tree.
    C:\Photos is watched by PSE4 so whatever is in it is automatically included in my catalogue.
    I then go through and do my tagging.
    I do my editing in the C:\Photos folder and save processed files in their original sub-folder but append _E to the file name to indicate that it has been edited. (I have a few other suffixes like _W for web images, _Q for max quality for large prints)
    If I am not entirely satisified with my editing and think that maybe in a few days time I might be able to do a little better, I save my processed file as psd. Otherwise I flatten it and delete the un-edited version. (A copy of the original still being available in C:\Negs if required.)
    Am I unusual in keeping my original shots outside Organizer? - my reason for doing it this way was to keep my Catalogue down to a manageable size.
    As far as backing up is concerned, I have a comprehensive system that involves daily backups from C to D (a second internal drive) and monthly backups to external USB drives (used in rotation and kept off site).

  • I currently use a Time Capsule for my wi fi. I would like to extend its range and have been looking at boosters. Is Airport Express the best booster solution when using the Time Capsule or is there a better solution?

    I currently use a Time Capsule for my wi-fi (2008). I would like to extend its range. While there are many boosters that can do the job, what is the best option? How is Airport Express as an option?

    The Time Capsule (TC) can have its wireless network extended by another TC, an 802.1n AirPort Extreme Base Station (AEBSn), or 802.11n AirPort Express Base Station (AXn).
    If your TC is a simultaneous dual-band version, only another TC or AEBSn will be able to extend both radios. The AXn can only support one radio (2.4 or 5 GHz) at a time.

  • DVD encoding.. what is the best software solution?

    HI,
    Been trying to figure out the best/optimal way to create high-quality DVDs from my movies. The original source movies are HD-res (1280x720) encoded from After Effects using the H264 codec. What software is recommended to create the best possible DVD? I've tried Toast, Visual Hub and iDVD sofar. Is there anything better out there to give me better results? Appreciate the info.
    thanks;
    John

    The original source movies are HD-res (1280x720) encoded from After Effects using the H264 codec.
    First question: What is your source codec? You've listed the resolution of the source video, but not the codec.
    Second question: Why are you encoding to h.264 if the final delivery is for DVD-Video? Seems like you'd be compressing twice ... which is never good for quality.
    -DH

  • What's the best cloud solution for Aperture photos?

    I use aperture and looking for a cloud storage solution to free up space? Interested in Fanfair, Smugmug and Dropbox. I plan to keep using Aperture for recent photos but want to be able to upload (and remove from my hard drive) older photos to free up space?
    Any suggestions on product and workflow?

    The problem with multiple libraries is that they are inconvenient. Only one can be open and searched at a time, so if you're looking for that photo from your trip to NY, well was that in 95? 96? Maybe it was 94... And as an Aperture Library is pretty much unlimitied in size, I don't see a compelling argument for a library-per-year approach.
    Having two libraries - one on the internal and one on the external - seems like a lot more convenient, but I would do that slightly differently.
    On the External I would have one complete library. This makes backing up very easy. Back that up and you have everything backed up. From that I would export as a library the material you want to take on the road with you. If you import then ideally import to the one main library, of if not, then you can move material from the internal to the external library using the export as Library/ import library function. So, at all times the internal library is a sub-set of the main one.
    I'm wary of Dropbox and other cloud solutions for backing up an entire Library. Why? These are very large databases. I've seen people complain that restoring a library can take days simply because these guys shape their traffic and you're downloading massive amounts of data.
    Further, many cloud services store the files on inappropriate disk formats, and this can lead to damage to the Library.
    So, my suggestion would be as follows:
    One full size Library on an external
    That backed up to another external.
    Ideally, backed up to a second external - this lives off-site at your workplace, a relatives house, even in your car.
    On your internal a subset of your main library.
    As for backing up Photos to the cloud, that's quite simple. Services like Flickr and SmugMug - and there are many others - are there and you can upload all your photos, they remain accessible from any computer anywhere in the world.

  • What is the best MDM Solution for pushing individualized app packages?

    Hello. I work for an education company, and we have a program offering students individualized iPad-based courses using apps catered to the individual students' needs. This means that every device has a different and constantly changing app profile. We need the ability to centrally purchase and distribute apps to our deployed devices, but in such a way that each iPad is receiving it's own unique update.
    So far I have looked into free Meraki and paid Casper Suite by JAMF Software. The reocurring theme seems to be that instead of an interface with the device at the center, it is each individual app that the interfaces focus on, leaving the user to have to choose which devices the single app be sent to. It would be great to have a program where your device is displayed and you choose which apps to send to it...
    Also, it must work in conjunction with the VPP.
    Thanks!

    Airwatch is probably your best bet, and has a free trial.  You could set up device groups, with each group really being an individual student, and deploy apps specific to that group/student. 

  • Firefox shows blurry images (and none of the best-known solutions work)

    No matter
    * if I run Firefox under safe mode (plugins disabled),
    * if I update my graphics drivers,
    * or if I disable Harware Rendering under Advanced Options,
    Firefox keeps showing blurry images.
    Here is a comparison screenshot between Firefox (left) and Opera (right) rendering the same image:
    http://imgur.com/7GksG7Z
    This has been happening for a long time, even through several Firefox updates. None of the solutions I've found on the web (listed above) will fix this problem.
    The image blur is tiring on the eyes, not being able to see images correctly is so annoying, and I hope someone can help me fix this issue.

    Changing Firefox's zoom behavior so you can enlarge just the text but not the images can lead to site layouts breaking. If you want to take a look:
    View menu > Zoom > ''check'' Zoom Text Only
    (If you do not normally display the classic menu bar, tap the Alt key to display it temporarily.)
    Firefox 22 was the first to tie your Firefox content zoom level to your Windows DPI setting. You can break the connection and set Firefox to use 100% resolution (the classic 96dpi resolution) if you like. Here's how:
    (1) In a new tab, type or paste '''about:config''' in the address bar and press Enter. Click the button promising to be careful.
    (2) In the filter box, type or paste '''pix''' and pause while the list is filtered
    (3) Double-click '''layout.css.devPixelsPerPx''' and change its value to '''1.0''' for Firefox 21-sized fonts.
    This will return the content to normal, but the toolbars may appear a bit smaller than your Windows standard for UI (100%/105%). There is an extension to enlarge fonts in that area: [https://addons.mozilla.org/en-us/firefox/addon/theme-font-size-changer/ Theme Font & Size Changer].
    Can you get it to work the way you want?

  • What is the best storage solution for the new Mac Pro

    Hi All,
    With limited funds when purchasing the new Mac Pro, I'm starting to look at storage for music/pictures/video etc - I'm thinking of storing this data externally and connecting via Thunderbolt or Firewire or USB3 to access the data... Not sure what type of storage to use, upgrading to 1TB PCIe-based flash seems excessive for costs and I would like at least 2TB, anyone have any suggestions?
    What are the alternatives for storage... (single disk Raid) I've recently had to replace my internal WD hard drive (lost everything) and the backup time capsule failed, again knackered disk (looks like WD format error with Maverick OS bug), don't want to format them just incase..
    With the new Mac Pro coming with 256GB PCIe-based flash storage, I'm reluctant to upgrade the storage because of costs and would like some redudancy when it comes to storage.
    Any suggestions?
    Many thanks
    Russ

    landing page at OWC for Thunderbolt products.
    Helios enclosure (the new dual-slot Helios PCIe chassis) and moving various hard drives into Thunderbolt cases (in time) but mainly into my favorite USB3 hard drive enclosure.
    For those without eSATA enclosures, a simple eSATA to USB3 adapter might do the trick.
    USB3 is plenty fast to house two hard drives.
    If you can, I'd try to order with 500GB to get started and have more room for system and default for scratch or even for Aperture/iPhoto or Lightroom.
    Always take the precaution and zero a drive before use. Want a solid enterprise ideal for RAID, then look no further than Seagate Constellation series. 128MB cache doesn't hurt and fine for RAID5, NAS or whatever you want.
    There should be retail PCIe-SSD products in 2014.
    Large storage, http://macperformanceguide.com/topic-thunderbolt.html

  • What is the best Cloud solution for all your photos?

    You can't put all your iPhoto library in iCLoud, correct?  So how can I put all my photos in a cloud?  I'm a little hesitant to put all my photos in Dropbox or Google Drive as then my other computers will fill up with the same photos AND I don't trust non Apple products for my photos.  Can anyone tell me how they are doing it please?  At the moment I feel ok with my external backups (I've got 3 of them) as well as backing up in Time Machine but just wanted to know how others do it.
    Thanks

    You can't put all your iPhoto library in iCLoud, correct?
    Correct.
    So how can I put all my photos in a cloud?
    I use Flickr and have more than 50k photos there. You can decide what photos - if any - you want public and which you want to have available only to those whom you invite to see them.

  • What is the best way to switch between multiple image buffers? AND How to synchronize saves?

    Hello,
    I'm trying to flip-flop between two buffers and wondering the best possible solution for this.  I'd like to acquire an image in one buffer, send that off to be processed, and then while that's being processed acquire a second image.  Right now I have a "Create IMAQ vi" in a for loop and have it creating 10 image locations.  I'm using a non-NI framegrabber (shame I know) which makes things a bit more difficult to replicate.  I have two while loops.  One while loop currently grabs images from the framegrabber and places in the 10 different locations.  The other while loop currently holds a case structure that does the processing.  I have created a local variable that holds all of the image locations and reads those to be processed.  I don't know if this is actually making things faster or if it's better to just make one image location.
    I have two images attached.  One image is the Grab While loop.  Since I have an array of locations, I have to use a for loop and index each one out to my display. I then have a shift register to carry the image location info over to the next iteration of the while loop.
    The second attachment is the bulk of the main while loop.  It shows what happens to the image while it's being fully processed in the left case structure.  I know it does not look like much but one of the cases (which is called by Boolean Image FFT) is a subVI that does most of the processing.  I believe that is what really slows it down because of how that program is written. 
    The right case structure shows my saving mechanism.  I have two file paths. One to save the image and the other to save the processed image.  I have a sequence to make sure they save at the same time once it gets to that point. 
    The problem though is the following:
    In the grabwhileloop.png, you can see that I have a timing to see how fast the images are being acquired.  This value is approximately 60 fps (which is the rate of the Basler camera).  There is a similar set up in the main loop case structure. This processes very slowly and is approximately 1.04 fps.  Which would mean that the image I turn into an array in the left case structure in the main while loop image is more than likely different than the image I'm trying to save off in the right case structure since the grab is occurring at the same time.  I'd like to have the processed image to save alongside the image I am processing.
    Sorry for the big bulk of text.  This code has come a long way as it is.  If you have any suggestions on making it faster or more efficient please feel free to chime in.
    Thanks,
    Rob
    Attachments:
    GrabWhileLoop.png ‏34 KB
    MainWhileLoop.png ‏68 KB

    Thanks for the comment.  I have looked into the producer/consumer architecture, but to be honest, I'm not quite sure how everything will work while doing that. I have seen the example codes, and I have thought about implementing (or at least attempting to) but I'm still unconvinced that it will run that much more efficiently.  There is other setup outside of the images that had to be done outside of either while loop. Also, I don't know where I would put both loops.
    Last time, I attempted to put the grab while loop inside of the state machine.  Things got chopped because it took so long to go through the main while loop's "acquisition" state (which is really the processing state).  I needed both to simultaneously run.  The grab reset every time it went to the "grab" state which is not what I wanted.  The only way I can thing to have combatted this was to combine the grab and acquisition in the same state.  If that were to happen, I'd take out the for loop and grab one image at a time.  However, that would still probably make things even slower than they already are.
    In terms of the doing something before an image is assigned in for loop, I don't need that pixel sum value to refresh too quickly.  As it is already, the main while loop is slow enough as it is, so I am more afraid that everything will run too slowly the more I do.  I know where the bottleneck is in my code, but I can't really see a way to "even out the flow".  Even if I moved to the other architecture, I feel it'd take the same amount of time that it does already.
    From my debugging, the Image local variable in the main while loop seems to refresh as quickly as the grab while loop spits it out.  Granted once the main while loop finally completes, main images have gone by.  This is what has to be though because it just take up so much processing power to run through the main while loop state.
    As a side note, does labview have an issue with acquiring images in real-time that you have heard of?  I ask because when I run the code, there is a solid white line that I'm supposed to see in my display.  Every time things either time out or something, the line moves which is not supposed to happen.  The line also moves every time I place my mouse cursor in the display or if I spin the mouse wheel to scroll.  If I don't do either of those things, it'll eventually move on its own.

Maybe you are looking for