Seed_pool and general storage question

I am trying to deploy EBS 12.1.3 Prod and Apps VM templates on a VM Server using just internal disk and I am running out of space.
So do I need the seed_pool files after I have successfully imported the templates as Virtual Machnes?
I was thinking I could also maybe create an NFS share from another server and mount the /OVS/seed_pool directory that way if needed.
Yes/No?
Edited by: user6445925 on May 13, 2011 9:33 PM

I don't know if there is a better way to manage this but what we are doing is NFS mounting multiple file systems from a NetApp then move the files we want to the file systems we want and connect it all back to the file system that /OVS is pointed to with symbolic links
e.g.
10.53.252.2:/vol/cos1_tier03_rms_testdev_os_ovm_nfs
158G 87G 72G 55% /var/ovs/mount/1A57047A2ABE4210B37DEEF62C33CF1F
10.53.252.2:/vol/cos1_tier03_rms_testdev_asm_ovm_nfs
2.0T 356G 1.7T 18% /var/ovs/mount/0465D5417DAB4F9F9D4EFDB141940AE6
10.53.252.2:/vol/cos1_tier03_rms_testdev_app_ovm_nfs
450G 417G 34G 93% /var/ovs/mount/7B543EC95C0B40648377E762F2F04713
[root@ovs-tst-01 /]# ls -l /OVS
lrwxrwxrwx 1 root root 47 Feb 9 16:00 /OVS -> /var/ovs/mount/1A57047A2ABE4210B37DEEF62C33CF1F
so copy the file you want into the 2tb file system then link them back to the same location in the 158gb file system where the /OVS link points to. This is the way that we have figured out to put different storage needs onto different volumes on the NetApp - specifically moving +ASM to a different storage pool on the NetApp.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

Similar Messages

  • Workflow and General Use Questions

    Hello,
    I'll apologize right off the bat for these novice question because I'm sure the information is probably somewhere in the forum, I just haven't been able to find it. I just purchased Aperture after completing the demo as my library is getting too large to manage using standard file folders. I'm now trying to figure out the best practices for workflow and general use before I invest some serious time into importing and keywording all my pictures.
    1) Store files in the there current location, or in the Aperture Library? It seems to me that once they are moved to the Aperture library, you can only access them from within Aperture. I'm thinking I would be better off leaving them in their current location. For one, if I want to quickly grab a picture as an attachment to an email or something it seems easier to grab it from the standard folders. Second (and more important) I do not have room to keep all my pictures on my Macbook, thus most of them are stored on the Time Capsule.
    So... Keeping photos in their current location appears to be the best choice for me even though it adds an additional step every time I bring in new photos from my camera. Does this sound right?
    2) Is there a way to mark the photos that I have uploaded to my website (Smugmug)? Ideally, I would like to badge photos that have already been uploaded so I can quickly recognize them and ensure I'm not duplicating. I've considered using the rating, or keywords to indicate that a photo has been uploaded but both methods have disadvantages.
    3) Any suggestions for general workflow and organization resources (tutorials, books, websites, etc.)? I've looked at the videos on Apple's site but they obviously didn't get that detailed.
    Thanks for the help, sorry for the length.

    I recommend to Manage by Reference with Master image files stored on external hard drives (note that Aperture defaults to a Managed-Library configuration rather than a Referenced-Masters Library). Especially important for iMacs and laptops with a single internal drive. The workflow as described below in an earlier post of mine uses a Referenced-Masters Library.
    I feel pretty strongly that card-to-Aperture or camera-to-Aperture handling of original images puts originals at unnecessary risk. I suggest this workflow, first using the Finder (not Aperture) to copy images from CF card to computer hard drive:
    • Remove the memory card from the camera and insert it into a memory card reader. Faster readers and faster cards are preferable.
    • Finder-copy images from memory card to a labeled folder on the intended permanent Masters location hard drive.
    • Eject memory card.
    • Burn backup hard drive or DVD copies of the original images (optional strongly recommended recommended backup step).
    • Eject backup hard drive(s) or DVDs.
    • From within Aperture, import images from the hard drive folder into Aperture selecting "Store files in their current location." This is called "referenced images." During import is the best time to also add keywords, but that is another discussion.
    • Review pix for completeness (e.g. a 500-pic shoot has 500 valid images showing in Aperture).
    • Reformat memory card in camera, and archive originals off site on hard drives and/or on DVDs.
    Note that the "eject" steps above are important in order to avoid mistakenly working on removable media/backups.
    Also note with a Referenced-Masters Library that use of the "Vault" backup routine backs up the Library only, not the Masters. Masters should be separately backed up, IMO a good thing from a workflow and data security standpoint.
    Max out RAM in your MB and keep the internal drive less than 70% full.
    Good luck!
    -Allen Wicks

  • ELOM Remote Console Scriptability and Remote Storage Questions

    Hello,
    I have a few questions reguarding scriptability of the Java Web Start eLOM Remote Console and the use of this application for redirecting remote storage.
    Setup:
    - x6250s in 6048 chasis
    - x6250 Bios version 1ADPI040 & SP version 4.0.52
    - Sun eLOM Remote Console version 2.53.05
    - CentOS 5.2
    Questions:
    - I'd like to write a wrapper script that would allow me to start a remote console on the Linux command line. Then, a command line like "myjavaconsole bladexyz" would give me a java remote KVM without me having to click through the web interface. Is something like this possible? Hints?
    - The [Sun Blade X6250 Server Module Embedded Lights Out Manager Administration Guide|http://docs.sun.com/source/820-1253-14/remote_con.html#0_66586] says that you can use the eLOM Remote Console GUI to redirect storage devices including CD/DVD drives, Flash, DVD-ROM or diskette disk drives, hard drives, or NFS. These seem very instersting options, but I've only been able to sucessfully redirect an ISO image. Are these other options really possible?
    - Is it possible to script the mounting/unmounting of remote ISO images or other storage? I would love to be able to control blade boot processes by having this functionality.
    Thank you,
    -Matthew

    It seems the problem is related somehow to the setup of my Windows box. I did try a couple of other Windows boxes with the same result, but everything worked perfectly when using a Linux/ubuntu system to run the remote console. The blade saw the CD, booted from it, and is now happily running ESX Server.
    The wierd thing is, the ubuntu system was running inside VMWare Workstation on the same Windows PC that has the problems, and was accessing the same physical CD drive. Sometimes you have to think out of the box, or in this case into a box inside the box:-)
    I guess if these things we all straightforward I'd be out of a job, so I shouldn't complain!!
    Steve.

  • Macbook Pro and iCloud storage questions/help

    Hi all, Im needing help here as my Macbook Pro has become extremely slow and sluggish, is this because there is too much stored on it? Having iCloud should help take extra room up on the Macbook should it not?
    Just general help and advice regarding these matters would be appreciated.

    iCloud is a syncing facility, not a separate storage one, and any data on iCloud is duplicated on your Mac. You would do better to buy an external hard disk and use it to store bulky data such as media files.

  • Book printing turnaround time(Canada) and general sharpening question

    Hi,
    I am nearly finished a book in Aperture and have a couple of questions, one specific to Canada and another, more general question.
    For those in Canada(or anybody else's experiences), how long does it take for the book to be published and sent out to you? My book project is to be a christmas gift and I would like to order one copy to proof, but if the turnaround is really long, then I would just order all the copies I need and hope everything turns out satisfactory.
    The second concerns sharpening. Since the images are resized for the individual use on a page, how do you optimize the sharpening? Is it better to oversharpen, or resize and sharpen each image prior to placing in the book?
    Thanks for any and all help. This is a first book so any other comments or suggestions would be appreciated!
    Scott

    Perhaps I can help with the first question. I tried the same approach and was quite happy with the first book quality; however, when I ordered several of the same book a few weeks later, I was very unhappy with the color casts and how they differed from the original book I had printed. The moral of the story is that the print quality is highly variable between printings.
    Aves

  • Validation Script for Dates and General Event Questions

    I have just started using Javascript, and am now using some objects and methods etc. that I did not even know about. It's progressing rather well, now I need to know some Livecycle Designer Basics that I can't seem to answer from my searches.
    Here's what I am trying to do in English:
    I want users to choose a date that they will miss at our Farmers Market. I have the date field on the form - works well.
    I want to validate the entry for:
    The date must be today or in the future
    AND
    The date must be before the closing date
    AND
    The date must be a Saturday
    Here's some script I've written and placed in the Validation Event (I have actually written more for testing out that the results are coming out properly):
    ----- form1.#subform[0].Missdate::validate - (JavaScript, client) ----------------------------------
    var entereddate = this.rawValue;// The date vendor will not attend as entered on the form
    var dentry = new Date(entereddate.slice(0,4),[entereddate.slice(5,7)-1],entereddate.slice(8,10),0,0,0); // month starts at 0!
    var closingdate = "October 04,2008" // closing date of the market
    var today = Date();// today
    (dentry.getDay() = 6);// and attempt to validate that the day = Saturday - nothing happens!
    But now -
    How do I actually validate this - my last statement seems to be ignored. How to I force a 'false' being returned? In Formcalc I simply put a camparison statement here and if it resulted in 'False' validation failed and if it resulted in 'True' it passed - What's the JS equivalent? Or are the variables giving me troubles?
    Maybe I'm putting this in the wrong Event? If so which one should I place it in.
    I want to force the user to enter the correct data - how do I code this - and put in a custom message refering to this. I may even get fancy and ask the user if the next Saturday is what they meant if they enter the incorrect one (this will be a real challenge!)
    I think I'm lacking some basic knowledge here that other posts have assumed. Please refer me to any help pages as well - although I've done extensive searching on this and have not really found a good explanation of Validation - only specific pages that are not basic or general enough for my understanding. Thank!

    In the validation script you have to allow the field's length to be 0, or
    it will not be possible to clear it...

  • Sharing the /home partition and general partition questions

    Hello, I'm new to Arch, but have been using Linux for a few years (albeit still at a beginner level).  I'm going to be reinstalling Arch on an old computer that has a 40GB main drive so dual boot a "operational" OS for day to day stuff that I want to make sure will be running well and then another OS that I can test on or just have for trying new distros.  I also have an 80GB that I'll use for data (but I don't think I want that to be my home drive). 
    My question is:  If I have two different installations of Arch, (or a second distribution) should they share the same /home partition?  My thought is "no", but I didn't know.
    Also, I'm planning on splitting the 40GB drive the following partitions.  Do these make sense, or would there be a better way to do this? 
    5GB = / (OS #1)
    14.5GB = /home (OS #1)
    5GB = / (OS #2)
    14.5GB = /home (OS #2)
    1 GB = swap (both OSes)
    I have an ancient P4 w/ 512 of RAM.

    sharing /home drives would NOT be a good option in your case simply because you are going to use the 2nd OS as test/trials. Those other OSes may have different ways of storing config files etc which may lead to having a lot of junk to parse through. and if you ever use any configs for the Test OS, and they are somewhat in conflict with Arch - in any way - you might end up having to re-configure settings for your favorite apps in Arch.
    I have a 30 GB HDD on a 10 yr old laptop which has Arch. This is the partition scheme I have
    ╔═[16:10]═[inxs @ arch]
    ╚═══===═══[~]>> df
    Filesystem Type Size Used Avail Use% Mounted on
    /dev/sda3 ext3 7.0G 1.7G 5.0G 25% /
    none tmpfs 125M 100K 125M 1% /dev
    none tmpfs 125M 0 125M 0% /dev/shm
    /dev/sda4 ext4 16G 850M 14G 6% /home
    /dev/sda6 reiserfs 5.1G 558M 4.5G 11% /var
    /dev/sda1 ext2 61M 12M 47M 20% /boot
    ╔═[21:16]═[inxs @ arch]
    ╚═══===═══[~]>> fdisk
    Disk /dev/sda: 30.0 GB, 30005821440 bytes
    255 heads, 63 sectors/track, 3648 cylinders
    Units = cylinders of 16065 * 512 = 8225280 bytes
    Disk identifier: 0x00000080
    Device Boot Start End Blocks Id System
    /dev/sda1 1 8 64228+ 83 Linux
    /dev/sda2 9 726 5767335 5 Extended
    /dev/sda3 727 1640 7341705 83 Linux
    /dev/sda4 1641 3648 16129260 83 Linux
    /dev/sda5 9 73 522081 82 Linux swap / Solaris
    /dev/sda6 74 726 5245191 83 Linux
    ╔═[21:18]═[inxs @ arch]
    ╚═══===═══[~]>>
    Since you have 10GB more than I do, you can adjust accordingly and make partitions for your test OSes as well.
    Last edited by Inxsible (2009-10-08 01:22:30)

  • IPhoto and Images Storage Question

    I've started a task of scanning in a bunch of old negitive strips. The strips are not organized so I never know whats really on them until I've scanned them. What I have been doing is scanning all the negitives to a scan folded on the desktop. When I've scanned them all in I create other folders on the desktop that are named by the event. After I put all the pics into the appropiate folders I drag the folder from my desktop into iPhoto. Normally this works fine if but when I try to drag a single or set of pics into an existing event in iPhoto it creates a new "unnamed event". How can I move these pics over to an existing filder. Also, can I put the folder that I created on the desktop into the trash once the pics have been imported into iPhoto so I dont have two copies laying around?
    Thanks,
    Jim

    Jim
    You cannot import to a specific event. You can however move pics between events in the iPhoto Window to consolidate them - drag and drop will do it.
    Assuming that you’re using iPhoto in the default mode, then the pics are copied into the iPhoto Library, therefore yes, you can delete the versions on the desktop.
    Regards
    TD

  • Geany Trouble: Compiling Perl Scripts (and general Perl questions)

    I'm brand new to Perl because I hear it's one of the best and so far I've found that it is.  I used to program a little C++ here and there and a while ago taught myself python but I'd have to say Perl is better than both of them.  At any rate, that's neither here nor there.  I use Geany and love it, so I didn't see a reason to switch to something new for Perl.  Unfortunately I'm having some trouble with compiling.  Geany flat out refuses to.  I looked around and didn't find anything on here or in the wiki or on the Ubuntu forums other than one person who said to replace the compile command with the execute command. 
    So, do I even need to compile my Perl scripts or is there something else I'm missing?
    On a somewhat related note, the same user on the Ubuntu forums said that at the top of the code there should be:
    #!/usr/bin/perl -w
    I'm curious if I need the "-w" or what it even does since as of right now I have
    #!/usr/bin/perl
    and executing the program poses no problems.
    Any help is greatly appreciated,
    --Wes

    Well, for one, perl is not a compiled language. You can if you want to, but it gains you few benefits. Google for "perl compiler" if you really must.
    As for the second, do you know how the shebang line works in unix, and do you know what the -w option does for perl? The answer should reveal itself to you.
    Make sure you "use strict;", it will catch many errors.

  • WRT310N and general LAN questions

    I am trying to set up a music studio LAN in my home, and I just bought a WRT310N which I am hoping will help me but I'm not sure now.
    I have a Mac PowerBook G4 and a Dell WindowsXP from 2003 that doesn't have built in wireless, so I've been using a Linksys USB adaptor to access the 802.1 internet from my housemate's DSL connection.
    The problem is that I need Ethernet 802.3, 100Mbps connection to run the MIDIoverLAN software that I bought; gigabit would be preferred, and that's why I bought the WRT310N.
    Q: Is it possible for me to just use the WRT310N for my LAN, while accessing the internet on the other network? Linksys tech support just told me I should get an Access Point, but 802.1 isn't what I'm looking for.
    Q: Is there a more detailed way of seeing the stats on the WindowsXP computer than opening "Local Area Network Connection". When I have a cable between the 2 computers it says it's connected at 100Mbps, but I'd like to know more about what that means. Also, is that the maximum speed that an ethernet cable can carry?
    Q: Do I really need a DSL modem to set up this router?
    Q: What kind of ethernet card do I need in my WindowsXP computer to access this router at 100Mbps or higher?
    Thanks.

    If you require speed more than 100MBPS you should install a Gigabit Network Adapter...You should have your own modem so that you can also have internet on your computers and a private network...

  • Twisted Framework and general mpkg question

    hello... i recently got a mac and i have OS X Leopard, i am a python developer and i use twisted framework, i ran sort of a similar problem when i was working with Ubuntu Hardy...
    i downloaded twisted framework for os x, which MIT says its version 8.2, and i installed the mpkg and the framework works well, however my project relies on the PythonLoggingObserver method call which says its missing, but on a production linux server running same version of 8.2 the problem doesnt exist, so i printed the twisted._version_ and that indicated that apperently i have twisted 2.5 installed, anyone else had that problem, and is it os x specific, im not sure...
    also, anyone know good tool to remove mpkg installed libs/apps, i want to remove twisted now, but i have no idea where it is subinstalled... any info would be helpful
    thnx

    Post developer queries to the appropriate forum under OS X Technologies.

  • General Storage Question...

    We need to store raw footage that will be used by several machines on our network (G4's and G5's). Is an XServe RAID the way to go? Will several video editors be able to access the DV files on the RAID without a big slow down?

    If you need simultaneous access by multiple users, you may need Xsan.
    If it's DV only, there's a slight chance that you won't, but simultaneous video access by multiple users off the same volume is Xsan territory.
    www.apple.com/xsan
    One thing I'll note out front about Xsan -- I STRONGLY (I can't state it strongly enough, so I'll say it again) STRONGLY recommend you not cut corners. Two dedicated metadata controllers, a separate, isolated metadata network, and at least 2 Xserve RAIDs are the minimum reasonable configuration.

  • HT201269 When I try to setup my new iPad air, I go through all the steps for the iCloud sign-in and choosing security questions and what not. But after I hit the agree to the terms and conditions... It says Apple ID could not be created because of a serve

    When I try to setup my new iPad air, I go through all the steps for the iCloud sign-in and choosing security questions and what not. But after I hit the agree to the terms and conditions... It says Apple ID could not be created because of a server error. Have no clue what to do... I've restarted the iPad and get the same message. But my internet works just fine.

    1. Turn router off for 30 seconds and on again
    2. Settings>General>Reset>Reset Network Settings

  • Workflow without 'save as' and huge storage consumption.

    I use Aperture & photoshp daily. Aperture no longer has 'save as' .. Photoshop hasn't, as yet followe suit, I am told. Very concerned about workflow changes incurring more complexity, more steps and hughe storage waste.
    Current work flow: I open one of the apps, then an image, initiate a bunch of edits, 'save as' my final desired result. I now have 2 files, the UNTOUCHED original and the edited second image.
    Several calls to Apple. They are not sure how the missing 'save as' works into this work flow, do not know how many more steps will be needed and how mush more complex. They did say, for the first time, now you have to check out each program to see how it's 'save as' concept will work... I guess and try to remember them all. WOW! 
    Also, with the 'versions' concept and processing raw images, instead of 2 files residing in memory at the end of the multiple edits, there will be a version for each edit step meaning there could be like 20, 50, even more huge files ... not wanted, not needed but consuming huge amounts of storage.
    It was finally decided there had to be a way to purge all those unnecessary files to win back storage space. More work and how?
    Their final assessment, you may not be able to proceed with Lion... with the future Mac's ongoing OS. I guess all photographers are at a loss then. Sad.
    Is this hopefully just a bad dream??

    The new Autosave/Versions paradigm is ideal for the situation you describe as “Current work flow”. It’s not a bit more complicated, it’s safer, and — if you can get past the fact that it is different from what you are used to — it is more logical. What you are doing is creating a new, modified copy of the original. So, open the original and Duplicate (File > Duplicate). That creates a new copy that you can edit, but it is treated like a new file, so when you close it, you get a Save As dialog. Or you can save it manually at any time just like any new file. No extra steps — just Duplicate at the beginning instead of Save As at the end.
    I said it is safer. Consider the following: In your current work flow: by accident or absent mindedness, you mistakenly Save rather than Save As. Now your original is clobbered. Of course, if you are smart, you have a backup. Otherwise it is gone. In the new workflow you might forget to duplicate at the beginning. But it’s not too late! Go ahead and duplicate later. When you duplicate an edited file you get a dialog asking what to do with the original. The choices are to leave it in its edited state or to revert to its original state. Even if you close the file without duplicating it, you can still re-open it and revert using Versions.
    As Pondini pointed out, you misunderstand how versions work. Versions are not files. It is more like a super sophisticated Undo. As you edit a file, only the latest version is saved in the file. Enough information to reconstruct intermediate versions is saved in a hidden versions database. So in your scenario, you still have only 2 files plus some information in the database. In general the versions information should be efficiently stored.
    This is how Aperture has always worked. As Terence Devlin pointed out, no version of Aperture has ever had a Save or Save As command. I wouldn’t be surprised if Aperture served as an inspiration or testing ground for Autosave/Versions. I am a serious amateur photographer, and I am not the least bit sad. I think this is a great advancement.

  • Adding a RAID card to help speed up export (and other drive question) in Premiere Pro CC

    First of all, I have read Tweakers Page exporting section because that is where my primary concern is. First my questions, then background and my current and proposed configurations:
    Question 1: Will adding a hardware RAID controller, such as an LSI MegaRAID remove enough burden from the CPU managing parity on my software RAID 5 that the CPU will jump for joy and export faster?
    Question 2: If true to above, then compare thoughts on adding more smaller SSDs for either a one volume RAID 0 or smaller two volume RAID 0 to complement existing HDD RAID 5. That is, I'm thinking of buying four Samsung 850 Pro 128 GB SSDs to put in a four disk volume to handle everything (media/projects, media cache, previews, exports), or split it up into two volumes of two disks each and split the duties, or keep the four disk volume idea and put the previews & exports on my HDD RAID 5 array.
    The 850's are rated at SEQ read/write: 550/470 MB/s thus I could get around 2000/1500 MB/s read write in a four disk RAID 0 or 1/2 that if I split into two volumes to minimize volumes from reading/writing at the same time, if that really matters with these SSDs?
    The Tweaker's page made a few comments. One is splitting duties among different disks, rather than a large efficient RAID may actually slow things down. Since the SSDs are much faster than a single HDD, I'm thinking that is no longer accurate, thus I'm leaning toward the Four disk configuration putting OS & Programs on C drive, Media & Projects on D (HDD RAID 5), Pagefile & Media Cache on SSD (2-disk RAID 0) and Previews &Exports on 2nd SSD RAID 0 (or combine the two RAID 0's and their duties).
    Just trying to get a perspective here, since I haven't purchased anything yet. Any experience/stories, I would appreciate.
    My current drive configuration:
    My D drive is software RAID 5 consisting of four 1 TB Western Digital RE4 (RED) 7200 RPM HDDs with a CrystalDiskMark SEQ Read/Write of 339/252 MB/s.
    The C drive is SSD 500 GB (Samsung 840 (not Pro) and does 531/330 MB/s. My OS, Program Files and Page File are on C, and data/media files/project, etc all are on the RAID drive.
    Problem:
    Current setup allows for smooth editing, only the exporting seems slow, often taking between two and two and a half times the video length to export. Thus a 10 minute video takes 20-30 minutes to export. 15 minute video can take 30-40 minutes to export. The first 10% of the two-pass export takes under a minute (seems fast), but it gets slower where the final 10 or 20% can hang for many minutes like my system is running out of steam. So where is the waste?
    I have enabled hardware acceleration (did the GPU hack since my GPU isn't listed) and it may spike at 25% usage a few times and eat up 600 MB of VRAM (I have 2 GB of VRAM), otherwise it is idle the whole export. The CPU may spike at 50% but it doesn't seem overly busy either.
    Our timeline is simple with two video streams and two audio streams (a little music and mostly voice) with simple transitions (jump cuts or cross dissolves). We sometimes fast color correct, so that might use the GPU? Also, since we film in 1080 60P and export 1080 29.97 frames/sec, I think that is scaling and uses the GPU. I know without the GPU, it does take a lot longer. I have ruled out buying a faster GPU since it doesn't appear to be breaking a sweat. I just need to know if my system is bottlenecked at the hard drive level because I'm using software RAID and my disks are slow and will hardware RAID significantly reduce the CPU load so it can export faster.
    Our files are not huge in nature. Most our clips are several MBs each. Total project files are between 5 GBs and 10 GBs for each video with Windows Media File export being 500 MB to 1.2 GB on average. We shoot using Panasonic camcorders so the original files are AVCHD, I believe (.MTS files?).
    Considerations:
    1. I'm thinking of buying (and future proofing) an LSI Logic MegaRAID 9361-8i that is 12Gb/s SAS and SATA (because some current SSDs can exceed the 6Gb/s standard).
    2. I'm not replacing my current RAID 5 HDDs because not in my budget to upgrade to 6 or more large SSDs. These drives are more important to me for temporary storage because I remove the files once backed up. I don't mind a few inexpensive smaller SSDs if they can make a significant difference for editing and exporting.
    I can only guess my HDD RAID is slow but the CPU is burdened with parity. I would imagine running RAID 10 would not help much.
    My setup:
    my setup:
    CPU - i7-3930K CPU @4.5 GHz
    RAM - G.SKILL Ripjaws Z Series 32GB (4 x 8GB) DDR3 2133 @2000
    Motherboard - ASUS P9X79 WS LGA 2011
    GPU - Gigabyte GeForce GTX 660 OC 2GB (performed the compatibility list hack to enable hardware acceleration).
    C drive - 500 GB Samsung 840 SSD (Windows 7 Pro 64 bit and programs).
    D drive - four 1 TB WD RE4 Enterprise HDDs 7200 RPMs in software RAID 5
    Case - Cooler Master HAF X
    CPU Fan - Cooler Master Hyper 212 EVO with 120 mm fan
    Power Supply - Corsair Pro Series AX 850 Watt 80 Plus Gold
    Optical Drive - Pioneer BDR - 208DBK
    thanks in advance,
    Eric

    ........software RAID 5 off the motherboard ??????......NOT a good idea, from what I have read here on this forum from experts like Harm Millard and others. They have mentioned a LARGE overhead on the CPU doing this....causing sub-par performance. RAID 0 off the motherboard will NOT do this, however.....RAID 0 would provide optimum speed, but, with the risk of total data loss if ANY drive fails. You may wish to reconfigure your RAID to be RAID 0...BUT...you would need to DILIGENTLY back up its entire volume onto perhaps a quality 4TB drive very frequently.
         A lot depends on the nature of your current and FUTURE codecs you plan to edit. You may not want to sink a lot of money into an older setup that may have trouble with more demanding future codecs. For now, in the 1080p realm, your rig should be OK....the read/write performance on your CURRENT RAID 5 setup is not great, and a definite drag on the performance. The rest of your components appear to be fine.....the Samsung SSD, though not ideal, is OK.....it's write speed is WAY lower than the Pro model,but, the drive is used mainly for reading operations. Since you have Windows 7 Pro, and NOT Windows 8.......you CAN put the entire windows page file onto the RAID 0 you might create.....this will take that frequent read/write load OFF the SSD. Read the "tweakers Page" to see how to best TUNE your machine. To use your current setup most efficiently, without investing much money, you would :a. create the RAID 0 off the motherboard, ( putting all media and project files on it )  b. install a quality 7200rpm 4TB HDD to serve as a BACKUP of the RAID array. Then, install a Crucial M550 256GB or larger SSD, ( close in performance to Samsung 850 Pro...much cheaper), to put all previews, cache , and media cache files on....AND to use as " global performance cache" for After Effects...if you use that program. Exporting can be done to ANOTHER Crucial M550 for best speed...or, just to the either the FIRST Crucial or, the 4TB drive. Your current GPU will accelerate exports on any video containing scaling and any GPU accelerated effects. Your CPU is STILL important in SERVING the data to and from the GPU AND for decoding and encoding non-GPU handled video....your high CPU clock speed helps performance there ! You may want to check out possibly overclocking your video card, using MSI Afterburner.or, similar free program. Increasing the "memory clock speed" can RAISE performance and cut export times on GPU effects loaded timelines,or, scaling operations. On my laptop, I export 25% faster doing this. With my NEW  i7 4700 HQ laptop, I export in the range of your CURRENT machine....about 2 to 3 times the length of the original video. PROPERLY SET UP...your desktop machine should BLOW THIS AWAY !!
        Visit the PPBM7 website and test your current setup to possibly identify current bottlenecks,or, performance issues. THEN, RE-TEST it again, after making improvements to your machine to see how it does. Be aware that new codecs are coming (H.265 and HEVC,etc.) which may demand more computer horsepower to edit, as they are even MORE compressed and engineered for "streaming" high quality at a lower bandwidth on the internet. The new Haswell E...with its quad-channel memory, 8 core option, large number of PCI gen. 3 lanes, goes farther in being prepared for 4K and more. Testing by Eric Bowen has shown the newer PPro versions provide MUCH better processing of 4K than older versions.

Maybe you are looking for

  • I receive error -3960 when I try to download Illustrator

    My 30 day trial for adobe illustrator also isn't working. I am using a mac and i have downloaded the adobe download assistant. I chose to trial illustrator and took around 30 mins to download and now all it says is "extracting... this may take a whil

  • I tunes wont find ipod

    I just got a new ipod back frome service . When i plug it in XP pickes it up but i cant see it in my computer . I turn on itunes and it dose not find the ipod ! but no error msg . When i run ipod updater it picks up my ipod BUT the update botton is n

  • It takes long time to connect Satellite A110-049 to Actiontec GT701-WRU router

    Actiontec wireless router model GT701-WRU Since buying a Toshiba A110-049 running Windows Vista (this was necessary because the old laptop gave up the ghost) We have been having problems connecting to this router. Using the old laptop, running Window

  • The download did not pass the integrity check (16236.304.443

    For the above error, use the manual download link. For Internet Explorer : http://fpdownload.adobe.com/get/flashplayer/current/install_flash_player_ax.exe For Firefox: http://fpdownload.adobe.com/get/flashplayer/current/install_flash_player.exe Bobby

  • Error when trying to use Squiggly in AIR

    Hi, I am trying to use Squiggly in an Adobe AIR application but keep getting this error: TypeError: Error #1090: XML parser failure: element is malformed.     at com.adobe.linguistics.spelling::SpellUI/loadConfigComplete()[C:\p4_garuda1890_ugoyal-xp\