Hardware config.vi

Hi,
What VI in DAQmx could be substitutable with "Hardware config.VI" in traditional DAQ? I can't find it in DAQmx.
I have a snapshot for my VI below and attachment.
Thanks in advance.
Attachments:
STIM9.28P_1.vi ‏286 KB

You can get a description of what it does by turning on context help. It looks like it performs about the same function as the DAQmx Create Channel when it's inside a for loop and passed an array of channels.

Similar Messages

  • Error 10401 occurred at AI Hardware Config, when using AI Acquire Waveform vi

    I can successfully run applications that use the AI Sample Channel vi, however when I use the AI Acquire Waveform vi the following errors are generated:-
    Error -10401 occurred at AI Hardware Config, followed by
    Error -10401 occurred at AI SingleScan.
    All applications have been developed using Labview 6.0.2, on an NT machine.  The applications are then run (or atleast attempted to be run) using the Labview Run-Time Engine on a different NT machine.
    The driver software I am using is the NI-DAQ Native Data Acquisition Driver Software Version 6.1.

    Hi,
    I've found a Knowledge Base on the NI website describing some situations where this problem occurs:
    http://digital.ni.com/public.nsf/websearch/46F78EDD19BA0CC386256CEE007201FC?OpenDocument
    That error code is generally seen when something has changed in the DAQ card's configuration, or the drivers are not installed properly. It's strange that this is showing up only on certain functions for your application.
    Also try having a look through the DAQ troubleshooting pages on the website:
    http://www.ni.com/support/daq/
    Regards,
    Mark Lee
    National Instruments

  • I obtain error -10003 from AI hardware config when I use AC coupling

    I am using a function generator to make a sine wave and I want to obtain the data. However, there is an error 10003 at AI hardware config when I use AC coupling and I am not sure why?
    Attachments:
    Motor_Current_Measurement2.vi ‏133 KB

    What model DAQ board are you using. There's only a couple from NI that support AC coupling.

  • How AI Hardware Config.vi into Acquire&Proc N Scans - Trig.vi amplify gain?

    I am using Acquire&Proc N Scan - Trig.vi program. I tried using AI Hardware Config.vi to add into the block diagram so as to increase the ECG waveform gain but it seems useless. How can it be done to amplify the gain of the waveform. Thanks for the help.
    Attachments:
    Scan_Trigger.vi ‏130 KB

    If you aren't using virtual channels...
    Using the original Acquire&Proc example program, you can set the gain by putting values into the Input Limits. Labview will automatically scale the results. So if your signal is 0-10V, put 0 as the minimum limit, and 10 as the maximum limit. Typically, you''ll use the range of whatever signal you're measuring, but if a higher gain is needed, use a smaller range (liek 0-1V).
    See also:
    http://sine.ni.com/apps/we/niepd_web_display.DISPLAY_EPD4?p_guid=B45EACE3E8A156A4E034080020E74861&p_node=DZ52300&p_submitted=N&p_rank=&p_answer=&p_source=External
    2006 Ultimate LabVIEW G-eek.

  • Recommended Hardware Config for huge OLAP Cube build

    Hi David ,
    Can you please provide the recommended hardware config for cube having billions of data under fact table . We ran a cube with 0.1 billion of data took around 7 hours to process. What could be the key areas to gain the performance benefit ? ansd also what could be the CPU , RAM (server) , RAM (Oracle DB) to gain much more perf benefit in such configurations ?
    Also we have 32 bit windows 2003 server .Can we get better result if we switch to 64 bit ?
    Thanks in advance,
    DxP.

    Hi!
    Well, I would definitely recommend you to proceed with a consultant because I feel that you have some lack of methodology and experience together with time constraints for your project.
    Regarding hardware, unfortunately, I would not be able to give you any precise figures because I have no supporting information. What you should bear in mind that your system must be balanced. You (better with consultant) need to find a right balance between all the factors you consider as important while building a pile of hardware:
    - cost
    - architecture
    - speed
    - availability
    - load scenarios
    etc...
    Regarding architecture point bear in mind that today finding right balance between processing power and storage processing capacity is a challenge. Put attention to this.
    Regards,
    Kirill

  • Cheap RAC Hardware Config

    I have read the paper on building cheap RAC on Linux using Firewire technology. I want to build one, strictly for learning purposes. Can someone post complete hardware specs in detail, which I need to buy to make this possible. I would appreciate if someone who has already built one posts the detailed hardware config.

    there are some links on this page that show the actual hardware used in our set up at Oracle World.
    http://otn.oracle.com/tech/linux/install/ow2003sf/OTN_firewire_demo_req.html
    the particular model of HD used has been discontinued but as long as you have a Firewire HD with a Oxford 911 chipset you should be ok.

  • Regarding Hardware Config. of  XI

    Hi all,
    Can someone gives me a brief idea of Hardware  configuration of XI , if there are four interfaces and message size <= 1 mb

    Hi all,
      Thanks for reply..
    Actually I need a brief idea about the Hardware config.
    (RAM , Harddisk ,Memory ,Additional Installation for  
      XI etc..)
    There will four interfaces(Legacy , RDBMS ,WEB APP , 
    SAP with XI in between) and message size will not be
    greator than 1MB.The communication will be sync. in
    case of on-line interfaces.
    I have seen the Installation guides but need a
    practical scenario .
    It would be of great help if someone can give the estimates in figures according to the experience he/she is having in the project .
    Regards,
    Shikha

  • Hardware config of BDB

    Hi Guys,
    We have an archive project planning,situation is:
    1.around 100TB archive data to be stored in BDB;
    2.around 50 concurrent users query the database;
    I am not sure about the hardware config,like cpu count,memory size ...
    Do you have any suggestion or reference about it.
    Thanks a lot.

    Hi all,
      Thanks for reply..
    Actually I need a brief idea about the Hardware config.
    (RAM , Harddisk ,Memory ,Additional Installation for  
      XI etc..)
    There will four interfaces(Legacy , RDBMS ,WEB APP , 
    SAP with XI in between) and message size will not be
    greator than 1MB.The communication will be sync. in
    case of on-line interfaces.
    I have seen the Installation guides but need a
    practical scenario .
    It would be of great help if someone can give the estimates in figures according to the experience he/she is having in the project .
    Regards,
    Shikha

  • I am having problems setting the gain programatically from Labview 6, using the Hardware Config vi, for a PCI6110E DAQ board.

    NB: This has also been posted to High Speed Digitizers forum.
    Changing the "gain" (part of the "alternate input limits" cluster) appears to affect the group settings, but when I pass a signal outside the supposed new voltage range, the signal doesn't clip as I would expect. This doesn't happen when I use MAX (where clipping is clearly visible), but unfortunately that is not an acceptable long-term solution. Any thoughts? I am only using allowable gain settings for the 6110E.
    Additionally, I am saving my data in hsdl format, and to get the actual true voltage from the binary numbers (after uncompressing and converting back to standard binar
    y format) I need to multiply the final number by the gain. Should this be correct? I thought the "scale multiplier" (part of "group settings cluster") compensated for "gain", but it doesn't seem to do so. Is this part of the same problem?

    Thanks Spencer. I do call Hardware Config.vi in several different places, so maybe its the multiple calls that is stuffing things up. I will investigate further.
    As for the multiplier, which bit sounds correct? The fact that I have to multiply by the gain, or that it should take this into account automatically and therefore not have to muliply by the gain (which is what the default hsdl code does). What do you do to convert from binary to real?
    Thanks for your help,
    Jo

  • LR + PS Optimizing Hardware Config on my Win7 PC

    Dear All,
    I'm looking to speed a few things up in terms of my hardware/software perfromance re. Photo editing (I'm a pro-photog).
    Here is my current hardware profile (assembled by an individual contractor - not a brand):
    Win 7 Ultimate 64bit
    12 GB RAM (Standard issue kingston DDR3. Not that fancy Gaming Ram).
    Intel Core i7 3770k CPU 3.50ghz (3rd Gen)
    AMD 7750 1GB VGA (Display Memory: 2775 MB  Dedicated Memory: 999 MB Shared Memory: 1776 MB)
    USB 3.0 ports
    Page File: 3737MB used, 14518MB available (Details of Custom pagefile config is next to each drive that is set to have a pagefile)
    -Lightroom 4.1 (Multiple Catalogues - One for each shoot (+preview folder) located in the Main folder with the corresponding photos). 60% of all photos are processed and edited with this program.
    -Autowrite XMP turned off
    -Photoshop CS5 & CS6 30% of all photos are processed with CS5. 10% with CS6 (I can't get some filters (ColourEfex 2) to work with CS6 and haven't bothered sorting that out yet).
    -I defrag regularily with MyDefrag
    -Multitasking involves running LR, PS and Chrome/Gmail simultaneously.
    -Each folder per shoot can range from 4GB - 50GB of Raw/DNG files.
    Hard Disks:
    (A) 1TB Sata/300 Driver: Windows/OS , LR & PSCS5 & PSCS6 Drive, partitioned 300GB for OS and 630GB for the other partition (B) which holds some music (400GB free). (Overkill I know but this was the only drive I had at one point...) (Pagefile 2000mb)
    (C) 2TB Sata/600 Drive: Primary DATA drive, where Folders and their respective LR catalogues are stored and accessed. (Freespace ranges from 850GB-1.1TB) (Pagefile 2000mb)
    (D) 2TB Sata/600 Drive: Backup of (E). These should be mirrored but it didn't happen that way for some reason...! (Mostly as above)
    (F) 500gb Sata/300 Drive: This was an older drive that ended up as a PS scratch disk (The only one). It is mostly used for movies and LR catalogue backups. The LR Camera Raw Cache is also here (It's set to 100GB). (Freespace: 300GB) (Pagefile 2000mb)
    Here are the intensive processes I wouldn't mind speeding up:
    1) Importing and converting 1000's of 5D Mark 3 CR2 (RAW) files into DNG
    2) Rendering 1:1 previews
    3) Exporting apprx 500-800 full-res or Resized Jpegs from DNGs at a time
    4) Exporting apprx 100-200 ZIP compressed tifs at a time
    5) Saving Tif's with ZIP Compression in Photoshop CS5 & CS6 (If I'd like anything to be faster - it would be this. It can take anywhere from 20 seconds to 1 1/2 minutes to compress (ZIP) and save a flattened Tif with only minor changes eg. Cross process color changes. Resulting Tif files range from 60mb to 120-180mb). I don't have too many large (500GB+) layered Tif files on this computer so we can discount that. I also don't work with video or panoramas (for the forseeable future anyway!)
    6) Batch processing SilverEFX Pro actions onto DNGs which are then saved as Tifs (as above in no 5)
    *OS and Application start times are quite fast and are not an issue*
    -The Usual process is to Export and Save images to the same harddrive as the Source file eg. a DNG on Drive (C) will be exported as a JPG to Drive (C)
    I find no difference in terms of speed when Exporting from (C) to (D). Especially when you factor in the time it takes to copy the exported files back to (C) so that they're located in the same folder.
    The Questions:
    Q1: Does this setup make sense esp with regards to the Win Pagefile? Or is there something that isn't quite recommended (bottlenecking)?
    Q2: If so how can I improve it? How might an SSD work best to improve performance esp with regards to point 5) - saving Tifs in PS? Or in LR perhaps as an Export or Save Disc? This may prove cumbersome as it requires the files to be moved back to disc (C) but if the performance gain is significant then whoopee!
    Unfortunately SSD's are fairly pricey in the country I live in. A 120GB drive will cost about $160 a(affordable) and my current OS Drive (A) is already at 150GB of usesage (although this can be trimmed down). Mostly I'd rather not have to reinstall Windows and all my PS filers, fonts, minor programs etc but if the performance gain is worth it I could consider it.
    The next level of SSD is at a currently unaffordable US$380.
    I'd be grateful for any thoughts, this machine has come a long way since it was first assembled...!
    Thank you in advance,
    KC

    Hello Epoch,
    I think, there not many things that you can do do speed up your workflow. A few suggestions:
    1) Importing and converting 1000's of 5D Mark 3 CR2 (RAW) files into DNG
    -> USE an internal USB3 Card Reader ( reads more than 90 MB/s on my SDCard)
    2) Rendering 1:1 previews
    -> perhapst you skip the geneation of these files on import and first select images to delete. After deletion you can generate the previews.
    3) Exporting apprx 500-800 full-res or Resized Jpegs from DNGs at a time
    > exporting images is recorded inside the catalog. Updates on your catalog take place on every image you export. Therefor put the catalog on your fastest drive!
    4) Exporting apprx 100-200 ZIP compressed tifs at a time
    -> from you setup, I saw that you are using a 3770k @3.5 GHz. Overclocking? Mine 2500k is running @4.3Ghz since more than 1 1/2 year.
    I tried myself to seedup LR by tweaking Windows7 - does not really worked for me.
    You can try LR 4.2. There some reports, that is faster than 4.1.

  • Hardware Config for MI server!

    Hi All,
    I need information regarding the hardware configuration for the MI server which should deploy SAP Netweaver Mobile 7.1 and support xMAU 3.0 SP5.
    Can anyone throw some light on this.
    Thanks in advance!
    Kanwar

    Hi Kanwar,
    Check this link in SMP for MAU specific information.
    https://websmp105.sap-ag.de/~form/sapnet?_SCENARIO=01100035870000000202&_SHORTKEY=01100035870000694050
    For general MI queries, please check the MI FAQ section https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/cc489997-0901-0010-b3a3-c27a7208e20a
    Regards
    Ajith

  • Care to share your server/hardware config for Spatial?

    Hi there - we're in the early stages of planning for a new spatial system (geocoding a large volume of existing address data), and we'd like to get a sense of what different organisations might be using by way of server architecture for a high-availability, large volume dataset GIS where your typical spatial query is a within-polygon or buffer-type search of geocoded address data, and you might need to support several thousand of these per hour... (for example).
    Would you share your platform/CPU/OS/memory/storage config comments as they relate to a dedicated Oracle Spatial 10g DBMS?
    Or perhaps just suggest what you had to do in addition to the baseline 10gDBMS requirements to give Spatial room to perform well...
    Any comments are accepted with humble thanks.
    Kind regards
    Dennis

    Dennis,
    Several thousand, say 3000 per hour is less than one a second. I'm not sure how much data you have (define large), but in general I'd try to keep all, or most of the data in RAM to get the best performance possible. A Linux box with say, 512GB of RAM is entirely resonable to use. If your seach patter is geographicly time sensitive (moves based on time of day), you could also effectively use partition prunning to reduce the amount of data to be searched.
    This is will give you a single, fast, db. For HA you need at least two, depending on what you define HA as.
    Bryan

  • What's a decent hardware config for photo/video editing ?

    I apologize if this has been asked a million times before, but I'm a newcomer. I'm seriously thinking of making the switch from the dark side and buying my first Mac - the iMac 20". My main priorities will be photo editing of both jPEG and RAW images from my Nikon D50 and ( at last ) editing all those video tapes I made of the kids and family vacations over the last 15 years. I'm not a professional photographer or videographer, but I don't want my next machine to be underpowered and cause me to get frustrated while waiting for it to do the the work and therefore loose interest. I'm kind of a hardware geek, so my philosophy is slanted towards more is better - more disk space, more RAM, yada yada yada. From what I've read on forums and at dpreview website, a lot of people recommend more system RAM, so I'm condfident 2 GB of system RAM is a good place to start. However, I haven't seen much talk about 128 vs 256 MB of RAM on the video cards. That's a no brainer for gaming -my son has a PC with 256 MB on a Gforce 7600 and he's happy with the performance, at least until the next generation of games comes out. However,I don't want to confuse the needs of gaming with that of video/photo editing.
    Is there a benefit to upgrading the graphics card to 256 MB ? In the near future, I would like to buy additional software, like Aperture, Nikon's Capture NX, or Final Cut.
    I realize that upgrading the video memory by the user is not easy on the iMac, so any hardware items that I would like to upgrade should be done at time of ordering from Apple through the Apple Store.
    Thanks much, and looking forwarding to a new Mac soon.
    Bill

    If you plan to use final cut I would strongly suggest the graphics card upgrade to 256. 2 gigs of ram will be good. This varies from person to person but the processor upgrade could be usefull but it does cost a lot.

  • Changed my hardware config, need help changing my overclock.

    About a year ago I completed my new OC system using a considerable amount of input here.  It has run well overall.  However, my original intent was to use the PC solely for productivity apps and internet access.  Recently, my XBox 360 died (you're shocked, I know) and I picked up Rome Total War to pass the time.  It didn't take me very long to realize the video card I was using wasn't adequate.  Here's my original system config (100% stable):
    MSI K9N SLI Platinum nForce 570 BIOS 1.8
    AMD Athlon 64 X2 5200+ 3.12Ghz (240*13) @ 1.45v
    SuperTalent 1GB DDR2 800 445.8Mhz DDR2 (CPU/7) @ 1.90v, 5-5-5-15-20-2T timings
    SuperTalent 1GB DDR2 800 445.8Mhz DDR2 (CPU/7) @ 1.90v, 5-5-5-15-20-2T timings
    MSI NX8500GT-TD256E 576Mhz core, 512Mhz GDDR2
    XClio Stablepower 500W +3.3V@30A, +5V@30A, +12V1@18A, +12V2@18A
    I purchased an MSI NX8800GTS 320M OC to replace the NX8500GT.  The NX8800GTS uses a dedicated 6-pin PCIE power connector, while the NX8500GT does not.  In addition, the 8800GTS generates ALOT of heat.  The 8800GTS was increasing ambient case temp, which in turn was increasing core temp to the point that the system was becoming unstable after long periods of playing RTW.  I decided to lower my overclock in an effort to reduce voltages and hopefully reduce system temps across the board (I can't add anymore cooling, I already have 3 fans plus a Zalman CPU cooler).  Here's my current config:
    MSI K9N SLI Platinum nForce 570 BIOS 1.8
    AMD Athlon 64 X2 5200+ 2.99Ghz (230*13) @ 1.35v
    SuperTalent 1GB DDR2 800 427.1Mhz DDR2 (CPU/7) @ 1.80v, 5-5-5-15-20-2T timings
    SuperTalent 1GB DDR2 800 427.1Mhz DDR2 (CPU/7) @ 1.80v, 5-5-5-15-20-2T timings
    MSI NX8800GTS 320M OC 630Mhz core, 930Mhz GDDR2
    XClio Stablepower 500W +3.3V@30A, +5V@30A, +12V1@18A, +12V2@18A
    Here's where my question comes from:
    I ran Orthos small FFT test, priority 9 with the above settings for 12 hours -- no errors.
    I ran memtest86+ test #5, 24 passes -- no errors
    However, I cannot get the Orthos large FFT test or blended CPU + RAM test to run for more than a few hours without starting to error out.  I'm pretty sure the CPU is stable because of the small FFT test.  So I think the problem is my memory.
    At 1.90v, the large FFT test errors out after about 30 minutes.  The blended FFT test errors out after about 4 hours.
    At 1.95v, the large FFT test errors out after about 3 hours.  The blended FFT test errors our after about 6 hours.
    At 2.00v and above, the PC starts behaving erratically.  It will freeze up sometimes, requiring me to pull the power and restart.  And sometimes it will just power down.
    Should I just trust the first two tests and ignore these results?  I guess I'm not sure what to do at this point.  I accomplished my goal of reducing temps (almost 8C degrees) but I want to make sure the thing is stable.
    Also, I asked this question a while back and got conflicting answers - is it better to have a high memory frequency or tight timings?  Because my cheap RAM really can't do both.

    Here is what I meant by Fine Tuning or finding that Sweet Spot.
    There are 36 different CPU Speeds and I figured 152 possible memory speeds based on figures I have seen here.
    200 to 250MHZ FSB (10 MHZ steps I am guessing)
    10 to 15 Multiplier
    7-13 divider
    308MHZ to 1071MHZ memory speeds
    Many more are possible not just the 4 that CPUZ shows for your current settings.
    Here are a few you might try with the 4-4-4-10 timings.
    FSB   Multi   GHZ   Divider DRAM Freq.
    220   13   2860   8   715
    230   14   3220   9   716
    240   15   3600   10   720
    240   12   2880   8   720
    250   13   3250   9   722
    220   15   3300   9   733
    210   14   2940   8   735
    240   14   3360   9   747
    230   13   2990   8   748
    250   15   3750   10   750
    250   12   3000   8   750
    200   15   3000   8   750
    230   15   3450   9   767
    220   14   3080   8   770
    250   14   3500   9   778
    240   13   3120   8   780
    210   15   3150   8   788
    240   15   3600   9   800
    200   14   2800   7   800
    230   14   3220   8   805
    250   13   3250   8   813
    220   13   2860   7   817
    240   12   2880   7   823
    220   15   3300   8   825
    250   15   3750   9   833
    240   14   3360   8   840
    210   14   2940   7   840
    230   13   2990   7   854
    Some are slower CPU clock speed than what you are at now but might Benchmark better at 4-4-4-10. Some have higher CPU speeds and might fail, but I listed them in case you wanted to give it a shot.  After all you might have hit the memory wall at 890MHZ and not the CPU wall at 3.12GHZ
    Set memory volts to max allowed by manufacturer and have fun. You can lower them later in steps to fine tune.

  • Hardware config and componenets : raid, gpu and monitor choices

    Right,
    I've almost finalised my decisions about a new build I'm putting together to use cs4 design premium prgrammes (mainly photoshop, InDesign, Illustrator, fireworks and flash) and in the near future 3d Max and possibly a video editing programme.
    There are some things I'm stuck with for now; Motherboard and cpu, and some things I have a few questions about: raid 0,  gpu and monitor choices.  I appreciate this is a long post - but any feed-back would be appreciated.
    MOTHERBOARD AND CPU
    The motherboard and cpu were originally intended for a gaming pc:
    MSI DKA790GX Platinum Motherboard
    AMD phenom x4 9950 black edition
    This motherboard and cpu are far better than any other  I've had at my disposal when it has come to working with  adobe programmes (CS2).  So although
    - I will only be able to go up to 8gb of ddr2  800mhtz ram (apparently the motherbioard  becomes unstable with 8 gb 1066 ddr2)
    - I  do not have a motherboard that takes 2  6+ core cpus
    From what I've seen of reviews on these components, at the moment,  I'm quite happy with what I've got for both cs4 and 3d max use.
    Besides, in the last few weeks I've learnt there are other things I can do to get maximum leverage out of a build comprising  these components.
    HDD CHOICE FOR RAID 0
    I've read Harms generic guideline for disk setup here
    http://forums.adobe.com/thread/662972?start=0&tstart=0
    I gather that this guide is principally for optimum running when video editing. But I also gather that the performance enhancement of raid 0 compared to no raid will be beneficial for both  photoshop and 3d Max.  4 discs with 2 in raid is what I was thinking of going for.
    Discs I have or am thinking of buying:
    1.  I have 1 160gb WD ide 8mb cache hdd WD1600AAJB
    2.  I have 1 500gb WD Caviar blue SATA 16 mb cache hdd (WD500AAKS)
    3.  For the raid I'm thinking of buying 2x 1tb Samsung SpinPoint F3 7200rpm 32mb cache drives for £45.74 each
    These are obviously highly rated as raid drives in these forums and I can't find anything smallar than a 1tb version of these drives at the shop I plan to order from – or is 1tb all they come in?
    1st disc for OS and programmes
    When going to the trouble of setting up raid is it acceptable to use an ide disc for the OS and programmes? I ask this because the 160gb WD hdd mentioned above has hardly been used....
    If the ide drive is not approved of  I'm thinking of buying either
    1.  A 500gb Seagate Barcude 7200.12 16mb cache sata 2 drive for £29.73 or
    2.  A 500gb Hitachi Deskstar 7K1000.C 7200rpm 16mb cache driver for £29.30 or
    3.  A  160gb samsung SpinPoint 7200rpm 8mb cache drive for 24.96
    I know the Seagate and Hitachi  are huge drives for the principal purpose of storing the OS and programmes but they both have 16mb cache and are only a few pounds more than the 160gb Samsung drive with only 8mb cache
    Or
    considering I'm going for F3 spinpoint for the raid discs would it be better, in terms of compatibility between the drives, to go for a spinpoint 8mb cache for the OS and programme discs
    What should be my biggest concern here – compatibility between the discs or a higher cache on the disc  - or considering they are all the same speed (7200rpm) does the cache not matter that much?
    2nd disc for page file, media cache
    I was thinking of dedicating the 500gb WD I have for these purposes.  I've read in this forum that the WD drives are not that reliable when it comes to raid arrays – apart from that they seem to have a good reputation.
    3-4th discs for raid 0
    My mind is pretty much made up on the Samsung SpinPoint F3's
    Partitions
    Because there are a huge number of gb's in this set up I'm thinking of  partioning the 1st and second discs so that in addition to using them for 'the operating system and programme files' and 'page file and media cache' their second partitions can be used to backup the data stored in the raid array.
    Is it good practice to partition dedicated discs in this way?
    THE GPU
    Due to it's reletive cheapness, wide spread use and endorsement amongst gamers and workstation users alike, first of all I was swayed towards the 480GTX and using quadro driver hacks.  But I couldn't get the number of ''risk factor' forum threads concerning making a GTX think its a quadro – particularly when it comes to using 3d max - out of my head.  Even without the hacks there are a high percentage of grissly stories out there concerning the 480s and 3d max compared to workstation cards and 3d max.  So I've decided that because I have the budget to get a workstation card (an ATI V7800) for 'just' £134 more than a GTX480 (£424 compared to £293) then for the peace of mind of more stable drivers, when getting to grips with a complex new programme, I might as well go for it and pay that bit extra.
    I've had a good look at both the ATI V7800 and the PNY QUADRO 4000.... the differences between stream versus cuda and how these cards perform when rendering 3d programmes.....  I  started vearing towards the V7800 when I realised that when it came to rendering it was being compared very favourably with the quadro FX5800 let a lone the much cheaper quadro 4000.... Plus with the disc set up I am now thinking of I can buy the V7800 plus all the discs talked about above for £10 less than I can find the quadro 4000 for alone.
    The only thing I'm concerned about is the V7800's compatibility with the cs4 design premium suite.
    This page indicates that although some of these cards were released after cs4 they are supported by it.
    http://kb2.adobe.com/cps/405/kb405445.html
    Would it be wrong of me to suppose that, because the new V7800 is a work station graphics card, it will  be supported by cs4?
    What difference will it make if it isn't supported?
    Having said that the quadro 4000 isn't listed as supported on this page either....
    What would be your choice between the quadro 4000 and ATI V7800?
    THE MONITOR
    The monitor I'm looking at the 22” Dell 2209WA  eIPS..... Although older (2008 vs 2009) the slightly larger 24”Dell 2408WFP has also caught my eye due to the higher contrast ratio (1300:1 vs 1000:1) and that this increased contrast ratio means it is capable of displaying 110 percent of the NTSC gamut as opposed to 86% with the the 2209WA.
    I'm familiar with monitor on the 2006 dell XPS M1710  laptop and am very impressed with colour fidelity.  The 2408WFP does require more calibration tweaking by the sound of it....
    Anyone any experince of either of these monitors?
    I'm off to the gym for an hour or so - I'll be back in a couple of hours

    My strong advice:  Don't skimp on price on hard drives.
    Get high MTBF RAID edition drives (e.g., Western Digital RE-series drives).
    http://www.wdc.com/en/products/products.aspx?id=30
    In my main workstation I currently have two 1 TB WD RE3 drives that have been spinning in a RAID 0 configuration 24/7 with ZERO problems for a year now.
    I figure one wants to minimize the chance of losing one's data down the road.  How much would it cost you in time, effort, and lost work to recover your data from a backup if you should lose your hard drive?
    Top-end drives can fail, sure, but the MTBF numbers say that the chances are much smaller than with "cheap" drives.  And the RAID edition drives have additional features, such as vibration reduction, that differentiates them from lesser drives.
    Ask yourself, why would one drive be cheaper than another?  Poorer quality materials?  Not as much quality control?
    Another bit of info:  Search this forum for "quadro".  It seems that these expensive video cards may be seeing more trouble with Photoshop than you might expect.  Some of the less expensive cards, that may have had more OpenGL testing (e.g., to support gaming) might be better choices for use with Photoshop.  Personally I prefer ATI.
    -Noel

Maybe you are looking for

  • InfoPackage QM status Process Chain

    Hello Gurus, In a process chain (BI 7.0) we have an InfoPackage which is loading the content of a flat file in the PSA. The problem is that the flat file can be empty and in this case the QM status of the InfoPackage stays in yellow and thus the proc

  • Multipage website

    I have a single folder with many photos taken on different days of a vacation. The different days..Day1, Day2, etc are saved as keywords for each photo. I would like to create a web gallery that is multiple pages- one for each day. What do you sugges

  • Scrolling through the menu 4th Gen.

    I am not sure if my ipod is working correctly. I touch "menu" and a variety of choices are presented. I assume I am supposed to draw my thumb or finger down to the desired item. How do I enter my choice? After I have drawn my thumb over the center bu

  • Macbook does not see bluetooth devices

    I have a MacBook Pro with bluetooth enabled but it will not see bluetooth devices.  I have a new Bose speaker that connects fine to my iPhone but not at all to the Mac.  Macbook says it is "discoverable".  Any suggestions on what to check? Thanks, To

  • Query on programe RFLKORD10  for tcode FI.27 for account statement

    for tcode : fi.27 If table bsid or bsad has dunning area with some specific value  then  i want to split  account statement. (we copied rfkord10 into zprogram and use that one.) if anyone worked on same then please help me out. Solution with coding o