PCI expansion slots with LV and NI hardware

I was wondering if there were any hardware or software limitations
with the use of a PCI expansion unit with NI boards and LabVIEW.

pincpanter wrote:
Hello LabVIEWers,
I tried to install the old DAQ-mX 7.5 to drive a NI 6601 PCI card under Microsoft's Virtual XP (host is Win7 Pro).
The driver installation failed somewhere (the errore message did'nt specify exactly what went wrong).
Also, the card itself was not seen by the virtualized OS.
Documentation available is somewhat limited and confusing, so I prefer to ask to actual users.
Are you running multiple LabVIEW versions on different VM's? Which ones? Do you get troubles?
Are the virtual OS really able to recognize the hardware? Is it possible to install different versions of the DAQ drivers on each?
Thank you in advance for your help
The PCI bus is not exposed to the MS XP virtual machine -  You can't get there on that VM although.. while not officially supported other XP VM emulators do exist.  a quick search on the forums should yeild results
Jeff

Similar Messages

  • Sun FastEthernet PCI with MII and x86 Hardware

    Hello
    I have two Sun FastEthernet PCI cards, each with one RJ45 port and a port labeled "MII". Would I be able to use these cards in a PC (Dell Precision 7xx) or x86 hardware in general?
    Thanks
    Matt

    Find the barcode paper sticker on the NIC and read the numbers. Report back here with what you find. That will be the actual part number of the product, in a format of something like
    501-XXXXyyyyWe would be interested in only the first seven digits.
    I vaguely remember that MS Operating Systems (at least from Win9x through WinXP) should recognize it natively from its chipset manufacturer. You wouldn't damage anything by just installing it to a PCI slot and booting your system. Your mileage may vary.
    I'm guessing you may find it to be this *<--link*
    (Ignore the `notes` on that web page, and the embedded URL's are broken as well. It's not a Sun Microsystems web site.)

  • Can I 'rescan' the PCI expansion slots?

    Howdy...
    I have a PCIe Express card reader in my MacPro to read cards from my Sony XDCam. It works very well if I boot with the card in the slot, and I can also 'eject' the card when I'm done, but I can't insert a card while the computer is already running...
    Is there any way (script? Terminal command?) that I can get the Mac to rescan the PCIe bus and re-recognize my card? in windows I can 'scan for new devices' which works, but I can't figure out how to do something similar in OS X...
    Any suggestions would be appreciated...
    -Ben

    Howdy...
    An Express Card is the same sort of slot that the MacBook Pro has. It reads Express Cards. The Mac Pro does not have this slot, but I have a PCIe card installed in my Mac Pro that enables the MacPro to read express cards.
    It works great, and as long as I have a card in the slot when I boot, everything is good. I can then eject the card and everything is still fine. It's just when I reintall the card (wihtout a boot) it is not recognized. Unlike the MBP, where I can insert and remove the card with no issues at all.
    I'd like to get that same functionality on the Mac Pro, but I think it requires rescanning the PCIe bus again...
    Any thoughts?
    -Ben

  • IMAQdx with JAI and a hardware trigger

    Hello,
    We are working with two 'JAI AD-080' cameras and IMAQdx, and have two problems regarding the triggering and frame grabbing:
    1)  We are unable to change the trigger source through IMAQdx property node or the Vision Acquisition Express; Vision Acquisition block.
    2)  When we manually edit the trigger source property using the NI Measurement and Automation Explorer (MAX) to its correct value, we can't get all four of the CCD's to run at a time without resulting in bad packets, e.g. horizontal black lines across the images.
    Our goal is to obtain the images from the 4 CCD's at a rate of 5 Hz using our hardware trigger.  We can already connect and obtain all four images at full speed, but the 5 Hz trigger is not being used by the cameras in that case.
    Details of the setup:
    NI 2011, Windows 7 
    Two (2) JAI AD-080 cameras (with 2 CCD's each), GigE cameras connected over Ethernet
    Hardware triggering at 5 Hz, on pin:  'Line 7 - TTL In 1'
    Details of the problem:
    (1)  Setting the trigger source not possible in Vision Express or IMAQdx property node
    In order to use our hardware trigger, we have to set the camera property 'CameraAttributes::AcquisitionControl::TriggerSour​ce' to a specific pin (Line 7 - TTL In 1).  This property is available in MAX, but is not usable in the Vision Express Block.  The property is present, but the values are invalid.  Here is what I think is happening:  the list of properties are read from the camera, but LabVIEW does not know what the valid values are for that property and it populates the value drop-down menu with whatever was loaded last.  This can be seen in figures 1 and 2 where the values in the drop down menu change.
    Similarly, this property of 'Trigger Source' cannot be changed programmatically using the IMAQdx property node shown here: http://digital.ni.com/public.nsf/allkb/E50864BB41B​54D1E8625730100535E88
    I have tried all numeric values from 0 to 255, and most give me a value out of range error, but the ones that do work result in no change to the camera.
    (2)  Lost packets in image during triggering
    If I set the 'Trigger Source' property in MAX to the correct pin, save the configuration, and then use the Vision Acquisition Express block to connect to the camera, the triggering works properly (the hardware trigger is used and the images are received by LabVIEW at 5 Hz).  However, this only works for one CCD:  If i use the same code for all four CCD's at the same time, I get black bars on the images, and at least one of the CCD's result in an error in 'IMAQdx Get Image.vi'  (code -1074360308, -1074360316)
    I tested this by using the configuration attributes created by the Vision Express Block, (The string used in the 'IMAQdx Read Attributes From String.vi'),  in the code we have been developing as well as a very simplified version which I have attached.  Those configuration attributes are saved in the text files:  JAI_Config_TrigON.txt and JAI_Config_TrigOFF.txt for using triggering or not respectively.  
    So my final questions are:
    Is there a problem with the IMAQdx because it doesn't recognize the trigger source value?
    Do you have any suggestions for why there are bad packets and trouble connecting to the cameras when I load them with the trigger on attributes?
    Thank you for your time - 
    Attachments:
    Fig1_VisionAcq.png ‏387 KB
    Fig2_VisionAcq.png ‏442 KB
    Fig3_BadPackets.png ‏501 KB

    Hello,
    Thank you for your response; especially the speed in which you responded and the level of detail.  
    I have not solved the problem fully in LabVIEW yet, but I was able remove the black lines and apparitions from the images using different camera parameters.  
    Since this was a significant victory I wanted to update:
    1)  Version of IMAQdx?
    I have IMAQdx 4.0, but the problem persists.
    2)  Setting configuration files
    Your suggestion to pay attention to the order in which the properties are set as well as using the MAX settings is very helpful.  I have not explored this feature fully, but I was able to successfully use MAX to set the default settings and then open the cameras programmatically without loading a new configuration file.  
    3)  Bandwidth limitations
    I modified the CCD's to only use 250 Mbits/second, but the lost packets (or missing lines/ apparitions) were still present.  
    4)  JAI AD-080GE Specifics
    I am using the JAI AD-080GE; and there are two settings for this camera that I want to mention:  
    JAI Acquisition Control>> Exposure Mode (JAI)>>Edge pre-select
    JAI Acquisition Control>> Exposure Mode (JAI)>>Delayed readout EPS trigger
    The "Edge pre-select" mode uses an external trigger to initiate the capture, and then the video signal is read out when the image is done being exposed.
    The "Delayed readout EPS trigger" can delay the transmission of a captured image in relation to the frame start.  It is recommended by JAI to prevent network congestion if there are several cameras triggered simultaneously on the same GigE interface.  The frame starts when the 'trigger 0' is pulsed, then stored on the camera, then is transmitted on 'trigger 1'.  
    The default selection is the "Delayed readout EPS trigger", however, I do not know how to set the 'trigger 1' properly yet and I only have one connection available on my embedded board that is handling the triggering right now (I don't know if 'trigger 1' needs to be on a separate line or not).  Incidentally, the system does not work on this setting and gives me the black lines (aka lost packets/ apparitions).
    I was able remove the black lines and apparitions using the "Edge pre-select" option on all 4 images with a 5 Hz simultaneous trigger.  I confirmed this using the "JAI Control Tool" that ships with the cameras.  I am unable to make this happen in MAX though, as the trigger mode is automatically switched to 'off' if I use the mode:  JAI Acquisition Control>> Exposure Mode (JAI)>>Edge pre-select
    i.e. when manually switching the trigger mode to 'on' in MAX, "JAI Acquisition Control>> Exposure Mode (JAI)>>Delayed readout EPS trigger" option is forced by MAX.  The vise-versa is also forced so that if EPS mode is chosen, "Trigger Mode Off" is forced.
    Additionally, there is a setting called:
    Image Format Control>>Sync Mode>>Sync     &     Image Format Control>>Sync Mode>>Async
    When the "Sync" option is chosen the NIR CCD uses the trigger of the VIS CCD.  In addition to using the "Edge pre-select" option, the "Sync" option improves the triggering results significantly.  
    5)  Future troubleshooting
    Since I cannot set the camera parameters manually in MAX (due to MAX forcing different combinations of parameters in 4), I am going to explore manually editing the configuration file and loading those parameters at startup.  This can be tricky since a bad combination will stall the camera, but I can verify the settings in JAI Control Tool first.  There is also an SDK that is shipped with the cameras, so I may be able to use those commands.  I haven't learned C/C++ yet, but I have teammates who have.

  • MOVED: MSI 785GT-E63 problems with temps and temp hardware

    This topic has been moved to Off-Topic Technical.
    https://forum-en.msi.com/index.php?topic=140927.0

    I also use cpuid hwmonitor to check temps n volts 12 vlt rail is at 12.05 during load. The gpu load temps on the lightning nvr exceed 72c at stock and 76c with the oc. Should I get a new mobo? Compatibility issue? I was looking at a asrock 990fx extreme 3 as this would allow me to upgrade my CPU later if I choose. I am on my current main pc bc bulldozer didn't grab me. Some benches r less than the x6s, my two 6970s perform btr than a single 7970 for my needs and the only increase I would see is if I got two 7970s that's way to much coin n would be bottlenecked by my current CPU I'm sure. Any ideas? I'm at a loss

  • CSi expansion slots and p2 (pcmcia) and sxs (express card) reading.

    i have a mac pro about a 1.5 years old that i do some video editing on (final cut pro).
    excuse my ignorance but what are the pci expansion slots for?
    do they make cards readers for p2 and sxs cards that fit into
    these slots (as opposed to using usb/firewire connections)?

    Hi indore,
    Yes DV is a good starting point,P2 is great, there are some small issues to get to know about all formats out there these days, use what your comfortable with, or use what your clients will pay for that's a better rule to use.
    I posted my results to date at-
    http://www.dvxuser.com/V6/showthread.php?t=144502
    More testing to make sure all is good is still under way, may be a few weeks yet, i'm getting some colleagues to test on their Macpro's as well. So far all is good.
    Cheers
    Tom K

  • Mac pro and motu PCIe audio card with a 2408mkIII

    mac pro and motu PCIe
    i recently upgraded to the mac pro from a G5 and now my motu 2408 mkIII sound card does not work.
    i installed the motu PCIe 424 pci card from motu and installed the latest UB drivers, followed the instructions to the letter, repeated numerous times in every pci e slot with only temporary luck. it worked briefly and then when the computer slept i lost the connection with the interface and have not been able to regain it.. yes i re-installed to the letter again. the pcie card is seen by the computer but it cannot seen the external rack mounted 2408 via the audiowire that connects it. yes i tried other cables. the unit worked fine on my G5.
    i have heard of others with the same problem after updating the firmware to SMC 1.7. my mac pro came with the latest firmware.
    Message was edited by: tara ntula

    another user forum wrote this on 8 august 2007 in regards to their mac pro.
    To my surprise I had five software updates waiting for me. Among them Pro App update, two of them, and this SMC update. After I updated my MOTU PCI-424 audio interface no longer works. Tried installing MOTU drivers again, still nothing. It can't communicate with its audio interfaces. So right now, this amazing update from Apple stopped me from working, and I am on a pretty tight deadline. I hope I'll find a solution ASAP

  • Faulty PCI-E slot on MSI K8N Mainboard

    Hi
    My MSI K8N SLI-F board seems to now have a 'temperamental" primary PCI-E slot.  I have a 9800GT card that was working fine for months and then I got no video on reboot after trying to get dual displays up and running.  After much troubleshooting and RMA'ing my 9800GT card I think it must be the slot on the Mainboard itself.  My 6800GS which has a light fan on it compared to the massive glaciator on my 9800GT is working fine but whenever I put the 9800GT (original or replacement) in I get no video on boot but I can hear the computer and HD making all its normal noises and know its getting to the windows login screen.  The 9800GT fan spins up then turns off or sometimes stays spinning.  Both the original and replacement 9800gt cards worked fine in another machine.  After cleaning out the slot and screwing in the card nice and hard it went again and ran all day but after shutting down and leaving it overnight, I got no video on boot again.  I have repeated the reseating steps and have it going again.  I can only presume the slot itself is now loose or the solder on it is failing or something.  When the card isn't in just right and I get no video I also get both optical drives ticking / seeking like there looking for a disc about 10 times then the computer kinda internally restarts and it goes again with no video still.  When I have the card screwed in just right like now it doesn't happen.
    I examined the PCI-e slot closely again and saw these 3 solder points which don't look right could this be whats causing the problem?

    Quote from: alzaeem on 17-June-07, 02:52:41
    Is it ok if i connect my single GPU (6800gt) to the second PCI-e x16 slot and leave the primary slot empty? if yes then do i have to change the position of the switch between the two slots so as to enable the second slot??
    why would you want to do that? it should work, but the recommended setting is to use the primary slot first (the one closest to the CPU).

  • X-Fi XtremeMusic - 100Mhz or 133Mhz PCI-X support with nForce Pro chips

    I have a Tyan k8we motherboard which makes use of a nForce4 chipset and I know there are issue with it and the X-Fi XtremeMusic cards.
    However my question is this will the X-Fi work in 00 & 33 PCI-X slots? The reason I ask is that the Tyan k8we has two 6-bit PCI-e slots with a standard 32-bit PCI in between them, which makes it impossible to install anything in that 32-bit slot if you run in SLi mode.
    I only have 00 and 33 PCI-X slots left that I can use on my motherboard. Both of which do not seem to work properly with the X-Fi XtremeMusic card (when I plug in the X-Fi, my onboard network cards stop working).
    So I need to know what my options are if this X-Fi card will not work.
    - Is there something I missed during the installation?
    - Do I obtain a firmware or an update for the X-Fi card?
    - Swap the card for one that does work?
    - Return the card and get the Pro version (which I have heard works for some strange reason)?
    - Or return the card and get my money back?
    This card is only a day old so I would like to act on the store's return policy if CL Support cannot help me anytime soon.
    Thanks very much
    Message Edited by ghoulish on <SPAN class=date_text>0-26-2005 <SPAN class=time_text>0:32 AM
    Message Edited by ghoulish on 0-26-2005 04:42 PM

    I think vanilla PCI is 33 MHz. 66 MHz is a PCI-X rate. Anyway, step is to talk to the motherboard manufacturer and make sure you are doing everything necessary to allow a 32-bit 33MHz PCI device to work in one of the PCI-X slots. If that doesn't work, the next step to talk to Creative. I doubt anyone here has any experience with this.
    Also, I KNOW that the PCI slot is stuck between the two PCIe x6 slots. All I'm saying is that something is not right (ie. Tyan screwed up) if you can't use all 3 slots at once. I don't have any experience with SLI. Is there a rigid connector between the 2 cards that doesn't fit if there is a 3rd card in between. If it is, I'm suprised that a dual-slot solution where the 2 slots aren't adjacent is even conforms to the required spec. Is the problem that your heatsinks are too big? Exactly WHY does it become unusable, and how could Tyan possibily think that such a restriction is reasonable?

  • Eclipse SLI and booting with VGA on 2nd x16 PCI-Ex slot

    Is it possible? It is simple operation and -probably- I could find out myself but I prefer to ask instead toying with add-on cards just to test theory.
    I have plans to buy Intel's SRCSASBB8I RAID Controller [http://www.intel.com/Products/Server/RAID-controllers/SRCSASBB8I/SRCSASBB8I-overview.htm] or something similar from 3Ware/LSI. I want to create some proper hardware RAID 5/50 because software RAID 5 from x58 is total overkill and completely pointless. Built-in JMB chips are fine, but only for RAID1 operations. Because RAID card is PCI-Ex 8x (and 3Ware/LSI too) to reduce chaos inside I planning to put it in the first PCI-Ex x16 slot. Because of that, my initial question stands. Will PC boot if VGA card is not inserted into first x16 PCI-Ex slot? Even better, maybe you tested such controller inside company? 
    If yes (booting) then great, if not then why not? Do I need some specific BIOS or is it hardware (chipset) limitation?
    Thanks in advance.

    OK, I just turned off PC for 30 - very valuable  - minutes instead bogging support about something so trivial.
    And in short... IT WORKS! Sort of.
    But not everything is so rosy. While switch to 2nd x16 PCI-Ex slot resulted with some surprisingly positive results it also resulted with something bad.
    Well as most people running with only 1 VGA and 1 or 2 HDDs and 1 or 2 optical drives first is the good news:
    + Temperature drop on the North Bridge was tremendous. ~10C during stress (drop from 58C to 49C with i7 920 clocked at 3.3 GHz and Thermaltake Extreme Spirit 2 cooler on NB). Probably because VGA blocked some of hot air which now was freely blown away.
    Bad news are:
    - Blue SATA ports are half accessible. Port 7&8 is OK, because it is on the edge of the VGA board, but ports 9&10 are complete no go. If you have something connected there it will work*, but to disconnect SATA device from Motherboard you need to remove VGA. There is no other way around. In two words: it sucks. If you have a lot of HDDs.
    - 2nd bad news is that such combination (VGA on 2nd slot) works only for less powerful (or older) cards. My 8800 GTS640 slot-in quite well, but with bigger cards (more than 26 cm - 10.1") it is also complete no go. GTX 2xx/4xx/Radeon 5870/5970 forget it. Big card will cover ports 9&10 and thus rendering them completely useless.
    * - assuming you using VGA smaller or equal 26 cm/10.1" - does not apply to cards with 1 slot cooling
    I hope it helps, and someone else will benefit from that. Meanwhile I need to look for some external RAID 5 HDD enclosure without NAS. Why all manufacturers are so obsessed with it?
    BTW: Jack my SB (included with Eclipse) still does not record audio even if I tested it on few clean Win installations (32/64). But in all honestly I completely forgot about it because testing was conducted long time ago. PCI SB rules.... just like Asus Xonar on PCI-Ex. 

  • 5770 and 5870 in Mac Pro 2009 - noisy PCIe expansion bay fan

    I replaced the Geforce GT120 by the ATI radeon 5770 in my Mac Pro Quad 2.66 Nehalem. I had severe trouble with the speed of the fan of the PCI-e expansion bay: 2000 RPM and really noisy. After 30-60 minutes it slowed down and became silent, but upon reboot the noise and the high speed were there again.
    I have sent the 5770 back to Apple. I am considering to try the 5870, but I am afraid I will be in trouble again.
    With the GT 120, the fan speed is only 800 RPM and the Mac is silent. The ATI Radeon cards require one or two power cables connecting to the logic board.
    Could that be the cause of the problems or was the 5770 a bad card? Or is it the logic board. I ran the Apple Hardware Test. everything was OK.
    I checked it with Techtool Pro and there were no problems found.
    Any help would be much appreciated. I have zapped the PRAM and reset the SMC without result.
    Kind regards and thanks of reading.

    If it is any consolation I have the same machine and installed the Radeon 5770 and have intermittent high expansion bay fan speeds (1400-2400rpm). I just checked this board today because it seems more noisy today than in a while and the ambient air is listed as 78 degrees F. The card has it's own fan and exhaust slots in the back of the card and the air from them is warm but not what I would consider hot. I had had two video cards and a eSATA card installed and never had a problem. Now with just this card and the eSATA it exists. I'm betting if I call Apple about it they will tell me it's only listed on their site as compatible with the 2010 Mac Pros, so, for now I will live with it.

  • X1900 and expansion slots fan

    Hang in there with this one, it gets kinda long:
    1) Bought X1900 for my dualcore 2.3, and the graphics kick butt!!! No way can I go back to the 6600.
    2) It did make some annoying fan noises on startup, and also when launching games it would spool right up and then go back down. It does come up again but usually only under hard use so you're busy shooting or whatever (CoD, Doom3, Warbirds) and can't hear it much.
    3) This is where it gets weird. I thought I'd try the artic cooling Accelero X2 cooler, supposed to be excellent and guys have used it on their mac pros with good results. Problem is, on G5, the cooler will interfere with the clear plastic air deflector in the recessed area.
    So I made up a new deflector using lexan that didn't have the recessed area (ie, its flat), installed it and ran computer like that for a couple days. None of the temps changed (I suspect the recess on stock part is to force airflow through the 6600 cooling fins, the X1900 having its own fan/cooler does not require this).This new deflector allows the Artic X2 cooler to go in with no issues.
    Today I installed the Artic X2 cooler, temps are the same and the noise is the same. Thats not right (look online for all the reviews including mac pros and temp drops and noise drops are huge) Then it hit me, the noisy fan is NOT just the 1900, it is the small fan at the front (hardware monitor calls it the "Expansion Slots Intake" fan). When you launch a game, it winds up to 3000rpm and that is causing the noise. So the question is, is there a quieter fan to put in there?
    Also, I'm thinking this fan gets it speed from the slots power consumption, as they seem to match. The 6600 used to run at 65 degrees C all the time, the stock X1900 ran at 55 all the time, and still 55 with new cooler so there is no reason for that fan to spool up like that based on temps.I'm also thinking the graphics proc temp is a local sensor, not right off the card.
    Thoughts on any of this? Sorry for the long post but it needed explaining. I'd like to quiet that fan down...

    Not familiar with the dual-cores, but on the earlier G5s that fan (known to HM as the "PCI Slot Fan" on this Ancient DP2 G5) is controlled by the slot values, not by a temp sensor.
    Some of the cooling kits for other cards, e.g. the Verax G03 on the 9800proSE, suggest that the fan with, presumably, relatively modest demands, pick up the power from the Molex connector for the optical drive. Don't think the reason was stated - but it was possibly that the increased power demand caused the "PCI Slot Fan/Expansion Slots Intake" fan to rev-up and cancel out the benefits of the new, quieter fan.
    This is not much help with a variable-speed fan controlled by a temp sensor on the graphics card, which has to be plugged in to the card.
    What speed/value did the "Expansion Slots Intake" fan run at with the 6600? With the 'standard' X1900?
    It's running at 37% with an X800 (variable-speed card fan - plugged into card) on this Venerable DP2.
    BTW The Service Manual is here
    http://akserver.dyndns.org/macosx/hw/asm/PowerMac/PowerMac-G3-G4-G5/PowerMac-G5/ powermac_g5.pdf
    but doesn't really cover this sort of thing.
    Possibly you could contact Arctic for info on installing in a G5.

  • Early model Mac Pro's Expansion Slot Utility not working with 10.5.6 update

    I'm reading that the 10.5.6 update may have crippled the Expansion Slot Utility for early Mac Pros.
    I have a 2.66 Mac Pro, and I just installed a Kona 3 card in slot 3. Already had an ATI 1900xt in slot 1. However, the Expansion Slot Utility program detects only the ATI Card.
    However, in System Preferences - the PCI column clearly shows slots 1 & 3 are occupied and with drivers installed for the respective boards... The Kona card works for video preview (further indicating that it's installed correctly - but the Expansion Slot Utility is broken), but I understand that the Kona card is operating at the 1x speed, thus crippling it...
    Apple please fix this... Any administrators or someone in the loop care to elaborate please...
    Thanks,
    Lonnie

    So, here's the deal. Running the Expansion Slot Utility still works for our model of computer - it just doesn't register correctly within the program. This is only a cosmetic flaw. I've confirmed this with AJA and in System Prefs under PCI columns.
    Also, AJA recommends 4Lane, slot 3. There it lies. Furthermore slot 4 concerns me - too close to drive bays for proper air circulation and the propensity for over heating.
    The main issue of not sleeping has been answered by AJA. Their Kona 3 card does not support sleep.
    Guess I'll be turning my computer off instead of sleeping it after hours - no biggee...

  • How to install a PCI card in my HP 110-335t expansion slot?

    The HP 110-335t desktop comes with one expansion slot, per the product spec. Right? How do I open up the box and install a PCI card?
    This question was solved.
    View Solution.

    Hi,
    Review this HP posted information.
    You can use the PCI-E mini-slot for a wireless LAN card.  Contct HP Support or Sales and get a recommendation. You will still need to configure an antenna.
    HP DV9700, t9300, Nvidia 8600, 4GB, Crucial C300 128GB SSD
    HP Photosmart Premium C309G, HP Photosmart 6520
    HP Touchpad, HP Chromebook 11
    Custom i7-4770k,Z-87, 8GB, Vertex 3 SSD, Samsung EVO SSD, Corsair HX650,GTX 760
    Custom i7-4790k,Z-97, 16GB, Vertex 3 SSD, Plextor M.2 SSD, Samsung EVO SSD, Corsair HX650, GTX 660TI
    Windows 7/8 UEFI/Legacy mode, MBR/GPT

  • PCI Express Expansion Slots

    I noticed that the Mac Pro offers 3 extra expansions slots for PCI Express graphics cards. How is it possible to SLI three graphics cards with no nForce MCP? Are the other slots even usable and how can you combine four graphics cards with no Crossfire or SLI?

    SLI and Crossfire are not supported at this time.
    The possibility to run more monitors is there with the extra slots, but not to combine video cards for faster frame rates on a one monitor.
    It's not Apple's fault, it's the limited capabilities in the Xeon chipset supposedly, not all slots can be 16x. A pair can be 8x though.
    I lost the link where Ars Tech explains it.

Maybe you are looking for