NVidia card for SDI output?

What nVidia cards are "compatible" for SDI output?  I am curious, as I have a 4000...
Thanks,

For people that already invested in a DeckLink Studio card, would it be worthwhile to wait a little while, instead of additionally investing in a Quadro SDI? Or is the support for other output cards way down on the roadmap?
Regards,
Andreas

Similar Messages

  • Second Graphics Card for HDTV Output + HDV Live Preview

    Hi guys,
    I have a few questions about hardware and Premiere Pro CS5. I am using Windows 7 x64 with a Dual DVI nVidia GeForce 7950GT which is outputting to two DVI LCD monitors. I want to install a second graphics card so I can look at the main preview window in Premiere on my HD LCD TV and would like to know what the best plan of attack is. Also, I want to be able to preview my HDV captures on my HDTV as relying on the 2.5" LCD on my Sony HVR is not eye-friendly!
    I am thinking about either selling the 7950GT and buying a GeForce 470GTX which has 2 dvi and 1 mini-hdmi, or, keeping the 7950 and buying a cheapish 2nd graphics card to just use for the HDTV. What do you guys think is the best option?
    This is the graphics card I am thinking of getting ( http://www.asus.com/product.aspx?P_ID=hs87bcOLPd9irXRk&templete=2 ) an ASUS ENGTX470  / 2DI / 1280 MD5, my server can just handle it's power requirements, but I am happy to upgrade the power supply if this card will ouput to 2 DVI monitors and also to a HD TV with a mini-HDMI to HDMI adapter. An Antec 1000w power supply is only about $225!
    Or, I can take the cheap option and get an NVIDIA Quadro FX 580  by Leadtek( http://www.leadtek.com/eng/workstation_graphics/overview.asp?lineid=2&pronameid=491&check= f ) which is a cheapy with 2 HDMI and 1 DVI. I guess there are other PCI-X graphics cards out there to choose from which are probably fine, but it would be awesome to know what some of you guys are using.
    Surely there are others out there that already have a dual monitor setup with a third LCD TV for previewing?
    Anyhelp would be much appreciated!

    @Harm: Thanks for that, I thought a card with three outputs would be  too good to be true. I think I will buy a much cheaper 2nd PCI-X  graphics card and just use its HDMI output for the HDTV. We will be  doing a complete overhaul of this machine sometime in the future and I  will buy an awesome graphics card with power supply to boot then.
    @J-MS: Currently we are moving away from a Matrox Axio HD as their  support for Windows 7 and CS5 has been lackluster, and I don't reward  companies that provide zero or no support with future business. I would  rather be free of a third-parties proprietary codecs, plugin and  decoders.
    @Thrill: Thanks for the link to that post, I read through it and it looks like I need to get a second card that is the same make as the existing card (ie. nVidia), which is what Harm was saying. When I capture from the tape deck, the capture preview window in Premiere goes blue and says in 5 different languages that the preview is going to show on the input device, or words to that effect. I am therefore looking at a 2.5" lcd for capturing! fail. I don't know why that is happening now to be honest, maybe it was because I was in a Matrox HD sequence before and the input device wasn't set correctly? I will double-check it again and see what the deal is.
    I am very tempted to just get an el-cheapo nVidia 8400GS Turbo Cache 512M DDR3 DVI HDMI which is only $39!!!! It has HDMI and is small and has a very low power drain on the system. If it outputs to the HDTV then that's all I need, if it doesn't, I sell it on ebay for $25 and lose $14, big deal.

  • NVIDIA CARDS for MacPro

    I just switched from apple editing to adobe editing.
    What nVidia Cards are compatible with a 07-08 MacPro 3.0 quadcore?
    I just burmed myself ordering a GTX570.
    1,1 profile / macpro 2x 3ghz dual core / running 10.6.8

    Lion has drivers for the 500 series, based on what is seen on Netkas and MacRumors. It should work with 570 as well, though may need some small modification to driver Plist. Get ATY_Init, and a 7300GT in slot 4, with the 570 in Slot 2, and you may have success.

  • Video card for HDMI output?

    Hey guys,
    We've got an old PowerPC G5 in the office that we'd like to set up for viewing HD footage on an HDTV. I know that there are DVI to HDMI adapters out there, and that's ideally the route we'd like to go. Unfortunately, we also need dual monitor support on this machine. Are there any hardware options available on this older machine that will give us what we need? i.e., can a machine support two video cards?
    Specs
    Dual 2GHz PowerPC G5
    14GB DDR2 SDRAM
    OSX 10.5.8
    Currently installed card: GeForce 6600LE
    Thanks so much for your help guys!

    Hi Joe, and a warm welcome to the forums!
    All G5s support Dual Monitors on one card...
    http://www.everymac.com/systems/apple/powermac_g5/faq/powermac-g5-adc-ports-dvi- ports-resolutions-supported.html
    But if you want another card, See japamacs page here on the best AGP cards for G4s & G5s...
    http://www.jcsenterprises.com/Japamacs_Page/Blog/4B4B7BA2-7ABB-47F1-87AC-B03D379 42BEE.html
    G5 PCIe options are listed here:
    http://www.jcsenterprises.com/Japamacs_Page/Blog/71BBF3EF-9713-4C53-8B80-26771F8 A4087.html

  • Two nvidia cards for triplehead setup

    hello everyone^^
    i posted in the german arch-forum but nobody was able to help. I already have an agp nvidia 7600gs installed. additionally i want to install a pci nvidia fx5200. both cards use different driver packages. are there any known problems using both packages at the same time? did anybody already deal with that problem and is able to help? btw i already had both cards working together in kubuntu 7.1 with triple monitor. i am quite new to arch, therefore any help is really appreciated.
    atm i need a working pc therefore i have to be very careful. the second card is not in xorg.conf and not in nvidia-settings. what to do first?
    thank you
    christian
    ps: sorry for my english
    Last edited by cl10k (2009-03-16 23:24:58)

    Hi rokziiron,
    As long as the card that you have is compatible with the motherboard, expansion slots, and processor of your new computer. You should have no issue with compatibility. I have included the 'HP ENVY Phoenix 810-170st Product Specifications and Configurable Options'.
    For the making sure that you are using the correct power, please make sure that the power requirement for the other graphics card does not put you over the Power output wattage. This information is also included in the specifications.
    If you do require further assistance with this, please provide me with the make of video card that you are wanting to install, and your old computers model number. I have included the document 'Guide to finding your product number'.
    Thank you.
    I worked on behalf of HP

  • How to structure the DMA buffer for PXie 6341 DAQ card for analog output with different frequencies on each channel

    I'm using the MHDDK for analog out/in with the PXIe 6341 DAQ card.
    The examples, e.g. aoex5, show a single Timer  (outTimerHelper::loadUI method), but the example shows DMA data loaded with the same vector size.
    There is a comment in the outTimerHelper:rogramUpdateCount call which implies that different buffer sizes per channel can be used.
       (the comment is: Switching between different buffer sizes will not be used)
    Does anyone know what the format of the DMA buffer should be for data for multiple channels with different frequencies ?
    For example, say we want a0 with a 1Khz Sine wave and a1 with a 1.5Khz sine wave.  What does the DMA buffer look like ?
    With the same frequency for each channel, the data is interleaved, e.g.  (ao0#0, ao1#0; ao0#1, ao1#1, ...), but when the frequencies for each channel is different, what does the buffer look like ?

    Hello Kenstern,
    The data is always interleaved because each card only has a single timing engine for each subsystem.
    For AO you must specify the number of samples that AO will output. You also specify the number of channels. Because there is only one timing engine for AO, each AO will channel will get updated at the same time tick of the update clock. The data will be arranged interleaved exactly as the example shows because each AO channel needs data to output at each tick of the update clock. The data itself can change based on the frequency you want to output.
    kenstern wrote:
    For example, say we want a0 with a 1Khz Sine wave and a1 with a 1.5Khz sine wave.  What does the DMA buffer look like ?
    With the same frequency for each channel, the data is interleaved, e.g.  (ao0#0, ao1#0; ao0#1, ao1#1, ...), but when the frequencies for each channel is different, what does the buffer look like ?
    In your example, you need to come up with an update rate that works for both waveforms (1 KHz and 1.5 KHz sine waves). To get a good representation of a sine wave, you need to update more than 10x as fast as your fastest frequency...I would recommend 100x if possible.
    Update Frequency: 150 KHz
    Channels: 2
    Then you create buffers that include full cycles of each waveform you want to output based on the update frequency. These buffers must also be the same size.
    Buffer 1: Contains data for the 1 KHz sine wave, 300 points, 2 sine wave cycles
    Buffer 2: Contains data for the 1.5 KHz sine wave, 300 points, 3 sine wave cycles
    You then interleave them as before. When the data is run through the ADC, they are outputting different sine waves even though the AO channels are updating at the same rate.

  • PCI Card for Analog output in the range of 10mv

    dear ni,
                    I want to know about the PCI card that can be generate analog output in the range of maximum 10mv. i  am going to use for caliberation of loadcell, strain gauges devices.
    could you tell me on which PCI card will support this type of application.
    Regards,
    Balaji DP

    Try:  http://www.ni.com/dataacquisition/
    These have analog output voltages < 10V:   http://sine.ni.com/nifn/cds/view/main/p/sn/n12:7604,n3:7853/lang/en/nid/1036/ap/daq
    You need something with a high bit count to get good resolution at 10 mV, such as the PCI-6010 which has a 16 bit D/A.   
    Here are the minimum voltage specs for the 6010:
    Minimum Voltage Range
    -0.2..0.2 V
          Range Accuracy
    283 µV
          Range Sensitivity
    6.4 µV
    Message Edited by vt92 on 11-18-2009 07:56 AM
    "There is a God shaped vacuum in the heart of every man which cannot be filled by any created thing, but only by God, the Creator, made known through Jesus." - Blaise Pascal

  • Best I/O card for SDI or HDMI monitoring

    For quite some time now I have been using Black Magic to provide I/O on our macs including thunderbolt from our Imac for HDMI & SDI.
    It is commonly known that using I/O harware is another process and hence will reduce performance.
    We could get by with just a monitoring solution knowing that we hardly ever output to digital betacam anymore
    Since it's been a while I'd thought I would ask if anyone out there has heard of any better solutions for monitoring etc.
    We may replace/upgrade all of our systems so any info would really be appreciated before we purchase.
    and please I do not intend this post to be a Mac vs PC thread.
    Thanks in advance

    Adobe Transmit is integrated into CS6 software. There is nothing to turn on or off. It just is
    AJA hasn't released its final drivers so I am not at liberty to go into details.
    I will speculate that based on AJAs fine track record with its hardware and software that they will release a final product that will satisfy most professionals.

  • Cs6 doesn't recognize NVidia Card for graphics processing?

    On starting up Cs6 for the first time, the program said that it doesn't recognize the NVidia GeForce 7300 GT. Is this right? Is this going to be true for the release?

    That's a bummer. I had the same problem but can't remember the exact steps I took to fix it. At first it would not recognize my graphics cards because they were different (NVIDEA, RADEON), and would ONLY recognize the NVIDEA 7300. Then I updated all the drivers and swapped the order of my cards around, and changed the slot settings with the Expansion Slot utility (if you are on an older MacPro). After some tweaking they all loaded and are working well but don't have all the extra features that a new graphics card would have.

  • How do I get PremierePro CS6 to talk with Blackmagic card for HDMI output?

    I recently upgraded my MacPro from CS5 to CS6.  I primarily work with Premiere Pro but also with PhotoShop and Sound Booth.  Previousely, my HDMI monitor worked with CS5 but the playback settings in CS6 only had the Adobe Default Player as an option and it would not show the Blackmagic card as an option.  I upgraded my Blackmagic software to the most recent version (9.7) but that didn't work either.  After the Blackmagic upgrade the HDMI signal in the CS5 PremierePro version (still installed and in use) ceased to operate. Subsequently, the MacOS was also upgraded from 10.7.5 to 10.8.3 but no change in outcome.  I have checked all mechanical connections.
    My platform specs are:
    MacPro (Early 2009)
    OS 10.8.3
    Processor:  2x2.66 GHz Quad-core Intel Xeon
    Memory:  12 BG, 1066 MHz DDR3
    Dual screen Apple display
    Blackmagic output settings (9.7) :  All Video, 4 Analog Audio & 2AES/EBU  (NTSC)
    My question is how do I get PremierePro to talk with the Blackmagic card?
    Phil

    Hi Phil,
    What Blackmagic card do you have?
    There is a known bug with the Intensity cards and Blackmagic driver 9.7 where it does not output HD, only SD   This should be resolved soon.
    In the meantime, if you have a Intensity card you can go back to desktop version 9.6.8.
    Best,
    Peter Garaway
    Adobe
    Premiere Pro

  • Dual SDI output

    Hi,
    I am currently using Speedgrade with an Nvidia 4000 SDI card for stereoscopic output to a projector with dual SDI inputs. I understand that people can now use 3rd party (thought I've only heard of people using Matrox) io cards.
    I searched Adobe's site for any info regarding 3rd party io cards with speedgrade and there's something about Mercury something or other but I can't find any specs or recommended cards. Does this really work now?  Is it buggy?  Can it do true stereoscopic (dual SDI)?
    I'm looking for more info regarding this so if there's a good info source for this I'd love to look it up.
    Thanks for any answers to my questions or just feedback regarding this.
    -Cablet
    p.s.  Hoping to move to mac from windows if possible yet keep the dual SDI output.

    I'm not sure about 3D, but with the 7.0.1 update SpeedGrade supports AJA, Matrox and Bluefish boards as Dual Display using "Mercury Transmit"(meaning those cards display the image), so that includes SDI on mac if the card has that option. I don't know if the Q4000 with SDI daughterboard works on Mac, oficially it is not supported. Do note that Apple apears to be moving away form PCI-slots, in favour of Thunderbold.

  • FCPX crashes all the time using the Intensity Pro Card for monitor preview wassup?

    I have been using FCPX for a few months and I like it somewhat and could really love it for it not crashing on all my projects. If I scrub the timeline or apply a filter it crashes. I'm using the Blackmagic Intensity Pro card for monitor output and the latest drivers that they put out and can't really get through a project because it crashes all the time.I thought that if i get a different display card it would possible help I have the GForce 120 card which only has 256MB ram.Can anybody put some light on the issue?

    FCP X currently does not support Intensity Pro cards.
    Andy

  • ATI Primary and Nvidia Secondary for Hardware MPE Acceleration

    Hi everyone,
    I'm not sure if this has been discovered yet. I think it is very exciting, and very important for anyone with an AMD (ATI) GPU who wants hardware MPE acceleration.
    It is possible to use Hardware MPE acceleration while using an ATI video card as your primary adapter, and a lesser CUDA Nvidia GPU as a secondary adapter not connected to any monitor.
    My system:
    CPU: 1090T
    Mobo: 890GX
    RAM: 8 1333
    RAID: No
    GPU1: 5870
    GPU2: GTS 450
    As you can see, I have a Nvidia and AMD GPU in the same system. The 5870 is obviously by far the most powerful of the two, and it is what I use to record rendered footage using FRAPS.
    Recently, I became aware of the powers of hardware MPE. I concluded that the best way to obtain HMPE and maintain my FRAPS recording was to purchase a GTX 480. However, this was out of my wallets league as I could not sell the 5870.
    I was already aware that PhysX (A CUDA physics calculation library) could only be run on Nvidia CUDA GPUs (Like HMPE). Many Nvidia card users used secondary CUDA cards to accelerate physics calculation in games. ATI card users could not use a secondary Nvidia card for physics calculation as the Nvidia driver locked down PhysX in the presence of an active ATI GPU. Luckily a clever fellow called GenL managed to hack the Nvidia drivers to force PhysX to work in the presence of an ATI GPU.
    I hypothesised that if I performed that hack, HMPE would gain access to CUDA in a similar fashion to PhysX, thus allowing me to buy a far cheaper GTS 450 and pair it as an HMPE renderer with my 5870. After buying a GTS 450, I failed at implementing the hack and was about to give up.
    HMPE worked when my monitor was connected to the GTS 450, but if i tried to start PPro with the 5870 connected to any monitor HMPE was unavailable.
    I had two monitors connected to my GTS 450, and was playing around with adding stupid amounts of HMPE accelerated effects to an AVCHD clip. Realising that it was impractical to constantly switch the DVI cable from 5870 to GTS 450 I decided to leave my primary monitor connected to the 5870 and give up on HMPE. So, I reached around behind my computer and did it, but crucially did not quit PPro before I did so.
    When the screen flickered back to life, the yellow HMPE preview bar was still yellow. The timeline still scrubbed perfectly smoothly. HMPE was still working with a 5870 as the primary monitor: The PPro window was on the 5870 monitor, and the 5870 was rendering the window!
    I found that provided I did not close PPro, I could switch between HMPE and SMPE at will, all while using the 5870 as the primary adapter.
    I tested this using a 10 second composition of 3 AVCHD 1920x1080 Clips with CC, drop shadow, gaussian blur, edge feather, Basic 3D, transform, Ultra Key, drop shadow applied, rotatating amongst each other. I could still switch even if the 5870 was the only card connected to a monitor.
    Rendering this test clip via PPro direct export takes 30 seconds in HMPE mode with the 5870 and 1.43 in SMPE mode with the 5870.
    However: Rendering performance in AME stays the same whether I selected HMPE or SMPE. I believe this is because AME is a separate application that 're-detects' the ATI card and disables HMPE before beginning the encode, in the same manner that restarting PPro while using the 5870 removes the HMPE option. Rendering the clip in SMPE and HMPE modes using the GTS 450 gave the same 30 second vs 1.43 minute result.
    Therefore, as long as you are happy to encode via direct PPro export you will still see the benefit of HMPE while using an AMD card as the primary adapter.
    I hope this is as terribly excited to other users of ATI cards as it was for me. This has saved me several hundred dollars.
    Cheers,
    NS2HD

    Interesting results. I own a system manufactured by BOXX, a system developer out of Texas who really knows their stuff. I had asked them if it would be possible to purchase a CUDA enabled card and put it in my secondary slot and use it for MPE while maintaining my current (nvidia) card to run my monitors (also giving me the ability to run four screens). They said that no, according to the Adobe developers they were working with, Premiere could only use MPE off the CUDA card if the monitor previewing your work was plugged into that card. I guess they were wrong!
    Also, from my understanding, you don't see lesser results with AME because it's a separate program that starts separately, you see the lesser results because it has not yet been coded to take advantage of CUDA.

  • Create Membership Card for organization

    I have created a membership card template using Avery Business Cards for the output media.  Now i need to insert the member's name in the card and print them.  Can someone walk me through the process?
    Thanks

    Hi...
    You can purchase iTunes gift cards that can be redeemed to purchase Lion.
    Choose your iTunes Gift Card - Apple Store
    To redeem a code just click Redeem under Quick Links right side of the App Store window.
    Make sure you purchase a card(s) that can cover the price of Lion $29.99 plus an applicable taxes or fees >  iTUNES STORE - MAC APP STORE TERMS AND CONDITIONS

  • Is nVidia Geforce GT640 good graphics card for PrE10 despite low memory bandwidth?

    Can anybody confirm that the nVidia Geforce GT640 is a reasonable graphics card for Premiere Elements 10 and Photoshop Elements 10?
    The person who assembled my Core i7 3770K desktop with 16Gb of RAM at 1600mHz installed the nVidia GT640 card with 2Gb of DDR3 memory. He said that this was a good (fairly low cost) card for video editing because it has 384 CUDA cores - very helpful in video editing. I am pretty ignorant about graphics cards, but like the low power usage, 65 watts, and reputed cool operating temperatures. I have since read that DDR5 memory would have been much faster because of greater memory bandwidth - say 80-90Gb per second compared with 28.5Gb per second for the DDR3 memory on the GT640 card. I was after economical power use. DDR5 cards use 110 watts upwards and run much hotter than DDR 3 cards, all other things being equal. The really fast cards require special power units and cooling.
    Does anybody know whether limited memory bandwidth is important in video editing? Is speed much more critical in gaming than in video editing? Are other attributes such as 384 CUDA cores, nvenc syncing, dedicated encodment, 28nm Kepler architecture, 2Gb memory frame buffer, 1.3 billion transistors, plenty of texture units -  more important than memory bandwidth in video editing? Does bandwidth limited by DDR3 memory affect quality of image?
    I read that the GT640 would be much faster (producing better image quality?) than the HD4000 integrated Intel graphics of the Core i7 Ivy Bridge processor. Is this so?
    Windows 7 Home Premium 64 bit and all programs are installed on a 120Gb OCZ Agility 3 solid state drive. My data drive is a 1Tb Seagate SATA 3 at 7200 RPM. I have a beautiful 21 inch ASUS vs228n LED monitor and LG blu-ray burner.
    I did lots of editing with PrE 3 with a Dell 3.06 gHz hyperthreading desktop and Win XP. The output and even the preview monitoring was clear and stable. I am still capturing standard definition mini dv tape by firewire from a 3MOS Panasonic handycam, but plan to upgrade to HD 3MOS with flash memory. I make the preview monitor really small - about 7cm wide - in PR10, because the quality of the preview picture is much poorer than the quality that I experienced with the Premiere Elements 3 program with Win XP. Is this just an indication of memory-saving in PrE 10 previews? I expect output to be much superior, although still mpeg2-DVD quality until I upgrade my camera. I have set rendering on maximum bitrate.
    Anyway, despite these reservations with preview quality, the GT640 seems to be performing fine. Picture quality in Photoshop, online and elsewhere on my computer is excellent.
    I updated the nVidia display driver only yesterday to version 306.23.
    Nearly all graphics cards forums are about gaming. I hope to see more forums about graphics in editing here.
    What do you think of the 2Gb nVidia GT640 for editing with PrE 10 and Photoshop Elements 10? What would you say about picture quality in the PrE 10 monitor versus quality of output? Was picture quality in the PrE 3 monitor sharper and more stable, as I imagine?
    Regards, Phil

    Sheltie,
    Thank you for the kind words. We all work very hard to help others with video-editing. Some of us also show up on other Adobe forums, depending on the products that we use most often.
    Besides helping out, I also find that I learn something new every day, even about programs that I have used for decades. Heck, I just learned something new about PrE vs PrPro (my main NLE program), when I went to try and help a user. I probably actually use PrE more to test my theories, or to replicate a user's problem, than I do to actually edit my videos. Still, when applicable, I do real work in the program.
    With about a dozen "regulars" here, if one of us is not around, several more usually are. Personally, I do not understand how Steve Grisetti and John T. can dedicate so very much time here. Steve is a noted author of books on PrE, PSE, Sony DVD Architect, and others, plus helps run a video/photography Web site, Muvipix.com, that is very active, and has so very much to offer. John T. is always under the watchful eye of The JobJarQueen, and gets dragged, kicking and screaming, out into the yard, or up on his roof, so can be gone for a bit.
    Neale usually beats us all, since he's in the UK, and normally answers all the questions, that come in too late for us to see. He is also a PrE power-user, so beats me hands down.
    I travel a great deal, but no one ever misses me. Was supposed to do a trip to Sydney last Dec., but had to cancel. Have not gotten details on the reschedule of that trip, but it would have been my first jaunt south of the Equator. Gotta' make that happen.
    Good luck, and happy editing,
    Hunt

Maybe you are looking for