Two Gainward 6800 Ultras in SLi mode - PCIe VGA Link Width: 8x

I have a K8N Diamond SLi board with two Gainward 6800 Ultras. When I have just one card inserted I get "PCIe VGA Link Width: 16x" in the BIOS, but when I have both cards in I get only "PCIe VGA Link Width: 8x", is this correct?
I also get a shaky/trembling display in everything direct3d when I'm in SLi mode, as soon as I turn it off the image is rock solid. It's like the complete image on screen shifts one vertical pixel line up and then back to normal again.. this happens at regular intervals while playing games and it's very annoying :\ Any ideas?

Check your BIOS configurations concerning SLI settings, PCI-E settings and so forth. Have you run any benchmarks with the SLI and Balance Bar modes enabled? You will find both in nVidia Settings! That should tell you if any one card is not pushing it's load!

Similar Messages

  • Neo4-Fi PCI-e VGA Link width x8 problem

    motherboard: K8N Neo4-FI
    processor: AMD Athlon64 3200+
    ram: 2 x 1GB
    power supply: Antec True Power Quattro 850
    vga: Sapphire HD2600 XT 256MB PCI-e
    sound: Creative Audigy 2 Platinum
    raid: Highpoint RocketRAID 133
    hdd: 9 pcs (5 ata, 4 sata)
    case: Chieftec BA-02B-B-B
    fdd: ?
    fans: 2 x 12cm, 5 x 8cm
    DVD-rom: ?
    DVD+-RW: Pioneer DVR109
    Here is my problem. I had Vga Sapphire x1300. Link width was x16 and everything worked fine. Now I have bought new vga card Sapphire HD2600 XT 256MB and now the PCI-e vga link width is x8. The motherboard is not SLI and there is only one PCI-e x16 slot. The graphic card works and the card itself is x16. PCI-e is x16. So PCI-e vga link width should say x16 as it has with my old card, but it says x8. I have updated BIOS to the latest version. I have searched the manual but I didn't find anything about setting up the graphic card bus. BIOS has setting for PCIE Spread Spectrum which can be set to "Down Spread" or "Disabled". The second setting for PCIE is PCIE Clock which can be in range from 100-145MHz. I didn't changed these settings because with an old card everything worked fine, and I don't know what these settings mean. Are one of this two settings related to my problem?  I haven't overclock any of my components.
    Can someone please help me or suggest what to do?
    Thanks in advance!
    Daniel

    Quote from: daniel66 on 19-March-08, 03:05:03
    I did what you said and it worked. I pulled the card out and inserted it again. Pushed it a little bit harder in the slot and now it says x16 like it use to.
    Thanks again!
    Daniel
    That's why is good to have hammer around just in case

  • Problem of PCI express link width and speed

    hello,
    I instantiate the pci express core v1.7 into a pci express endpoint and the core was configured as GEN I x8 or GEN II x4. By using the example design Xilinx offered in the ipcore dir, I could read and write device by PIO mode.
    However, when I check the device’s link width and speed by using lspci –vvvv in Linux, I found that no matter what configurations I set, the device link is always trained as GEN I x1, which means the throughput of device, reduce 8 times. May the most important problem is that all logic in the user layer upon transaction layer are written at 250MHz, and if link width and speed are limited 2.5G/T and x1, I need to change user logic circuit which is a huge work.
    So my question is how to change the PCI express link width and speed in OS side, or I need to change a new motherboard?
    (I guess it related with motherboard, and I check that the PCI express slot in motherboard support GEN II X16. Another issue, when I insert a PCI express GEN II x8 device, the device is also trained as GEN I x1).
    lscpi -vvvv
    01:00.0 RAM memory: Xilinx Corporation Device 6018
    Subsystem: Xilinx Corporation Device 0007
    Control: I/O+ Mem+ BusMaster+ SpecCycle- MemWINV- VGASnoop- ParErr- Stepping- SERR- FastB2B- DisINTx-
    Status: Cap+ 66MHz- UDF- FastB2B- ParErr- DEVSEL=fast >TAbort- <TAbort- <MAbort- >SERR- <PERR- INTx-
    Latency: 0, Cache Line Size: 64 bytes
    Interrupt: pin A routed to IRQ 16
    Region 0: Memory at dfcff800 (32-bit, non-prefetchable) [size=2K]
    Region 1: Memory at de000000 (32-bit, non-prefetchable) [size=16M]
    Capabilities: [40] Power Management version 3
    Flags: PMEClk- DSI- D1- D2- AuxCurrent=0mA PME(D0+,D1+,D2+,D3hot+,D3cold-)
    Status: D0 NoSoftRst+ PME-Enable- DSel=0 DScale=0 PME-
    Capabilities: [48] MSI: Enable- Count=1/1 Maskable- 64bit+
    Address: 0000000000000000 Data: 0000
    Capabilities: [60] Express (v2) Endpoint, MSI 01
    DevCap: MaxPayload 512 bytes, PhantFunc 0, Latency L0s unlimited, L1 unlimited
    ExtTag- AttnBtn- AttnInd- PwrInd- RBE+ FLReset-
    DevCtl: Report errors: Correctable- Non-Fatal+ Fatal+ Unsupported-
    RlxdOrd+ ExtTag- PhantFunc- AuxPwr- NoSnoop+
    MaxPayload 128 bytes, MaxReadReq 512 bytes
    DevSta: CorrErr- UncorrErr- FatalErr- UnsuppReq- AuxPwr- TransPend-
    LnkCap: Port #0, Speed 2.5GT/s, Width x8, ASPM L0s, Latency L0 unlimited, L1 unlimited
    ClockPM- Surprise- LLActRep- BwNot-
    LnkCtl: ASPM Disabled; RCB 64 bytes Disabled- Retrain- CommClk-
    ExtSynch- ClockPM- AutWidDis- BWInt- AutBWInt-
    LnkSta: Speed 2.5GT/s, Width x1, TrErr- Train- SlotClk- DLActive- BWMgmt- ABWMgmt-
    DevCap2: Completion Timeout: Range B, TimeoutDis-
    DevCtl2: Completion Timeout: 50us to 50ms, TimeoutDis-
    LnkCtl2: Target Link Speed: 2.5GT/s, EnterCompliance- SpeedDis-, Selectable De-emphasis: -6dB
    Transmit Margin: Normal Operating Range, EnterModifiedCompliance- ComplianceSOS-
    Compliance De-emphasis: -6dB
    LnkSta2: Current De-emphasis Level: -3.5dB, EqualizationComplete-, EqualizationPhase1-
    EqualizationPhase2-, EqualizationPhase3-, LinkEqualizationRequest-
    Capabilities: [100 v1] Device Serial Number 00-00-00-01-01-00-0a-35
    Kernel driver in use: card

    I also have this issue of the user_link_up is high and everything looks good but the LnkSta widht is 1x. did you ever get any guidance about this?
     

  • Twin 6800 GTs in SLi

    Hi folks,
    I've got myself the  ASUS K8N-SLi Deluxe board and have popped two of these in it:
    http://www.scan.co.uk/Products/ProductInfo.asp?WebProductID=154668
    (MSI 6800 GTs)
    after *alot* of testing & playing and mucking about I still cannot get them to work satisfactorily in SLi mode.
    Picture Shows it best (Far Cry)
    Link
    There is distortion of one half of the screen (Show GPU balancing is enabled to show that it is the output of one card).
    This distortion is not seen when captured direct from DirectX (Fraps)
    Link (Warning: large Files, sorry)
    Board is the A8N-SLi Deluxe (BIOS REV 1002), Athlon Fx55, MSI Gf6800GT x 2, 550W PSU
    tried:
    swapping cards round,
    cards individually (fine)
    dicking around with memory (dual channel on and off etc)
    unplugging all optical drives / fans to free up power
    nvidia drivers 69 - 71.20 (no change)
    E-Z Plug is in place, E-Z Selector configured for dual cards, SLi connector in both orientations
    other apps (HL2) show similar, although seem to implement SLi in another way, as the interferance appears in a different fashion on the screen (flickering on and off, scrolling up / down the screen)
    at lower resolutions, the effect is not quite so bad and appears as simply a distortion on the lower half of the screen when the cursor is moved over / there is movement on screen.
    In Single card mode everything is fine also
    If you run a game in SLi mode, BUT windowed, there is also no distortion effect.
    Some other people are having this problem:
    Anantech Forums
    (Last Starfighter is me
    and
    Link
    but Asus insist that the fault lies with the MSI cards:
    Quote
    Dear Alex,
    This is more likely to be related to the graphics cards themselves. Try the latest Nvidia drivers or check with MSI.
    UK Technical Support
    PLEASE DO NOT REMOVE PREVIOUS MAILS
    WHEN REPLYING AS WE CANNOT GUARANTEE YOU
    WILL ALWAYS GET THE SAME PERSON RESPONDING
    Any ideas chaps?

    [Whoops - lagging a bit. This is for the pre-PSU posts.]
    Alex - *Weird*. Looks like the final presentation blit is sorting everything out. I'm hazy on how some of SLi works with this kind of thing. It's also possible that in full screen mode the app is trying to do something which the driver can't handle in SLi - overlays or some kind of palette fiddling, maybe? Just a thought. I don't know if there's a way to fiddle the driver settings to disable the caps for something like that, which might help. (I'm guessing, I don't know my way around nVidia drivers.)
    Zoomee - well, it's probably not cut down, for a GTo card.   I suspect they just didn't know. Given the trouble I've had getting anything out of MSI UK (who *also* don't seem to know anything, I don't think that they're being deliberately unhelpful) I believe Scan's technical support when they say they're also unable to find anything out. I've not seen the GTo card specs anywhere on MSI's global site either, and I did try to look before buying it - MSI UK probably should have told me (while I was getting them to find out about the dual link issue) if they'd known.
    It sounds like MSI could do with publishing some more information on the things they're shipping, but at the moment I'm too well-disposed to them for actually producing a dual link card (unlike, e.g., Innovision, who *still* say their card is dual link in spite of my contacting them twice, over several months, and finding out that it's not; for some reason their Ultra seems to have lost some pipes, too) to complain.
    It would help if nVidia would actually mention any details about the 6800 variants on their web site, of course. There's nothing at all I've found about GTo or LE cards, and very little to distinguish the others.
    I thought Scan's site still says 16 pipes (they just quote the 6800GT specs); someone was kind enough to mention that it's 12. I actually posted a query on that before finding out (still being processed, along with the following one saying "ignore me, and you'll be getting your cards back"). Generally, I find Scan to be very good until something goes wrong, at which point they get very confused. It's a gamble. I suspect someone who knows what they're doing is in the background at Scan, but everyone else is running by puppetry, and occasionally the strings get tangled.
    At least I now know why 3DMark05 wasn't smooth the whole way through...
    I'm keeping my fingers crossed for Overclockers. To be fair, it's not January yet (and my PCI-E motherboard won't turn up until then anyway).  Again, it's unlikely to be their fault if things run late. The only reason I'm looking at GTs is that I've given up on Ultras appearing any time soon - but I need a dual link connector (which means the MSI GT/GTo or I'm into Quadro territory), and I could really do with both SLi and decent generic shading rate. Under Linux, with no SLi (I presume?) for now, it's even more important that the basic card is fast.
    At least I've not pulled my current Opteron apart yet; it's only my desk that's a rat's nest of cabling.
    I guess we should have realized that 6800 availability was going to be a problem when the die size was first announced. I was kind of hoping they'd have given it a die shrink by now, though (I was expecting the Quadro 4400 to be on a smaller process, but then I was expecting it to be released before now...) and fixed up things like the video decode. The problem with the flagship silicon is that, for marketing, it's wonderful to produce a small number of them and then completely ignore them while 6600GTs get shipped to the masses. Sigh.
    I've heard good benchmarks for the Radeons, but I really want PS3.0 and dual link (and SLi helps) so I've ignored them. Good luck if you go that route - you'll never reach SLi speed that way, though. Gigabyte's card with two 6600GTs in one slot
    might be worth a look, if it's available yet (you can't SLi it, though).
    Thanks for all the information, guys - I'd have had a rude shock some weeks too late if I'd not lurked here...
    Er. Alex, I really *hope* that PSU copes. I've only just upgraded to a 480W
    Tagan (although it's under-specced), and I'm expecting that to handle it.
    Andrew

  • Neo 2 desert combat shuttering (sliding skipping etc)problem with bfg 6800 ultra

    hey guys i just built a new pc after i got everything installed and updated to service pack 2 i installed desert combat only to have shuttering problems especailly turning fast in chopper or plane its kinda like lag or dropping frames but according to fraps its not drop frames just really choppy game play i know its not the card because i stuck it back in barton 3000 board worked fine. put my old msi 5950 ultra in same problem
    fixes tried with no sucess
    reinstall windows
    turned off nv
    turned off fast writes
    tried my old set of 512 corsair plantinums =i gig dual chan
    install nforce driver from nvidia
    tried serveral nvidia graphic card drivers even the bet 71.21
    turned off spread spectrum
    pulled hair out
    tried old video card which is 5950 ultra both cards work fine with old mobo
    basically im out of ideas
    please help
    system specs
    90mn winchester core 3500 with xp 90 thermalright heatsink
    audigy zs 2
    corsair 3200LLPro with leds i gig in dual chan mode
    two 74 gig raptors in raid o
    OCZ 520 watt modstream powersupply
    bfg 6800 ultra overclocked from factory 425/1100
    xp pro fully updated sp 2
    please please please help been working on this for two days without any luck maybe ill just rma the board and get an asus or gigabite board

    First off have you been using Driver Cleaner to get rid of all the old bits of drivers? if not go get it read the instructions and use it.  Re-start the driver process again.  The latest ones on the Nvidia site are probably a safe bet at the moment.  So clean your PC with that prog and install new drivers.
    Then do as you say you have done and ensure 'FastWrites' are turned off.  This is most important.  Get 'Riva Tuner' and this will comfirm you have them turned off.  Also get the 'Coolbits' registry tweak.  Run this and it will add overclocking capability to your display properties.  You can adjust both 2D and 3D speeds.  Now what you must ensure is that they are both set at the same speed!  sometimes the 2D comes set lower than the 3D this can cause symptoms like you are having.  Set the 2D to the same speed as the 3D.
    I have an XFX6800 Ultra  in my K8N Neo2 Plat and it works superbly so fingers crossed doing the above will sort it as they are the main probs with the 6800Ultra cards.

  • Diamond Plus SLI mode?

    Ok, now when I have installed all chipset drivers, I'd like to enable SLI mode. But how can I do that? I have set two GF 6800 with SLI bridge and windows find that second card, but there are no enable SLI mode anywhere.
    What is the trick to do that? In manual read's only that plug two cards with bridge, nothing else.
    BTW. The lower heatsink is very hot, I think. You can hardly put finger on it. Is it normal?

    Quote from: mrking.id on 07-May-06, 09:43:26
    I know with the vid drivers one should uninstall the old ones before intalling the new ones, but what about the nForce drivers? I've never updated them before so what is the protocol?
    If you're using Nforce 2/3 you need not have to update it to the latest drivers the best drivers is Ver5.10 If you're using the Nforce4 then you've to updates it the latest drivers from Nvidia website. Of course you'll see alot of listing in the Nvidia driver and you need to select Nvidia AMD.As for the VGA you also need to updates it to the latest driver. GD luck.

  • K8N+SLI Mode - Need help

    Hello,
    I was going to buy the second graphic card for my computer to work in SLI Mode. Nowadays I am using:
    MSI GeForce 6600 128MB DDR3/128 bit TV/DVI/VIVO PCI-Express (8979-06S)(NX6600-VTD128E Diamond)
    so I went to the shop, in which I always buy computer parts and got there an information that this card is no more available 
    The only two models of MSI Graphic cards (based on 6600 chipset) that are available are:
    MSI GF 6600 128MB/128b TV/DV PCI-E MS-8981-Z02
    And
    MSI GeForce 6600GT 128MB DDR3/128bit TV/DVI PCI-Express (8983-010) (NX6600GT-TD128E)
    Will any of the mentioned card work with my MSI GeForce 6600 128MB DDR3/128 bit TV/DVI/VIVO PCI-Express (8979-06S)(NX6600-VTD128E Diamond)?
    Please HELP ME!
    Greetings

    try to boot with 2 memory sticks only, when using 4 sticks setup memory index to 333. (Winchester core had bad mem. controller on it aswell, it may not handle 4 sticks property)
    Clear CMOS Guide
    and re-test.
    next thing is your PSU, doesn't look to have enough juices for your VGA. VGA min req.(over 30A on +12V for single rail, over 22A for dual rail +12V by ATI)
    your PSU got +12V1 17A, +12V2 17A
    good PSU recommendation:
    https://forum-en.msi.com/index.php?topic=103299.msg757739#msg757739
    or borrow from friends to test.

  • Enable SLI mode on K8N SLI Platinum

    Hello, I got an older AMD64 3500+ CPU on a MSI K8N Platinum SLI motherboard, model 7100 (with a flip-board SLI switch), 2x512Mb DDR400 Corsair XMS512C2 memory, a Soundblaster Live! 4.1 soundcard on a PCI slot and two GeForce 7950 GT 512Mb DDR3 PCI-E cards. Power is supplied by a Corsair HX620 620W with separate power cables to each card. No signs of instability on the system whatsoever (no overclocking applied). Built-in sound card, Firewire, RAID and other integrated peripherals disabled in BIOS.
    I recently got the second GFX card from a friend, and they are likely not from the same manufacturer. I have installed both cards, flipped the SLI board to "SLI mode", installed the SLI bridge that came with the motherboard, yet I can't get SLI to fire up. Both cards have exactly the same BIOS revision and exactly the same clock speed (core and memory) when checked with the ESA tools from Nvidia.
    I have reinstalled the newest Nforce4 and Geforce drivers, but to no avail. Both cards are installed and enabled in Windows XP (32bit SP3), but I don't get any options for SLI in the control panel, and I don't see any performance difference in-game. I have looked in the BIOS for SLI-enable-settings but found none.
    Is there anything else I can do to enable SLI mode for the system? Or are the cards simply incompatible? Reflashing with some stock BIOS on both cards, can that make them compatible? Or is maybe the SLI bridge not seated properly? What exactly ARE the requirements for two cards to match, anyway?
    Further, does anyone know the real-world implications of disabled PCI prefetch, as this was disabled in the last update for the motherboard? I use a PCI soundcard, and have some performance issues in games that seems to be somewhat correlated to sounds being played.
    Best regards, Martin "xarragon" Persson

    Quote from: Svet on 12-August-09, 20:23:52
    which one? http://eu.msi.com/index.php?func=searchresult&keywords=NX7950GT
    Min seems to be the "MSI NX7950GT - VT2D512EZ - HD", latter one with VIVO support. It does have passive cooling using heatpipes, and a sticker on it says what I put in double quotes above. That would be this card:
    http://eu.msi.com/index.php?func=proddesc&maincat_no=130&prod_no=1045
    The sticker/model name for the ASUS card says:
    ASUS P455 REV: 1.00
    EN7950GT/HTDP/512M/A
    I suppose it ought to be this card? http://www.asus.com/product.aspx?P_ID=5XmjRmrwz3r9rWuv
    I have also used GPU-Z to pull the following info from the ards, when they were connected one-by-one to the primary PCI-E 16X slot with the SLI bridge set to "non-SLI" mode.
    GPU-Z: MSI BX7950GT-VT2D512EZ-HD:
    GPU: G71 Revision: A2
    Device ID: 10DE-0295
    BIOS Version: 5.71.22.42.06
    Subvendor: MSI (1462)
    ROPs: 16
    Bus interface: PCI-E x16 @ x16
    Shaders: 24 Pixel / 8 vertex
    DirectX support: 9.0c / SM3.0
    Pixel fillrate: 8.8 GPixel/s
    Texture fillrate: 13.2 GTexel/s
    Memory type:  GDDR3
    Bus width: 256 Bit
    Bandwidth: 44.8 GB/s
    Memory size: 512 Mb
    GPU Clock: 550 MHz
    Memory: 700 MHz
    GPU-Z: For ASUS P445
    GPU: G71 Revision: A2
    Device ID: 10DE-0295
    BIOS Version: 5.71.22.42.06
    Subvendor: ASUS (1043)
    ROPs: 16
    Bus interface: PCI-E x16 @ x16
    Shaders: 24 Pixel / 8 vertex
    DirectX support: 9.0c / SM3.0
    Pixel fillrate: 8.8 GPixel/s
    Texture fillrate: 13.2 GTexel/s
    Memory type:  GDDR3
    Bus width: 256 Bit
    Bandwidth: 44.8 GB/s
    Memory size: 512 Mb
    GPU Clock: 550 MHz
    Memory: 700 MHz
    Both cards work fine, with the MSI card being slightly hotter due to the passive cooling. The ASUS one got an aftermarket Zalman VGA cooler mounted.

  • Neo4 Plat/Sli Dual PCI x16 or Dual x8

    I'm a little confused about my motherboard (K8 Neo4 Platinum/Sli)... on the specs written on the box i can see "Dual PCI-E x16"... in the way that i understand that is... that i have 2 pci-e ports with x16 bandwidth on every port...
    The usuer's guide says "The mainboard provides two PCI Express x16 and three 32-bit PCI-BUS"
    This is true...??
    2 x16 pci-e = 32 PCI-E lanes for graphics or 2 x8 PCI-E = 16 PCI-E lanes for graphics 

    There is only 20 pci-e lanes in total on nforce4 .
    Dual x-16 in this explanation means the it has two physical x-16 slots .
    On delivery the sli card/switch is mounted so the first 16x pci-e port has 16 lanes and the second 16x port 2 lanes .
    When it's set to sli mode it reorganizes it to 8 lanes on each of the two slots .

  • Is the geforce 6800 ultra ddr3 same as the ddr and will the ddr3 fit my dual powermac G5 june 2004?

    i want to get this video card but there is a ddr and a ddr3 whats the difference and will it fit my power mac dual cpu june 2004??

    The Apple OEM versions are GDDR3 memory for the GT DDL and the Ultra DDL.
    Both run 16 pipelines and both use the NV4x GPU clocked at 350MHz and 400MHz respectively..
    Total bandwidth runs in the favor of the Ultra DDL:
    Geforce 6800 GT 32GB/s
    Geforce 6800 Ultra DDL 35.2GB/s
    However, benchmarks show the performance of the two cards to be very close:
    http://www.barefeats.com/radx800.html
    The PC versions of the GT and Ultra use the same GPU, GDDR3 memory and same clock speeds, except in the case of OC variants.
    The Memory clocks of the GT and Ultras (OEM or PC) are 500 MHz (1000 MHz) and 550 MHz (1100 MHz).
    Performance of the PC  version is the same as the OEM equivalent, only port configuration and card width being major differences (PC cards are typically single slot, while OEM are dual slot. This is a negative if one has many PCI/PCI-X cards).
    Of course there are cooler variations, but all use either a single or a dual slot Cooler Master cooler.
    Both OEM Geforce 6800 GT and Ultra cards will fit in and work in any AGP G5 (yours) with only OS X Panther 10.3.6 or later being needed (Tiger sam an improvement in Geforce drivers).
    Of the flashed GT and Ultra cards, the same is true of the OEM- either can be used in any AGP G5.

  • Geforce 6800 Ultra - output only video to TV

    I just bought a new Geforce 6800 Ultra and I'm a little confused about the settings for TV output via the s-video cable. Is it possible to have only the video I'm watching zoomed full screen on the TV? Currently I'm only able to get the desktop shown on the TV but not the video only.
    I'm asking this because I used to have Matrox Parhelia 128 which had this function and I liked to watch all the DVD's and other video from my tv.
    Thank you for your help!

    I'm still confused about the TV-problem.
    When I switch to TV mode I can watch videos from TV but when I return I have to put all the resolutions and screen adjustments back in the settings. It's really frustrating to do it every time.
    Does anyone know a workaround for this?

  • I have a 6800 ultra in my powermac, what temp should it be?

    I have a 6800 ultra in my powermac 2.7, what temp should it be? right now the case is between 111-120 F and the chip is between 150-160F.... is this the temperature that it should be running in? also, is the only difference between the ultra and the GT the 50mhz core clock and the 100mhz memory clock? thanks for the help

    Dear The Geek,
    I have the Dual 2.5 G5 with very similar cooling system, etc. I just recently installed a 6800 GT card new from Ebay and it has been running about 111ºF on the Graphics processor case and 144ºF on the Graphics processor chip at idle. As far as I know the difference in clock and memory speed are the only differences between the 6800 GT and Ultra. That said, the difference in processor operating temp could be due to the slightly higher clock speed, I suppose. Running Cinebench I did get the processor temp up to about 147ºF. I don't think the extra 10ºF should be a critical difference. I wonder how your fan speeds compare. For instance, my Graphics processor fan runs static at 32% and the PCI Slot Fan runs static at 49%. How does your PCI fan compare?...I notices mine runs considerable faster with the added current used by the 6800 compared to the original 9600xt.
    Chad
    PowerMac G5 Dual 2.5GHz   Mac OS X (10.4.5)   23" ACD HD; PB G3 Lombard; 1G, 4G iPods

  • Upgraded 10.5.5 Dual G5 6800 Ultra lost main monitor

    I run Dual head (twin LCDs 1280x1024) on my Dual 2.5GHz G5 with 6800 Ultra
    I just upgraded to 10.5.5 and I lost my primary display, it's not kicking out any signal that my screen can work with. The secondary screen is fine.
    I went to display settings and changed the primary screen to loads of valid settings and nothing works.
    Reset PRAM, still nothing.
    I suspect from reading other posts that 10.5.5 is bollocked on G5s somehow.
    Does anybody have any insight or information about what I can do to fix things?
    I'm not overly happy about trying to drop back to 10.5.4 as I just lost one of my drives and I'm actually trying to restore that at the moment so the prospect of finding a spare 500G drive to clone onto is a bit scary etc.
    Anyway, let me have the suggestions

    Hi Mordatdansant,
    Welcome to Apple Discussions!
    I want to say make a bootable clone first and use Archive and Install as the option on the OS Disk to get back to an earlier OS, but you don't seem very up for that.
    There's another thing you could try to fix your issues but only if you're willing to take an element of risk. I've done this two times to clean up weird problems on my mac but I had backups galore and could take the risk (I'm reckless as heck too). Exactly what problems trying this fix could cause, I don't know. Essentially I would think it was copying over files to replace identical ones. It's I just don't see this posted too often.
    So do you feel lucky..., well, do ya' "Mordat"?, huh?
    If so, download the Mac OS X 10.5.5 Combo Update, yeah, the gigantic one, here:
    http://www.apple.com/support/downloads/macosx1055comboupdate.html
    Once you get it all to your desktop, double click on it and apply the update, again. When I did it I don't recall the installer giving me the usual install options, but if it does do an Archive&Install.
    You'll be asked to restart..., and, when it all starts ups again, ideally, your problems will be gone.
    Best of luck
    Steve
    Message was edited by: Samsara

  • Can My 6800 Ultra Push Video To My Trusty Dusty Samsung HDTV

    Hello
    I have a DP 1.8 with a 6800 Ultra card. I have an older Samsung HDTV that has a DVI connector. The Samsung website tells me that it is a DVI-D Single Link input. Can I run a standard DVI-D cable from my card to my TV?
    I am hoping to use this older Samsung of mine as a display to test my video edits on before exporting and what not.
    Anyone have any luck?
    Thanks.
    If anyone finds this whose curious--I have two 1.5TB Seagate HD's in my G5 and it runs like a champ.

    If one or both connections are DVI-D, you need a DVI-D cable.
    http://www.datapro.net/techinfo/dvi_info.html

  • GeForce 6600 & 6800 Ultra or 7800GT in G5 QUAD???

    Hello guys.
    What better for upgrade?
    I check tests-and see that 6600 better than Ultra...
    So... what card I need if I want change my 6600.
    Any ideas?
    Thanks for help...

    Hi Lexas;
    I currently have the Quad with the 7800 GT in it. It is great.
    In the title you sound like you might be thinking of putting a 6800 Ultra in the Quad. I don't think it will work. The 6800 Ultra is AGP and the Quad uses PCI Express cards. So I don't think it will fit into the Quad.
    Allan

Maybe you are looking for

  • How to restore itunes library from a faulty time machine backup?

    Hello, I just bought a new macbook pro 15inch. I have a time machine backup but I wanted to start fresh so I didn't use the migration assistant to transfer my info. The only thing I want is my iTunes library with the playlists, ratings, artwork etc.

  • Titan offer is in processing or not

    i bought a printer 20/3/2014 and i register in your site to get the titan offer and i get a Emil to send the purchase invoice copy to [email protected] i also send a scanned invoice copy but i don't get any reply email. i just want to know if my tita

  • Creating object class dynamically

    Hi , I need to create object dynamically for a class.eg when i loop thru internal table i need to create an object per pass of the loop and no of entries  in itab can be known at runtime only Any clues Regards Swatantra

  • My iphone and ipad doesnt wanna connect with bluetooth

    Why most of apple say "FORGET THIS DEVICE" if it doesnt wanna connect

  • More Rows or More Columns ?

    I have a table (a form) where 50 attributes have to be filled for each id. There may be some extra attributes be added or removed in this form. Which is better 1) Going with more columns (50+) less rows. 2) Going with less columns and more rows. I am