Guest Tessera Posted November 6, 2006 Share Posted November 6, 2006 I had thought Oblivion was optimized for ATI, and Morrowind for nVidia. But it may be the other way around...all I know is that it was due to whatever hardware the relevant X-Box was using. I'm pretty sure you're correct about Morrowind being aimed primarily at ATI cards. Oblivion seems to have been developed with nVidia dual or quad SLI systems in mind. Couple this with the fact that nVidia was running ads for Oblivion on nVidia's own website for awhile and it leads me to believe that they were working together. ATI, on the other hand, ran no such ads on their site. I guess I'm just putting 2 and 2 together. I could be wrong, of course. Neither video card was officially recommended by Bethesda when Oblivion was released. It's just a feeling I have about it. Link to comment Share on other sites More sharing options...
tonyt Posted November 11, 2006 Share Posted November 11, 2006 .......I'm pretty sure you're correct about Morrowind being aimed primarily at ATI cards. ...... Yup. Oblivion was aimed to be used with ATI cards, but Nvidia works just as well if not better. (I use an Nvidia 7600) it all comes down to wether you want to run HDR and AA at the same time. ATI can, nvidia can't. But then again, HDR looks bad so I stuck with nvidia. Tony T. P.S. I hear Nvidia just made a new graphics card.....8800? Link to comment Share on other sites More sharing options...
Guest Tessera Posted November 11, 2006 Share Posted November 11, 2006 Tony T. P.S. I hear Nvidia just made a new graphics card.....8800? Yes... a new water-cooled 8800 monster, optimized for use with the upcoming DirectX 10 and Windows Vista. The going price is about 750-800 American dollars right now. The 8800 has full Shader Model 4.0 support and also looks like it could explode and take out a city block... http://www.tessmage.com/images/BFG_nVidia_8800_water-cooled.jpg It should be pretty awsome. I guess this is nVidia's answer to the ATI X1950-XTX challenge. But now, I'm wondering what's next from ATI... this video hardware war could get scary. :P Link to comment Share on other sites More sharing options...
Switch Posted November 12, 2006 Share Posted November 12, 2006 I'm guessing the "ATI XX2500XTXX Xtreme Xdixion" as MB might say, is just around the corner. After all, it's not only a competition of power, it's a competition of how many Xs one can cram into the title. :P It does look like one seriously nice piece of hardware from NVIDIA. Your comment about it taking out a city block with the heat it generates isn't far off though I expect... especially if they've had to resort to water cooling (I mean, water cooling on a graphics card? For pity's sake!). They also say it's 28 centimetres long. That's almost too much to fit in a tower case! Shuttle owners, return your case now... Link to comment Share on other sites More sharing options...
Guest Tessera Posted November 12, 2006 Share Posted November 12, 2006 See... I think I know where all of this is heading... After they stop hugging their knees and come out from under their desks, the engineers at ATI will begin top-secret work on their next suitcase bomb... err, I mean, next-generation video hardware. You see, the new nVidia 8800 card started me thinking. Mainly, I'm thinking that there simply isn't enough room in most modern PC's for these pixel-pounding monstrosities to grow any larger. So... what if they made an external module, instead..? You'd have a standard PCI-e card that simply works as a breakout box (external interface) The actual video hardware would be separate and in its own enclosure, sitting next to your PC and connected via the card I just mentioned. Then, you could cool the whole thing off with liquid Freon, just like air conditioners use. The environmentalists will really love that, too (NOT). No biggie... ATI might get around their protests by claiming that the external module doubles as a space heater, saving homeowners hundreds of dollars per year in home heating bills and... ...and I think I need to go back to bed. Nevermind. :P Link to comment Share on other sites More sharing options...
Marxist ßastard Posted November 12, 2006 Share Posted November 12, 2006 An external video card? What will it get its power from, a 9V adapter? Will it connect to the mainboard via USB? Those (major) grievances aside, the concept you're just completely failing to grasp is that the second slot isn't being taken up by functional parts of the video card at all --- it's being taken up by a cooling system that would be made obsolete if the card weren't facing the wrong way. Link to comment Share on other sites More sharing options...
Guest Tessera Posted November 12, 2006 Share Posted November 12, 2006 An external video card? What will it get its power from, a 9V adapter? Will it connect to the mainboard via USB? Those (major) grievances aside, the concept you're just completely failing to grasp is that the second slot isn't being taken up by functional parts of the video card at all --- it's being taken up by a cooling system that would be made obsolete if the card weren't facing the wrong way. It was a joke. My dry sense of humor often goes right over some people's heads. It's a cross I bear... PS - the 8800 also comes in a standard, fan-cooled version for those who feel adventurous. Link to comment Share on other sites More sharing options...
Marxist ßastard Posted November 12, 2006 Share Posted November 12, 2006 It's a cross I bear...Please stop comparing yourself to Jesus. Link to comment Share on other sites More sharing options...
Guest Tessera Posted November 13, 2006 Share Posted November 13, 2006 It's a cross I bear...Please stop comparing yourself to Jesus. I don't think he played computer games, so no comparisons would be valid. Anyway, what does any of this have to do with video cards..? :unsure: Link to comment Share on other sites More sharing options...
Switch Posted November 13, 2006 Share Posted November 13, 2006 Nothing. If you think your sense of humour is dry, try deciphering MB's sometime. :P Please stop stirring things MB. ;) It is an interesting idea though. I wonder if video cards, and indeed PCs, will start to enlarge as the limits of how many transistors you can shove onto a small board space are reached? Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.