carlc1993 Posted September 28, 2011 Share Posted September 28, 2011 Hello I have recently got a new laptop with a graphics card. I want start playing Fallout New Vegas again and when I load up the game and go to the options menu it is not recognizing my graphics card. I have a 1GB AMD radeon graphics card. When I go to the graphics adapter it is not displaying my graphics card only this Mobile Intel HD graphics. Why is it not displaying the AMD graphics card? Thanks Carl Link to comment Share on other sites More sharing options...
bben46 Posted September 28, 2011 Share Posted September 28, 2011 It doesn't recognize the graphics card because the card is newer than the game. All that means is you will have to set the graphics resolution yourself. However, not every Laptop graphics subsystem will work with the games. Some are just not made to run games. Using the computer you plan to play the game onGo here to see if your computer can run the gamehttp://www.systemrequirementslab.com/cyri/ You will find the various game we support Oblivion listed as 'The Elder Scrolls IV, OblivionFallout3 as Fallout3 PC Dragon Age listed as Dragon age: OriginsDragon Age 2 as Dragon Age IIThe Witcher as The WitcherThe Witcher 2 as The Witcher 2: Assassins of KingsFallout New Vegas as Fallout: New VegasMorrowind is not listed, However if you can run Oblivion at all it should run well on your system. Note: TES V - Skyrim is not listed yet as no one knows what the final requirements will be yet. Link to comment Share on other sites More sharing options...
carlc1993 Posted September 28, 2011 Author Share Posted September 28, 2011 when I got to the settings menu for the card there is a menu for gaming. i have tried that tool and it failed. so what you're saying is the graphics card wont do nothing for the game Link to comment Share on other sites More sharing options...
carlc1993 Posted September 28, 2011 Author Share Posted September 28, 2011 That test failed. So what you're saying is that the graphics card I have wont do nothing for fallout new vegas Link to comment Share on other sites More sharing options...
bben46 Posted September 28, 2011 Share Posted September 28, 2011 If that site says your computer cannot play the game, it usually tells you why. If it failed on the graphics card, then you are probably out of luck, as you cannot change the graphics subsystem on a laptop. :confused: Link to comment Share on other sites More sharing options...
carlc1993 Posted September 28, 2011 Author Share Posted September 28, 2011 OK I went to the site and it said fail. and you are right when you say when I first started it set the settings to low. however I changed to graphics to high and it was playing ok but with a little bit of lag. My laptop is HP pavilion DV7The spec is :6GB RAMintel i7 processorAMD Radeon HD 6490M graphics will this be ok to run the game Link to comment Share on other sites More sharing options...
GothikaGeist Posted September 28, 2011 Share Posted September 28, 2011 (edited) Just to clear things up: What's going on is that laptops automatically DISABLE the gaming video adapter to conserve battery power, when they do that they fall back on their internal graphics adapter (In this case the Mobile Intel HD Graphics). So if that is a gaming laptop with the secondary (Gaming) video adapter then try going into your PC's POWER settings and turn 'em up (More battery use) or look for options pertaining to activating the gaming adapter. (There were a few technical supports about this before, maybe try to search this forum for the answers as well) EDIT: Just to further clear the air of doubt here: The age of the video card should not matter. You said you run an AMD Radeon HD 6490M? It should pop-up as AMD Radeon 6400 Series. And I'm personally running an AMD Radeon 6750 which is recognised correctly as a AMD Radeon 6700 Series. EDIT: Oh, and with those listed specs you should definitely be able to run the game, you FAR exceed the minimum requirements. Edited September 28, 2011 by GothikaGeist Link to comment Share on other sites More sharing options...
carlc1993 Posted September 28, 2011 Author Share Posted September 28, 2011 On 9/28/2011 at 4:53 PM, GothikaGeist said: Just to clear things up: What's going on is that laptops automatically DISABLE the gaming video adapter to conserve battery power, when they do that they fall back on their internal graphics adapter (In this case the Mobile Intel HD Graphics). So if that is a gaming laptop with the secondary (Gaming) video adapter then try going into your PC's POWER settings and turn 'em up (More battery use) or look for options pertaining to activating the gaming adapter. (There were a few technical supports about this before, maybe try to search this forum for the answers as well) EDIT: Just to further clear the air of doubt here: The age of the video card should not matter. You said you run an AMD Radeon HD 6490M? It should pop-up as AMD Radeon 6400 Series. And I'm personally running an AMD Radeon 6750 which is recognised correctly as a AMD Radeon 6700 Series. EDIT: Oh, and with those listed specs you should definitely be able to run the game, you FAR exceed the minimum requirements. Thanks very much I will try to configure the battery usage settings :) Link to comment Share on other sites More sharing options...
carlc1993 Posted September 28, 2011 Author Share Posted September 28, 2011 i have messed around with the power settings and it still isnt showing my graphics card in the graphics adapter dropdown menu Link to comment Share on other sites More sharing options...
GothikaGeist Posted September 28, 2011 Share Posted September 28, 2011 (edited) On 9/28/2011 at 5:13 PM, carlc1993 said: i have messed around with the power settings and it still isnt showing my graphics card in the graphics adapter dropdown menu Have you set it to "Best ZPerformance Power Plan" or whatever? I'll search some stuff up. So from what I've read, try opening up your BIOS and search for a setting to "set" either the integrated (Intel) or dedicated (AMD) videocard. Here's something good: http://forum.noteboo...phics-card.html that guy is trying to switch to intregrated graphics card, the reverse. Apparently if you have XP you need to go into your BIOS and go under the DISPLAY setting and change your video adapter. In Vista (and I'm guessing Windows 7 as well) you change your battery output to it's maximum setting (best performance or whatever) "Yes, if you are using Vista you can get to the option by left clicking on the battery icon near systray then select switchable graphics then "Energy saving" If you are running XP then you can't do it this way, instead you have to go in bios to select this option. Press F1 to go in Bios and then go to Display section and disable switchable graphics and select integrated."[Remember: You're doing the reverse of this.] Edited September 28, 2011 by GothikaGeist Link to comment Share on other sites More sharing options...
Recommended Posts