AceGoober Posted February 25, 2016 Share Posted February 25, 2016 Loaded up TechPowerup GPUz and set to record while playing Fallout 4. Using nVidia optimized defaults for 3840 x 2160 with ReShade and a bunch of mods. Played for roughly three hours then pulled the data log into a spreadsheet. Check out the VRAM usage... Date: 2/25/2016 3:14:34 AM GPU Core Clock [MHz]: 1200 GPU Memory Clock [MHz]: 1752.8 GPU Temperature [0C]: 79 Fan Speed (%) [%]: 64 Fan Speed (RPM) [RPM]: 3102 Memory Used [MB]: 10188 GPU Load [%]: 45 Memory Controller Load [%]: 25 Video Engine Load [%]: 0 Bus Interface Load [%]: 0 Power Consumption [% TDP]: 72.6 PerfCap Reason []: 4 VDDC [V]: 1.143 I pushed Fallout 3 and Fallout: New Vegas to 3.8GB VRAM usage. Neither would go higher no matter what I did. Skyrim was pushed to using 5.1GB after which it would crash. It is clear that Fallout 4 won this round. :hehe: I uploaded the three hour slice into a Google Spreadsheet if anyone wants to peruse the data. Fallout 4 VRAM Data Filtered V1 Link to comment Share on other sites More sharing options...
steve40 Posted February 26, 2016 Share Posted February 26, 2016 Of course, Fallout 4 is a 64-bit game. Link to comment Share on other sites More sharing options...
AceGoober Posted February 26, 2016 Author Share Posted February 26, 2016 Yes, but have you ever viewed a game using 10GB graphics card VRAM? I'd gander to guess probably not. Link to comment Share on other sites More sharing options...
Moraelin Posted February 26, 2016 Share Posted February 26, 2016 Well, I think you'll find that when following a hypothesis to the logical end yields a blatantly absurd result, you just disproved that hypothesis. X => Y is equivalent to !Y => !X. It's called an ad absurdum. In this case it tells you that assuming that what your tool shows as VRAM use means graphics card memory is obviously false, since no gaming card has that much RAM. Not even the Titan X. I don't know what your particular tool thinks it does, but taking a wild guess, what you're seeing is that DirectX swaps textures in and out over the PCIe bus, from actual video card RAM to mainboard RAM and viceversa. So I suspect you're just seeing how much texture data is currently loaded. And 10 GB, if you have the RAM, isn't even surprising, after taking a quick peek at the size of the textures BA2 files. That or the tool is simply showing wrong data. Link to comment Share on other sites More sharing options...
jones177 Posted February 26, 2016 Share Posted February 26, 2016 Hi I play at 3840 x 2160 with a 980 ti sc. I use Fallout Performance Monitor for testing & my average usage is 3.5gb vram. My highest ever reading was 3.8gb. I only have 20 mods installed. In Skyrim I only achieved 4.1gb usage without the enb executable. Like Skyrim, vram stays the same at 1440 & 2160. I will try TechPowerup tonight to see what I get. Later Link to comment Share on other sites More sharing options...
BlackRoseOfThorns Posted February 26, 2016 Share Posted February 26, 2016 Yes, but have you ever viewed a game using 10GB graphics card VRAM? I'd gander to guess probably not. Skyrim in 4k with ENB+ 4-8k textures can easily push 9GB+ of VRAM on sli GTX Titan setup when recording. If the game is crashing it's most likely having problems with loading new objects (32bit engine is a huge limitation) or some script conflicts. Link to comment Share on other sites More sharing options...
BlackRoseOfThorns Posted February 26, 2016 Share Posted February 26, 2016 Well, I think you'll find that when following a hypothesis to the logical end yields a blatantly absurd result, you just disproved that hypothesis. X => Y is equivalent to !Y => !X. It's called an ad absurdum. In this case it tells you that assuming that what your tool shows as VRAM use means graphics card memory is obviously false, since no gaming card has that much RAM. Not even the Titan X. I don't know what your particular tool thinks it does, but taking a wild guess, what you're seeing is that DirectX swaps textures in and out over the PCIe bus, from actual video card RAM to mainboard RAM and viceversa. So I suspect you're just seeing how much texture data is currently loaded. And 10 GB, if you have the RAM, isn't even surprising, after taking a quick peek at the size of the textures BA2 files. That or the tool is simply showing wrong data. Indeed. The amount seems too high for vanilla setup in comparison to when it's monitored with more common tools. Link to comment Share on other sites More sharing options...
jones177 Posted February 27, 2016 Share Posted February 27, 2016 I got about the same readings with GPUz as I did from Fallout Performance Monitor. So why is yours so high & mine is about what you would expect? Anyway. Enjoy your Samsung 28" UHD LED Monitor. It is perfect for these type of games. When I bought mine I was shopping for a 1440 monitor. My local BestBuy(US electronics store) was running a demo on one that was surrounded by 1080 monitors. After seeing it I knew that my days playing at 1080 were over. Later Link to comment Share on other sites More sharing options...
zanity Posted February 27, 2016 Share Posted February 27, 2016 Fallout 4 uses a TEXTURE STEAMING technology based on John Carmack's MEGATEXTURE library (although the 'every pixel is unique' megatexture idea itself is NOT used). Now Megatexture is JUNK, and the idea that the streaming aspect of megatexture automatically 'scales' to the resources available causes ALL the problems seen in FO4 steaming, especially excessive VRAM usage and late loading hi-quality textures. Use the .ini hack to DISABLE texture streaming (for the most part) and force only the best version of any texture to load. This increases initial loading times, but otherwise fixes texrture problems peeps have (and allows me on a ONE GB VRAM card to have the very best textures dispayed AND have the best performance possible with my GPU). Link to comment Share on other sites More sharing options...
zanity Posted February 27, 2016 Share Posted February 27, 2016 Fallout 4 uses a TEXTURE STEAMING technology based on John Carmack's MEGATEXTURE library (although the 'every pixel is unique' megatexture idea itself is NOT used). Now Megatexture is JUNK, and the idea that the streaming aspect of megatexture automatically 'scales' to the resources available causes ALL the problems seen in FO4 steaming, especially excessive VRAM usage and late loading hi-quality textures. Use the .ini hack to DISABLE texture streaming (for the most part) and force only the best version of any texture to load. This increases initial loading times, but otherwise fixes texrture problems peeps have (and allows me on a ONE GB VRAM card to have the very best textures dispayed AND have the best performance possible with my GPU). Link to comment Share on other sites More sharing options...
Recommended Posts