GetOutOfBox Posted December 10, 2010 Share Posted December 10, 2010 A few days ago, I made a post asking for help understanding strangely low performance in Oblivion, despite the fact I have a very high-end video card, the AMD (AMD have stopped calling their video cards ATI now, same manufacturer still) Radeon 6870, with 1GB of VRAM. After messing around with the INI file, out of curiousity I decided to check out the "RenderInfo.txt" file located in the same directory. I was surprised to notice that Oblivion had auto-detected the appropriate shader package for my video card as package 7 (out of 19,higher number = newer revision of Shader's ranging between Shader Model 2. to 3.0) (the number adjacent to the "Shader Package: " line is the shader package Oblivion is currently configured to use). My card has support for Shader Model 5, so it obviously supports Shader Model 3. So I went into the "Shaders" folder located inside Oblivion's "Data" directory and made a copy of both the highest shader package (shaderpackage019.sdp) and the shader package Oblivion configured itself to use on with my video card (shaderpackage007.sdp). I moved the copy I made of the currently set shaderpackage (007) to another folder to store as a backup, then renamed the copy I made of the highest shader package (019) to the name of the shader package Oblivion had configured itself to use (007). I also changed the 'bSupport30Shaders" variable in the Oblivion.ini file from 0 to 1. After starting Oblivion up, I was astounded to notice a huge improvement in performance, especially performance stability. Instead of rapidly jumping between 5 and 25 FPS, the FPS remained at around 30-40, and was stable, instead of rapidly increasing and decreasing. The only issues doing this may cause are that Oblivion may crash more often (on some PC's, my gaming computer with the Radeon 6870 worked fine, there were no crashes other than the very rare crash which is normal for Oblivion), and could possibly cause strange graphical artifacts, especially if combined with certain INI tweaks. It's worth trying though. In the case that you're getting extreme issues, just revert the change you made by recopying your backed up shader package to the folder. If you forgot to back up the shaders, I recall that someone uploaded the default shaders to tesnexus, don't remember the link though. Anyways, this helped me a lot, I hope it can help other people experiencing similar issues. FYI, changing the value in the RenderInfo file will do nothing, as it appears to just be a log file, either way, changes made to it are undone everytime Oblivion is launched, as it attempts to auto-detect your video card every launch. Link to comment Share on other sites More sharing options...
demidekidasu Posted December 10, 2010 Share Posted December 10, 2010 Thanks for posting this info. I will give it a go later to see if it improves things on my 2x 470's. Link to comment Share on other sites More sharing options...
GetOutOfBox Posted December 10, 2010 Author Share Posted December 10, 2010 Thanks for posting this info. I will give it a go later to see if it improves things on my 2x 470's. No problem, I hope it works :) This would also explain why Fallout 3, despite appearing to have very little advances/changes to the customized Gamebryo engine Bethseda used for both Oblivion and Fallout 3, seems to perform better on newer cards, while Oblivion doesn't. Perhaps they included extra hardware profiles for the games auto-configuration functions. Unless they update it though, extremely new cards like the Radeon 6000 series will probably experience similar issues with fallout 3 and will require this fix as well. Link to comment Share on other sites More sharing options...
demidekidasu Posted December 10, 2010 Share Posted December 10, 2010 Thanks for posting this info. I will give it a go later to see if it improves things on my 2x 470's. No problem, I hope it works :) This would also explain why Fallout 3, despite appearing to have very little advances/changes to the customized Gamebryo engine Bethseda used for both Oblivion and Fallout 3, seems to perform better on newer cards, while Oblivion doesn't. Perhaps they included extra hardware profiles for the games auto-configuration functions. Unless they update it though, extremely new cards like the Radeon 6000 series will probably experience similar issues with fallout 3 and will require this fix as well. Very possible, but I reckon its more due to the fact that Oblivion primarily uses only 1 CPU thread. You can tweak certain variables in the INI file to allow it to use more but it is only for minor things such as trees, faces etc. The really demanding area is the AI processing, and sadly Oblivion runs every single AI entity on the same CPU thread as the rest of the game's processing (sound, texture loading etc.). This is why towns and cities are a system-killer, lol. Fallout 3 properly supports and benefits from multi-core CPU's. I've not gotten around to trying Fallout on my new rig yet (i7-950 4GHz, 12GB DDR3, X-fi titanium, 2x 470's) but I expect it to perform hugely better than Oblivion does, due to the vastly improved CPU utilisation. But I will give your method a go later on and see how Oblivion benefits! Oh and btw... I can tell you that Oblivion (and most current games tbh) run better with HT disabled, as HT essentially splits each core into 2 "slower" cores... well, sort of, hehe :P Link to comment Share on other sites More sharing options...
GetOutOfBox Posted December 10, 2010 Author Share Posted December 10, 2010 (edited) Very possible, but I reckon its more due to the fact that Oblivion primarily uses only 1 CPU thread. You can tweak certain variables in the INI file to allow it to use more but it is only for minor things such as trees, faces etc. The really demanding area is the AI processing, and sadly Oblivion runs every single AI entity on the same CPU thread as the rest of the game's processing (sound, texture loading etc.). This is why towns and cities are a system-killer, lol. Fallout 3 properly supports and benefits from multi-core CPU's. I've not gotten around to trying Fallout on my new rig yet (i7-950 4GHz, 12GB DDR3, X-fi titanium, 2x 470's) but I expect it to perform hugely better than Oblivion does, due to the vastly improved CPU utilisation. But I will give your method a go later on and see how Oblivion benefits! Oh and btw... I can tell you that Oblivion (and most current games tbh) run better with HT disabled, as HT essentially splits each core into 2 "slower" cores... well, sort of, hehe :P Yeah, the lack of well-planned multithreading in Oblivion is disappointing, but its understandable considering that multicore CPU's were considered "the latest thing" back in '04, few games added multithreading at all until very recently. As for HT, yeah for games like Oblivion that won't benefit much from extra cores, HyperThreading will probably reduce performance, but in games with good support and scalability for multithreading, HT could actually increase performance, or at least not hurt it, since it allows for more instructions to be executed in one clock cycle. I don't have an i7 anyways (I wish :(, I'd love the AES-Encryption Hardware Support and the sexy overclocking potential :D), I have a Phenom 2 X3, and my next CPU upgrade will probably be a BE Phenom 2 X4, which I hope to OC to 4GHZ. By the way, I have a question, but I don't want to start yet another thread to ask it, so here goes: How well does Oblivion scale with modern Crossfire/SLI? I know it does have at least some built-in support for SLI, but both SLI and Crossfire have changed a lot since 2004. In terms of performance vs a single card, how much of an improvement did you notice when you added an additional 470, or if you bought both at the same time, what kind of performance do you get with them? The reason I don't just get the extra card regardless of Oblivion's performance is because virtually all of the other games I'm running work great with my Radeon 6870 (i.e Team Fortress 2, Crysis (you know when a game's "unoptimized" when Crysis performs better than it :rolleyes:)) Edited December 10, 2010 by GetOutOfBox Link to comment Share on other sites More sharing options...
demidekidasu Posted December 10, 2010 Share Posted December 10, 2010 To be honest, I got both cards at the same time. When I next fire up Oblivion (a bit too busy atm sorry lol) I will turn one of them off and let you know how it runs. I can tell you how it runs for me with SLI enabled (I have been pretty much obsessed with fine-tuning it so I remember the figures quite well! lol). Only problem is that this is with Streamline running (58fps min, 70fps max), so its not a true representation. The game is almost unplayable at times without it though... Furthermore, I have a TON of graphics mods installed, including the major ones (QTP3 full, OBGE v2, Screen Effects). I refuse to uninstall them! lol :P I can get between 60-70fps when walking around the countryside. In towns, it can drop to less than 30fps, but shoots back up when Streamline kicks in. (1920x1080, HDR "on", self shadowing "off" -because it looks ugly, "imingrasssize=50" and all other fanciness at on or highest settings) I will do a proper "fair" test without Streamline running when I get me a free moment or two, hehe. Link to comment Share on other sites More sharing options...
GetOutOfBox Posted December 11, 2010 Author Share Posted December 11, 2010 To be honest, I got both cards at the same time. When I next fire up Oblivion (a bit too busy atm sorry lol) I will turn one of them off and let you know how it runs. I can tell you how it runs for me with SLI enabled (I have been pretty much obsessed with fine-tuning it so I remember the figures quite well! lol). Only problem is that this is with Streamline running (58fps min, 70fps max), so its not a true representation. The game is almost unplayable at times without it though... Furthermore, I have a TON of graphics mods installed, including the major ones (QTP3 full, OBGE v2, Screen Effects). I refuse to uninstall them! lol :P I can get between 60-70fps when walking around the countryside. In towns, it can drop to less than 30fps, but shoots back up when Streamline kicks in. (1920x1080, HDR "on", self shadowing "off" -because it looks ugly, "imingrasssize=50" and all other fanciness at on or highest settings) I will do a proper "fair" test without Streamline running when I get me a free moment or two, hehe. Thanks :D Yeah I used to use OBGE all the time (SSAO = :O), but the fact it prevents AA from working (at least the game's AA control, does forcing AA in the video cards control panel work?) made me disable it. Link to comment Share on other sites More sharing options...
demidekidasu Posted December 11, 2010 Share Posted December 11, 2010 To be honest, I got both cards at the same time. When I next fire up Oblivion (a bit too busy atm sorry lol) I will turn one of them off and let you know how it runs. I can tell you how it runs for me with SLI enabled (I have been pretty much obsessed with fine-tuning it so I remember the figures quite well! lol). Only problem is that this is with Streamline running (58fps min, 70fps max), so its not a true representation. The game is almost unplayable at times without it though... Furthermore, I have a TON of graphics mods installed, including the major ones (QTP3 full, OBGE v2, Screen Effects). I refuse to uninstall them! lol :P I can get between 60-70fps when walking around the countryside. In towns, it can drop to less than 30fps, but shoots back up when Streamline kicks in. (1920x1080, HDR "on", self shadowing "off" -because it looks ugly, "imingrasssize=50" and all other fanciness at on or highest settings) I will do a proper "fair" test without Streamline running when I get me a free moment or two, hehe. Thanks :D Yeah I used to use OBGE all the time (SSAO = :O), but the fact it prevents AA from working (at least the game's AA control, does forcing AA in the video cards control panel work?) made me disable it. No, you cannot force AA through the drivers either unfortunately. However, the Screen Effects mod has a nice blur filter that if you set to a really low value will do a nice job of smoothing the edges without making a massive mess of the image lol. I use that in combination with the EdgeAA shader from OBGE and it looks great. Heres some of my gameplay screenshots so you can see (no editing on them at all, just pure gameplay shots): http://www.tesnexus.com/imageshare/images/1518551-1290915022.jpg http://www.tesnexus.com/imageshare/images/1518551-1290898469.jpg http://www.tesnexus.com/imageshare/images/1518551-1290552952.jpg Link to comment Share on other sites More sharing options...
GetOutOfBox Posted December 12, 2010 Author Share Posted December 12, 2010 Wow, those are some nice screenshots! You're right, the Edge Detect + FS blur filters do a good job of reducing AA while still allowing SSAO to work :D Link to comment Share on other sites More sharing options...
GetOutOfBox Posted December 14, 2010 Author Share Posted December 14, 2010 By the way, I don't know about you, but I seem to be able to use OBGE, and force AA through my drivers control panel. They're definitely both working, as I'm seeing no jaggies, both near and far from me, and SSAO is definitely on, since if I disable/enable it in OBGE's menu, I can see the the shading turning on and off. Link to comment Share on other sites More sharing options...
Recommended Posts