Jump to content

Ended Stutter in Oblivion


pleasenoname

Recommended Posts

I fixed stuttering in oblivion.I have a Phenom II x4 and a HD 4890.

My last GPU was 9800 GT. So its not the CPU's or GPU's that cause stutter in Oblivion.

 

There are so many objects in such a large area in Oblivion, that the hard drive is the only thing that has trouble keeping up with the rest of the system. I installed an SSD and put only Oblivion on it(no OS, just Oblivion), so great had become my obsession with ending the stutter.

 

That greatly decreased new area loading stutter. However there was some left. I read that the game graphics have to be in synch with the physics engine and if the physics engine is going faster than the onscreen graphics or vice versa there are missing frames.

 

Go to "Tweak Guides" and enable all the multi threading options it lists in the .ini file. Not all the multi threading options are good, some can cause crashes like, "Multi ThreadingAudio". EDIT NOTE: For "iThreads=", I set it to "8", because I assume that each core has two threads (x4 x 2 = "8"). Also I should mention that I increased "iPreloadSizeLimit=" in the .ini file to 262144000 and uInterior Cell Buffer=16 uExterior Cell Buffer=102. These values are dependant on the amount and type of RAM you have, so set according to "Tweak Guides" instructions. The .ini file is in "My Games" folder.

 

Enter a new line in the .ini, under the "General" heading of it; "iNumHWThreads=4". This should tell the program there is 4 cores to work with, not sure though. Enter however many cores your computer has after the "=". Make sure that V synch is enabled. V synch is there so that your graphics card is putting out the same frequency/frames as the monitor you are using. I've heard people saying that it should be turned off to get maximum frames, but that is totally wrong. If you turn it off your GPU will burn itself out putting 8000 Frames Per Second on a menu screen and not only that it causes ugly screen tearing. See the stutter in itself is simply occuring because the difference between our highest frame count to our lowest frame count is so drastic that there is no gradual slow down in speed. Enabling triple buffering on ATI Catalyst Control center further smooths the frame difference further(not sure what Nvidia equivalent is).

 

And finally go to "iFPSClamp=" in the .ini and set it to either "=60"(for normal speed). If you want Oblivion the sonic go with "=40" and everything will be really fast ,yet smooth.

 

I am now completely and totally stutter free with maximum LOD on everything, with super speed.

Edited by pleasenoname
Link to comment
Share on other sites

All good except the part about vsync. There's discussion about it in most Oblivion tweak guides, definite reasons why they recommend leaving it disabled unless you're having problems.

 

Tweak guides is wrong about some things, such as how many iNumHWTheads to put in and Vsynch.

 

Vsynch example:

Say the game is designed to run at 30 frames per second. What happens when we turn off Vsynch and get 80 to 8000 frames per second? The answer is screen tearing because the monitor or TV is switching to new frames right in the middle of other frames being rendered, because the monitor can't keep up or "synch".

We have 30 original frames programmed into the game when we then make the GPU render 80 to 8000 the game either stutters from missing frames to keep the Havok engine in synch with the onscreen graphics or the animation has to slow down to match the Havok engine.

 

Just set the "iFPSClamp=" to "120" and you will see the game move in slow motion. Why is this when the clamp is set to more frames per second? Because we have to add so many more newly created frames beyond what the original game had with it(assuming 30 frames per second). Assuming every frame is always kept at 1 second, with 90 more frames added you are adding 90 more seconds. I have game mode for 120hz/cycles if necessary on my display.

Link to comment
Share on other sites

All I know is, I have an old system (AGP graphics, single core/HT P4) and run the game at 85Hz refresh (DirectX also set to 85fps in dxdiag).. with vsync enabled the game is choppy, with vsync off it's smooth as silk. I think the obvious/correct answer here is, "it depends on your hardware".

 

Another trick, if you have Nvidia graphics and use their older "classic" control panel, lower the "max frames to render ahead" from its default (3) to 1. It makes a *huge* difference on my system. At 3 there is substantial mouse lag, 2 removes most of it, at 1 there is none at all, with very little impact on frame rates.

Edited by TheMastersSon
Link to comment
Share on other sites

TheMastersSon are you running a CRT monitor or an LCD? The reason I ask is that pretty well all of the LCD monitors I'm aware of are locked at 60Hz refresh rate ... could be the source of your problem when using vsync.
Link to comment
Share on other sites

Striker, it's a CRT. Oblivion is gorgeous at 85Hz. :)

 

Also, the Coolbits 2 registry hack needs to be installed on top of NV's classic control panel to enable/adjust the 'max pre-rendered frames' option discussed above. Either that or just use Rivatuner or another tweaker.

 

Here's a nice writeup about vsync:

 

http://techreport.com/articles.x/22735/2

Edited by TheMastersSon
Link to comment
Share on other sites

Striker, it's a CRT. Oblivion is gorgeous at 85Hz. :)

 

Also, the Coolbits 2 registry hack needs to be installed on top of NV's classic control panel to enable/adjust the 'max pre-rendered frames' option discussed above. Either that or just use Rivatuner or another tweaker.

 

Here's a nice writeup about vsync:

 

http://techreport.com/articles.x/22735/2

Thank you for the link. I was thinking to myself that there should be something like adaptive Vsynch in the graphics drivers, that switches Vsynch and clamp on and off in response to real time frame fluctuations and here they are actually going to implement it finally.

 

No wonder you have no problem with screen tearing; you are using a CRT. I originally tried turning Vsynch off on Fallout 3(later oblivion), because of those recommendations on "tweak guides", however it was the worst choppiness I have ever seen.

 

I'm going to blow the dust off of my old tube television and see how well Vsynch off works on it.

It may well be that an analog signal is just far better and more adaptive than a digital signal. Analog signal can change in response to conditions, digital is either on or off (synched or not synched).

Edited by pleasenoname
Link to comment
Share on other sites

Striker, it's a CRT. Oblivion is gorgeous at 85Hz. :)

 

 

OK, I tested Oblivion on my old analog CRT TV (All CRT's are analog). Most likely your CRT (because its a computer monitor) has a circuit board on it, with a logic circuit that is able to make changes in the intermediate signal stage when a Vsynch signal is received from your graphics card. My CRT TV on the other hand has no such computer interface board, so it will not even start the game with Vsynch on. I have to turn Vsynch off for the game to be able to run.

 

Guess what? With VSynch off, on an analog tv; there is no stutter, there is no slow down ( resolution 640x480 ). Analog is actually an adaptive dynamic signal, it can smoothly morph and is always there. Whereas Digital is either on or off ( which causes the screen tearing on our LCD displays).

 

The sound is also so much better on an analog TV/monitor, because audio is naturally analog. We a converting it into digital now, however when that is done is causes a slight loss of sound quality (because the sound begins as an analog waveform).

 

Analog display is not inferior. It is possible to make an analog display with as much clarity as a Digital display. We would simply have to increase the number of scanlines on the analog display. This will not happen as Digital displays make it easier for content to be controlled by the TV, and graphics card vendor. The HDCP inside a graphics card sends out a synch signal to our Digital displays and the display responds with a synch signal to the graphics card. If the signal is not present the display will not work. Is this necessary? No, it is a form of DRM (to prevent copying of content). In what circumstances the synch signal is withheld are unknown to me.

 

This research has led me to the conclusion that the HDCP synch signal may be interfering with the proper timing of our frames on digital (LCD) displays.

Link to comment
Share on other sites

Does your computer monitor (LCD) have a DVI input? If so you may have better luck on DVI ... it's 'old school' digital, no DRM stuff to interfere with the digital signal.

 

I've got a couple of 19" CRT computer monitors laying around unused. I'm not sure I could ever go back to a non-wide screen format, plus after years of being spoiled by my 24" 1980 x 1200 I think the 19" would look small. I can't imagine how much desk space a 24" CRT would take up ... the 19" took up all available space.

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...