Jump to content

Obbsession with framerate?


velve666

Recommended Posts

 

My point still stands. Not everybody has that much money to spend on a computer; statistically, you're in the minority in terms of computer hardware.

Amen to that!

 

I ran my old Pentium 4 (512 megs of RAM) with a nvida geforce 5200 GPU (with 128 megs) up until about a year ago....Still have that 10 year old relic. Can't bear to part with it. I served me so well for so long...One of these days I'm gonna set it up for older games....

 

But, back to point. So many people with flame spouting systems simply assume that they are in the majority, that what they preach is solid gold, which is simply not the case. Most folks who buy computers nowdays don't purchase them for gaminig only.

 

I long for the day before laptops and consoles....When desktops held sway and games and apps were developed accordingly. :)

Link to comment
Share on other sites

  • Replies 40
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Amen to that!

 

I ran my old Pentium 4 (512 megs of RAM) with a nvida geforce 5200 GPU (with 128 megs) up until about a year ago....Still have that 10 year old relic. Can't bear to part with it. I served me so well for so long...One of these days I'm gonna set it up for older games....

 

 

 

You've got a monster machine for Neverwinter Nights 1. Just sayin'.

Link to comment
Share on other sites

@Vain - If you are happy with the performance of your system, that's fine. But maximizing FPS is not what most people want. They just want the game to run smoothly with the system they have and the mods they want to play with. They don't want to sacrifice the mods for the better fps, and are willing to play with an fps lower than 60 - if they can keep it stable.

 

Contrary to your assertion, the human eye and brain cannot tell the difference between 30 and 60 fps IF they are both smooth and if there is no blur induced. Here is a link to the documentation that shows this. Reference - Andrew B. Watson (1986), "Temporal sensitivity", Handbook of Perception and Human Performance (Wiley)

 

But not everyone has a fantastic system that can always give at least 60 fps no matter what the game does. Some people are doing good to see 30 fps most of the time - those are the people that need to know about how to, and why they should cap the fps to make the game run smooth instead of trying to force it to always show 60fps when it can't.

 

The consoles are capped at 30. Movies are frame rate locked - not just capped at 24fps. TV is locked at 24, 25 or 30 fps. (depending on the system). Yes, you do see advertising that says a particular TV can display 120 fps - however, it is not being broadcast at that rate so it really doesn't matter ( they use video tricks like slow panning and motion blur to make it look good to the human eye)

 

What you perceive as stutter is actually the fps dropping then jumping back up - and it can drop even if you think your sys can do 60 fps all the time.

By capping the fps, you limit the ability of a PC from going over that rate - That does not mean your video card cannot do a higher rate - You want it to be able to run at a higher rate than the cap because that is what it uses to keep from dropping even lower. If you are running your card maxed out , and at a particularly intense video section, you need a bit more card to keep up. Because you don't have a cap, It doesn't actually stay at any set rate but actually goes up and down constantly to fill the available video frame ram. But because your card is already maxed out, Your card doesn't have any more to give, and it has no choice but to drop the fps. (or use some other method to reduce the load) You can see the slight hesitation when it drops, then catches back up. However if you are capping that same card at a lower rate, and you hit that same rough spot, you still have some fps in reserve. Most of the time you will not even see a blip at all because it doesn't have to suddenly dump a lot of frames and render a bunch more all at once. Because it has some overhead at that point. (overhead being that amount that is available but not being used at that moment.)

 

If you want to learn more about FPS and how it relates to games. Rad this Wiki article - You will have to scroll down below the parts on Movie and TV fps to get to the game part. :thumbsup: http://en.wikipedia.org/wiki/Frame_rate

Link to comment
Share on other sites

I think most people can "see" a difference from 30 fps to 60 fps in current engines, but you shouldn't be able to feel it. Human reaction time is 1/10 of a second, or in other words, 10 fps. If below 45 feels slow to you, that simply means you're doing something wrong and causing mouselag, perhaps with vsync or a choking game engine, because 30 fps itself will never feel noticeably different without something else interfering, although it looks different. Also, the human brain directly interprets blur as fluidity and sharpness as choppiness, meaning two things.

1. The higher the resolution you play at, the higher your framerate must be for it to look smooth.

2. Motion blur literally makes a game smoother to your brain, it's not some underhanded trick.

And ofc, every movie you've ever seen was recorded at 24 fps.

 

Finally, it's ridiculous to compare a video card dropping to 30 fps to one capped at 30 fps. The two are dramatically different. A game dropping to 30 fps will look awful to most gamers, and depending on the engine it can also turn laggy as your video card hits its peak and overflows. However, a game limited to 30 fps keeps all the responsiveness and stability of 60 fps, with a purely aesthetic difference.

 

And remember, if you have vsync on and your framerate falls to 29 fps, vsync will make it jump straight down to 15. Benchmarks won't show this, because technically your video card is still drawing 29 frames or 30 frames or whatever. It's just syncing in such a way that you only get 15 on the screen.

 

Ofc, a higher bandwidth card will also run smoother than a lower bandwidth one, due to the effect it has on frame latency. 60 fps on a 384-bit card will look smoother than 60 fps on a 192-bit card.

 

I don't have a problem if you want to keep 60 fps, but I put in the effort to get consistent framerate limits in my games, so I'd rather enjoy the extra stability, lower cost, and the ability to push my settings higher without worrying as much about performance.

Edited by Rennn
Link to comment
Share on other sites

@Vain - If you are happy with the performance of your system, that's fine. But maximizing FPS is not what most people want. They just want the game to run smoothly with the system they have and the mods they want to play with. They don't want to sacrifice the mods for the better fps, and are willing to play with an fps lower than 60 - if they can keep it stable.

 

Contrary to your assertion, the human eye and brain cannot tell the difference between 30 and 60 fps IF they are both smooth and if there is no blur induced. Here is a link to the documentation that shows this. Reference - Andrew B. Watson (1986), "Temporal sensitivity", Handbook of Perception and Human Performance (Wiley)

 

But not everyone has a fantastic system that can always give at least 60 fps no matter what the game does. Some people are doing good to see 30 fps most of the time - those are the people that need to know about how to, and why they should cap the fps to make the game run smooth instead of trying to force it to always show 60fps when it can't.

 

The consoles are capped at 30. Movies are frame rate locked - not just capped at 24fps. TV is locked at 24, 25 or 30 fps. (depending on the system). Yes, you do see advertising that says a particular TV can display 120 fps - however, it is not being broadcast at that rate so it really doesn't matter ( they use video tricks like slow panning and motion blur to make it look good to the human eye)

 

What you perceive as stutter is actually the fps dropping then jumping back up - and it can drop even if you think your sys can do 60 fps all the time.

By capping the fps, you limit the ability of a PC from going over that rate - That does not mean your video card cannot do a higher rate - You want it to be able to run at a higher rate than the cap because that is what it uses to keep from dropping even lower. If you are running your card maxed out , and at a particularly intense video section, you need a bit more card to keep up. Because you don't have a cap, It doesn't actually stay at any set rate but actually goes up and down constantly to fill the available video frame ram. But because your card is already maxed out, Your card doesn't have any more to give, and it has no choice but to drop the fps. (or use some other method to reduce the load) You can see the slight hesitation when it drops, then catches back up. However if you are capping that same card at a lower rate, and you hit that same rough spot, you still have some fps in reserve. Most of the time you will not even see a blip at all because it doesn't have to suddenly dump a lot of frames and render a bunch more all at once. Because it has some overhead at that point. (overhead being that amount that is available but not being used at that moment.)

 

If you want to learn more about FPS and how it relates to games. Rad this Wiki article - You will have to scroll down below the parts on Movie and TV fps to get to the game part. :thumbsup: http://en.wikipedia.org/wiki/Frame_rate

 

What do you mean the human eye/brain can't tell the difference? What we see isn't even measured in fps, so of course we can tell the difference. And yeah Movies and stuff are at like 23.97 fps I know, but that's all played back and some scene's use blur in movement, so your watching and really don't notice. However going from 30 fps or 40 fps, to 60 fps you or anyone would see and feel the difference. While guys might be happy with limiting there fps so they can use this mod and that mod, my opinion is still 60 fps over anything else. Especially when you have a 1080p monitor. Or maybe you guys should try that Seasons of Skyrim Project, Winter version, which makes the game look like a cold wasteland, but at the same time it basically removes all grass in game, and actually gives more fps and makes the game look quite nice actually, all in 1 mod. I used it on my old computer and had a good time with Frostfall to.

Edited by Vainlash
Link to comment
Share on other sites

 

What do you mean the human eye/brain can't tell the difference? What we see isn't even measured in fps, so of course we can tell the difference.

 

 

No, we can't see the difference, provided there's no microstutter. And yes, we can measure it in fps. That's all been proven in more studies than I can shake a stick at. Most people mix up 30 and 60 fps if they're not told which is which.

 

The only reason anyone can possibly see the difference from 30 to 60 fps is because modern video cards and engines are incapable of maintaining perfectly consistent framerates. The human eyes stops being able to tell at about 24-25 fps. Every other inconsistency is purely the result of uneven frame latency and hardware, and that can be mostly wiped out at either 60 or 30 fps.

 

Yes, 60 fps looks better. I don't think any gamer who specifically buys a PC for gaming would dispute that. However, it only looks about 15% better, imo, for exactly double the performance cost of 30 fps. The extra smoothness you see isn't from the increased framerate, it's from the decreased microstutter that comes with 60 fps, which is far less efficient. It's not even close to worth it for me, or for many people. You essentially spend 200% performance for a 15% increase in visual quality.

 

Besides, not every game has a good option for decreasing visual quality. Look at Assassin's Creed III. If you drop the shadow quality one notch below ultra, everything starts to flicker and strobe. In cases like that, where there is no good option below ultra, 30 fps on ultra is much more attractive than 60 fps on medium.

Same for the Witcher 2. Dropping the graphics to medium turns everything glowy and silly looking. Much better to aim for ultra at 30 in games like that.

 

Skyrim is honestly one of the exceptions where there's not a big difference from ultra to high, or even ultra to medium, in most cases. And that's just because ultra looks rather borked. The shadows look pretty much crap no matter what, so turning those from ultra to high doesn't hurt much, it just nets you about 15 fps. Same for view distance. The whole world turns into mud past 200 meters anyway, why not get rid of the tiny clutter items at distant ranges and net yourself an extra 10 fps?

ENB, however, is a different beast. I don't know whether it's horribly optimized or if Skyrim's engine just can't handle post processing effects, but either way the performance hit of ENB is about 5x what you would expect from a similar looking game on a better engine.

 

Vanilla Mass Effect 3 looks arguably better than the best Skyrim ENBs (once you get around ME3's deferred lighting and find some SGSSAA that works), and ME3 runs at over 100 fps for me most of the time, as opposed to Skyrim with the best ENBs, which can hit 20 fps. Making Skyrim look good is possible, but it's highly inefficient. And for people with middle-grade cards (I believe Steam ran a survey that indicated most Steam users are running a GTX 650 or equivalent), efficiency is the single most important part of image quality. 60 fps is not efficient... Meaning 80% of gamers are better off without it right now.

Edited by Rennn
Link to comment
Share on other sites

 

 

What do you mean the human eye/brain can't tell the difference? What we see isn't even measured in fps, so of course we can tell the difference.

 

 

No, we can't see the difference, provided there's no microstutter. And yes, we can measure it in fps.

 

 

yawn...ok

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...