Jump to content

4K graphics and human vision


conundrum

Recommended Posts

Luckily we don't even have to move our heads for that. Just look up "saccades". And yes, one effect is to increase resolution significantly.

 

I'd also add that getting anything out of 4k is also a matter of size, not just distance. Which is another reason I'm weary of absolute statements about which resolution is too much. For example, at the same resolution and distance, the pixel size will be 3x as big on a 40" monitor as on a 13.3" laptop. So what's too much resolution for the eyes to see any difference on the latter, may not be too much on the former.

Link to comment
Share on other sites

I would argue that the 60fps limit on human vision is non-sense on a practical level. We, as humans don't act on pure visual processing. A lot of our perception is aided by our experience through memory. This is why when you play at 60 fps or at 240 fps, it 'feels' different even if science would tell you it shouldn't visually be different, because in your brain, the world around you does not have and FPS limit.

Link to comment
Share on other sites

1. Actually, I would argue that beyond a point, that has to do more with LATENCY than with FPS. And one thing that TFT monitors brought along is, depending on the monitor, several frames worth of latency. Add triple buffering and whatnot, and you can be operating on information that can be 0.1 to 0.2 seconds old. Add network latency if apropriate, mouse latency if you have a cheap old one, etc, and you'll feel it all right.

 

There's nothing crazy about feeling a difference there. You do feel it. They actually tried to see the effect of latency on pilots, drone jockeys and such, more than a decade ago, and it's real. People become increasingly more inaccurate when they have to put a bomb crosshair on something, the more latency you introduce in the loop.

 

It doesn't actually contradict the science that you shouldn't be able to tell much of a smoothness difference between 60 fps and 100 fps, unless you're a fly. Because it's not about looking smooth to the eyes. It might be perfectly all right for watching a movie. But when you play a game, it's about introducing lag into the feedback loop, and that throws it off majorly. And really nothing in our evolutionary history had to deal with that effect. The brain does compensate for its own "lag" in processing the input, but throw more than that into it, and really there is nothing in there that evolved (or was created by God/aliens/whatever, for people who believe that) to deal with extra latency.

 

 

2. Not all frames are rendered in equal amounts of time. In a lot of situations you can average close enough to 60 fps, or 30 fps, or whatever, but end up with something like every x frames, one takes a lot longer to come out. You CAN sense these flukes, rather than the average.

Link to comment
Share on other sites

I can only go with what I see.

 

I have been playing Fallout 4 at 4k resolution since it came out. The game only makes sense to me at 4k.

 

My main gaming computer uses a 3440 x 1440 monitor. I only play level 70 plus Fallout 4 saves on it because at or around that level they are no longer smooth at 4k. The main cause of them being less smooth is that the older saves use more GPU. At close to 100% GPU usage frame rate drops are below 60fps. It takes me a day or two to get use to the lower resolution. Other games take a minute or two for me to adjust.

 

I only play Skyrim/SE and Fallout 4 at 4k because they seem to benefit the most from the resolution. The other games I play do not benefit from 4k resolution as much and look fine at 1440. The wide screen format helps a lot.

 

I think it is a lot about what we are used to. I was totally happy with 1080p until I saw a 4k monitor running 4k content beside a bunch of 1080 monitors running the same content. There was no comparison.

Link to comment
Share on other sites

Luckily - or sadly - I'm one of those people who don't care about 4k at all. If I had to choose between 4k and a stable +60 FPS or decent gameplay (with... you know, quests that are actually working, consistent and balanced player abilities and good storytelling) I'd ditch 4k without even thinking about it.

 

The sad part about it is, that it seems to be a major selling point today, so studios and investors want to see more of it. Plus it certainly helps selling hardware.

 

4k in BGS games seems to be such a waste, since the GFX engine isn't even capable of displaying the most basic 3D techniques like normal mapping in a way you'd expect them to be in a modern 3D game. It's 2018 now and there is still no sign of automatic occlusion culling techniques used by other GFX engines for almost a decade.

 

Having crisp textures surely doesn't do much on the "realsim" side, if the PC and NPCs still have a moveset like stick figures, can't jump over obstacles in a human way, need to finish animations before being able to talk to each other and are disembodied entities in FP mode. But hey - mods will fix it, right?

Link to comment
Share on other sites

Well, nobody said you had to focus on 4k over everything else. Just that the notion that you absolutely can't see any difference is false, and quite trivially so. That you don't have to mind the tiny difference, that's also true, but it's a slightly different issue.
Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...