dave1029 Posted June 29, 2014 Share Posted June 29, 2014 I think, we are finally at the point where increasingly better hardware is less important. Games with current top end hardware look absolutely gorgeous. To a point where I can't imagine in 5 years us saying it's an eye sore. Today, I can't go back and play oblivion because of the graphics, yet in 5 years, I wouldn't see a problem going back and playing Skyrim because the graphics look great with minimal mod usage. Thoughts? Link to comment Share on other sites More sharing options...
Thor. Posted June 29, 2014 Share Posted June 29, 2014 (edited) you might change your mind on that when you look at games like Watch Dogs, though terrible port had beyond ridiculous system requires. Don't blame the hardware, blame the games. Edited June 29, 2014 by Thor. Link to comment Share on other sites More sharing options...
dave1029 Posted June 30, 2014 Author Share Posted June 30, 2014 you might change your mind on that when you look at games like Watch Dogs, though terrible port had beyond ridiculous system requires. Don't blame the hardware, blame the games.Oh it's definitely the games. Which is why we recently had this big bump in the need of better hardware because of the new consoles. But my point is games are starting to get pretty enough to the point where having cutting edge tech isn't giving as much extra eye candy as it used to. Imagine Skyrim on low *barf.* Watch dogs on medium did not look terrible. Didn't look good, but the scenery wasn't bad enough to detract from the enjoyment of the game. Link to comment Share on other sites More sharing options...
Thor. Posted June 30, 2014 Share Posted June 30, 2014 (edited) Not true we need to brake the realism barrier, its only going to happen sooner then later, a game that can be compared to Avatar is a day when cgi meets easier compute processes. The thing is the movie industry wouldn't need to use supercomputers to pull the same effect. To bad this game was cancelled before its time, it was going to set a new standard in the gaming industry. Edited June 30, 2014 by Thor. Link to comment Share on other sites More sharing options...
Vagrant0 Posted June 30, 2014 Share Posted June 30, 2014 Although we've been making great strides in bringing games to graphical fidelity, much of the stuff under the hood has only gotten more sloppy. Sloppy programming has higher hardware demands. As more and more games are being made on existing engines, and engines become more and more generalized; those functions which demand a more specialized system to support a mechanic (such as reactive AI) tends to only be accomplished by faking it. Link to comment Share on other sites More sharing options...
Rennn Posted June 30, 2014 Share Posted June 30, 2014 (edited) I think, we are finally at the point where increasingly better hardware is less important. Games with current top end hardware look absolutely gorgeous. To a point where I can't imagine in 5 years us saying it's an eye sore. Today, I can't go back and play oblivion because of the graphics, yet in 5 years, I wouldn't see a problem going back and playing Skyrim because the graphics look great with minimal mod usage. Thoughts? I actually *can* go back and play Oblivion with its graphics, and they look pretty good most of the time. Just not the faces. Oh gods, not the faces. I need mods for those to make it playable, otherwise the compression artifacts make everyone look like they're suffering from the bubonic plague. I've also heard the newer generation saying that Wolfenstein: TNO and Far Cry 3 look like crap, which is clearly wrong. It just goes to show, our idea of what constitutes playable graphics is dependent upon the games we grew up playing. That was the PS2 for me, at 480p and 25-30 fps. Additionally, it's only in the last ~5 years that game developers started taking art direction and scene composition seriously. Color balancing and overall scene layout can matter just as much to realism as poly counts or texture resolutions. The new focus on scene integrity has advanced game realism far beyond the stronger hardware alone. But, I agree with the fundamental point you're making. By all rights, if a game is properly optimized, a GTX 780 TI or R9 290X should be capable of running graphics not far off from photo-realism, and perhaps close enough for it not to matter. Of course, the jump up to 1440p is going to reset the whole thing. Even a GTX 480 would have handled photorealistic graphics at 720p, and in a similar manner, the GTX 780 TI is capable of hitting the same performance levels at 1080p. But I predict it'll take at least a GTX 980 TI to hit that performance at 1440p. There's an argument to be made that 1080p is all that's needed to be realistic, and that's true, at 24" or below. Just like it's true for 720p at 19" or below.Once screens start to average 27" for gaming, 1440p will reset the performance clock and a GTX 780 TI or R9 290X will no longer be enough for photo-realistic graphics. I'm running a GTX 660, and I'm not concerned about it. I'll be staying at 1080p/30 fps for a long while yet. Edited June 30, 2014 by Rennn Link to comment Share on other sites More sharing options...
Thor. Posted June 30, 2014 Share Posted June 30, 2014 (edited) If you want to compare legacy compatibility and optimization, far cry 3 could of used a much more effective engine like the original Crysis engine. With today's hardware could achieve the same level of detail and then some if they programmed it that way, the thing is it was a rushed game again, typical Ubisoft originally was dx9, and later patched it to directx11, though they say it doesn't use the actual api, more like half and half. Another example how a 8 year old engine could in theory outperform even the crytek's engine- the new far cry 3 engine, it was apart of the same dev team but went its own way. To this day its no where near the quality of the original cryengine. thats without mods, so take that and then some. Edited June 30, 2014 by Thor. Link to comment Share on other sites More sharing options...
FMod Posted June 30, 2014 Share Posted June 30, 2014 There's an argument to be made that 1080p is all that's needed to be realistic, and that's true, at 24" or below. Just like it's true for 720p at 19" or below.1080p isn't all that's needed to be realistic. 1080p is too much to be realistic on even the best contemporary hardware, with modern A-game budgets. Our ability to tell reality from simulation is contingent on how much detail we get to see. Look at 300x200 thumbnails and you won't tell even rather crude CG from photographs. Open it in a browser at 1350*900 and half the time you can tell it, half the time you can't. Zoom in to full size at 3200*2400 and there's never any doubt. Once screens start to average 27" for gaming, 1440p will reset the performance clock and a GTX 780 TI or R9 290X will no longer be enough for photo-realistic graphics.To have photorealistic graphics, you need an appropriate level of detail to be there in the first place. You can use a $10 million cluster to raytrace a picture for a week, and if the original model is a crude CAD piece, it might look a little nicer, but no more realistic than the same drawing rendered right in the CAD program in 0.0001 seconds on a laptop with integrated graphics. Now, there is, of course, a trick to making high resolution compatible with illusion of reality - you can blur a high-resolution image to hide the lack of fine detail you don't want the player to see. And that blur can be improved on, you can mix DOF, motion blur, etc, keep sharp parts where you want it sharp, all that. I strongly suspect blur to make a big comeback when the era of 4K sets in. Detail requires man-hours, not just hardware, when man-hours are limited and realism is called for, game makers will resort to the only choice they have. Link to comment Share on other sites More sharing options...
Thor. Posted June 30, 2014 Share Posted June 30, 2014 (edited) Dof is evil, and unrealistic, though graphic detail is another thing. Blur is also evil witch tries to hide bad artifacts in the distance, a no go on my opinion. if you want realism you have to simulate the world as is, Funny enough Forza and Project cars are heading in that direction with simulating particle weather instead of painting the sky blue. I have a little more respect for forza sense they dropped microtransactions. Oh and gta v for pc will stand out as the most realistic game, real gameplay footage :teehee: Edited June 30, 2014 by Thor. Link to comment Share on other sites More sharing options...
dave1029 Posted June 30, 2014 Author Share Posted June 30, 2014 I think the 295x2 and Titan-Z are the first single PCB cards to be able to handle ultra-realistic graphics @ resolutions higher than 1080p. For me, 4k looks way better than 1080p, but only slightly better than 1440p. And The Witcher 2, being one of the best looking games I've ever seen, runs above 60fps maxed (no ubersampling) on a single r9 290x @ 1440. So I think these newest cards can handle it, and it's only going to get better. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now