Jump to content

PC Graphics are getting better and better.


dkamm65

Recommended Posts

ive noticed something interesting in the comparison between console and computer graphics.

When a computer system cannot process the visual informatin at the demanded speed, it runs choppy, less than 30 fps and so it looks like it skips around because 30 fps is the speed of which the human eye perceives visual information. Anything less and it looks like a slide show. Anything greater is unoticable.

However, Ive never once seem a console chop like a pc can.

My friend Adam has Halo Wars for the 360 and during a huge battle, I noticed a coniderable slowdown in gameplay. He didnt notice anything but during this huge battle with explosions, particle effects, beams and projectile going everywhere, the whole nine yards.

 

Are consoles capable of slowing down the game speed in real time to compensate for loss of framrate during intense moments?

Link to comment
Share on other sites

ive noticed something interesting in the comparison between console and computer graphics.

When a computer system cannot process the visual informatin at the demanded speed, it runs choppy, less than 30 fps and so it looks like it skips around because 30 fps is the speed of which the human eye perceives visual information. Anything less and it looks like a slide show. Anything greater is unoticable.

However, Ive never once seem a console chop like a pc can.

My friend Adam has Halo Wars for the 360 and during a huge battle, I noticed a coniderable slowdown in gameplay. He didnt notice anything but during this huge battle with explosions, particle effects, beams and projectile going everywhere, the whole nine yards.

 

Are consoles capable of slowing down the game speed in real time to compensate for loss of framrate during intense moments?

It demands more than an answer albeit related.

First the quest for higher FPS is somewhat misunderstood, it's not only about the eye accuracy, in this interactive environment it involve things as better aiming, response times. Games like Oblivion, and in minor scale, Fallout 3 can be played around 20+ FPS once the lag do not get excessive. In FPS and deathmatches it could be the downfall syndrome of doom.

 

The second is about the way the game synchronizes the events, normally console are tied with the video format and expect to run at that FPS that is around 50FPS for NTSC and 60 for PAL (remember consoles are designed to work with TV for monitor), when it can't be reached that general slowdown you depicted is what indeed happens, it have to keep the constant flow so to mess not with the TV sync. In PCs the games aren't normally tied this way and quick moving objects will show up like jumping from a position to another when FPS falls under certain limits, as the cursor positioning too, what makes hard to aim distant and small objects. But notice it's by design philosophy and PC games may present that behavior too. For example in console games emulation (although it is actually a 'console game' it's rendered on PC hardware and behaves exactly as in console hardware).

 

Update: What we learn from the above is not only about how many frames are shown, it's more about how things changes in the between.

Link to comment
Share on other sites

Two factors makes the main differences between PC and gaming consoles. It's expected the same aged consoles and PCs the former having a more advanced hardware. After all that is the only advantage consoles have over PC, since they are meant only to play games they have to do it well.

 

Developing a game for console is almost a trivial task over a well known and tightly specified hardware. Should not cause strangeness the games presenting less bugs and better optimization. Developing games for PC is another completely different animal, the developers can't count with a known environment beyond the minimal specifications granted by DirectX or OpenGL and to tell only about the video...

 

But what makes for the gaming console advantages makes up for their greater weakness too. Being a dedicated hardware it lacks badly in flexibility. In the long run the economics advantages the console presents show it's illusory character, at least here in my country, the games themselves are expensive, often twice the price the PC's version when it exist.

 

Second, console tend to becomes helplessly outdated very soon with almost zero hope of getting upgrades and it's hardware advantages are lost too, again too soon. Put together they are useless for almost anything else than playing a rigid pletora of games and these are almost impossible to mod, and it's charm is gone to me.

Link to comment
Share on other sites

If there is one thing that doesn't bother me while gaming then it is graphics. Sure, I do like to see pretty things while playing a game, but it's not going to stop me from enjoying a fun game if the graphics aren't pretty. Games like Loco Roco and Patapon are incredibly fun without having 'uber-cool' and 'realistic' graphics.

 

On the other hand, some games rely on good graphics to make the game more fun. This is why I'm worried about Operation Flashpoint 2. Take Far Cry 2 (a game I thoroughly hated) for example. When playing multiplayer players with low end PC's had an advantage, because if their shadows weren't turned on other players couldn't hide in the shadows from them, making it easier for the low end pc guy. Now, what if OF2 is just like that? What happens if the game is so realistic that it relies on graphics for realism? Draw distance, shadows, foilage all contribute to realism. If that is switched off in the graphics options the game becomes a lot easier automatically. Just a little something to get you thinking...

 

Just a little something I wrote in one of my blogs recently. Kinda touches the subject.

 

But back on topic, at the speed graphics are improving, Crysis is going to be outdated within the next few months.

Link to comment
Share on other sites

Graphics ain't everything, that's for sure. They're nice and all, but there's a certain point where they're "good enough" for the game and that point varies from game to game and can even be subjective person to person. For example I hate Diablo 2's graphics. I think they utterly ruin the atmosphere that I expected from Diablo. In hindsight, I see that such a change only makes sense given Blizzard's whimsical nature. Diablo was not originally made by Blizzard. The company (whose name I forget) that made it was bought by Blizzard, and Bliz then tacked on the buggy (albeit awesome at the time) multiplayer. Watching some of the Diablo 3 gameplay vids has made me tentatively hopeful that the dark, forbidding atmosphere of the classic has returned at least partially. Time will tell, and you can bet I'll try it before buying, one way or another!

 

FPS above 30 do matter, it was once explained to me, because when one is moving and turning rapidly, the more frames that are rendered, the more likely you are to catch that enemy or obstacle or whatever. Think of it this way: You get to choose one lotto ticket out of 100, and there's one winner in there. Would you rather choose out of just 30, 60, or the full 100? ;)

 

Anyway if the Uncanny Valley is anything to shout about, there'll come a point where graphics may be better in a technical sense, but people will hate them anyway. (heck, I already dislike HDR lighting in many situations, and keep it turned off) :D Though, I'm sure any game whose graphics fall into the valley probably will have them changed before release. ;)

Link to comment
Share on other sites

FPS above 30 do matter, it was once explained to me, because when one is moving and turning rapidly, the more frames that are rendered, the more likely you are to catch that enemy or obstacle or whatever. Think of it this way: You get to choose one lotto ticket out of 100, and there's one winner in there. Would you rather choose out of just 30, 60, or the full 100? ;)

 

:/ Dude, not a great example. I would rather have a one in 30 chance of winning then a one in one hundred. That's LESS of a chance. A better example would be taking more cards. You have more of a chance of winning if you have 60 versus 30.

 

Also, once you get to a certain amount of FPS it actually becomes worse because the screenw ill start tearing. On my ancient monitor I will take 75 fps thanks. :P

Link to comment
Share on other sites

Yeah, any fps higher than what your monitor can show is completely wasted. Thats usually 60 or 75 on modern LCD screens (depending on the resolution - my new 22" only does 60 at its native res, but 75 at lower ones), or maybe 100 on older CRT screens.

 

This is why having V-sync on is useful if you have a powerful computer. When your pc can't render the game at more than 60fps anyway, its completely pointless.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...