Jump to content

Technological Advance Stalled In Gaming?


Recommended Posts

The amount of Vram matters very little compared to almost every other aspect of the card. Clock speed and cores play a FAR greater role then vram. The 9300 is an extremely underpowered card, not designed for doing much more then powering the desktop.

 

The graphics card I linked to is FAR, FAR higher end and should, as I said, be able to run most games on medium/high with the PS3 using the equivalent of the setting below that.

 

 

Link to comment
Share on other sites

  • Replies 45
  • Created
  • Last Reply

Top Posters In This Topic

Point in case is, consoles offer a simple, and often/most times cheaper solution for playing video games, with nice eye candy. PCs offer something much different, greater eye candy and performance with the ability to do work, internet surf, etc...and honestly it is usually more expensive, but that's the nature of the beast. Some people like to grow their apples from seeds, because they enjoy it, and they get a better apple, some like to get them at the store. Simple as that, each offers its own perks.

 

As for stalling, I think it can be as simple as economy. We still had crisis though (I honestly can't remember when it came out, and I'm to lazy to check), give it time, things will advance. They've advanced enough I'm building a new rig now.

Link to comment
Share on other sites

The 9300 is an extremely underpowered card, not designed for doing much more then powering the desktop.

I have the 9300M, and it is in a laptop, not a desktop. Prior to purchasing my Asus, I had a Gateway with an Intel integrated graphics card that was so underpowered, I couldn't even run KOTOR2, so to me, it feels kinda powerful. Then again, perhaps it would be more fair to compare the PS3 to a desktop instead of a laptop.

 

It's because of consoles

Possibly, although I highly doubt that consoles are the only factor here. Perhaps you could answer a couple of questions for me. Why do consumers such as myself, pay such large amounts for these "outdated" consoles when we could easily assemble a PC for much cheaper? Secondly, if consoles are little more than overpriced computer hardware, what is keeping a rival company like Apple from making a superior console at a lower price?

 

Microsoft and Sony expect for their consoles to last until 2015

What do you mean last? Do you mean neither will release a new console until 2015, or is that when the consoles will finally expire?

 

 

It's ridiculous that gamers would allow this to happen.

[Ron Paul]Yes, it is very ridiculous. Rather than the government spending their money on dumb crap like education,welfare,and medical care, they should spend their money developing a rival console to the PS3 and 360. After all, video games are more educational than school anyway. Look at all the useful job skills you can learn from Grand Theft Auto. The competition from a government-developed console would encourage privately owned corporations to develop their own consoles at a much faster rate. Vote for Ron Paul in 2012, and by 2015, America will once again be the king of console gaming. [/Ron Paul]

 

I'd wager that when the new Xbox or new PS3 is released in 2015, or sometime around that, its hardware will be VERY outdated compared to today's hardware.

Consoles have always traditionally trailed behind PC gaming. For example, CD-Roms with Full Motion Video existed years before the Playstation came out(Playstation was not the first console that utilized CDs, but previous CD consoles were nowhere near as successful as the PS). However, even with "outdated" hardware, consoles are expensive to develop. The original playstation was in development since 1988, and costed in research and development. It was not released until 1996, eight years later.

Link to comment
Share on other sites

What do you mean last? Do you mean neither will release a new console until 2015, or is that when the consoles will finally expire?

 

I guess they mean that they don't think they'll be outdated 'til then, as in, they don't think Nintendo can think of anything better.

 

Consoles are fast catching up to PC's, though. I don't know how many factors you can contribute that to. Ease of installation, game makers prefer consoles for some reason, cheaper, and you don't need to spend a week trying to piece it together. Most of today's gamer market doesn't consist of geeks who build their own PC's for a living and gaming is no longer limited to them. Most people just want to play the game.

Link to comment
Share on other sites

Thought I'd address these since they kinda stuck out.

Possibly, although I highly doubt that consoles are the only factor here. Perhaps you could answer a couple of questions for me. Why do consumers such as myself, pay such large amounts for these "outdated" consoles when we could easily assemble a PC for much cheaper?

For one, there aren't many people who know enough about making a PC to be able to build a decent one that is under the cost of a console. Sure, you can do it with budget parts, an install of Linex, and a sturdy cardboard box, but you wouldn't be able to play anything made in the last 3 years very well, and it'd probably crap out in a month or so. For two, because games for consoles are more abundant, cheaper, action packed, an can be rented from many of the fine establishments who offer those sorts of things. People don't buy a console for the hardware (with the exception of the PS3 since it's the cheapest decent blu-ray player out there), they buy a console because they want to play the games that are supported by that console and because they like how the controller feels in their hand.

 

Secondly, if consoles are little more than overpriced computer hardware, what is keeping a rival company like Apple from making a superior console at a lower price?

One reason... they have enough sense not to get into that business. Launching a console is not merely just making a few million units, and depending on their flashy exterior to sell. Launching a console requires that you have a good platform for the console which has features that consumers want, requires that you have games already created and ready for shipping with the console, requires license contracts with game designers who will make games for your console, and requires enough word of mouth about your console to get people interested. Apple has none of this.

 

Although their platform for iphone and mac is fairly good, it's too functional and open-ended for the average consumer. In order to match what is currently offered by other consoles, they would not only have to setup a whole web service around their console, but they would have to make their console work with that web service. For the iphone, the framework for this was already present and could just be filled in, for a console which is a stationary object that doesn't have a touchscreen, they would have to start almost at square 1.

 

As games go, unlike Microsoft, Apple doesn't have the facilities to create games in house since every game released in an Apple format has come from other developers who eventually came around to converting the code for an Apple. Although the two have become similar in more recent years, it still often requires separate formatting and coding to work right in the different environment. This means that either Apple would have to build their game development from the ground up, or release portions of their console software as well as their consoles in advance to developers to create games so that both are available on a Day 0 launch. Needless to say, but that is a very large risk due to the amount of money and resources which would be up in the air until 9-10 months later when enough games are produced that the console could be launched. And in that time, any other console company could have easily cobbled together a new version of their own console with slightly better hardware, given it some "superior" suffix name, and completely overshadowed the new Apple console. So, in short, Apple knows well enough to stay out of the mess and instead focus on creating better and more numerically superior portable devices.

Link to comment
Share on other sites

The better graphics continue to get, the less you will notice the fine details that they went through extraordinary efforts to bring to you. There is a limit in how much detail the human eye can distinguish and I don't believe we are too far away from maxing out on that, maybe in the next 10-15 years, or less. I believe that is why people think graphics aren't getting that much better, because we are comparing the move from NES style 8 bit graphics, to less than 10 years later we have graphics in games animated with 3d polygons. Can't really expect that large of a technology leap in graphics anymore. Some of the X box 360, PS3 and latest PC games already look incredibly realistic, and exactly how much more do you really think they are gonna be able to take that before our eyes can't even notice that much improvement? Some of the CGI movies like Avatar, that is about the pinnacle of what we can do with 3d graphics today, and I gotta say, if it gets any better than that, I probably couldn't tell the difference.
Link to comment
Share on other sites

The graphics suck in MW2 and Mass Effect?

 

Coulda fooled me. Watching my friend play MW2 in full 1080p on his 52" TV he uses as a monitor was like watching a frikkin movie. At least until MW2 brought his once ass-kicking system to its frikkin knees (its frikkin KNEES I tell ya, and it's a machine he spared no expense on, but that was a few years ago. I swear if MW2 could make his Intel Quad-powered machine choke I don't wanna know what it'd do to my poor Phenom). :teehee:

 

Look no further than Windows XP for why more games don't take more advantage of DX10 or 11. XP still has a huge install base, and an equally huge fan base. Microsoft will not completely end support for it a for a few more years yet, though only security updates now. Vista and 7 just aren't good enough. There's really no killer app for them (DX10 and 11 don't count, sorry). XP OTOH had several, much-improved USB support being one of the most important ones (you know, for the people wanting to plug and play), but there was also the conversion to the NT kernel.

 

Just give it two or three more years. XP will take (and already HAS taken) more time to fade into insignificance than 9x/ME did, but fade it will as many people replace rather than fix their computer when things go wrong, and those new machines will be running 7. Although incidentally it'll probably take me personally less time to move to 7 than it took to move fully to XP.

Link to comment
Share on other sites

What Microsoft should do is pretty much just build their next OS off of XP. You know, keep all the programming and just update the UI for more shinies, like they did in Vista, but without all the problems. Some new screensavers would be nice.

 

So is a problem with the advancement of graphics the fact that people like an older OS than the new ones?

 

Okay, I think I've seen the pattern now. The reason that technological advances have been stalled in gaming is that

The Consumer refuses to upgrade or doesn't know how.

Link to comment
Share on other sites

What Microsoft should do is pretty much just build their next OS off of XP. You know, keep all the programming and just update the UI for more shinies, like they did in Vista, but without all the problems. Some new screensavers would be nice.

If I have this right, and I probably don't, the reason why Vista had to be built from the ground up, and subsequently led to so many problems, was more of a legal issue than anything. XP was still based off of Win 95 coding, and Win 95 coding was based off Win 3.x, and Win 3.x was based off DOS. Now this might not be saying much if it wasn't for the whole deal with Microsoft essentially reverse engineering DOS and Apple to create Windows. Vista was their way of finally getting into using coding that was entirely new, and didn't have all the legal issues that previous versions of windows had. Windows 7 is based off of Vista, but incorporates those elements of XP which Microsoft had determined that it had free and clear usage of.

 

As I've been using Windows 7 for over 6 months now, I frankly don't see what's so bad about it once you're using it on a suitably capable system and have adjusted it to where most of the bells and whistles are turned off. I'd be tempted to think that most of the reported problems with Windows 7 have nothing to do with the OS itself, but rather the user who lacks the hardware to run the OS properly, or issues surrounding a 64 bit processor. The fact of the matter is that a 64 bit processor isn't compatible with most older programs, and most newer programs don't quite know yet how to make use of it, so people are forced to upgrade their programs that they've been using for 6+ years for something that looks and operates totally differently, and doesn't work quite as well... hence all the hate. The problem is that instead of just updating program code to work with newer systems, some turkey somewhere decided to get creative and force a new interface scheme on everyone, so not only does a software company have to figure out how to finally utilize newer processors, but they also have to mess around with coding things for an entirely new UI.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...