Jump to content

AMD unveils 8000 series GPU's


Recommended Posts

Eh. After re-reading, it appears that, at least, the 8670 isn't a COMPLETE rehash

 

1GHz engine clock vs 0.8GHz

2GB GDDR5 vs 1GB GDDR5

1150 MHz memory clock vs 1000 MHz memory clock

384 Stream Processors vs 480 Stream Processors

 

Well. Overclocked the speeds and gimped the stream processors. Hmm. Must be an architecture change kicking around there somewhere, or the 8670 will perform WORSE than the 6670.

Link to comment
Share on other sites

before seeing if the entire series is a rebrand.

It's not, reliable preliminary specs on proper units have been out for a couple months already.

 

But low end is going to stay low. Simply put, 32nm, 28nm and below tech nodes (note: Intel's Trigate 22nm is ~= 28nm for others, similarly 14nm Trigate ~= 20-22nm planar) aren't cheaper even per transistor than previous 130, 90nm, etc, they are more expensive. So low end development avoids them.

Link to comment
Share on other sites

Although no one can say for sure, this trend has been going for ages now, in fact I have yet to see a card make as much as a technological leap as the old 8800 GTX. The ability of the current (and next) generation of GPUs is indirectly linked to the current line of Gaming consoles (the "next" gen being with the same power of a cheap future gaming laptop), I really don't see a phenomenal card appearing in at least 2 years since the focus of the GPU manufacturers today seems to be increasing efficiency, decreasing power consumption and the mobile market. Edited by Ihoe
Link to comment
Share on other sites

Although no one can say for sure, this trend has been going for ages now, in fact I have yet to see a card make as much as a technological leap as the old 8800 GTX.

It was also a big price leap.

You might not remember - most people didn't care at the time, "latest and greatest" wasn't the slogan, three generations of hardware coexisted on shelves. But I used to get the latest hardware right after it came out and distribution settled - "soft launches" were the order of the day.

 

Except when 8800GTX arrived... it was like a cold shower. It was impressive and I anticipated it, but $700+ (sold above MSRP)? That's enough to give anyone pause, since the normal price for high end cards at the time was in the $390 range.

And 8800GTS actually had worse price/performance ratio, so I couldn't buy the GTS because of that and couldn't believe that GTX would stay in that price range for almost a year. Seeing how I already had a 7900GTX released only half a year back and there was no use for DX10 yet (you'd have to buy the Vista abomination and there were no games anyway).

 

While I waited for the price to drop, 2900XT came out, with a nifty 512-bit bus, a full gigabyte of new GDDR4, newer tech node and all around features, all for $400ish, so that settled it.

 

Soon there will be another monster - the long-awaited GK110. Which will be like 1.5 years overdue at that moment, and 2 years since it was first shown. Real timely, yeah. MSRP is slated at $900, so if you're into monster cards, that's the one for you.

Originally I bought Lightning 680's with the intention to unload them while they're hot and get a single GK110, but now I'm not even sure. They finally got SLI to work reasonably well, so the heck? Yes, the most powerful single GPU card. Going to be faster than HD8970 that will follow. But, this time, a 1.5 year old product with 2 year old feature set and worse price/performance than 680 had at launch.

 

 

So unreasonably priced performance leaps aren't going anywhere. Architecturally, 8000 series is more interesting, though it remains to be seen how much of that unification will be software supported.

Link to comment
Share on other sites

Although no one can say for sure, this trend has been going for ages now, in fact I have yet to see a card make as much as a technological leap as the old 8800 GTX. The ability of the current (and next) generation of GPUs is indirectly linked to the current line of Gaming consoles (the "next" gen being with the same power of a cheap future gaming laptop), I really don't see a phenomenal card appearing in at least 2 years since the focus of the GPU manufacturers today seems to be increasing efficiency, decreasing power consumption and the mobile market.

 

Power consumption is something that does need addressing, they can't keep increasing the power requirements while energy costs are going up at the rate they are. I don't think there is any need for a huge jump in performance at this point, very few games fully use what's available now unless the game is being played at silly resolutions, hell if the rumored specs of next gen machines are correct then they won't really be needed then either.

Link to comment
Share on other sites

Although no one can say for sure, this trend has been going for ages now, in fact I have yet to see a card make as much as a technological leap as the old 8800 GTX.

It was also a big price leap.

You might not remember - most people didn't care at the time, "latest and greatest" wasn't the slogan, three generations of hardware coexisted on shelves. But I used to get the latest hardware right after it came out and distribution settled - "soft launches" were the order of the day.

 

Except when 8800GTX arrived... it was like a cold shower. It was impressive and I anticipated it, but $700+ (sold above MSRP)? That's enough to give anyone pause, since the normal price for high end cards at the time was in the $390 range.

And 8800GTS actually had worse price/performance ratio, so I couldn't buy the GTS because of that and couldn't believe that GTX would stay in that price range for almost a year. Seeing how I already had a 7900GTX released only half a year back and there was no use for DX10 yet (you'd have to buy the Vista abomination and there were no games anyway).

 

While I waited for the price to drop, 2900XT came out, with a nifty 512-bit bus, a full gigabyte of new GDDR4, newer tech node and all around features, all for $400ish, so that settled it.

 

Soon there will be another monster - the long-awaited GK110. Which will be like 1.5 years overdue at that moment, and 2 years since it was first shown. Real timely, yeah. MSRP is slated at $900, so if you're into monster cards, that's the one for you.

Originally I bought Lightning 680's with the intention to unload them while they're hot and get a single GK110, but now I'm not even sure. They finally got SLI to work reasonably well, so the heck? Yes, the most powerful single GPU card. Going to be faster than HD8970 that will follow. But, this time, a 1.5 year old product with 2 year old feature set and worse price/performance than 680 had at launch.

 

 

So unreasonably priced performance leaps aren't going anywhere. Architecturally, 8000 series is more interesting, though it remains to be seen how much of that unification will be software supported.

 

Although I was not taking the bang-per-buck, efficiency, price or any other detail except Pure performance into account, one thing I don't understand about your post is the "latest and greatest" part. I can understand GPU performance not mattering much in 2004-5 since the PS2 was already ruling the market at its peak and most games, monitors or gamers didn't demand much (I kept my 6600 GT 'til 2008), but right now I don't see the "latest and greatest" slogan anywhere right now either (except on iPhone) not being used in the same context as it was used since 50000 B.C.

 

I'm not sure about GK110 and how good it will turn out to be, I think I'll just sit and have fun with my GTX 470 until the 800 series come out.

 

Power consumption is something that does need addressing, they can't keep increasing the power requirements while energy costs are going up at the rate they are. I don't think there is any need for a huge jump in performance at this point, very few games fully use what's available now unless the game is being played at silly resolutions, hell if the rumored specs of next gen machines are correct then they won't really be needed then either.

 

Dude, there's the new generation of Super HD just announced. PS4 and Xbox720 (django durango or whatever) to be unveiled. they could act as a midwife in bringing Ultra HD to the spotlight just like how 1080p became common. it could mean that those super silly resolutions could become the norm (putting 1600% more GPU load compared to 1080p) in probably 1.5 years from now. and don't be fooled by the rumored specs, the lowly (!) 360 has the equivalent of (probably) a multi-threaded Athlon CPU and a 7800 GT GPU, yet it still handles Games like Crysis at a consistent frame rate.

Edited by Ihoe
Link to comment
Share on other sites

Although I was not taking the bang-per-buck, efficiency, price or any other detail except Pure performance into account, one thing I don't understand about your post is the "latest and greatest" part. ...

Focus on the "latest" part.

 

Today it seems like even SB is being taken off the shelves - even low-end computers are getting IB processors. And SB is only two years old and performs the same as IB.

This was not how things were in 1990s and early 2000s. Buying a PC with a 5 year old CPU model was considered quite normal in the 90s, less so in the 00s. Same to lesser extent applied to GPU, you could see things two generations apart sharing a shelf.

Today, when something new is launched, it hits 100s of stores and 10s of online retailers at once. Back then it was more like they would slowly and lazily start selling it to select distributors.

 

 

I can understand GPU performance not mattering much in 2004-5 since the PS2 was already ruling the market at its peak and most games, monitors or gamers didn't demand much

GPU performance did matter, it wasn't like today when you don't have anywhere to spend it. So I'd change mine every generation. Except for 8800. The price was like a cold shower - just a big "NO, sorry, but NO WAY".

 

It dropped much later - over a year later - but at first you'd have to shell out $700 - twice what previous cards cost. And install a crappy OS nobody liked to get use of the new features, but price was the main thing. Simply put, no matter how cool and innovative and whatever, it wasn't worth it at that price.

The same applies to any new cards. Performance gains that don't come with an improvement in price/performance ratio are pointless.

 

 

Dude, there's the new generation of Super HD just announced. PS4 and Xbox720 (django durango or whatever) to be unveiled. they could act as a midwife in bringing Ultra HD to the spotlight just like how 1080p became common.

Good luck. Right now most console games run at 720p - so the milestone to conquer will be 1080p.

 

It's called "Ultra HD" or more properly 4K. Not quite announced; they just settled on the name. Projectors, TVs and monitors have been available for a few years now.

 

 

and don't be fooled by the rumored specs, the lowly (!) 360 has the equivalent of (probably) a multi-threaded Athlon CPU and a 7800 GT GPU, yet it still handles Games like Crysis at a consistent frame rate.

Games on consoles are usually but a bleak shadow of their PC version. It may be called "Crysis", but it looks like every other console game.

 

These don't even look like the same game. At the very least you'd think they are different locations. But don't be fooled, look at the curve in the road. These screenshots, slight difference in camera angle apart, are supposed to represent the exact same location.

 

http://pikigeek.com/files/2011/09/FbQG1.jpg

http://pikigeek.com/files/2011/09/XYwSO.jpg

Link to comment
Share on other sites

Focus on the "latest" part.

 

Today it seems like even SB is being taken off the shelves - even low-end computers are getting IB processors. And SB is only two years old and performs the same as IB.

This was not how things were in 1990s and early 2000s. Buying a PC with a 5 year old CPU model was considered quite normal in the 90s, less so in the 00s. Same to lesser extent applied to GPU, you could see things two generations apart sharing a shelf.

Today, when something new is launched, it hits 100s of stores and 10s of online retailers at once. Back then it was more like they would slowly and lazily start selling it to select distributors.

 

That wasn't the answer I was looking for but thanks for the reply anyway.

 

Good luck. Right now most console games run at 720p - so the milestone to conquer will be 1080p.

 

It's called "Ultra HD" or more properly 4K. Not quite announced; they just settled on the name. Projectors, TVs and monitors have been available for a few years now.

 

Focus on the could s I typed. we can Never speculate properly as our expertise in Hardware pretty much relies on personal experience and Information Articles. the 1080p milestone is already conquered, Literally or figuratively. The UHD tvs are not just annonced as LG Electronics began selling the first flat panel Ultra HD display (pretty much) in the United States with a resolution of 3840 × 2160 on october 2012 with an insane price and Samsung, Sharp and Sony continued by announcing their own 4K tvs . So i'd say reaching 4K UHD 2160p is a close and 8K UHD 4320p a distant but not unreachable possibility for nextgen consoles.

 

Games on consoles are usually but a bleak shadow of their PC version. It may be called "Crysis", but it looks like every other console game.

 

These don't even look like the same game. At the very least you'd think they are different locations. But don't be fooled, look at the curve in the road. These screenshots, slight difference in camera angle apart, are supposed to represent the exact same location.

 

I'd say those two pictures look relatively identical, unless you're like, going to focus on a still picture for 10 mins. what is seen when playing on 360 is a motion picture (!), you can't quite tell the difference when they're actually moving. the Change in color mood is probably because of the software-hardware differences between PC-360 when the pictures were taken for comparison.

 

Oh and Dude from what I can squeeze out of your posts, no, I didn't live under a Rock in the last two decades and suddenly start talking hardware. I custom built my own PC at age 10 (thanks to dad for the trust :D), and have been tinkering (or reading) hardware ever since. English is not my native either. so why don't you ease up on the historical lessons a little? Cheers.

Edited by Ihoe
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...