Jump to content

What do you guys think of this build?


Recommended Posts

Excuse me, but I wouldn't recomment to favour a Radeon R9 290x over a R9 280x atm. They are quite close together and you'll get an 280x for half the price.

He's not in the R9 290x price range to begin with. He's only just in the R9 280x price range if he sticks with the current CPU and less if he gets anything more expensive.

 

The current AMD chip architecture is lacking floating point units, which are very important for gaming, so they really can't compete with the intel chips in regards of gaming performance.

But you see, the thing is that they do, e.g.: http://i2.wp.com/gearnuke.com/wp-content/uploads/2013/10/battlefield4-beta-7.jpg

Same situation in a number of other games, mostly new ones, cba to dig it all up.

 

"Lacking FPU" is a non-issue: there's actually the same number of FPU and the same total width in FX-8350 as in intel 4-cores. The issue is as always that 8350 still doesn't do well with old games that put everything important on a single thread.

 

Most such games are old enough that they'll run fine on anything. Skyrim, which uses essentially a 2011 revamp of a 2006 engine based on a 2002 development of a 1998 engine, is one of the few exceptions. Starcraft 2 is another. These are outliers.

 

As for power consumption, there is a difference. But when you're gaming, not benching just the CPU, and have your GPU running hot and the monitor on, it comes out to around 450W vs 400W total draw.

Cost-wise it's less than 1 cent per game hour. Idling at one red light instead of stop-starting is ~equivalent to a quarter hour of gaming or 2 hours worth of 8350 vs 3570 difference.

Link to comment
Share on other sites

Meh i have 0 issues with my 8350, sigh that's when OC'ing and water cooling heatsinks come into play. Default turbo of 4.30ghz is fine by me.

Another reason why i recommend it, personal experience.

 

or you could get one of these :D

 

http://www.tigerdirect.ca/applications/SearchTools/item-details.asp?EdpNo=8324286&CatId=1946

Edited by Thor.
Link to comment
Share on other sites

Well, technically I'd always take the chip which is faster and has a lower power consumption.

If you are fine with the AMD-chips, then fine.

 

As said I'd take the intel i5 or i7 chip and a Radion R9 280x because atm they are undoubtful way better than their competitors. And allow me to add, floating point operations will be important elements of future games as well.

 

(2 AMD cores share 1 floating point unit, not every core has its own, so they actually can't work like 2 cores would do - that's the reason the Bulldozer AMD cores are called logical cores, and not physical ones.)

 

The op asked what we think of this setup and this are my 2 cents.

Edited by tortured Tomato
Link to comment
Share on other sites

2 AMD cores share 1 floating point unit, not every core has its own, so they actually can't work like 2 cores would do - that's the reason the Bulldozer AMD cores are called logical cores, and not physical ones.

Yup, but people tend to easily forget that CPUs are not only used for games. AMD's FX 8350 can beat even Intel's Ivy Bridge "Extreme" i7s by a large margin in some tasks. In gaming, FX 8350/8320 are somewhere on-par with i5 3570K in single/dual-threaded games, and between it and i7 3770K in games that can utilize an octa-core.

 

Another thing to mention, the HT on Intel's i3 and i7 chips have lower overall gain than AMD's module implementation. An i7 3770K gains some 20% better multi-threading performance over i5 3570K, with near-identical per-core performance. An FX 8350 gains 60% increase in multi-threading performance over Phenom II X4 965, also with near-identical per-core performance. That's when using software that can take advantage of all threads.

 

AMD's module implementation in Piledriver FX-series flops on software that isn't optimized for it, but gets some impressive results on software that is. With the Steamroller around the corner (architectural improvements, cca 25% increase in performance, backwards compatibility with AM3+) and more games being optimized for AMD's K10 (consoles run on octa-core AMD Jaguars), it's possible that Intel's large lead may become a thing of the past in the next generation of games.

 

By the way, when you compare CPUs in performance, you compare them with a similarly priced equivalent, it's the price/performance that matters the most, not performance alone, unless you have a crap-ton of money or do professional 3D work. The only thing Intel has in 8320's price range is i3 4340 and i5 3350p, both of which are overall on-par with it (rock in per-core, flop in multi-threading), and both of which are locked.

 

Well, technically I'd always take the chip which is faster and has a lower power consumption.

If you are fine with the AMD-chips, then fine.

Yes, FX 8320 draws 125W while i5 3570K draws 77W. Temperature difference? Oerclocked FX 8320 runs cooler than i5 3570K on stock, same cooler. Performance difference? i5 3570K wins by cca 60% in per-core and 4% in multi-threaded applications overall. Overall performance difference? 14% in favor of i5 3570K. Price difference? Around 70$ in favor of 8320. Price/performance ratio? 30% in favor of FX 8320.
Is 14% overall performance increase worth 30% more money spent on CPU alone, the same money that could go into a better graphics card? Not in my opinion, FX chips will run most older and modern games at over 60FPS on 1920x1080 and if not, there's always performance through overclocking.
Link to comment
Share on other sites

 

2 AMD cores share 1 floating point unit, not every core has its own, so they actually can't work like 2 cores would do - that's the reason the Bulldozer AMD cores are called logical cores, and not physical ones.

Yup, but people tend to easily forget that CPUs are not only used for games. AMD's FX 8350 can beat even Intel's Ivy Bridge "Extreme" i7s by a large margin in some tasks. In gaming, FX 8350/8320 are somewhere on-par with i5 3570K in single/dual-threaded games, and between it and i7 3770K in games that can utilize an octa-core.

 

Another thing to mention, the HT on Intel's i3 and i7 chips have lower overall gain than AMD's module implementation. An i7 3770K gains some 20% better multi-threading performance over i5 3570K, with near-identical per-core performance. An FX 8350 gains 60% increase in multi-threading performance over Phenom II X4 965, also with near-identical per-core performance. That's when using software that can take advantage of all threads.

 

AMD's module implementation in Piledriver FX-series flops on software that isn't optimized for it, but gets some impressive results on software that is. With the Steamroller around the corner (architectural improvements, cca 25% increase in performance, backwards compatibility with AM3+) and more games being optimized for AMD's K10 (consoles run on octa-core AMD Jaguars), it's possible that Intel's large lead may become a thing of the past in the next generation of games.

 

By the way, when you compare CPUs in performance, you compare them with a similarly priced equivalent, it's the price/performance that matters the most, not performance alone, unless you have a crap-ton of money or do professional 3D work. The only thing Intel has in 8320's price range is i3 4340 and i5 3350p, both of which are overall on-par with it (rock in per-core, flop in multi-threading), and both of which are locked.

 

 

Well, technically I'd always take the chip which is faster and has a lower power consumption.

If you are fine with the AMD-chips, then fine.

Yes, FX 8320 draws 125W while i5 3570K draws 77W. Temperature difference? Oerclocked FX 8320 runs cooler than i5 3570K on stock, same cooler. Performance difference? i5 3570K wins by cca 60% in per-core and 4% in multi-threaded applications overall. Overall performance difference? 14% in favor of i5 3570K. Price difference? Around 70$ in favor of 8320. Price/performance ratio? 30% in favor of FX 8320.

 

Is 14% overall performance increase worth 30% more money spent on CPU alone, the same money that could go into a better graphics card? Not in my opinion, FX chips will run most older and modern games at over 60FPS on 1920x1080 and if not, there's always performance through overclocking.

 

 

The FX 8350 is in regards of gaming performance slower than a Core i3 3220 [edit: core i5 3450] btw., has been benchmarked, with actual games. He's faster at video converting tasks though - while still being slower than certain core i7.

 

Well yes, all of this is the fault of their architecture.

Edited by tortured Tomato
Link to comment
Share on other sites

 

 

2 AMD cores share 1 floating point unit, not every core has its own, so they actually can't work like 2 cores would do - that's the reason the Bulldozer AMD cores are called logical cores, and not physical ones.

Yup, but people tend to easily forget that CPUs are not only used for games. AMD's FX 8350 can beat even Intel's Ivy Bridge "Extreme" i7s by a large margin in some tasks. In gaming, FX 8350/8320 are somewhere on-par with i5 3570K in single/dual-threaded games, and between it and i7 3770K in games that can utilize an octa-core.

 

Another thing to mention, the HT on Intel's i3 and i7 chips have lower overall gain than AMD's module implementation. An i7 3770K gains some 20% better multi-threading performance over i5 3570K, with near-identical per-core performance. An FX 8350 gains 60% increase in multi-threading performance over Phenom II X4 965, also with near-identical per-core performance. That's when using software that can take advantage of all threads.

 

AMD's module implementation in Piledriver FX-series flops on software that isn't optimized for it, but gets some impressive results on software that is. With the Steamroller around the corner (architectural improvements, cca 25% increase in performance, backwards compatibility with AM3+) and more games being optimized for AMD's K10 (consoles run on octa-core AMD Jaguars), it's possible that Intel's large lead may become a thing of the past in the next generation of games.

 

By the way, when you compare CPUs in performance, you compare them with a similarly priced equivalent, it's the price/performance that matters the most, not performance alone, unless you have a crap-ton of money or do professional 3D work. The only thing Intel has in 8320's price range is i3 4340 and i5 3350p, both of which are overall on-par with it (rock in per-core, flop in multi-threading), and both of which are locked.

 

Well, technically I'd always take the chip which is faster and has a lower power consumption.

If you are fine with the AMD-chips, then fine.

Yes, FX 8320 draws 125W while i5 3570K draws 77W. Temperature difference? Oerclocked FX 8320 runs cooler than i5 3570K on stock, same cooler. Performance difference? i5 3570K wins by cca 60% in per-core and 4% in multi-threaded applications overall. Overall performance difference? 14% in favor of i5 3570K. Price difference? Around 70$ in favor of 8320. Price/performance ratio? 30% in favor of FX 8320.
Is 14% overall performance increase worth 30% more money spent on CPU alone, the same money that could go into a better graphics card? Not in my opinion, FX chips will run most older and modern games at over 60FPS on 1920x1080 and if not, there's always performance through overclocking.

 

 

The FX 8320 is in regards of gaming performance slower than a Core i3 3220 btw., has been benchmarked.

 

Can you find me said benchmark?

Link to comment
Share on other sites

Can you find me said benchmark?

Sure, already linked above. It's widely known that the AMD bulldozer architecture cant compete in regards of gamin. They have a good stand in video performance, but a way higher power consumption.

 

So, depends what you want to do. If you want to edit / process videos, the AMD chips beat the intel ones. In gaming, they are way slower.

 

Besides of the gaming performance, the morst important things for me would be power consumption and heat loss.

 

Therefore, for a gaming rig, it would be nonsense to take an AMD chipt.

Edited by tortured Tomato
Link to comment
Share on other sites

Its good fro everything and gaming, especially folding and stuff. That's where it really shines.

True, with Piledriver they made a lot of improvements compared to Bulldozer. But still, it heavily depends on the task. In games they still are middle ground.

 

Edit

E.g., the FX 8350 is quite good in Benchmarks with Adobe CS6.

Edited by tortured Tomato
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...