kalikka Posted October 25, 2013 Share Posted October 25, 2013 (edited) Some statistics on how overclocking affects the game performance160 different CPU's in WoT (@stock)160 different CPU's in WoT (@OC)So the fps difference is ~10fps when talking about single-core performance. That's not much, maybe 1 more year to CPU life.And as fmod said, the single-core performance differences are minimal between the Sandy-Bridge and Haswell (and we are talking about 2 generations newer CPU) as they focused on improving multi-thread/HT/iGPU performance. And with the crappy TIM in ivy/haswell... Even 4,5ghz is hard to get. I have 2x 3570k and neither of them can get to 4,5ghz (4,3ghz and 4,2ghz). Edited October 25, 2013 by kalikka Link to comment Share on other sites More sharing options...
Werne Posted October 25, 2013 Share Posted October 25, 2013 (edited) @FMod I was talking about an Ivy Bridge i7, Haswells are Intel's mistake in my opinion. While Haswell brings a new architecture and better multi-threading, the performance difference between Haswell and Ivy Bridge is some 10-15%, with Ivy Bridge able to reach higher clock speeds when overclocked, making them near-equal (and I've seen Ivy Bridge performing better than Haswell once overclocked). And their hyper-threading still sucks. As much as people laugh at them, AMD's octa-core Piledrivers have better overall multi-threading capability (FX 8350 is within ~20% of i7 4770K overall), even though they have more than 50% slower cores, but that's also crap cause of their crappy optimization - they suck in some tasks and beat i7s in others. FX works quite well for software compiling though. Once you disable HT, CPU gains some 10-15% in single-core performance and once overclocked, Ivy Bridge i7 will beat a Haswell Xeon in per-core, which matters a lot in games like TES/Fallout series. Bottom line is - replacing CPUs is playing catch-up with software demands. There's no strict "this one will work better" option, for now an overclocked Ivy Bridge i5 3570K will perform better than Haswell Xeon 1230v3 in most current-gen and older games. I was talking about performance gain in current generation of games and older ones which OP is playing, where having newer architecture, instructions and better multi-threading doesn't mean jack s***. All of that may be worth something in a year or two, but by then you'll have Broadwell and Steamroller/Excavator so Haswell will be old as well. By the way, Haswell i7's can't go over 4.3GHz? That's little to no improvement from turbo, so what's the point of allowing overclocking then when it's obvious there's not much use anyway? @kalikka Yes, the difference is some 10FPS, in i7 3770K's case it's 22-23% difference, which is not a small increase. And it still means you have to shell out for a new mobo+CPU a year sooner, and if you play at 30FPS like me, an overclocked i7 will last for a bit longer than a year. Plus, I'm a bit biased since I like overclocking stuff, I more-or-less only look at Intel's K and AMD's Black Edition CPUs before considering anything else. Also, that benchmark you posted is a bit odd. While Intel's i5 3570K is set to 4.5GHz in that benchmark, I had an i5 3570K @4.7GHz, reached 5.0 once but I didn't have money for a better cooler. It was heating up like crazy even with an aftermarket cooler so I couldn't keep it that high. Also, an FX 8320 I have at the moment can reach 4.6GHz on stock voltage (1.320V), with 5.1GHz at 1.372V - it's at 4.7GHz in that bench, they barely even overclocked it. And 8350 should be able to easily go over 5GHz, I've seen a few at 5.5GHz, 1.41V. So either those guys had bad luck with their chips, or they did a crappy job. :confused: And I know what you mean by crappy TIM, my old i5 on stock speed (no turbo) would reach 72oC on a stock cooler (67oC is max by Intel). For comparison, an overclocked Core 2 Duo E4500 @2.97GHz (65nm CPU, 2.20GHz stock) would reach 71oC on a stock cooler (72oC max by Intel). I don't know what Intel did but they messed up this time around, their CPUs always ran cooler than AMD's but now even overclocked 125W FX series run cooler than 3570K on stock (FX 8320 @4.4GHz = 52.3oC peak temperature, i5 = 57oC on stock, same cooler). Overclocking something like that is a nightmare, I spent most of the time looking for a way to keep it cool instead of overclocking. By the way - personally, Intel should drop trying to improve the iGPU. AMD's APUs have better integrated graphics than latest Intel CPUs and a decent APU like A8 5600K is cheaper than an Ivy Bridge i3 while being close in performance. Even though they lack in CPU performance, GPU is more important than CPU in a lot of games, so APUs are "decent" for casual gaming, especially once overclocked and crossfired with a low-end dedicated card. Intel would likely be able to drop the price of their CPUs if they were to remove the iGPU and use motherboard chipset graphics instead (like they did with C2Ds and C2Qs with G31/33). Edited October 25, 2013 by Werne Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now