Jump to content
ℹ️ Intermittent Download History issues ×

PC and next upgrades (advice wanted)


Recommended Posts

Haven't personally tried BF 4, but Crysis 3 recommends a quad core and Far Cry 3 recommends a dual core. I mean, do you have a source for FC3 using more than 4 cores? It's one of my favorite games, if it really does use 6 that might be worth going for an AMD octa-core. But my CPU hits 48-50% load and locks there every time my framerate drops in FC3, so that seems to indicate it does only use 2 cores, which would make an i5 more effective. :s

I don't blame you for not playing BF4 - it's awful, from flying cars to spawning in the middle of the sky and CTDs, it's just awful. Though sometimes the bugs are just hilarious. :laugh:

 

And in FarCry, I just took a look at it and... I was mistaken, FC3 uses 4 cores. My bad. One thing though (kinda irrelevant but I wanna mention it), the game is surprisingly stressful for the graphics sub-system. The 8350 is trailing 5-6 frames behind the i7-4770K despite it's 50% faster IPC due to the fact that it's GPU bottlenecked by a 7970 (Nvidia equivalent would be GTX 680 and GTX 770), but the only CPUs that go under 45FPS average are dual-cores:

http://static.techspot.com/articles-info/615/bench/CPU_01.png

And the reason your CPU is only used 50% is because you're bottlenecked by your graphics card - FC3 on ultra preset, 4xAA with a GTX 680 hovers around 30-35FPS but on the CPU side, even a Phenom II X4 980 can nearly run it at 60FPS. GPU bench:

http://www.abload.de/img/far-cry-3-test-gpus-13ir6i.png

Kinda like Crysis, but unlike Crysis it can run on an old Phenom II X4 955 at 45FPS.

 

And speaking of Crysis...

http://cdn.overclock.net/a/a3/a3006fcb_Crysis-3-Test-CPUs-VH-720p.png

It's a different kind of game entirely, 8350 is only beat by 3960X which is 6C/12T processor and ties with 3770K while Penom II X4 965 is in the 30s. Granted, there is no Haswell since the benchmark is old, if there were Haswell the 4770K would be slightly ahead. My guess is that Crysis doesn't require that much FPU performance which seem to be AMD's problem with Piledriver (2 cores share 1 FPU).

 

Most of my games are limited by my graphics card, but those I can all keep above 30 fps anyway. The only games I really need the improvement for (the ones that drop below 30 sometimes) are CPU limited, or strongly hint that they're CPU limited. Like Oblivion, which uses one CPU core. Or Borderlands 2, as the framerate dives as soon as I switch PhysX to CPU processed.

Actually, Oblivion depends more on GPU power than CPU, at least in my case. My old Core 2 Duo E4500 ran it perfectly fine at 60FPS when I coupled my current 7770 with it (it was during my upgrading process, I still haven't bought a new CPU), runs equally smooth on an 8320. It was designed to run fine on a Pentium 4, even your Phenom is light years ahead of P4.

 

However, Oblivion is strange in a way that NPC heads pretty much bog down some graphics cards, newer models more so than older. My 7770 performs a lot faster than GeForce 9600GT yet the 9600GT has equal framerate and 7770 can only get ahead if using a lower-res monitor like mine. And the 9600GT was coupled with an old Athlon. I've also seen a GTX 770 + i5-3570K being unable to pull 60FPS with just installed Oblivion, no mods, CPU-bound since the card was practically idling and Core0 was at 100%. On another occasion, I saw i3-2120 + Radeon 6850 pulling 60FPS stable. Unexplainable.

 

Me thinks it's because Oblivion uses an old engine and API that wasn't designed for modern hardware and it runs like crap on certain configurations, I'm likely correct in that assumption but there is no hard proof.

 

And regarding Borderlands, why do you use CPU PhysX when you can do GPU PhysX which induces a much lower framerate drop and lower overall system strain? Even an i7 would flop face-first into the ground with that thing, CPU PhysX makes no sense either unless you're running on a Radeon.

 

 

 

But I digress with all that talk, I do that often. The bottom line is, you can't make a mistake whatever CPU you choose (as long as it's at least a current/last gen quad-core). I don't know if I said it before but going with either FX 83xx or i5 is good, there's no right or wrong choice either way you look at it. i5 will rock in some games, 83xx will rock in others, more cores does help with some stuff while higher IPC helps in other stuff, and they will both rock at 30FPS. They are both good processors and they both do a good job in tasks they are designed for, the choice depends solely on what you do, what you want, what you play and what you need. One that stands out from the bunch is i7, that thing is like like both FX and i5 combined with both IPC and multi-threading, but that's why it costs like both of them combined.

 

And in case you wonder about multi-core utilization in future games, I can't really say anything for certain. 4 cores vs 8 cores is in a questionable state right now with 8-core consoles and newer titles coming out that utilize octa-cores alongside other titles that don't. Gaming industry may go either way so I can't suggest either as future-proof. I'm 100% sure they will switch to multi-threading eventually, but I'm unsure as to when, could be in 6 months or 6 years for all I know.

 

 

 

As for overclocking, if the hardware can take it, I don't see why anyone wouldn't want to overclock unless they are uncomfortable about "damaging hardware" which is BS. You can't damage it unless you pump 1.6V through a CPU/GPU, use a $40 mobo for overclocking or let the CPU fry at 100oC. If you know the hardware's limits, all goes well and can work well for years, like it does for many others. If you don't know the hardware's limits, well, then you can damage it.

 

For example, the FX-series are in the clear up to 1.44V, 1.5V is for extremes with water cooling but still acceptable for 24/7 operation as long as you can keep the heat under control, highest 24/7 voltage is said to be 1.550V but I wouldn't exceed 1.5V. The old Phenoms also overclock great as long as the mobo can take it and are still formidable gaming processors, Deneb can go to 1.45V, I wouldn't go 1.5V although it's possible. Your mobo can't take any of that since it barely even supports a stock Phenom II X4 955 but you'd likely be able to get some 20-30% more out of it (depending on the chip) with a decent 8+2 board that may be bough used for little money.

 

To me overclocking is simple math - buy all new or take what you have to it's limit, I personally prefer (and very much enjoy) overclocking over outright buying new stuff, saves me some money in the long run and extends the platform's life (plus, it's fun for me). I had to buy new stuff in the end though, what I had before was a low-end dual-core and a low-end mobo with an HTPC card, not even overclocking could save that from retirement but it did extend it's life for another year and a half with a 30% overclock on CPU and graphics card.

Edited by Werne
Link to comment
Share on other sites

 

Haven't personally tried BF 4, but Crysis 3 recommends a quad core and Far Cry 3 recommends a dual core. I mean, do you have a source for FC3 using more than 4 cores? It's one of my favorite games, if it really does use 6 that might be worth going for an AMD octa-core. But my CPU hits 48-50% load and locks there every time my framerate drops in FC3, so that seems to indicate it does only use 2 cores, which would make an i5 more effective. :s

I don't blame you for not playing BF4 - it's awful, from flying cars to spawning in the middle of the sky and CTDs, it's just awful. Though sometimes the bugs are just hilarious. :laugh:

 

And regarding Borderlands, why do you use CPU PhysX when you can do GPU PhysX which induces a much lower framerate drop and lower overall system strain? Even an i7 would flop face-first into the ground with that thing, CPU PhysX makes no sense either unless you're running on a Radeon.

 

But I digress with all that talk, I do that often. The bottom line is, you can't make a mistake whatever CPU you choose (as long as it's at least a current/last gen quad-core). I don't know if I said it before but going with either FX 83xx or i5 is good, there's no right or wrong choice either way you look at it. i5 will rock in some games, 83xx will rock in others, more cores does help with some stuff while higher IPC helps in other stuff, and they will both rock at 30FPS. They are both good processors and they both do a good job in tasks they are designed for, the choice depends solely on what you do, what you want, what you play and what you need. One that stands out from the bunch is i7, that thing is like like both FX and i5 combined with both IPC and multi-threading, but that's why it costs like both of them combined.

 

And in case you wonder about multi-core utilization in future games, I can't really say anything for certain. 4 cores vs 8 cores is in a questionable state right now with 8-core consoles and newer titles coming out that utilize octa-cores alongside other titles that don't. Gaming industry may go either way so I can't suggest either as future-proof. I'm 100% sure they will switch to multi-threading eventually, but I'm unsure as to when, could be in 6 months or 6 years for all I know.

 

 

No offense, but I've been told by a lot of people over the last two years that my graphics card is more of a bottleneck than my CPU, so I've upgraded my graphics card on their advice and in certain games (Oblivion, Dark Souls, Mass Effect 1, Dragon Age, etc) I had absolutely no increase in performance. Those are all notoriously CPU heavy games. And like clockwork, every time my CPU locks at 50% load, especially in games optimized for 2 cores, my framerate dives. This time I'm upgrading the CPU and keeping the GPU. Also, I'm waiting for the GTX 8xx series to release before I upgrade my card again.

 

As further testing, I used a program called CAR (core affinity something) in Dragon Age. It lets you disable CPU cores in-game to test performance. DAO is optimized for 3 cores, according to the developers. I normally get 45 fps in the camp in Ostagar, which is the hardest place to run in the game. I disabled 1 of my 4 cores, and the framerate didn't change from 45. I mean, it may have wiggled between 43 and 46 or something, but that's negligible. I disabled another core, and my framerate dropped from 45 to 35. I disabled another, going on just one core now, and my framerate was 15-20. Clearly, I was CPU bound.

 

I don't actually run PhysX through my CPU, I just tested it in Borderlands 2 for a few minutes.

 

Still, I just checked and I didn't know the Xbone and PS4 were using 8-core CPUs. I haven't checked since the rumors said they'd be quad. That might make it worth going for another AMD CPU, if it reflects on the overall optimization of the industry.

 

 

What about...

http://www.newegg.com/Product/Product.aspx?Item=9SIA2W019M6559

http://www.newegg.com/Product/Product.aspx?Item=N82E16813128627

http://www.newegg.com/Product/Product.aspx?Item=N82E16819113285

...as a new plan? ($230)

 

Compared against...

http://www.newegg.com/Product/Product.aspx?Item=N82E16822148910

http://www.newegg.com/Product/Product.aspx?Item=N82E16813157370

http://www.newegg.com/Product/Product.aspx?Item=N82E16819116898

...the old plan. ($420)

Edited by Rennn
Link to comment
Share on other sites

 

The rule of dealing with Seagate is simple:

You ========10-foot pole======== Seagate hard drive

 

Other than that it makes sense, but I would strongly recommend the option of a second-hand Ivy or Sandy Bridge CPU plus Asrock Z77 Pro3. The gain of Haswell vs IB or SB is too small to bother, and i7-3770K can be had cheaper than i5-4670.

 

 

Still, I just checked and I didn't know the Xbone and PS4 were using 8-core CPUs. I haven't checked since the rumors said they'd be quad. That might make it worth going for another AMD CPU, if it reflects on the overall optimization of the industry.

 

 

IDK. Right now, you're still better off with an Intel CPU. 2 years down the road, I'm quite sure FX-8320 will beat i5-4670K. It already does in a few 2012-2014 games.

Link to comment
Share on other sites

 

The rule of dealing with Seagate is simple:

You ========10-foot pole======== Seagate hard drive

 

Other than that it makes sense, but I would strongly recommend the option of a second-hand Ivy or Sandy Bridge CPU plus Asrock Z77 Pro3. The gain of Haswell vs IB or SB is too small to bother, and i7-3770K can be had cheaper than i5-4670.

 

 

Still, I just checked and I didn't know the Xbone and PS4 were using 8-core CPUs. I haven't checked since the rumors said they'd be quad. That might make it worth going for another AMD CPU, if it reflects on the overall optimization of the industry.

 

 

IDK. Right now, you're still better off with an Intel CPU. 2 years down the road, I'm quite sure FX-8320 will beat i5-4670K. It already does in a few 2012-2014 games.

 

 

 

Okay, I'm going with the Western Digital HDD then. I'll have to go 1TB again to stop the cost from doubling over the Seagate, but it should still be worth it. The amount of Seagate HDDs that reviews say failed in 6 months to a year was startling. I'm getting rid of my current HDD mostly for reliability... I won't risk getting another HDD just as unreliable.

 

Two modified plans.

 

Intel Plan: $390

http://www.newegg.com/Product/Product.aspx?Item=9SIA2W019M6559

http://www.newegg.com/Product/Product.aspx?Item=N82E16819115233

http://www.newegg.com/Product/Product.aspx?Item=N82E16813157297

 

AMD Plan: $320

http://www.newegg.com/Product/Product.aspx?Item=9SIA2W019M6559

http://www.newegg.com/Product/Product.aspx?Item=N82E16819113285

http://www.newegg.com/Product/Product.aspx?Item=N82E16813128627

 

Does anyone have an i5-3570 or an FX-8320, and want to talk about your experience with it? Are you ever disappointed with the speed... Does it need a new cooler... etc?

 

It might be worth noting that I'll be using this monitor. I don't expect to change this, as it seems perfect for me. Capping at 30 fps, I could scarcely care less about the refresh rate of 60hz or the 6ms response time (dealbreakers for other gamers at this price point, or so I'm told). I'm just interested in color accuracy, contrast ratio, viewing angle, etc. That means an IPS panel is a requirement this time. And also I'm glad that it has an adjustable base, because my desk is a bit low without that.

http://www.newegg.com/Product/Product.aspx?Item=N82E16824236287

Edited by Rennn
Link to comment
Share on other sites

Hitachi=Toshiba is the way to go these days. They are as reliable as WD and Samsung drives, they are as fast as WD Blacks (no low-end option), and they push the price down hard.

 

Hitachi 7K4000 and Toshiba DT01ACA are the same drives, made by H, they just use the T brand for differentiated pricing.

 

http://www.newegg.com/Product/Product.aspx?Item=N82E16822149382

http://www.newegg.com/Product/Product.aspx?Item=N82E16822149407

http://www.newegg.com/Product/Product.aspx?Item=N82E16822149408

 

Does anyone have an i5-3570 or an FX-8320, and want to talk about your experience with it? Are you ever disappointed with the speed... Does it need a new cooler... etc?

 

It's a CPU. It's not the kind of part you feel, like speakers/display/etc, it's a piece of silicon and fiberglass. First-hand doesn't matter.

 

Anyway, a good cooler is strongly desirable for i5-3570 and mandatory for FX-8320 if you overclock. If you don't, stock ones are passable (AMD stock is much better, but the CPU is hotter), nothing more.

 

I have an i7-4930K clocked to 4.9 on water and I'm very often disappointed with the speed.

Link to comment
Share on other sites

Hitachi=Toshiba is the way to go these days. They are as reliable as WD and Samsung drives, they are as fast as WD Blacks (no low-end option), and they push the price down hard.

 

Hitachi 7K4000 and Toshiba DT01ACA are the same drives, made by H, they just use the T brand for differentiated pricing.

 

http://www.newegg.com/Product/Product.aspx?Item=N82E16822149382

http://www.newegg.com/Product/Product.aspx?Item=N82E16822149407

http://www.newegg.com/Product/Product.aspx?Item=N82E16822149408

 

Does anyone have an i5-3570 or an FX-8320, and want to talk about your experience with it? Are you ever disappointed with the speed... Does it need a new cooler... etc?

 

It's a CPU. It's not the kind of part you feel, like speakers/display/etc, it's a piece of silicon and fiberglass. First-hand doesn't matter.

 

Anyway, a good cooler is strongly desirable for i5-3570 and mandatory for FX-8320 if you overclock. If you don't, stock ones are passable (AMD stock is much better, but the CPU is hotter), nothing more.

 

I have an i7-4930K clocked to 4.9 on water and I'm very often disappointed with the speed.

 

I have a Hitachi HDD now, and I'm not certain I agree with you. The speed benchmarks pretty terribly, and the return rate on Hitachi drives is about 3.4%, as opposed to Western Digital's rate of 1.5%.

All 3 Hitachi drives in those links have 3/5 stars, with a disproportionately high number of people complaining that they were either DOA or died soon after arrival.

Combine that with the fact that my last Hitachi HDD died after less than 3 years... Western Digital looks like a good option, since the price is a mere $25 higher between the 1TB models.

 

On the contrary, firsthand experience does matter. No offense, I didn't say anything about what people felt. I'm not specifically asking if anyone has had them blow up or anything isolated, which could be random chance. Random chance can be useful when it's repeated 200 times, like on Newegg reviews, but not in a random forum thread. However, if someone mentions that their 3570 bottlenecks them while emulating PS2 games or needed a new cooler because the default one is noisy, that's easily repeatable and testable. But it's only something that someone who owned one would know. Same for Hard Drives. You can bet "makes a lot of noise on write cycles" isn't on the list of specs or in the benchmarks. When I asked if people were disappointed, I perhaps wrongly assumed that people would weigh the benchmarks against performance in current games, not that they would rely on a gut (read: imagined) judgment.

 

Anyway, I won't be overclocking right away. If there's even a 1% chance of damaging something I'd rather not risk it at this point, as an OC won't be a matter of necessity yet, but rather of preference. Either of those CPUs should already be bottlenecked by my GPU.

Edited by Rennn
Link to comment
Share on other sites

I have a Hitachi HDD now, and I'm not certain I agree with you. The speed benchmarks pretty terribly, and the return rate on Hitachi drives is about 3.4%, as opposed to Western Digital's rate of 1.5%.

Which model specifically?

What other HDDs did you run the same benchmarks on to come to the conclusion that the Hitachi drive is slower?

 

7K4000 specifically measures as a toss against WD Black FAEX with the latter sometimes slightly better, and with every other 7200 or less rpm drive considerably slower.

 

However, if someone mentions that their 3570 bottlenecks them while emulating PS2 games or needed a new cooler because the default one is noisy, that's easily repeatable and testable. But it's only something that someone who owned one would know.

Except even they wouldn't know.

 

If they think their 3570 bottlenecks them, do they know if the game would run any faster with... wait, what's faster than 3570 in single core - nothing really, even 4960X is only 5% or so faster. Or if it would run any slower with an i5-2400?

It's probably the emulator holding them back, and they don't know if the CPU is good or not unless they ran the same software on another CPU.

 

If they think the cooler is noisy, is that a reason to buy a 4770K? (Intel ships ~the same coolers regardless of model). Or if their aftermarket cooler is noisy, what is their line for "quiet" or "noisy".

 

Without another device for comparison, impressions are quite useless.

This especially applies to something like CPU where differences are within a narrow 20% margin of one another.

Link to comment
Share on other sites

No offense, but I've been told by a lot of people over the last two years that my graphics card is more of a bottleneck than my CPU, so I've upgraded my graphics card on their advice and in certain games (Oblivion, Dark Souls, Mass Effect 1, Dragon Age, etc) I had absolutely no increase in performance. Those are all notoriously CPU heavy games. And like clockwork, every time my CPU locks at 50% load, especially in games optimized for 2 cores, my framerate dives.

That is because you're CPU-bound. The Phenom II X4 955 (Deneb core) is currently 5 years old, it was made to compete against Intel's Netburst (aka, Core 2 Quad), and was trailing behind. Only the 6-core Phenoms like 1100T (Thuban core) were able to take the fight to C2Q, but those 6-cores were actually made to compete with Nehalem, and they flopped. On the graphics side, GTX 660 goes head-to-head with the current-gen mid-range cards but the CPU is head-to-head with a 5 generations older unit (I may have messed up with counting generations, but whatever).

 

Even though they are old as dirt, you can still find Core 2 Quad and Phenom II X4 processors in gaming computers, but none of them work on stock frequencies anymore. The Phenoms generally work at 3.8-4.4GHz where they are 20-35% faster and go head-to-head with stock Sandy/Ivy i5s, while your 3.2GHz unit is comparable to a stock Core 2 Quad, neither stock PII nor stock C2Q can handle new games anymore. Put simply, you can't expect a 5 years old unit to compete with current generation of processors without overclocking it.

 

As further testing, I used a program called CAR (core affinity something) in Dragon Age. It lets you disable CPU cores in-game to test performance. DAO is optimized for 3 cores, according to the developers. I normally get 45 fps in the camp in Ostagar, which is the hardest place to run in the game. I disabled 1 of my 4 cores, and the framerate didn't change from 45. I mean, it may have wiggled between 43 and 46 or something, but that's negligible. I disabled another core, and my framerate dropped from 45 to 35. I disabled another, going on just one core now, and my framerate was 15-20. Clearly, I was CPU bound.

Here's DA:O on 8320 with GTX 680, and that's Anandtech which generally makes fun of FX-series - 130FPS on maxed out settings (not sure how correct that is, Anandtech isn't the most reliable but they are the only ones I could find with a DA:O benchmark). Even Skyrim in their tests runs at some 200FPS on the 8320 and GTX 680, which is close to what Thor claims for his 780 Ti, I still doubt the correctness of that. Dragon Age (the original, as in 1) is just... ugh, back in the day there wasn't a single CPU that wasn't being stressed at 100%, not a single one. As for DA II, that one's GPU bound, a 6990 can't push it over 74FPS on an i7 (660 is more or less 1/2 of 6990) while it runs at 60FPS on a Core 2 Duo.

 

Dragon Age series are weird, one murders graphics cards, one murders CPUs, one runs great on everything, mind-boggling.

 

I don't actually run PhysX through my CPU, I just tested it in Borderlands 2 for a few minutes.

 

Still, I just checked and I didn't know the Xbone and PS4 were using 8-core CPUs. I haven't checked since the rumors said they'd be quad. That might make it worth going for another AMD CPU, if it reflects on the overall optimization of the industry.

Now the PhysX makes sense. And yeah, Xbone and PS4 run on 8-core x86-64 AMD Jaguar SoC, which may translate well to Vishera once the console ports hit the PC market. So far I know of COD: Ghosts that's ported to PC and it uses 4 cores, but it's so badly optimized it runs like crap on more or less anything, even consoles. :laugh:

 

I have a Hitachi HDD now, and I'm not certain I agree with you. The speed benchmarks pretty terribly, and the return rate on Hitachi drives is about 3.4%, as opposed to Western Digital's rate of 1.5%.

All 3 Hitachi drives in those links have 3/5 stars, with a disproportionately high number of people complaining that they were either DOA or died soon after arrival.

Combine that with the fact that my last Hitachi HDD died after less than 3 years... Western Digital looks like a good option, since the price is a mere $25 higher between the 1TB models.

To be honest, I have a Hitachi in my machine right now (Hitachi Deskstar HDS72101), and it's a pretty good hard drive. Loud though so I stretched two O-rings on the 5.25" bays with zip-ties and suspended it, now it's barely audible. I also had a 320GB Hitachi in my old machine, it was bought in 2007 and I sold it 3 months ago to a friend after assembling my current machine. I worked for over 45,000 hours according to SMART, and it still works perfectly fine and shows no signs of failure what so ever. Before those two I had an old 80GB IDE Hitachi which lived for 5 years, I took it apart and made a clock out of it last year cause no one wanted to buy an IDE HDD.

 

I also had a 250GB Seagate in my last machine that died after two months, and an old IDE Seagate two machines back that died after 6 months. But mobile Seagates seem to be decent, I have a Momentus in my laptop that just passed 8 years and 32,000 work hours.

 

Either way, it all depends on luck, sometimes you get crap and sometimes you hit gold, it's like lottery.

 

Anyway, I won't be overclocking right away. If there's even a 1% chance of damaging something I'd rather not risk it at this point, as an OC won't be a matter of necessity yet, but rather of preference. Either of those CPUs should already be bottlenecked by my GPU.

Limits, man - limits. As long as you know the hardware's limits, you can't damage anything at all. It's like tuning an engine, you need to know how much pressure can the cylinders take to fiddle with fuel-injection and compression, overdo it and it'll blow up or overheat, but do it right and it'll last for years.

 

And for risk-free, there's overclocking without overvolting. Voltage and heat kill processors, both independently and as a combination of the two. But if the processor is cooled well, the voltage is stock, and you raise the frequency - you're in the clear. Both CPUs you listed have decent mobos and the FX is a good overclocker (the i5 is locked). The FX-series are overvolted out-of-the-box and allow some degree of overclocking without risk of damage, on my unit it's 900MHz (4.4GHz on stock 1.3125V which is 25%) on some it's less, on some it's more. But the bottom line is, you're perfectly fine as long as you don't touch the voltage, then you have 0% chance of inflicting damage (except on your current mobo which was bound to die anyway with that Phenom).

 

I have an i7-4930K clocked to 4.9 on water and I'm very often disappointed with the speed.

Next upgrade, Tianhe-2. :tongue:

Edited by Werne
Link to comment
Share on other sites

Which model specifically?

What other HDDs did you run the same benchmarks on to come to the conclusion that the Hitachi drive is slower?

I have a 1TB HDS721010CLA. I didn't run most of the benchmarks, I looked at a chart a while ago comparing HDD speeds when a site ran a bunch of benchmarks. There are much newer and likely superior models of Hitachi HDD than mine, of course, but specifically the HDS721010CLA was among the slowest of all the tested drives.

 

 

 

No offense, but I've been told by a lot of people over the last two years that my graphics card is more of a bottleneck than my CPU, so I've upgraded my graphics card on their advice and in certain games (Oblivion, Dark Souls, Mass Effect 1, Dragon Age, etc) I had absolutely no increase in performance. Those are all notoriously CPU heavy games. And like clockwork, every time my CPU locks at 50% load, especially in games optimized for 2 cores, my framerate dives.

That is because you're CPU-bound. The Phenom II X4 955 (Deneb core) is currently 5 years old, it was made to compete against Intel's Netburst (aka, Core 2 Quad), and was trailing behind. Only the 6-core Phenoms like 1100T (Thuban core) were able to take the fight to C2Q, but those 6-cores were actually made to compete with Nehalem, and they flopped. On the graphics side, GTX 660 goes head-to-head with the current-gen mid-range cards but the CPU is head-to-head with a 5 generations older unit (I may have messed up with counting generations, but whatever).

 

Even though they are old as dirt, you can still find Core 2 Quad and Phenom II X4 processors in gaming computers, but none of them work on stock frequencies anymore. The Phenoms generally work at 3.8-4.4GHz where they are 20-35% faster and go head-to-head with stock Sandy/Ivy i5s, while your 3.2GHz unit is comparable to a stock Core 2 Quad, neither stock PII nor stock C2Q can handle new games anymore. Put simply, you can't expect a 5 years old unit to compete with current generation of processors without overclocking it.

 

Yep, thank you for the confirmation. :smile: Even last generation I guessed my Phenom II was often the bottleneck, but there were always three or four people online ready to insist that the CPU didn't matter very much for gaming and that I should upgrade my video card again instead. Ofc that had underwhelming results in many games. That's why I'm definitely upgrading the CPU this time.

 

And I tried OC'ing my Phenom II once, but I suspect my current motherboard was just too weak, or something. :blink: I'm not sure actually. All I know for sure is my attempt to overclock up to a paltry 3.5Ghz without a voltage increase lead to a series of looping audio and total system crashes about once a week ever since. I reverted the settings back to stock, ofc, and I didn't touch my RAM speeds at all. But the issues persist after several hardware re-seats and a reinstallation of Windows and drivers, as well as a reset and updated BIOS, so at this point I'm certain it's some sort of protracted hardware failure.

 

It didn't help that I probably warped my mobo a bit, when I was more of a noob and I didn't know what motherboard stand-offs were.

Edited by Rennn
Link to comment
Share on other sites

Yep, thank you for the confirmation. :smile: Even last generation I guessed my Phenom II was often the bottleneck, but there were always three or four people online ready to insist that the CPU didn't matter very much for gaming and that I should upgrade my video card again instead. Ofc that had underwhelming results in many games. That's why I'm definitely upgrading the CPU this time.

To be honest, I hate the term bottleneck, but whatever. There is a point at which the system becomes "bottlenecked" by one of the hardware components, and it depends largely on what's being ran on it. For example, your PC will run Skyrim just fine at 30FPS on both GPU and CPU side, but it will fail to run BF4 with acceptable results due to CPU being the "bottleneck" (I believe BF4 also disables some in-game content on quad-cores, or was it dual-cores?). On the other hand, games that are GPU-bound will have problems with your card rather than the CPU, because they rely more on the GPU muscle.

 

People claim it's better to take a weaker CPU and a stronger GPU because majority of the games don't require that much CPU power and rely on GPU instead. Problem is, that doesn't always work as expected because people expect you to overclock your CPU to catch up with the card, something you can't do cause your mobo can't handle it. Because of that, you are past the point where the GPU equals performance, right now your card performs rather well but the CPU is too slow to keep up in certain tasks, even though games that rely heavily on the card and less on the CPU still perform satisfactory.

 

And I tried OC'ing my Phenom II once, but I suspect my current motherboard was just too weak, or something. :blink: I'm not sure actually. All I know for sure is my attempt to overclock up to a paltry 3.5Ghz without a voltage increase lead to a series of looping audio and total system crashes about once a week ever since. I reverted the settings back to stock, ofc, and I didn't touch my RAM speeds at all. But the issues persist after several hardware re-seats and a reinstallation of Windows and drivers, as well as a reset and updated BIOS, so at this point I'm certain it's some sort of protracted hardware failure.

Looping sound and system crashes, sounds more like your memory is shot rather than your mobo (though it could be the NB). I get the same exact problem cause one of my sticks is dying, confirmed in Memtest and Prime95. I'm buying new RAM today, finally got the money for it. Sometimes even the card can make the same issue, but that problem is more rare.

 

Anyway, if RAM is the problem, you'll have the same issue on your next upgrade as well, that's why I'd recommend running Memtest/Prime95 memory stress over night or borrow some spare RAM from someone to check.

 

It didn't help that I probably warped my mobo a bit, when I was more of a noob and I didn't know what motherboard stand-offs were.

That's all good, my Mugen 4 cooler bent the board about 17o around the socket once I tightened it and I got no issues, electronic boards are resistant little bastards. Had to take the backplate off and hammer it back into shape. :laugh:

 

And you're not the worst either, I know a guy who screwed the board directly to the case's mobo tray, no standoffs, there were lot of fireworks the moment he flipped the switch. :laugh:

Edited by Werne
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...