Jump to content

Cheap Gaming PC Under $600


Recommended Posts

  • Replies 52
  • Created
  • Last Reply

Top Posters In This Topic

Additionally, you're kinda ignoring the fact that within the last 5 years there has been several milestone advancements which rendered most everything before obsolete. Things such as 64 bit operating systems, larger memory address awareness, significantly faster and larger storage systems, usage of multiple cores/videocards to boost performance.

Not exactly ignoring it. Do you think these advances were the last?

Just because there's no 128-bit computing around the corner (in fact, being 32-bit is the last problem with the Pentium 4 build above) doesn't mean the progress is over. You're extrapolating the trend of the last 3 years on the next 10, expecting slightly faster cores and maybe 6 instead of 4... and yes, that isn't going to change anything.

 

But there's an elephant in the room.

Intel calls it Xeon Phi. AMD calls it HSA or Fusion. Nvidia, something else, they're the last player in this market.

All three are different, but the concept behind all three is the same: unified general purpose and graphics processors. It is a disruptive technology that will do to general software what the introduction of GPU has done to 3D graphics.

And it's coming. The APU in PS4 and Xbone is/are next HSA generation parts already, with unified memory space, which is a big deal. It's already locking existing systems out of possible features, since no current GPU support it.

By 2020, you're looking at PS5, and you're looking at 3-4 HSA generations. Worst-case scenario, you're looking at compilers that freely mix x86 and CL instructions in one line. Best-case scenario, the x86 core is reduced to a stub for legacy purposes, only used to launch the program on the compute core array.

 

One that just isn't there in 1, 2 or 4 modern video cards. "Application has terminated." By the early 2020s, an awesome i7-3970X build with 4xTitans and 64GB of RAM is going to be reduced to the same thing as a 10 year old 8-socket Opteron server with an S3 Trio 2MB video card today: an impressively power-hungry way to check your mail.

 

Even if these predictions don't come true, if everyone stalls HSA advances (but I doubt it: they all know it's the way forward, "4 big cores" are on the way out), at the very best you'll feel like you'd feel today with a DirectX 8 (at best 9.0a) video card. Because the next DirectX or OpenGL and/or OpenCL or what else will arrive, sooner or later, and within this decade, you can't doubt that at least. Probably more than one. And there won't be a firmware patch for 780 to support what it can't support in principle.

 

Link to comment
Share on other sites

 

Additionally, you're kinda ignoring the fact that within the last 5 years there has been several milestone advancements which rendered most everything before obsolete. Things such as 64 bit operating systems, larger memory address awareness, significantly faster and larger storage systems, usage of multiple cores/videocards to boost performance.

Not exactly ignoring it. Do you think these advances were the last?

Just because there's no 128-bit computing around the corner (in fact, being 32-bit is the last problem with the Pentium 4 build above) doesn't mean the progress is over. You're extrapolating the trend of the last 3 years on the next 10, expecting slightly faster cores and maybe 6 instead of 4... and yes, that isn't going to change anything.

 

But there's an elephant in the room.

Intel calls it Xeon Phi. AMD calls it HSA or Fusion. Nvidia, something else, they're the last player in this market.

The problem is not in the fact that some of these things are coming, but rather in how quickly developers will embrace it as part of mainstream or deal with the related costs of doing so. Given how long the majority of developers have continued to make games that only utilize one core, or which are based around 32 bit systems, it'll probably be a very long time before any company starts to actually make use of such hardware. Given Microsoft's and EA's posturing though, the push to cloud computing will probably have a much larger impact and potentially render most of the impact of any UGPU moot for anything except indie titles or graphics production.

 

I'll give you that it's a bit of a gamble, but it's a pretty safe bet considering the current trends.

Link to comment
Share on other sites

The problem is not in the fact that some of these things are coming, but rather in how quickly developers will embrace it as part of mainstream or deal with the related costs of doing so. Given how long the majority of developers have continued to make games that only utilize one core, or which are based around 32 bit systems, it'll probably be a very long time before any company starts to actually make use of such hardware.

 

They already do. Intel's quicksync, DXVA and CUDA in modern video players, in graphics software, in any real computing software.

 

They aren't running entirely on GPU, nor will software necessarily ever do, but it takes precious little to make a piece of hardware obsolete. It only takes a new version of CUDA or OpenCL for software relying on it to cease running on older cards.

 

This year already, PS4 and XB1 with their next generation APU will strike down 7970 and GTX 780 from the list of currently feature-complete devices and put them on the road to obsoletion.

When will GTX770 be obsolete? For computing it already is. When will it become impractical for playing at least some new titles? Can't tell exactly, but it's more likely before 2020 than after. And within the next 10 years for sure.

 

As for the past history of progress, Oblivion used two cores as early as 2004. Well before the average person had two. Later progress had been held up by the lack of tools and standards. Even so, we had games that worked best on 4 cores before the average PC had 2, and as of 2013 we finally have games that run best on 8 cores today, when the average is perhaps 2.5 cores and 3.5 threads. As for 32-bit vs 64-bit, why bother? There simply never was a significant advantage to going 64-bit for the purposes of a game, until small advantages recently.

It's not as slow as it seems. Just that games have been designed around PS3/360... well, now they'll be designed around a next-gen HSA AMD chip.

 

Increasing GPGPU usage isn't even nearly as big a change as going multi-threaded. It's been limited to particle physics because of what a pain it is to pass data back and forth, easier to compute on CPU. Now that the pain is gone, we're going to see, for instance, real physics using it. Then collision detection. Then AI pathfinding. More and more compute bits. And at some point, a CPU fallback just isn't going to keep up.

 

It's not all about Moore's Law, smaller transistors and higher clock rates. Modern CPU are extremely inefficient; much less that 1% of a modern x86 CPU is actually processing. Less than 0.1% is doing the actual computing work. It's not even a steam engine, we have a way to go till we're there. Optical and quantum computing are likely to remain useless for general purpose for decades (QC will mostly just help in code-breaking).

 

The only currently effective way forward is to make work some of the remaining 99.9%; Intel knows it, AMD knows it, M$ knows it. And GPGPU architectures is what makes productive use of a larger fraction of the chip. Now, because more of the chip is in fact doing work, they have to be clocked lower, but even that's enough to leave x86 CPU, except for Xeon Phi, in the dust.

 

But GPGPU computing is severely limited by GPU's graphics roots. This is the problem currently being solved, and it's being solved at a rapid pace. Someday, someone will want to design a PS4 game with PC-competitive visuals, and since they can't replace its hardware with even more steam engines, they'll have to make use of the plentiful and accessible potential it already offers.

 

I'm easily willing to bet that this day will come well before PS5 and well before 2020 in particular. "AAA" game makers compete on it. And when it comes, a lot of $5,000 rigs will have their inevitable process of turning into pumpkins sped up.

 

 

 

 

 

Given Microsoft's and EA's posturing though, the push to cloud computing will probably have a much larger impact and potentially render most of the impact of any UGPU moot for anything except indie titles or graphics production.

Cloud computing is out there in the clouds. In terms of gaming, all it amounts to is poor quality streams with just-playable input to video delay, a way to me-too for the most undemanding part of the populace and a moderate improvement over watching a video of the gameplay before buying for the rest of us.

 

To make remote gaming competitive, you need to first retire the whole TCP/IP protocol stack. More realistically, if you can live with never winning a shooter match with this, image quality could be made passable with better video codecs - but they will require a GPU-based solution to run. Data networks can not push uncompressed data at display resolution, they use a finite resource (frequencies). Significant improvements in compression will arrive with a rise in computing performance well beyond what modern CPU have available or what modern GPU can solve.

 

It may end up dominating "social gaming". Pink animated ponies compress beautifully and don't care about lag. But you're not looking at the next Crysis or Battlefield becoming a cloud-only title, not if they want to remain the benchmarks.

For all its money, EA is little more than a reseller. Actual developers are working on things like Unreal Engine 4, which will rely on next-gen console features, and not a word about clouds.

Link to comment
Share on other sites

No the Apus are mid range cpu's with built in integrated graphics, the 7950's were ahead of there time and still are considered ahead sense it can keep up with the GTX 680 and in crossfire a gtx 690, but by 2% slower which doesn't mean much.

 

Your believe that the apus are making the 7950's and 7970's obsolete, the apu's have nothing against them. One reason they are so good on the ps4 and xbone one is the firmware that they are using.

 

If you consider it, the 7900 series cards were like the Titan when they first came out.

Edited by Thor.
Link to comment
Share on other sites

No the Apus are mid range cpu's with built in integrated graphics, the 7950's were ahead of there time and still are considered ahead sense it can keep up with the GTX 680 and in crossfire a gtx 690, but by 2% slower which doesn't mean much.

Actually, if the 8-core claim is true, they're lower-clocked top of the line CPU with integrated graphics that has about the performance of 7870 and newer features.

 

But it's not about power. The next generation of HSA can do computing tasks the current generation won't run. It's not its DirectX/OpenGL part where a today's build will have trouble, it's the CUDA/OpenCL part. Doesn't matter how powerful your processor is if it doesn't support the commands it's requested to perform.

 

Next generation AMD chips - HD9000 series, not 8000 that's skipped - will support unified memory. Next generation Nvidia chips - Maxwell or 800 series, due 2014 - will include an ARM CPU core and support unified memory. Next Intel tick, no promises yet, but they're adding the latest DX to their IGP, so probably yes too.

 

From that point on, it's just a matter of time before advanced features and middleware require it to work or to deliver their full potential. And it's not a matter of 10 years. In 10 years, PS4 and Xbone will themselves have been retired in favor of new platforms.

 

 

If we had a real-life connection to ensure payoff, I'd actually be willing to make a bet: If in 5 years NV Kepler still plays all new games without loss of visual quality, I'll give you two GTX 680 Lightnings, original unlocked voltage run, with EK-FC680s, and a 1155 SLI platform; if it doesn't, you buy me a single top of the line video card at the time. Or just an unconditional deal to trade them in 5 years, since I believe they'll be just curios by then and you believe they'll still be rocking.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...