Jump to content

The last poster wins


TheCalliton

Recommended Posts

@Thor Absolutely not, PS4 will not have 14GB of RAM. The guy is just saying that the large (8GB) VRAM pool on PS4 is nothing innovative since AMD's GCN workstation cards already have 6GB GDDR5.

 

He's comparing PS4 VRAM size with workstation card VRAM size, that's all, PS4 still has 8GB shared GDDR5 pool.

Edited by Werne
Link to comment
Share on other sites

i read into it lol, your right, but it all seems like a marketing scam from Sony if you ask me.

 

trying to hype something that's obviously inferior to anything else at the time. Especially when the 8 cores has been down graded to 1.6ghz on a cellular level, not joking about that comparing to a high end smart phone.

They probably could of done better with a 8 core arm processor at the same clock speed, it certainly would run a lot cooler then :yes: . Also the arm cortex cpu's have the same 86x architecture as Intel and AMD.

Though with 8gb of ddr5 memory at least the latency would be good, it doesn't Necessarily mean better fps.

 

Also the memory clock speed of 240ghz is better then the xbone, but still uncertain what the clock speeds are and bus speeds may be.

noting that Killzone shadow fall is struggling at 1080P 30fps, that could be the sign. Especially with the development cycle. though graphically comparing it to Crysis 3 which is a great start fro a first gen next gen console game, if you like to call it that. Sense all that hardware doesn't have a bottleneck of a OS installed like windows.

Edited by Thor.
Link to comment
Share on other sites

i read into it lol, your right, but it all seems like a marketing scam from Sony if you ask me.

 

trying to hype something that's obviously inferior to anything else at the time. Especially when the 8 cores has been down graded to 1.6ghz on a cellular level, not joking about that comparing to a high end smart phone.

They probably could of done better with a 8 core arm processor at the same clock speed, it certainly would run a lot cooler then :yes: . Also the arm cortex cpu's have the same 86x architecture as Intel and AMD.

Though with 8gb of ddr5 memory at least the latency would be good, it doesn't Necessarily mean better fps.

 

Also the memory clock speed of 240ghz is better then the xbone, but still uncertain what the clock speeds are and bus speeds may be.

noting that Killzone shadow fall is struggling at 1080P 30fps, that could be the sign. Especially with the development cycle. though graphically comparing it to Crysis 3 which is a great start fro a first gen next gen console game, if you like to call it that. Sense all that hardware doesn't have a bottleneck of a OS installed like windows.

Well, I actually like the hardware on those things. The octa-core Jaguar in those things is a downclocked FX 8320, meaning my overclocked 8320 should be able to tag along until the PS4/Xbone generation of consoles is replaced. The FX series CPUs can run pretty cool if you optimize them, I have my FX 8320 overclocked to 4.2GHz (FX 8350 Turbo speed) at 1.308V, at which it doesn't go over 50oC at all. At 3.5GHz I can undervolt to 1.176V at which it runs at peak temperature of 36.3oC. Downclocked to 1.6GHz with a low voltage, it should run pretty damn cool.

 

But the PS4's utilization of a shared memory pool and GDDR5 as main system RAM is a bit strange. GDDR5 is memory optimized for graphical tasks, not general computing, and the memory is shared between the CPU and GPU, meaning you don't get full 8GB system RAM. Color me dubious on that one.

 

Anyhow, my favorite thing about the consoles - we'll see games optimized for octa-core CPUs with Bulldozer architecture, which will give great performance on FX 8320/8350 processors.

 

 

 

Also, hardware aside. PS4 has an advantage over Xbone - the OS. PS4 runs on CellOS which is a descendant from FreeBSD, a stripped-down UNIX kernel with some elements of Linux. Xbone runs on Windows 8, which sucks even on PCs. Difference? UNIX and Linux operating systems have a lower memory and processor footprint than Windows 8.

 

For example, Debian Wheezy with GNOME 3 Classic takes up 250MB RAM on average, 387MB peak, while Windows 7 on the same hardware takes up 1.17GB on average, 1.53GB peak. One of the largest differences between the two is that Windows increases it's RAM usage the more RAM you have, with 8GB it'll use around 2GB on average, while UNIX/Linux don't bloat in RAM and use the excess free memory for caching only. On the CPU side, Windows 7 GUI (idle) uses about 3-5% CPU on average (distributed among 8 cores) while Debian's GUI (again, idle) uses less than 1%, and that's with the fancy Compiz 3D effects.

 

So if the PC OS handling applies to consoles, and it should since they're basically using PC hardware, the difference in usable system memory and CPU power is different. While on Win7 you have about 2.5-2.9GB usable out of 4GB RAM, on Debian you have 3.6-3.8GB ready at all times. UNIX/Linux systems also don't have "system reserved memory", meaning all of RAM is free if you need it, Linux can run off the HDD directly even on a PC that has no RAM modules installed. The difference in CPU usage is large as well, UNIX/Linux can, due to the way they're designed, shut down all but the necessary programs while a fullscreen 3D application is running, freeing up additional resources.

 

All in all, the consoles are pathetic, and I can see my PC being able to handle whatever console-port game developers throw at it until the PS4/Xbone get replaced. That is, as long as my graphics card can take it.

Edited by Werne
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...