DarkWarrior45 Posted April 7, 2010 Share Posted April 7, 2010 Most what most people do, my three year old Athlon 5200 x2 runs Windows 7 better than a lot of newer PCs, especially in 64 bit. So dual core is more than enough for most people. But, I don't use my computer for just media and internet surfing. I'm an engineering major, earlier this week I made an Intel 8400 dual core wolfdale system with 4 gigs of ram completely crash and blue screen from running a large simulation in pSpice (simulating a custom microprocessor). I also have the Unreal Development Kit loaded on my athlon machine with 6 gigs of ram, and it's a tad bit laggy (but tolerable) with a large scene loaded. Again, mutlithreading helps (but so does more ram). Also, applications such as Photoshop, Visual Studio, NI Multisim, Matlab, Intel Parallel Studio, video encoding software, AutoCad and some virtual machines use, and need, as many physical cores as they can get. And that's only to name a few. And btw, games are now taking advantage of multithreading. Sins of a Solar Empire is one example. Now, 8 core computing has been out for a while; this is nothing new. Mac Pros have had this, along with other workstations, for at least two years now. How? Dual Quad Cores. Most linux distros can take advantage of the extra cores. Mac is there for the Mac Pro line. Windows is getting there, abut it will take a while for Microsoft to convert all of their code to parallel processing. Now, parallel programming is more complicated. The risk of data corruption grows with the more parallel processes you have. In fact, I wouldn't be surprised to see if ECC memory makes a comeback on the desktop line. But I fully expect dozen core processors not too far in the future. Link to comment Share on other sites More sharing options...
nosisab Posted April 7, 2010 Share Posted April 7, 2010 Most what most people do, my three year old Athlon 5200 x2 runs Windows 7 better than a lot of newer PCs, especially in 64 bit. So dual core is more than enough for most people. But, I don't use my computer for just media and internet surfing. I'm an engineering major, earlier this week I made an Intel 8400 dual core wolfdale system with 4 gigs of ram completely crash and blue screen from running a large simulation in pSpice (simulating a custom microprocessor). I also have the Unreal Development Kit loaded on my athlon machine with 6 gigs of ram, and it's a tad bit laggy (but tolerable) with a large scene loaded. Again, mutlithreading helps (but so does more ram). Also, applications such as Photoshop, Visual Studio, NI Multisim, Matlab, Intel Parallel Studio, video encoding software, AutoCad and some virtual machines use, and need, as many physical cores as they can get. And that's only to name a few. And btw, games are now taking advantage of multithreading. Sins of a Solar Empire is one example. Now, 8 core computing has been out for a while; this is nothing new. Mac Pros have had this, along with other workstations, for at least two years now. How? Dual Quad Cores. Most linux distros can take advantage of the extra cores. Mac is there for the Mac Pro line. Windows is getting there, abut it will take a while for Microsoft to convert all of their code to parallel processing. Now, parallel programming is more complicated. The risk of data corruption grows with the more parallel processes you have. In fact, I wouldn't be surprised to see if ECC memory makes a comeback on the desktop line. But I fully expect dozen core processors not too far in the future.You make a point here, increasing the cores number does not help if software does not escalate to follow it. Still there is not much to be done in increasing the speed by now, it is already achieving the quantum limitations. Computers can be designed for many specialized functions too, servers have very specific necessities and does not need the same flexibility than a generic gaming/professional/webbrowsing desktop, it's not surprise those 12 cores are tweaked for them. That takes us to my former post. Generic machines and mainly the simulators benefits from parallel computing, the videocards are far ahead here, some presenting above thousand simples processors and some others hundreds able to run C/OpenCL/CUDA instructions. The near future seems to point physics functions processors will be the same as mathematical processors are today. Actually I'm from that time that math processor was a co-processor, a totally external chip with it's own socket/slot and such, most of you are so accustomed with the CPU math capable that may think it was always there :smile: Integration with the video and even the sound (as other things too) will allow for that parallel processing achieving the incredible in simulations (and games). That integration can be of at least two forms: mono and bidirectional. The first does not returns values to the CPU while the second does, allowing it to use that information to adjust the 'reaction'; for example, what happens if the player is hit by a random fragment from an explosion himself caused...PS: notice it already happens in some games, but very limited by now and mostly a blast range is all that is taken into account. And it's not a response from/to external hardware calculation but a previously known condition. Parallel processing came to stay, or some hybrid form of it, at least. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now