Dan3345 Posted December 22, 2012 Share Posted December 22, 2012 The thing is.. AMD developed integer cores first. Intel perfected the technology and called it hyperthreading. AMD has had it for about 14 years but for some reason is just getting around to really using it in the last 3 or four. Now the main difference between the two companies is one (Intel) is honest about it. Intel does not claim to have a physical eight core (though the 3930X is an eight core with two CPU's laser etched off the dye separating them from the rest of the CPU), AMD on the other hand does. Doesn't bother me though, I know the difference between a real eight core and what AMD makes. But if you ever wondered why the six core Intel CPU was $1000, its because its actually got six cores active on it, and it does have eight. Though the the extra two are turned off and as I said removed permanently by laser etching. I am not sure why Intel did this, but my guess would be they were worried about the 8350 or the 8120 so they made the 3930X to keep in their back pocket just in case. Guess they decided they never needed it. Link to comment Share on other sites More sharing options...
Thor. Posted December 22, 2012 Author Share Posted December 22, 2012 (edited) Wonders why they didn't take full advantage of hyper threading then if they invented it to gave them the edge in competition. If it works for Intel it should work for AMD. Especially when amd Invented hyper threading?? If ya can't join'em beat'em with the same exact tech but better. If you think about it amd did have the upper hand, but decided not to use it. save it for a rainy day. like bankruptcy Edited December 22, 2012 by Thor. Link to comment Share on other sites More sharing options...
Dan3345 Posted December 22, 2012 Share Posted December 22, 2012 Not exactly. Its true AMD did invent the idea of hyperthreading, but they called it integer cores/split processing. Intel developed hyperthreading on their own. AMD has never been as large as Intel or commanded the market Intel does. So when Intel developed hyperthreading it was inevitable they would be able to take it much farther given their resources . AMD simply doesn't have the resources to keep up RND with Intel. In my opinion AMD would be much better served to pull out of enthusiast CPU development entirely and focus on video cards, which with the 7 series they have been doing a very good job with. Of course this would really hurt their stock for a bit, but eventually I think once people saw the capabilities of their video cards if they poured all their resources into it AMD could be a valuable as Nvidia is. The sad fact is the way AMD is conducting business now (fighting a losing war with Intel) they are only going to continue devaluing the company. No company will want to buy them when it looks like they can't produce anything worth it. And the GPU market isn't what it once was. So even if AMD did focus on high end GPU's and made more great ones, I don't know if that alone can save them. However AMD does still have a chance. Keep making great GPU's keep up the great drivers (the old ones pre 12.11 were crap and everyone knows it) and stop making enthusiast CPU's and they will be fine. You see, you can't say on one hand you are done competing with Intel and then release an 8 core "i7 killer." Contradictory.. Link to comment Share on other sites More sharing options...
Thor. Posted December 22, 2012 Author Share Posted December 22, 2012 (edited) But that would hurt the pc market in entirety, Intel would stall their new cpu's and focus on selling there latest models, Innovation would be threatened. I doubt they would pull out of the market any time soon, until some other chip maker like Nvidia or IBM starts selling cpus for the pc market. I would buy into Nvidia thats for sure if they wen't the way of pc, they have some of the Fastest cpu's out there. Edited December 22, 2012 by Thor. Link to comment Share on other sites More sharing options...
FMod Posted December 22, 2012 Share Posted December 22, 2012 (edited) AMD has long pulled out of the enthusiast CPU segment. Bulldozer architecture was designed for servers, where it's pretty successful; FX is just a "trickle-down" similar to Intel's SB-E. They didn't really spend extra engineering effort other than to cut off some cores in lower-spec versions. As intel stands now, it doesn't take as much as it seems to beat. Intel has been selling almost the same CPU for the last... what was it... 2011 - 4 cores, 3.4 or 3.3 GHz, +4 Turbo, K-only OC, 8 or 6 MB cache, HT on or off (SB).2012 - 4 cores, 3.5 or 3.4 GHz, +4 Turbo, K-only OC, 8 or 6 MB cache, HT on or off (IB).2013 - 4 cores, 3.5 or 3.4 GHz, +4 Turbo, K-only OC, 8 or 6 MB cache, HT on or off (Haswell). For how long will this go on? Broadwell specs 2 cores for ultrabooks and 2/4 cores for desktops. If I may hazard a guess:2014 - 4 cores, 3.6 or 3.5 GHz, +4 Turbo, K-only OC, 8 or 6 MB cache, HT on or off (Broadwell). We might never even see a 6-core mainstream desktop part. The extra die area goes to iGPU.So as Intel throttles its expansion to more cores, the performance gap can keep shrinking. However, intel has been exploiting the market's now-fading attachment to x86 and the de-facto total use of OOP, anti-parallel by its nature, thus with its performance governed by a single core. This will give it a niche and Intel is holding on to it. You don't get 55% profit margins and 40% share of BOM in ultrabooks by competing in ARM against a dozen other vendors. Currently AMD's continuing - with A LOT of bumps - on its Fusion roadmap. The idea is to continue the integration of CPU and GPU in the form of integrated GPGPU, and ultimately unification between CPU and GPU units. The end result would have a base processor serving as an x86 translator for compatibility with any mass calculations done on GPGPU, integrated and otherwise. Also the 8350 may read thermals strangely considering its a quad core pretending its an eight. By the definition of a core, Bulldozer is an 8-core part. The x86 translator is not part of a processing core. Neither is the FPU unit, although it's bridged/split into two FPU units depending on the requirements. It's a slower core, that's all. You should see the confusion with GPU cores. Thermal sensor readings keep being off all the time. It's not common to get a clear picture. Temperatures such as described aren't possible. The lowest delta-T for CPU waterblocks on a high-wattage CPU is 40K (!) - see http://www.xtremesystems.org/forums/showthread.php?253470-Review-22-CPU-Waterblocks-tested-Roundup That means that in a 15C room, the lowest T of the CPU itself under burn load that a state of the art water cooling system could achieve is 55C. If the radiators were perfect, i.e. brought the water to the same 15C as the room. In practice it's at least 65C-70C.Idle temps can be as little as 5-10K above ambient temp, not lower than that though. Edited December 22, 2012 by FMod Link to comment Share on other sites More sharing options...
Dan3345 Posted December 22, 2012 Share Posted December 22, 2012 Thor that's not entirely true. ARM has been looking for an entry into the CPU market for years now and if AMD were to leave ARM would surely step in. FMod you got me there. The only thing Intel can claim every year is slight improvements to their architecture and some pseudo-gimmicky features. However you have to admit tis kind of funny how little effort Intel has put into a chip for a while now and how easily they still remain faster than AMD chips. Link to comment Share on other sites More sharing options...
Thor. Posted December 22, 2012 Author Share Posted December 22, 2012 (edited) Still that's a sign innovation has already stalled, and the cpu market is certainly not doing as well as it should, the lack of true innovation from Intel is the sign. They should be competing against IBM or Nvidia for results. AMD is not really focusing on beating Intel in performance, rather then quality and price. it was never really a good comparison to begin with, maybe their early cpu's. Although maybe the 8350 is a not a good example of not competing against Intel, clearly they tried to make a decent chip this time, and had good results. The reviews are quite good for the 8350 so far. It might be a sign of change, especially when they added hyper threading to there chips, and I'm quite impressed with it so far. I had a 50fps jump sense installing the 8350. Edited December 22, 2012 by Thor. Link to comment Share on other sites More sharing options...
FMod Posted December 22, 2012 Share Posted December 22, 2012 But who needs CPU performance these days? A small bunch of modded Fallout/Skyrim gamers... anyone else?And professional users, but Intel would rather have them buy server parts. TXT isn't locked on K series for no reason. The performance desktop CPU market is all but dead. The few top CPU that do get out are like a bone thrown to a dog. People who need real performance are choosing between custom RISC and GPGPU solutions. Here's the current list: http://www.top500.org/lists/2012/11/1 - Tesla+Opteron2 - Power BQC 3 - SPARC 64 Don't expect powerful GPGPU customer solutions from Nvidia, though. They've got Tesla for $5,000 and they don't $500 cards to cannibalize its sales. General computing capability in the 600 series got nerfed, not improved. NV today is getting a lot like Intel, they are holding back on their mass solutions so as to segment the market and maximize the margins. And all this is happening amidst the ultratabletphone crowd chanting "who needs performance no om no mo". So 3/4 of the customers don't want it; Vendors don't want to give it, not even to the 1/4 that do want it. Things are not looking good. It's not AMD, or Intel, or x86, or even CISC that is going down; it's performance computing. How many of you are taking a Concorde to your winter vacation destination this year? Link to comment Share on other sites More sharing options...
Thor. Posted December 22, 2012 Author Share Posted December 22, 2012 (edited) Don't get me started with innovation and consoles, except when steam brings out there new console, apparently its going to be using the latest Opengl 4.3, which competes directly with Directx 11. Console ports is the culprit when it comes to sniffling innovation. Edited December 22, 2012 by Thor. Link to comment Share on other sites More sharing options...
hoofhearted4 Posted December 23, 2012 Share Posted December 23, 2012 i use CPUID to monitor my CPU temps. im on stock heatsink atm, no OC or anything, and my average is right between 22-23C (at idle). and my room temp is probably somewhere around 20C give or take. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now