dncracknell Posted December 16, 2015 Share Posted December 16, 2015 If anybody is interested in this, this is going to be very interesting. I've just found out (because I didn't understand what it was) that "PCI-e" - the link your GPU has to the rest of your system - is going to be, relatively, ignored (abandoned at least for GPU applications) somewhere around Q3 2016. PCI-e 3.0 (I'm on a "2.0" 8GB/sec bi-directional standard) is a 16GB/sec bi-directional data bus that your GPU (itself, having an average of a 224GB/sec bandwidth) sits in, and is largely wasting its time sitting there, on a really badly out-dated technology for letting your GPU not do its thing 90% of the time.So game developers use game engines that operate within that out-dated functionality so you don't notice how sh*- it is.. In 2016, both nVidia and AMD are moving to their own high bandwidth connector standards, of which I just found out an hour ago, have a bi-directional data speed of 80GB/sec and 100GB/sec respectively.nVidia claims that their NVLink will hit 200GB/sec in some enterprise applications (I thought NVLink was a new GPU SLI thing, whatever - doh). It's probably going to mean a fairly global PC upgrade (always the case with these things).But it also means that the days of "Game Engine Complaints Of It Not Being Optimized" will be a thing of the past; even with 4k textures and 4k resolutions at greater than >60fps. Skyrim's Game Engine is not really the problem (or any game engine for that matter - they'll go as fast as your hardware allows). It's PCI-e you should point your fingers at.Overlock that CPU to 5Ghz today; it'll barely do anything. Overlock your GPU to 1.5Ghz; it's still sitting in a PCI-e slot with a diabolically narrow bandwidth. Did you know some SSD's in enterprise applications can reach the speed of PCI-e 2.0?That's f* up when you think about it.And a 6Tflop GPU (and 12Tflops in 2016), might be sitting in that? Disclaimer:I'm not technical. I just try my best to sound like it. Link to comment Share on other sites More sharing options...
mark5916 Posted December 16, 2015 Share Posted December 16, 2015 (edited) Thanks for the infos, then... :wink: Edited December 16, 2015 by mark5916 Link to comment Share on other sites More sharing options...
bben46 Posted December 16, 2015 Share Posted December 16, 2015 I wouldn't hold your breath. It will still be several years before the game companies get around to using it. They still don't make their games 64 bit and its been mainstream for well over ten years. Link to comment Share on other sites More sharing options...
obobski Posted December 17, 2015 Share Posted December 17, 2015 I wouldn't hold your breath. It will still be several years before the game companies get around to using it. They still don't make their games 64 bit and its been mainstream for well over ten years. This, and the PCI -> AGP and AGP->PCIe transitions weren't instant, and competing proprietary standards *rarely* see widespread adoption instantaneously. Furthermore, PCIe is not "a huge bottleneck" or "a huge problem" and some (most?) of the technical stuff in the first post is wrong - the GPU does not have "224GB/s of bandwidth on average" - that doesn't even make sense as a statement. GPUs today have significant memory bandwidth, and they carry onboard memory specifically as a solution to I/O bus limitations (IOW this problem was discussed, researched, and solved, some twenty years ago). The game engine is 100% unaware of all this I/O bus too (because its running above the APIs which themselves sit above the HAL), so no "custom optimization" will need to be done. Tom's Hardware and other sites fairly regularly do benchmarks comparing PCIe bandwidth and performance, and pretty consistently I/O has been out ahead of real-world requirements for about ten years (since AGP 8x at least) (if you want a really giant example of this, PCIe 3.0 is fast enough for XDMA CrossFire which *does* show positive scaling when not CPU bound). As far as "Skyrim isn't the problem it will go as fast as your hardware allows" - this is also problematic (and its problematic for many other game engines too) as Creation assumes a more or less set frame-rate (and therefore inter-frame timing) and running faster than that will create problems (IOW the goal is not "more faster"). Overall this looks like a marketing stunt on nVidia's part, and yet another consumer lock-in feature to push people onto more proprietary nVidia stuff to sell more nVidia products to people (because now we'll go back to nVidia-approved motherboards to use nVidia GPU stuff - and that means higher prices), and AMD will either be forced to follow suit or get beat-up for "not being up to date." If you want to look at it another way, Intel and AMD have both developed (and released) products that have better-than-PCIe between CPU and GPU (on APU parts), and have both released whitepapers demonstrating lower computational latency which can have a minor performance benefit, however the outboard PCIe parts (which are often more powerful) still can complete more complex tasks faster due to their greater computational performance (IOW the I/O is not "kneecapping" the system). This isn't to say I/O interfaces won't keep improving, but at least presently it isn't an issue for contemporary software. Link to comment Share on other sites More sharing options...
niphilim222 Posted December 17, 2015 Share Posted December 17, 2015 Even directx 12 enabled cards won't see the light of day of that performance gain in a very, very long time aswell. pci e is here to stay. They simply are not pushing the cards to there limit yet. I give it 5 more years or more. Link to comment Share on other sites More sharing options...
obobski Posted December 18, 2015 Share Posted December 18, 2015 Even directx 12 enabled cards won't see the light of day of that performance gain in a very, very long time aswell. pci e is here to stay. They simply are not pushing the cards to there limit yet. I give it 5 more years or more. Exactly, and that "5 years or more" has been the case since AGP 8x came out - by the time AGP 8x became a bottleneck for the graphics card, PCIe 1.0 had already been out for a while, and 2.0 was on the way out, and as PCIe 1.0 is starting to become a bottleneck for cards, 2.0 is already out, and 3.0 has been out for a little while (not all systems support 3.0 yet, but its certainly gaining ground). And it generally requires newer/more demanding software to necessitate a newer/faster bus, not newer hardware (e.g. Morrowind probably doesn't care if your graphics card has AGP 4x or PCIe 3.0, but Skyrim would care at least a little bit (I've tried it on PCIe 2.0 x4 vs x16 on the same card (GeForce GTX 660) and it was perfectly playable on either, and the "jump" to x16 was a marginal performance difference - newer synthetic benchmarks (stuff more demanding than Skyrim) saw a bigger performance difference). Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now