Werne Posted October 22, 2013 Share Posted October 22, 2013 You do know that the only HD 7000 series GPUs are the 79xx, 78xx and 77xx cards? There are no other teirs in that series.Correct, I forgot AMD renamed the rest of the series to 8xxx and released them as OEM. Sorry about that. And to correct myself, yes, the the entire 7xxx series is built on GCN. Thor's research appears to just be reading an old report based on the 7950 when it was first released, and then assuming that it's the only GPU that uses the GCN architecture. That's not remotely researching which GPUs use GCN.It's actually the 7970 he looked at, the 7950 he mentioned is the card he has (two of them in Crossfire, that is). Back when they were to be released there were rumors that only the 7970 will be built on GCN architecture, there was even talk of 7990 being the only card built on GCN, but it was proven crap once the cards hit the shelves. Link to comment Share on other sites More sharing options...
Thor. Posted October 22, 2013 Author Share Posted October 22, 2013 (edited) I see thanks for the update, i was basing on articles when i bought my two 7950's, it was all over the place, rumors claiming that the only cards back then that was gcn was the 7970, this was long before mantle was ever mentioned. i think i will get a r290x when they come out, the specs do seem impressive, when i saw the leaked specs. I hope they have figured out microstuttering, the one thing that plagues the 7900 series. Edited October 22, 2013 by Thor. Link to comment Share on other sites More sharing options...
FiftyTifty Posted October 22, 2013 Share Posted October 22, 2013 (edited) Eh, I got a 5 mixed up with a 7. No need to start a catholic inquisition! GAWSH! *Achem* But yeah, I think we've disMANTLEd that assertion with the 7970 being the only GCN card in the 7000 series. Edit: Thor, when you have microstuttering, does recording via RadeonPro remove it? Had an issue like that with my 6670; RadeonPro's recording feature removed it, for some reason. Edited October 22, 2013 by FiftyTifty Link to comment Share on other sites More sharing options...
Thor. Posted October 22, 2013 Author Share Posted October 22, 2013 (edited) radeon pro does not remove it, unfortunately sad :facepalm: Also i have this model of Asus monitor, looks like g sync will be a monitor upgrade, yay, might not be strictly gpu. http://www.extremetech.com/gaming/169091-nvidias-g-sync-promises-better-faster-monitors-that-will-revolutionize-pc-gamingANNNDD DOUBLE YAY :dance: http://www.asus.com/Monitors_Projectors/VG248QE/ keeping an eye out for this upgrade. Edited October 22, 2013 by Thor. Link to comment Share on other sites More sharing options...
FiftyTifty Posted October 22, 2013 Share Posted October 22, 2013 "the module works with GPUs that have Kepler architecture (so the GTX 660 and up)" And this is going to be another flop. Especially since you have to buy a specific monitor that has support for it; you can't attach it to existing monitors. Link to comment Share on other sites More sharing options...
Thor. Posted October 22, 2013 Author Share Posted October 22, 2013 (edited) It is a existing monitor upgrade, says it right on the Nvidia forums and the article specifically stated it is. And i have that very monitor in question, if it does fail i will be a early adopter :D Its not like they are going to shipping with it installed, they are kind of old now. Considering its almost a year old. I'm pretty sure there is newer models out now. The one thing i am so tired of is microstutter, it plagues every game, and its the thing that annoys me the most. Edited October 22, 2013 by Thor. Link to comment Share on other sites More sharing options...
FiftyTifty Posted October 22, 2013 Share Posted October 22, 2013 And I doubt there's a substantial number of folk who already have these specific monitors. Everyone I know who games on PC use monitors that are rather old; 'bout five years old is the average. And I can testify to that as well; using a good ol' SyncMaster 932B. So in order to make your framerate appear smoother, you'll need to splurge on a monitor with a large price tag; about the same price tag as a CPU, RAM, motherboard and GPU upgrade. And then you might as well just go for the platform upgrade instead. Link to comment Share on other sites More sharing options...
Rennn Posted October 23, 2013 Share Posted October 23, 2013 AMD has the edge with the r290x, partcularly with Mantle factored in.That said, hardware PhysX is too much to lose, imo. I'm personally looking forward to gsync. Never having to deal with vsync or screen tearing again is worth taking into consideration since I was going to upgrade my monitor anyway. Link to comment Share on other sites More sharing options...
Thor. Posted October 23, 2013 Author Share Posted October 23, 2013 (edited) In the articles about g sync, it does seem like Nvidia is release kits fro the same monitor. http://www.pcper.com/news/Graphics-Cards/NVIDIA-Announces-G-Sync-Variable-Refresh-Rate-Monitor-Technology Also the 780ti specs have been leaked and the 4k overkilll, if i do buy Nvidia or AMD it would last me a life time. http://hothardware.com/News/AMD-Claims-upcoming-R9-290X-Will-Trounce-GTX-780-In-4K-Gaming-Offers-Benchmarks/ -------------------------------Also my assumption may be correct. quote, but here is a catch, obviously.. This technology will be available soon on Kepler-based GeForce graphics cards but will require a monitor with support for G-Sync; not just any display will work. The first launch monitor is a variation on the very popular 144 Hz ASUS VG248QE 1920x1080 display and as we saw with 3D Vision, supporting G-Sync will require licensing and hardware changes. In fact, NVIDIA claims that the new logic inside the panels controller is NVIDIA's own design - so you can obviously expect this to only function with NVIDIA GPUs. DisplayPort is the only input option currently supported. It turns out NVIDIA will actually be offering retrofitting kits for current users of the VG248QE at some yet to be disclosed cost. The first retail sales of G-Sync will ship as a monitor + retrofit kit as production was just a bit behind.Using a monitor with a variable refresh rates allows the game to display 55 FPS on the panel at 55 Hz without any horizontal tearing. It can also display 133 FPS at 133 Hz without tearing. Anything below the 144 Hz maximum refresh rate of this monitor will be running at full speed without the tearing associated with the lack of vertical sync. Edited October 23, 2013 by Thor. Link to comment Share on other sites More sharing options...
FiftyTifty Posted October 23, 2013 Share Posted October 23, 2013 Renn, GPU phsyics can be done with OpenCL; which will work with both GPU vendors. Example 'ere. Thor, you must understand that there is only going to be a rather small number of people who went and bought these gsync-compatible monitors beforehand. And there are probably even fewer who will go and buy a gsync-compatible monitor just for the sake of gsync; might as well have an entire platform upgrade considering the costs. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now