Jump to content

I am between 2 GPU cards, please help pick the right one (the best for


Recommended Posts

Mantle was never on PS4 and XBone in the first place.

It will really come down to how popular Frostbite 3 ends up, I don't expect widespread adoption of the API.

For large budget releases, PhysX is also The Witcher 3, Borderlands 2, Metro, Watchdogs, and as you said, Batman.

The effect there is minimal. Metro has better particle effects without hardware PhysX than 95% of the other games have with it. They run the same effects, but scripted normally and simulated with a NV card, only differing in interactivity. You have to know exactly what to look for to see the difference.

 

Haven't played Borderlands 2. Most games games with Physx support run fairly primitive effects that just don't make use of it, some even have hardware Physx off by default (e.g. UE3 ones). Witcher 3 is in development, not sure how it's going to turn out, but my guess is no other game is going to go to the extreme that Batman did - completely removing every animated game object out of spite - at least no game with ambitions.

Link to comment
Share on other sites

  • Replies 79
  • Created
  • Last Reply

Top Posters In This Topic

mantle was dropped by Sony ps4 and Xbox one... http://www.extremetech.com/gaming/168671-xbox-one-will-not-support-amds-mantle-and-ps4-is-also-unlikely-is-mantle-doa

 

It seems that other companies don't bother with gpu wars and monopoly games.. So mantle is going to end up DOA, I think mantle is competing direct x? Mantle is good for console exclusives (Sony and Nintendo) Microsoft will do their best to ditch it.

 

Either ways, I spoke with the distributor today and my final decision about my new gpu, will be taken next week. (I may have the chance to check also the r9 290 reviews (none X version) that is almost the same price with GTX 770. It will help me decide.

Microsoft doesn't want competition against their API, of course they're not going to go with Mantle, Mantle is supposed to be easier and allow more options for devs to work with, for all we know it could be the next best thing, but, as we have no benchmarks or any companies supporting it other than DICE, we have no word if it will be succesful, though if it offers the performance increase being speculated, it can be a real game changer.

 

If you can afford a 290 then it's a much better option (as long as you get a card with a non-reference cooler) don't mistake the 290 as an underclocked 290x though, the 290 has less stream processors.

Link to comment
Share on other sites

Thank goodness for that debacle that mantle was, i was betting it was a failure to begin with.

YUUPPP!!!! i'm to in a debate what video card to get, so it is Nvidia from now on. AMD has lost this round considering the mantle debacle. I'm getting the gtx780ti for sure.

Edited by Thor.
Link to comment
Share on other sites

Currently, Microsoft Windows 7 or (heaven forbid) Windows Vista only supports DX 11.0; not 11.1 or 11.2. Only Windows 8 / 8.1 support 11.1 / 11.2. Additionally, nVidia does not support DX 11.1 or DX 11.2 and only newer / newest AMD cards support DX 11.1 / 11.2.

 

If you are planning on installing multiple high resolution patches to Skyrim I highly recommend getting a card with at least 3 GB VRAM or more. With all the high resolution patches / mods I have installed, Skyrim at 1920 x 1080 used between 1.9 GB VRAM and at 2560 x 1600 2.1 to 2.2 GB VRAM to give you an idea.

 

If you plan on getting BF4 and running at 1920 x 1080, 3 GB plus VRAM card is a necessity as based on multiple users on other forums who are reporting usage between 2.1 and 2.6 GB RAM at 1080p and 1440p, respectively.

Link to comment
Share on other sites

A Note has been added to the article linked which suggests Microsoft has (yet again) contradicted themselves:

 

from Article: Note: Feedback we’ve gotten from other sources continues to suggest that Microsoft’s low-level API for the Xbox One is extremely similar to Mantle, and the difference between the two is basically semantic. This doesn’t square very well with Microsoft’s own statements; we’ll continue to investigate.

 

mantle was dropped by Sony ps4 and Xbox one... http://www.extremetech.com/gaming/168671-xbox-one-will-not-support-amds-mantle-and-ps4-is-also-unlikely-is-mantle-doa

 

It seems that other companies don't bother with gpu wars and monopoly games.. So mantle is going to end up DOA, I think mantle is competing direct x? Mantle is good for console exclusives (Sony and Nintendo) Microsoft will do their best to ditch it.

 

Either ways, I spoke with the distributor today and my final decision about my new gpu, will be taken next week. (I may have the chance to check also the r9 290 reviews (none X version) that is almost the same price with GTX 770. It will help me decide.

 

We do not know what Mantle is capable of as there hasn't been any official reviews of the technology in use on the PC platform. Until then, unless a trusted reviewers or AMD release actual, physical demonstrations of Mantle in action on the PC platform any notion that the technology is dead or performance is entirely speculation, IMO.

Link to comment
Share on other sites

Skyrim at 1920 x 1080 used between 1.9 GB VRAM and at 2560 x 1600 2.1 to 2.2 GB VRAM to give you an idea.

If you plan on getting BF4 and running at 1920 x 1080, 3 GB plus VRAM card is a necessity as based on multiple users on other forums who are reporting usage between 2.1 and 2.6 GB RAM at 1080p and 1440p, respectively.

That conclusion is not the way to go about it, however.

 

For a long time it's been the case that VRAM usage would stay firmly at 100% everywhere, it was normal and it was responsible for some performance difference between cards. The rest of the data would be loaded/unloaded into primary RAM as needed. VRAM usage going below 100% started a few years ago with excessive VRAM on cards.

 

So what VRAM peak use of 2.1 GB means is that the application has absolutely, totally no use at all for anything above 2.1 GB and it will never even be accessed. It means that even without any cleanup at all, carrying all the trash it needed once and won't need again, the game won't have any use for more.

If you have 4GB, 1.9GB will go unused. If you have 2GB, 0.1GB will be unloaded when not needed.

 

The performance difference in such cases - 5%-10% over capacity - is non-existent, zero fps. When VRAM peak usage is somewhat higher, e.g. capacity+30%, it appears but remains small until peak use is much higher. So it's far from a necessity, more like no longer useless

 

If the cards cost the same, go for 4GB.

They don't cost the same:

2GB - http://www.newegg.com/Product/Product.aspx?Item=N82E16814121770 - $330 for Asus DirectCU II - very overclockable and quiet

4GB - http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=-1&IsNodeId=1&Description=gtx%20770%204gb&bop=And&Order=PRICE&PageSize=20 - $380 minimum and nothing there with good cooling.

 

Simply put, it's not worth the price difference when there are squarely better cards to look forward to if you have the cash - R9 290 and GTX 780 if they drop the price - and the GPU in question is now just upper midrange.

 

If that's too expensive.... in 95% of the games even looking to the future, your 2GB won't even be fully used. In the other 5%, VRAM amount will be far from your only or primary bottleneck. And maybe in some 1% it will be. But you're not shopping for a high-end card, and $50 is much better used being put towards your next GPU upgrade (which will come sooner than you think) than spent on that 1% of difference.

Link to comment
Share on other sites

What FMOD is saying makes quite a bit of sense. Besides, you will never be able to stay on the cutting edge of technology no matter what choice you make since technology i changing exponentially (everything is the NEXT thing). A 3GB would be a nice improvement im sure, Nvidia is currently top dog as I see it. However the new AMD card is not a bad idea if you can afford it, but being a new card i'd want them to work out kinks first, but then you miss out on Shadow Play which is a REALLY cool feature.

Link to comment
Share on other sites

Amd supports newer DX 11.1 and 11.2 but Nvdia supports Open GL 4.3 (amd supports 4.2) I dont know the differences.

 

Right now I am in a standby mode, I requested to give me 1 week to decide, RADEON 290 (none x) is coming out in Tuesday, hopefully if the reviews are good, I may get that, or else GTX 770 or if I get crazy GTX780 (in my country they dont drop the prices that easy, but I do have about 400 euro refund in goods, its basically not money, I turn in my card and they give me the choice to pick any card, if its more expensive I pay the difference).

 

I hate windows 8, i dont like them, I may get windows 9 lol

Edited by ermacos
Link to comment
Share on other sites

tbh i think any nvidia card with 4gb is a waste their chips are too weak to make use of them you would need 2x770 4gb to make use of it you can read more about it in forums ofc you can still get one but...if you like nvidia get a gtx 780 it got a massive price drop to compete with the new amd line...if you like amd i personaly would go for a r9 280x just because the r9 290 and r9 290x run too hot for my taste wich also makes then bad overclockers

Link to comment
Share on other sites

The reason I want to see the 290 reviews, is not the performance, because is going to be better than gtx770... But the temps.

 

It has less spus and that makes it a bit cooler, but time will tell. If its around 75-79, is going to be OK for me, but more than that (no way). I dont like amd more than nvidia, the opposite. My only consideration is the "feature proof" idea. Nothing more.

 

My problem with r9 280x is the built quality. My distributor offers only Gigabyte and Asus... Gigabyte is about 290euros and the cooler is cheap and noisy, because they used a cheaper version of windforce cooler. I avoid cheap built qualitys. So my only option for 280x, is Asus AMD Radeon R9 280X DirectCU II Top (3GB, DDR5, 384bit) with dual fan (here is a video for it

)

 

The card I turned in, due to temp faults, is gtx480... Something happened to it and it gave me 120c and BSODS in games and 80c in windows environment! Thankfully my card was still in warranty. So, I will try to avoid hot cards this time around... i REALLY enjoyed my gtx480, amazing card. But I hate that NVIDIA cut almost the whole computing power of their gpus, just be able to sell some quadros, that NOBODY buys anymore.

 

I am not keen of OC ing my cards, but I am very positive of buying dual cards... (sli or cf)

 

I already use an TX850watt by corsair psu and I also have one brand new (havent even opened it yet) AX850w by corsair (fully modular) for my new system. So I am covered for feature sli or cf

Edited by ermacos
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...