Jump to content

Nvidia and Bethesda sitting in a tree


Moksha8088

Recommended Posts

Agree with Rooker76 here. I won't be getting AMD card no matter what, because I had bad experiece with them over the years, but Nvidia Experience and GameWorks are a complete joke. I NEVER install this crapsoft, only drivers with PhysX and other updates. Making us registering account in Experience to get new drivers is outrageous, it won't happen. Customers will drop petition bombs for this and some even might sue them. I buy hardware and have the right to get software support, they have no right to get my personal info or anything like this.

Link to comment
Share on other sites

Agree with Rooker76 here. I won't be getting AMD card no matter what, because I had bad experiece with them over the years, but Nvidia Experience and GameWorks are a complete joke. I NEVER install this crapsoft, only drivers with PhysX and other updates. Making us registering account in Experience to get new drivers is outrageous, it won't happen. Customers will drop petition bombs for this and some even might sue them. I buy hardware and have the right to get software support, they have no right to get my personal info or anything like this.

Isn't it optional though? I've never had to do anything with Nvidia other than choose to let it update drivers every couple months. I switched over from AMD after about 12 years of dealing with terrible game support.

Link to comment
Share on other sites

On interwebz there's talk about getting new drivers via 3rd party software portals, they say Nvidia won't let you officially download new drivers if you aren't registered in Experience. I honestly don't think this may be happening for real, I think Nvidia will reconsider.

Link to comment
Share on other sites

 

Isn't it optional though? I've never had to do anything with Nvidia other than choose to let it update drivers every couple months. I switched over from AMD after about 12 years of dealing with terrible game support.

 

Yeah, you can totally still just go to nvidia's website and download it from the same utility that's been there for the last half decade at least. No signup, no nothing. These people don't know what they're talking about in the slightest.

 

As to stuff being optimized to Nvidia? Yeah, it sucks if you have AMD. But it's not like AMD hasn't done the same thing. They developed their most recent new tech with DICE, and so Battlefield 4 and Dragon Age Inquisition ran like f*#@ing trash on Nvidia hardware for months after launch. Neither had SLI support at launch, DAI didn't have Nvidia drivers for 48hrs after launch. BF4 hitched every 2 seconds on 600 series cards. Tomb Raider was poorly optimized for Nvidia as well. Along with a handful of other games that threw their lot in with AMD over the past few years. It's not big bad Nvidia curling its mustache all alone in the corner.

 

By all means, s#*! on Nvidia, but don't go acting like AMD is any better or like they haven't almost entirely abandoned trying to compete with Nvidia and Intel hardware while they reap the residuals from from their PS4 and XBO APIs. If we end up in a world where Nvidia runs a monopoly on cards and Intel runs a monopoly on chips, it will be because AMD couldn't be bothered to care / compete hard enough.

Link to comment
Share on other sites

 

 

 

I hadn't seen that, well there's something to disable to get a whole bunch of frames. I wonder why there is no sign of it on the player character? he has that bit of hair on his forehead and it doesn't move, the females doesn't move in chargen either. I'm not impressed with that hair, it's clipping through that collar, I'm not singling out Fallout, it wasn't very good in Tomb Raider or The Witcher, I don't think the technology is there yet, it's distracting more than anything else.

 

Hair tech in general just isn't there yet. It's barely there for proper CG, all these physics demos they're shoving into the games do is make s#*! float around for no reason. TW3 couldn't even manage 60fps maxed out on SLI Titans with that nonsense. It doesn't belong in games at the moment. None of the stuff should be there until it just works natively, but they can enable it on some 5GHz OC water-cooled SLI Titan demo machine for bullshots so they shove it in there as a selling point. Personally, I prefer understated stuff like Piper's hair at the moment, because it does far less to break my immersion than a bunch of tesselated nonsense that looks like it's under water. At least that hair behaves in regards to gravity like normal hair does.

Link to comment
Share on other sites

NVIDIA and "Gameworks" is nothing new - they've been doing this same racket since the "Way It's Meant To Be Played" campaign a few years ago (which, as far as I know, was born out of the unholy alliances ATi was forming with a lot of devs back in the day, that led to tons of super-ATi optimized code that ran like junk on multiple generations of NV cards)). Where it's gotten "off the deep end with greed" (and I do agree 100% with that) is when NV is deliberately doing stuff that breaks non-NV systems, for example using PhysX and Hairworks and so forth - I remember reading a presentation recently that showed a newer game generating tons of superfluous (non-drawing) polys that would wreak havoc on AMD and Intel GPUs, but the NV driver was designed in such a way as to ignore that little "gift" and voila, "NVIDIA HAS SO MUCH PERFORMANCE." It's also been demonstrated a few times they use PhysX in this way, with "CPU PhysX" deliberately using the least optimized, worst performing paths available (and the "GPU PhysX" isn't even fully done on the GPU - the GPU does part of the work, and the rest is done via optimized SIMD instructions on the CPU; basically the graphics card is acting as a DRM key to push NV hardware sales and providing some computational assistance, but it's nothing like the original "full hardware offload" that Ageia promised us back in 2005). They've also started holding various features "hostage" - things like DSR or power management - want a new feature? Gotta keep buying GeForce cards. Sure some of it is architectural, but a lot of it is just artificial lock-outs to try and push sales (and this is an unfortunate shift from how they used to do things).

 

This isn't saying ATi/AMD haven't had close relationships with developers recently too, but I can't think of an example where a game that takes a lot of AMD tech (e.g. a Mantle game) is deliberately knee-capped without an AMD graphics card in the system. They also haven't gone down the road of feature lock-out with their currently supported cards (but they've dumped a lot of relatively recent stuff a lot more quickly than some people would like (e.g. I think the HD 4800 series only got like two years of full driver support, and the 2000/3000 series were completely left in the cold when it came to video acceleration and compute support)).

 

Another thought I had though, regarding why we see GTX 400/500 series and Radeon HD 7000 series as minreq for a lot of games: AMD really only supports GCN going forwards, and the VLIW4 and TerraScale GPUs (HD 2000 thru HD 6000) series have largely been left in the dark. NV, by contrast, aims at like a 5-7 year lifecycle for all of their products, and Fermi (400/500 series) is still under that. Same thing is happening with DirectX: NV has listed Fermi and higher for DX12 compliant driver support, but AMD is only worrying about GCN and up (and it's entirely possible some of their older DX11 cards could work, but it'll never happen because they're not being supported). So it may be that Bethesda (or other devs, because I've seen that GTX 400/Radeon HD 7000 listing on other games recently as well) is basically saying "we're only going to officially support platforms that are currently officially supported with driver/software updates" instead of going down the slippery slope of trying to validate EOL (or nearly EOL) hardware. A random example: Skyrim can (believe it or not) be run on GeForce 7 (not 700, 7 - you can go look this up on YouTube if you want to see it; it isn't pretty but it does work) as long as the card has sufficient memory (there's a few that do), but Bethesda lists GeForce 8 as the official requirement. That makes complete sense: GeForce 8 was on mainstream support until very recently, while GeForce 7 was axed in early 2013 ("but obob, there are 2015 drivers" -> those are mandatory security patches for Vista/7/8), and I'm guessing that big developers like Bethesda probably have some sort of heads up as to what hardware is and isn't getting the axe, so knowing that GF7 was going the way of the dodo within (roughly) a year of Skyrim's launch, it makes sense to just omit it. That *could* be what's happening with the GeForce 400/Radeon HD 7000 thing on some fronts.

 

 

My point is, it's really unfortunate if this is the case, because ultimately the consumer loses - even if you have the "right" piece of hardware. It'd be nice if "they" (being NV and AMD and so forth) would go back to trying to kill each other with better performance, better features, lower prices, etc instead of just trying to sabotage and destroy each other and using games to do it.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...