Jump to content

Fallout 4 (DX10 Support)


nemmay

Recommended Posts

I know this has probably been brought up on numerous sites including this one (I do apologize for not searching through existing topics; I just figured it'd be better to get a new thread going in terms of brain storming and actual mod development)

From a technical standpoint and the current capabilities we have with modifying Fallout 4, is it possible to modify the game to utilize DX10 even if that means stripping down or replacing effects and functionality that requires DX11-12 capable GPUs.

There's quite a few people that have had to settle to playing the game via console and have been excluded from the modding scene that the nexus sites foster; I've seen some amazing mods here, and I would love to be able to experience them at some point.

My rig should be able to handle Fallout 4 if I could just figure out a way to get it to support DX10 with a framerate other than 1 FPS. I know I could easily manage 30-60 FPS given that I can max all of the graphical settings on Skyrim with the HQ texture packs and a metric megaton of modifications with a smooth framerate that will hover between 40-60 and that's only because I went overboard with the type of visual modifications I used for the game.

My GPU is ancient according to current standards (GTX 285,) but I'm fairly confident it could handle this game if some type of viable work around was found. i have modding experience from the following: Morrowind, Oblivion, Skyrim, Fallout 3, Fallout New Vegas, Fable TLC (PC and XBox; to clarify, I mean the original Fable TLC release that was on the PC and original XBox,) KoTOR 1-2 (PC & XBox), Halo CE, Halo 2, and some private modifications for some newer games from the last console generation that I respectfully decline to go into detail on.

I love PC gaming, and I love everything the nexus community has helped provide for the games that I love and have spent quite alot of time playing around with various mods and utlizing my own conten (My Halo energy sword mofication for TLC was pretty legit despite some minor flaws from the first version where the hilt was held like a regular sword, lol) Heck, I even tinkered with GTA San Andreas on the original xbox (first released version, not the re-release that has various patches and the removal of the "hot coffee mod." The original xbox kept my thrist for changing and adding content to the games I love quenched until I had finally owned a PC that would play Oblivion and Fallout 3.

Any technical details would be much appreciated; also, I don't want to see simple "no this can't be done" responses, I want to see constructive responses that explain why it can or cannot be accomplished. The easy method would be to purchase a new GPU; however, I'm broke and can bairly walk on the best of days due to nerve damage in my lower back.

Link to comment
Share on other sites

I highly doubt it.

DirectX is a set of API's or (Application Programming Interfaces), meaning it contains calls that the game's code itself hooks into.

 

Meaning, you would need to re-write the version of the engine FO4 is using in order to accomplish what you're after. In addition, source code is not available to us, nor could we legally obtain it in order to carry our such a re-write.

Also, I highly doubt a 285 could run the game anyhow. It's far below the game's minimum requirements.

 

FO4, is also 64-bit and takes advantage of much higher amounts of RAM than most games. Which means if the rest of your system is comparable to your GPU it wouldn't matter what version of DirectX you're running anyhow. You're just behind the times.

Edited by cjthibs
Link to comment
Share on other sites

The engine rather heavily uses compute shaders for several key visual components (primarily computing the lighting for the scene). DX10-level hardware is extremely limited in terms of compute, especially the older nvidia cards. It's long been time to upgrade, and I don't think that anyone is going to put in the large engineering effort to rewrite the shader library and corresponding interface code to make the engine DX10-compatible.
Link to comment
Share on other sites

  • 7 months later...

I'd like to see more people like OP because if something like that were to be accomplished it would be good for the modding scene and some people with older cards would get to play the game when otherwise they cannot, and I hate to see that this thread died so quickly :/

 

 

Also, I highly doubt a 285 could run the game anyhow. It's far below the game's minimum requirements.

Honestly its not the power of the GPU that's the problem...I ran fallout 4 with a GT 430 fine and that card may have DX11 but in terms of raw power it is NOTHING compared to even a very old DX10 card like a 8800GTX.

 

 

I myself own 2 GTX 285's that I used in SLI and even today they are simply beasts of a card so if Fallout 4 was DX10 compatible they would probably run Fallout 4 on high setting at 1080p without breaking a sweat, although obviously some DX11 effects would be disabled so it would look worse than Fallout 4 high settings with DX11 effects.

Link to comment
Share on other sites

Simply not going to happen because mod authors do not have access to the engine source code. "Downgrading" the FO4 engine to use DX10 instead of DX11 would require highly skilled programmers, to boot.

Edited by Reneer
Link to comment
Share on other sites

Sorry, everything you have written is NONSENSE. When it comes to API support, 100% of the answer lies with the GPU drivers that Intel, AMD and Nvidia provide. If they don't care to support a piece of hardware YOU ARE OUT OF LUCK.

 

Years ago I was running an ATI 800XT, and Bioshock game out. My GPU could run the game in its sleep, but Nvidia had paid the devs to use a point update to the shader API that its new cards supported, and ATI didn't care because they wanted to force existing customers to update to the 1800/1900 GPU. So the 800XT was artificially locked out from running the game.

 

Today we see the same thing with DX12. There is no such thing as basic DX12 hardware- but Nvidia and AMD have only provided DX12 drivers for their most recent GPUs. ALL DX11 cards are capable of running DX12 titles- but because of drivers only a fraction will ever do so.

 

I play FO4 on a 6870 (at 720P), and even getting that supported card running acceptably has been a pain in the backside (due to Beth synthetically trying to cripple performance on 1GB cards at Nvidia's and AMD's behest). With the right 'hacks' it runs great with near best graphics- but this would have been IMPOSSIBLE if the 6870 driver did not support the needed APIs.

 

Your 285 needs binning and badly- sorry if you were fooled into paying a forune for it back in the day by Nvidia. It is much better to buy a good mid-end GPU and replace it often than to pay Nvidia's extortionate prices for their high-end product and try to hang on to it. Just watch what happens to the new 1080 and Titan-X GPUs when Nvidia launches Volta (a TRUE DX12 architecture) next year. Nvidia will fall over itself to pay devs to use proper Vulkan and DX12 methods that make Pascal obsolete as soon as possible.

 

GPU drivers are the most difficult pieces of software in the world to write (check Linux for proof). No-one is going to get your non-supported 285 working by recoding its drivers, or by recoding Beth's back-end renderer.

 

Getting the 800XT working in Bioshock was infinitely easier, and that never happened.

 

Your time is money. New cards from Nvidia and AMD are making older- but still capable cards- cheaper than ever. And second-hand is cheaper as well, cards and laptops. A 260X or 750TI- low end 'rubbish', are actually as fast as a PS4 console (admittedly, so long as your PC has a decent CPU). A 6870 could probably be had for less than 20 dollars second hand. If you looked, you'd probably find someone who'd gift you one for free.

 

I mean, look how you even say in your post you don't want people to tell you the bad news- what on Earth is that about? You say you are too 'poor' to support your gaming hobby, yet how much does Fallout 4 cost.

 

If you are really serious about gaming, do what it takes to build a 4-core i5 (any one), 8GB machine with a 460 or 470 card from AMD. This machine should be good for decent gaming for at least the next FIVE years, meaning the yearly cost of ownership should be little more than ONE AAA game a year would cost you. Otherwise beg for an old 560TI, 660TI, 6870 or the like- but be aware 1GB VRAM gaming is increasingly not worth the effort- and can half your framerates over identical hardware with 2GB.

 

I don't care whether you like this answer or not- because this is THE answer for AAA gaming in 2016.

Link to comment
Share on other sites

Zanity, wrong. As always.

 

Technically, any GPU can emulate any DX level if it lacks the appropriate SSE. However, once you do that, then it's no longer done on a hardware level but instead of on a software level, significantly slowing down everything.

 

So if you don't know something, just shut up. Please.

 

Oh, and FO4 uses a healthy chunk of VRAM. If you want to out on your tinfoil hat, feel free to, but a lot of modern games nowadays eat VRAM like brekky.

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...