Jump to content

Fallout 4 Optimization and Performance Systems Explained


Recommended Posts

Very nice. I want that script!

 

Your physics nifs are a complete blow, cant seem to edit the collisions properly with 3ds max. Not an expert in that field but collision data doesn't display properly with them or SCOLs.

 

Also, having the prewar Sanctuary bridge as a loose file crashes the CK when precombining meshes, well it generates fine but cant generate an ESP file. Its like it skips it if its packed but forces it to include if its loose?

Link to comment
Share on other sites

  • Replies 589
  • Created
  • Last Reply

Top Posters In This Topic

Where do I get this magical script? :dance:

Very nice. I want that script!

 

Your physics nifs are a complete blow, cant seem to edit the collisions properly with 3ds max. Not an expert in that field but collision data doesn't display properly with them or SCOLs.

 

Also, having the prewar Sanctuary bridge as a loose file crashes the CK when precombining meshes, well it generates fine but cant generate an ESP file. Its like it skips it if its packed but forces it to include if its loose?

 

You wait a little, the script is done and I just need to write the guide for using it :happy: . Should be ready to release it within the next day or so.

 

@SMB92, if you are talking about the the precombineds, I doubt the import script was made to read them right. If you are talking about the xxx_physics.nif, same as before, and I'm almost certain the import plugin can't read collision info (and we only have an official exporter, not an official importer).

Link to comment
Share on other sites

I was starting to think i couldn't view the collision cos I only have access to 2017 3ds max. Like I said I'm not a modelling expert. Maybe one day we'll have a tool to open the merged data and edit it....

 

I'm running the rest of Beantowns individual generations by area for the latest version, can't wait to play with... Your tool :P

Link to comment
Share on other sites

I was starting to think i couldn't view the collision cos I only have access to 2017 3ds max. Like I said I'm not a modelling expert. Maybe one day we'll have a tool to open the merged data and edit it....

 

 

As far as I know, the import plugins can't read collision. Only the official Export plugin (which can only be used with the 2013 version) can generate/export collision info. Also, I'm pretty sure the import plugins can't read precombined meshes properly (nifskope had to be updated to be able to read the vanilla ones, and Jon said it's gonna take some time to get it to read the ones made by the CK).

 

I will finish the guide tomorrow and post it. I would do so now, but I am still recovering from finals and am typing this with my eyes closed half the time*. Should be done tomorrow evening at the latest.

 

*surprisingly, less corrections needed than half the time I type when wide awake. Generally less coherent though.

Link to comment
Share on other sites

Here is the script (with guides probably more detailed than needed for anyone who actually should be using this script, but I figure overkill is better than dealing with people complaining my script broke their mod).

 

It is a result of me asking zilav about a better way to edit the XCRI record manually, and him being absolutely awesome. He made the script; I tested it, and provided feedback on what I wanted in it.

 

If you are doing any significant edits: holding Shift when you click OK (when you are launching xEdit) will prevent it from building reference info (huge impact if you are doing significant edits to the XCRI field, particularly for "busy" cells like the Mechanist's Lair).

 

Edit: Some things I've found out:

-Having a PCMB timestamp that matches your plugin's "Date modified" prevents the workaround from triggering (i.e. touching a placed reference [REFR record] that had been precombined won't disable previs/precombineds)

-having a different PCMB value than vanilla that is still before your "Date modified" will still let the workaround trigger

Link to comment
Share on other sites

The fact the public CK produces 5-10 times larger files explains the small stutter/pause I see when transitioning from an area with custom generated data to vanilla. I would imagine that also the overheads lost in our generations would account some way toward why I don't see reasonable or sometimes even noticeable gain from the meshes, but then as you say it only gened so many new files.

 

The files being 5-10x larger does not necessarily explain the stutter/pause. BSPackedCombinedSharedGeomDataExtra (the vanilla block used in precomb) is meta data. There is no geometry data. What it does is reference the actual Refs and their NIFs in some way that none of us have yet decoded (some kind of hash), then goes through the metadata which defines said geometry in 1-N positions/rotations/scales and then loads and copies the geometry from the base mesh into those positions/rotations/scales.

 

BSPackedCombinedGeomDataExtra on the other hand (the block our precombs use) after a cursory glance appears to NOT be solely metadata, and may in fact just be the fully baked geometry data (or partially baked geometry data). This means that the new block the CK makes for us actually has less overhead. Because at load time (load screen or streaming) it doesn't have to look through the metadata, find the original geometry, then bake the combined geometry into a new mesh. I say this because I looked through the byte patterns and I see clear triangle lists. What I do not know yet is if the original NIF's geometry data has just been copied into this block so as to save the lookup, or if the geometry data I'm seeing is the baked geometry data i.e. the original mesh placed in 1-N arrangements based on how many refs are being combined.

 

The reason that they did not use the latter block for all vanilla files is obvious: disk space. Baking the geometry data into the files would save on loading/streaming overhead but take up a ton of disk space. For modders it matters a lot less. Modding is still mostly PC-centric, especially large mods that change so much they need to regenerate precombined. We can take the disk space hit, but consoles can't.

 

So after I actually decode BSPackedCombinedGeomDataExtra I'll know for sure but based on the byte patterns I'm 90% sure there is geometry being held in the block whereas previously there was none. If so this is a simple case of prioritizing overhead vs disk space. The former block prioritizes diskspace, the latter overhead.

 

Also 5-10x bigger might be a bit of a conservative estimate. I've compared files with the same filename but one is vanilla and one is regenerated and sometimes the same block is 100x larger or more. Obviously this depends on how complex the geometry is that is getting baked into the file.

 

---

 

Edit: And as for the pausing, if this were on an HDD then I do imagine that the time to load the files would be longer than loading the metadata + the time spent baking the meshes in realtime. On an SSD I'm not so sure, especially NVMe ones. Putting the BA2s which contain precombined or the loose precombined files into a RAMdisk could alleviate the stuttering if it is a stutter from diskreads.

Link to comment
Share on other sites

Really good info there Jonwd7, awesome stuff. If thats true then thats good news, I love performance and I don't care if I fill a 500GB Evo to get it lol. Maybe thats why giggly sees so much perfirmance improvement downtown with new gens or at least partially so.

 

Edit - just to clarify jonwd7, I'm running fallout off a 960 Evo nvme dedicated to it ;)

 

And thanks very much VlitS and zilav for that script, shame I had to work 10 hours today lol I would have spent that time playing with that script. But if what jonwd7 is true I'm sticking with the full gens :P Xbox people will be real happy but

Link to comment
Share on other sites

And just as a TL;DR to my post, if the different block does indeed hold the baked geometry instead of just metadata, then one can definitively say that this incurs less overhead while playing the game. Because you are moving operations done on the metadata at runtime and putting them into the cost of baking instead.

 

What this does not necessarily mean is that less overhead = faster. This would have to be tested rigorously and honestly I'm not sure if the results will prove conclusive. The system barely makes a difference in some places anyway.

 

---

 

I'm actually a bit confused about the script myself even though I'm the one who decoded the file side of it.. :P If you remove unnecessary _OC files how is file resolution done? Does it go backwards through your load order looking at every <Plugin.ESP>\Precombined\<blah>_OC.nif looking for the matching name? Or does it simply jump all the way down to vanilla? If it's the former then I foresee potential issues. The filename in the lower priority mod could match but not actually have the same content as vanilla. I don't know if this can happen via generation (same name but different content) but that doesn't preclude someone from manually editing their precombined files. One can manually tweak the positions of things in precombined files for example.

 

Anyway if that fallthrough example is possible--basically having _OC files from 2+ different mods in the same cell--then what happens to the UVD files that only know about the state of the cell from when the top-most mod baked it? You likely only have the masters + your plugin loaded when baking, so the UVD really only knows about vanilla + your mod.

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...