Jump to content

hexabit

Premium Member
  • Posts

    22
  • Joined

  • Last visited

About hexabit

Profile Fields

  • Country
    United States

hexabit's Achievements

Apprentice

Apprentice (3/14)

  • First Post
  • Collaborator Rare
  • Conversation Starter
  • Week One Done
  • One Month Later

Recent Badges

0

Reputation

  1. (Assuming you are also the one in NifTools Discord discussing this matter) If you were working from the PR for dev7 like I suggested you wouldn't have had to re-enable anything. It was to prevent people getting (at the time) dev4 for FO4 and seeing the option and believing they could use it. I only wrote export for BSTriShape, not import. "Works really well" is quite an overstatement. The code is riddled with issues and you likely didn't correct all the things I did in the PR, like adjusting for nif.xml changes, which you're probably also not using the correct XML, as right now it's also in PR status and the development version only comes with my builds on Discord. I will be (finally) releasing dev7 in a week or so, as it's gone through months of feature creep and testing. AFAICT you also did not heed my warning about your bhkRigidBody being corrupt. Your NIF version is 130 (FO4) but for bhkRigidBody that means > 83 and 100 (Skyrim and SSE) which the nif.xml processes as "also Skyrim". However the bhkRigidBody that Max exports has the Cinfo properties and data alignment of FO3 (34). Since bhkRigidBody are not technically valid blocks in version 130 (FO4), I don't want to muck up the nif.xml version conditions solely for pre-Elric FO4 NIFs. I took a look at your bhkRigidBody in the supplied NIFs and indeed the info is wrong. You created the block with NifSkope and then filled in values, but that means the data alignment is that of Skyrim and not FO3, so Elric is actually reading the wrong values for things like Friction, Restitution, etc. because the data becomes misaligned after Time/Gravity Factor. So, I repeat, the data is completely wrong after Time/Gravity Factor, and you have to manually realign the data. I also brought up the issue of MOPP to you, which I don't see any kind of warnings issued here. Anything beyond primitives/convex shapes in FO3 and Skyrim need a bhkMoppBvTreeShape as the first shape, and the actual shape data is below that. MOPP is a program which accelerates raycasting for each shape/face/triangle in the shape or shape collection. You can technically use collision without MOPP but it will be monstrously slow. We tested on both FO3 and Skyrim by stripping their MOPP data and FPS plummet and it becomes a stutter-fest. I brought up that you need to verify whether or not the lack of a MoppBvTreeShape with valid MOPP data has a negative impact on the Havok binary produced by Elric after conversion i.e. does it also lack the acceleration or did it hopefully remake it.
  2. @VIitS Yeah, I brought up XPRI and _Physics in the *Edit thread on AFKmods but I forgot to carry that over here. I've seen the HavokRoot node in tons of _OC files. I think a good rule is that if an _OC file (vanilla format) is >10KB there is a good chance that it has a HavokRoot. I even thought that the physics from precombined references had to be in the _OC and that the _Physics was for other things because of the prevalence of HavokRoot. And after what you said now I'm not so sure. I originally thought that the XPRI list was what goes into _Physics, but it does seem like the lists for those are rather paltry. I have viewed the _Physics in CK Preview before with the collision wireframe and it did seem to be a lot more than the XPRI lists which is generally activators, FX and such. What I do notice is that it seems like what is in XCRI is never in XPRI and vice versa, so I guess maybe the official way of saying what is in _Physics would be (XCRI + XPRI - the HavokRoots from the _OC files). I can see no real rhyme or reason for there being a HavokRoot instead of it going into the _Physics file however. I see a lot of trees and stairs in the HavokRoots but a lot of random objects too. Have you discovered the method to the number of _OC files that you get out of a cell? There is obvious spatial partitioning going on so as to keep the meshes that get combined close together.. otherwise the bounding sphere that contains them can be as big as or larger than the cell. But then every cell should be 16 or 64 _OC files. Instead it seems like the average is closer to 32. Maybe 2x4x4 to make sure things are also vertically close? Basically, other than such a spatial partitioning scheme, is there any reason for the separate files that you are aware of? I also wish I was able to figure out the reason for the naming in the second section of the filename. Not to mention the hashes in the ExtraData block that actually point to the meshes. --- ...As for SMIM I still think it's in the category of "Virtually Impossible" while keeping the precomb system on. I've long professed this by saying that it necessitates regenerating the precombined for every cell in the game, thus including 2GB+ of baked data. What I didn't know then was that the files we can generate are 10x larger than vanilla. It's very unlikely given the scope of an SMIM-like project that 90% of the _OC files would be unnecessary to package (20GB->2GB) so it still ends up worse than my original claims. Let's be hopeful and say that you can get the 20GB of data down to 4GB. Even before adding in the actual NIF files that you are replacing, that's one of the largest mods ever. Not to mention the sweeping incompatibilities this causes with anything else needing regenerated previs. I'm not even sure that the CK can regenerate the precombined for all affected cells when you replace an extremely common NIF (say a garbage pile NIF that has 7000 uses across nearly every cell). In the past the CK typically just crashes when given that kind of workload. :smile: Also each time you change a single vertex or poly on an SMIM NIF, you may need to regenerate all 20GB+ of the precombined**, depending on the prevalence of the NIF in ref usage. For example if you tweak a NIF with just a few uses, you only need to regenerate precombined for a few cells. You tweak a NIF used in very cell, you need to regenerate the whole game. (**: Both versions of the data block in _OC files are directly reliant on the vertex and triangle counts in addition to the LOD level information.) Given this, I think it would be much simpler to programmatically uncombine all references that use said super-common NIF with something like your script. I know that your script can uncombine a reference manually, but can it uncombine all references across all cells which use a certain NIF (or certain STAT, etc)? That would be extremely useful in this case. I guess the only downside is possibly doubled-up physics? SMIM-like mods normally don't change the collision shape anyway and you would only uncombine the references which have thousands of uses per NIF, so I don't really see the big deal of having the physics loaded from the precomb and the uncombined NIF given it dramatically cuts down on how many affected cells have to be regenerated when you edit said NIFs. Taking that into account, you're basically creating a blacklist for certain NIFs and any refs that use the blacklisted NIFs should somehow be considered totally vanilla when baking precombined. I forget the exact wording in the CK menus, but it's something like "regenerate precombined for all affected cells" and I'm assuming this looks at any loose files you have and determines which cells this overridden NIF is used in. In that case I guess you can just move the NIF files out of the Data folder that you plan on uncombining. My reasoning for bothering doing this is also filesize. An SMIM-ified NIF file is probably going to be way bigger, and if the CK is sticking that data into the _OC files that means that they'd be a lot smaller because they'd still be vanilla. So to summarize my ideas: The more super-commonly used NIFs that you uncombine (and keep out of the Data folder when baking) the:Smaller the total size of the regenerated _OC will be given that SMIM-ified files will be way larger (and that size differences gets multiplied by use count) Fewer cells which have any non-vanilla NIFs in them needing to regenerate, meaning less time regenerating it all. Fewer times you have to regenerate precombined at all when editing a file's geometry. If it's a permanently uncombined file, you do not need to regenerate the precombined to get it working correctly in game. The only issue I can think of at the moment is that of initial setup. There may be a lot of cells where you only have one SMIM NIF at the time, and if it's an "uncombined/blacklisted" NIF you still need the cell record in your plugin to uncombine it. Also I'm not certain just moving the files out of Data works for the way the CK determines what cells are "affected" when you regen precomb in that way. I should really go into the CK sometime as it's been so long I forget the exact terminology. :tongue:
  3. And just as a TL;DR to my post, if the different block does indeed hold the baked geometry instead of just metadata, then one can definitively say that this incurs less overhead while playing the game. Because you are moving operations done on the metadata at runtime and putting them into the cost of baking instead. What this does not necessarily mean is that less overhead = faster. This would have to be tested rigorously and honestly I'm not sure if the results will prove conclusive. The system barely makes a difference in some places anyway. --- I'm actually a bit confused about the script myself even though I'm the one who decoded the file side of it.. :P If you remove unnecessary _OC files how is file resolution done? Does it go backwards through your load order looking at every <Plugin.ESP>\Precombined\<blah>_OC.nif looking for the matching name? Or does it simply jump all the way down to vanilla? If it's the former then I foresee potential issues. The filename in the lower priority mod could match but not actually have the same content as vanilla. I don't know if this can happen via generation (same name but different content) but that doesn't preclude someone from manually editing their precombined files. One can manually tweak the positions of things in precombined files for example. Anyway if that fallthrough example is possible--basically having _OC files from 2+ different mods in the same cell--then what happens to the UVD files that only know about the state of the cell from when the top-most mod baked it? You likely only have the masters + your plugin loaded when baking, so the UVD really only knows about vanilla + your mod.
  4. The files being 5-10x larger does not necessarily explain the stutter/pause. BSPackedCombinedSharedGeomDataExtra (the vanilla block used in precomb) is meta data. There is no geometry data. What it does is reference the actual Refs and their NIFs in some way that none of us have yet decoded (some kind of hash), then goes through the metadata which defines said geometry in 1-N positions/rotations/scales and then loads and copies the geometry from the base mesh into those positions/rotations/scales. BSPackedCombinedGeomDataExtra on the other hand (the block our precombs use) after a cursory glance appears to NOT be solely metadata, and may in fact just be the fully baked geometry data (or partially baked geometry data). This means that the new block the CK makes for us actually has less overhead. Because at load time (load screen or streaming) it doesn't have to look through the metadata, find the original geometry, then bake the combined geometry into a new mesh. I say this because I looked through the byte patterns and I see clear triangle lists. What I do not know yet is if the original NIF's geometry data has just been copied into this block so as to save the lookup, or if the geometry data I'm seeing is the baked geometry data i.e. the original mesh placed in 1-N arrangements based on how many refs are being combined. The reason that they did not use the latter block for all vanilla files is obvious: disk space. Baking the geometry data into the files would save on loading/streaming overhead but take up a ton of disk space. For modders it matters a lot less. Modding is still mostly PC-centric, especially large mods that change so much they need to regenerate precombined. We can take the disk space hit, but consoles can't. So after I actually decode BSPackedCombinedGeomDataExtra I'll know for sure but based on the byte patterns I'm 90% sure there is geometry being held in the block whereas previously there was none. If so this is a simple case of prioritizing overhead vs disk space. The former block prioritizes diskspace, the latter overhead. Also 5-10x bigger might be a bit of a conservative estimate. I've compared files with the same filename but one is vanilla and one is regenerated and sometimes the same block is 100x larger or more. Obviously this depends on how complex the geometry is that is getting baked into the file. --- Edit: And as for the pausing, if this were on an HDD then I do imagine that the time to load the files would be longer than loading the metadata + the time spent baking the meshes in realtime. On an SSD I'm not so sure, especially NVMe ones. Putting the BA2s which contain precombined or the loose precombined files into a RAMdisk could alleviate the stuttering if it is a stutter from diskreads.
  5. Can you please remove this from your post? As I already told you in the private CK beta forum, this is completely incorrect. It compresses just fine and is not "old and not supported" considering Archive.exe is what the CK calls on the command line when packing the BSAs. As you already shared with me in the private forum, you simply didn't know that you had to check the option "Compress Archive" and as far as conversion goes, it's completely irrelevant, as anyone can unpack their Skyrim BSAs with any number of programs, and repack it using Archive.exe.
  6. At the time of release, my extractor produced identical files to ba2extract. I verified this on both general and texture BA2s for each file in the archive using a binary diff tool. I'd like if you could verify this yourself. If they actually differ for you, I'd like to know. As of 0.02 I now just treat the BC5U like ATI2 like ba2extract can be told to do, but this is still wrong, at least in that the only tool which can really work with it doesn't display it correctly. I posted about that here.
  7. I will have a GUI version for extracting BA2s up on here shortly, which I couldn't have done without ianpatt. :) I'm just waiting on this UI library to compile statically so that I don't have to include the DLLs.
  8. Is that the correct link? Don't see anything about the .ba2 files there? It says "ripped" from the .ba2, but the filepath strings are fully visible in the file with a hex editor I believe. Doesn't mean that he unpacked the file. Edit: I have an extractor ready to go, and I'm just waiting to be told what the adjustments to the file header are as well as the layout of the records. I have been told by the F4SE team that it's only a slight modification of BSA.
  9. No mod "causes" it. It's a vanilla bug. It's more common with mods because they are running the types of effects on the player which exacerbate the issue. The most commonly reported FX bugs have been fixed in USKP 1.3.0. The scripts have been edited to short circuit if they are erroneously run on the player, which the vanilla game seems to love doing. The ones fixed were: Draugr Eye FX, Ice Wraith FX, Steam Centurion FX, Spriggan FX. I didn't think I was going to have to do many more but apparently everyone has gotten every type of FX attached to them... More may be fixed in later USKP versions.
×
×
  • Create New...