Jump to content

DiodeLadder

Supporter
  • Posts

    294
  • Joined

  • Last visited

Nexus Mods Profile

About DiodeLadder

Profile Fields

  • Country
    None

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

DiodeLadder's Achievements

Community Regular

Community Regular (8/14)

2

Reputation

  1. Yeah, CullBone.bone name is it in the context of gun animations (for example, "CullBone.WeaponMagazine"). I suppose how the engine handles weapons is different from AnimObjects? I think many of these annotations are context sensitive. Sorry about wasting your time.
  2. Honestly, the only annotations I know so far are the ones used in gun animations, and those are not many. The CullBone/UncullBone are used in the Lever Action Rifle reload (to hide/show cartridges), and NPC reloads (they hide the entire mag). I've looked at your list, but yeah, without the contexts, it's hard to know how to use them I think the only way to learn these is via exporting XML files from the particular animations you are interested in.
  3. Hi guys, if you are using 3DS Max, controlling visibility is much simpler. Here's how : 1) select the object you want to animate the visibility of. 2) Open Dope Sheet (Graph Editors -> Dope Sheet). 3) Select the object in the dope sheet, and from "Edit" menu, select Visibility Track -> Add. This creates a visibility track. 4) Add keys to the track. Value 1.0 = visible, 0 = invisible. That's it. The exporter picks it right up and creates the necessary Controllers for the Nif automatically. One issue while you do this is, though, once you assign the BS Lighting FX, you can't preview the visibility animation in 3DS (it will work after the export, but no preview). You need to use the 3DS' default materials while you create keys. Controlling visibility of Havok anim can be done via annotations : CullBone.**bonename** (make the bone invisible), UncullBone.**bonename** (make the bone visible)
  4. Think of Black Planes as objects that draw black void. That's the reason why you place them as the top most objects. It's a very primitive system. Additionally, the "Hidden from Local Map" check box in the object reference window is for preventing objects from being included in the Local Map you see in Pip Boy.
  5. Ronin1971, you can still get older Blender versions. The main ones are of course the LTS versions (they get patches for 2 years) : https://www.blender.org/download/lts/ I'm still on 3.3.x myself. Here's the link for all the past release versions : https://download.blender.org/release/
  6. The fact that Elric app has "BC7 Threshold" setting tells me that Bethesda was planning to use BC7 on some assets, and opted not to in the released game for whatever reason. What that means is up to our interpretations, I guess. My guess is that it didn't quite work out for the console versions, and on older GPUs in PCs. Personally, I have been using Bethesda formats for environment assets and less important props, and BC7 for things that need to stand out. Sometimes you'd want uniformity with the vanilla assets, and other times you want your assets to stand out. Testing both ways and deciding what's best for your project is the best way, I think. If you are starting TODAY, though, you need to remember that the next-gen update is coming soon. Considering that Starfield has much shorter loading time and better performance than Fallout 4 does on my machine, I suspect all the BC7 performance considerations may no longer be necessary in the coming update.
  7. While what Ronin said above is correct, in this particular case, however, you need to be extra careful as an existing model is involved. The model needs to have the exact same splits (you can merge vertices + mark those edges sharp while working to make it easier to edit), and also the way triangulations is done needs to be exactly the same. You need the normal map used by the original model to work the same way on your modified model. Believe or not, if the triangulation goes differently from how the normal map was originally baked on a curved surface, you will get weird looking faces in game. Here's a very good video made by the former lead character artist at Eidos Montreal : This is just about the best video on YouTube on the subject. All her videos are pretty much in the league of "I can't believe this stuff is free", and well worth watching if you want to learn from a professional.
  8. I'd recommend using Steinberg's Wavelab Elements. Wavelab has been around since when we still used hardware samplers (musical instruments), and it's great for this kind of editing. Full version of Wavelab is pricey, as that one has full coding capability for professional mastering plants, but the Elements version is much cheaper, and you can pick it up for like 50 bucks during sales. Here is what a Bethesda auto firing SFX file looks like in Wavelab : The green markers are the loop start/end, and for each individual shots, there are unnamed markers (yellow ones) which I think are used to sync audio with WeaponFire annotations in hkx animation files. The last shot marker and the loop end marker are placed for the same shot, but 4 ms apart (as you can see in the top left section). 48kHz 16 bit mono WAV, root = C3. Metadata says it was originally created using Sony Sound Forge 9.0, by the way (it's now owned by Magix, not Sony). I think Sound Forge shows up in Humble Bundles sales from time to time, although I don't remember which version. I hope this helps.
  9. Hello Issacreaper2, Those split edges are the result of sharp/hard edges being exported in a way a game engine can handle them. For example : The edges marked as "sharp" (the light blue edges) here will need to be split in the game engine, so that when smooth shading is applied to the whole thing, these edges will not be affected by it. Because using sharp edges will double the vertex count in engine, you may need to be careful about using them in older games. I'm using sharp edges everywhere here, though, since my assets are light weight anyway. Wherever you have the sharp/hard edges, you'll need to have UV seams on them, so that the baker will handle those edges correctly when baking normal maps. I hope this helps.
  10. Texel Density is measured by how many pixels of texture you have per unit. I was just trying to say "try maximizing the UV island sizes", so nothing fancy really. The concept of Texel Density, however, is very important when creating 3D assets. For example, UV islands for a receiver and a buttstock should have the same texel density ideally even if they are on separate materials, so that both parts would have the same texture resolution on screen and one wouldn't be more blurry than the other. Here's an example : Here I have models with separate materials - receiver (4k texture), muzzle devices (2k), buttstock (2k), etc., in more or less the same density (not perfect, but close enough, lol). The "color grid" image included in Blender is very handy for keeping eye on texel density. There's a free Blender addon for matching texel densities across multiple parts, so you might want to check that out : https://mrven.gumroad.com/l/CEIOR Personally, I use ZenUV addon, but it's a paid product. https://sergeytyapkin.gumroad.com/l/zenuv4?layout=profile ZenUV is amazing, though, and it's totally worth your money. Take a look at their demo videos. In first person shooters, you may want to consider how much screen real estate an asset would occupy. For example, rear iron sight may need more polygons and higher texel density, since it may occupy a large part of your screen while aiming. The example above doesn't have such treatment because I was already packing a lot in one map, but it's something to consider, especially if you have a zoomed-in aim-down sight. Speaking of UV packing, this addon for Blender is really great to have : https://glukoz.gumroad.com/l/uvpackmaster3 I have the older version, which was cheaper, though. Looks like he raised the price. Anyway, I think Bethesda's 10 mm pistol heavy barrel could serve as a nice example of how to do UV mirroring, and use out of range UV coordinates for the mirrored islands. By "mirroring" I don't mean splitting the mesh at the center in this case, since symmetric top and bottom parts would have "butterfly" look if done in such a way. Bethesda artist mirrored the UVs only on the sides, and offset one side outside of regular UV coordinates (+1 in X axis), so that during baking normal map from high to low poly, one side would be ignored by the baker. The UV for Eyebot is a great example as well (by using a lot of UV mirroring + overlapping, the Eyebot 2k texture pretty much has the density of 4k maps used on other assets), but in that asset it looks like the artist moved the out of bounds UV islands back to regular coordinates after baking. About Substance Painter, I have never seen it crashing personally. I use the Steam version from 2019. ....It's gotten kinda long, but I hope someone would find it useful.
  11. DDS = Direct Draw Surface DDS is a part of DirectX, and Microsoft standard. For conversion, Bethesda used Microsoft's texconv.exe, with Bethesda's Elric app acting as the front end. You can find it in the Tools folder (the same folder where Elric is located). Elric takes the .TGA format, and converts it to DDS format of your choice. TGA image format is used in game industry often because of its uncompressed nature, and you can see Bethesda using it as the export format from CK, for example. The vanilla game textures are : _d = Diffuse (+Alpha) : BC1 (no alpha), BC3 (with Alpha) _n = Normal : BC5 (Blue channel removed) _s = Specular (Red channel) + Gloss (Green channel) : BC5 Note that Bethesda is using the DirectX format for normal map, and not OpenGL (which is used by Blender, for example). To convert OpenGL format to DirectX, you need to flip the Green channel. Bethesda's Fallout 4 normal maps do not contain the Blue channel (I assume this was done for better compression when creating .ba2). _s texture files actually contain 2 separate maps, Specular (metallic) and Gloss (smooth). RGBA image file is better seen here as a container for 4 separate 8 bit numeric values. Game devs often pack multiple texture data into 1 file to reduce the number of files being loaded in game. DXT designation was depreciated in DX11, and Microsoft started using BC# then. The old NVIDIA plugin for the 32 bit Photoshop doesn't even import the Bethesda normal maps correctly (it adds the Blue channel on import, and it appears the wrong color profile gets applied). That plugin predates the current BC- formats, and better avoided for Fallout 4. Anyway, in Elric, you'll find the Texture Conversion section in the middle part of the left column. At the top, you'll see "Force Format" and here you can set the DDS format of your choice. Check the "Generate MipMaps", and keep the "Quartering Threshold" and "BC7 Threshold" to "NONE". "Quartering" here means it would create a smaller resolution out of a larger one (Input 4k -> Output 2k, for example). Elric can batch convert TGA files to DDS. As for the texture compression format, the safest bet is what Bethesda used. For my own project, though, I've been using BC7 (d/n/s) for weapons/clothing, and vanilla formats for environment/props, as I feel that BC7 pops out more against the vanilla formats, which are noisier to my eyes. In any case, you may want to maximize your texel density of your textures, so that if you had to lower the resolutions for consoles in the end, they will survive the conversion. Bethesda used a lot of UV mirroring/overlapping to maximize texel density, and their assets are a great source of learning in this regard. Reducing the number of material files loaded by combining multiple elements can be a good idea if possible (each material files will add draw calls in game). I hope this helps. Good luck.
  12. Hello Andyno, Yeah, it was your project I was thinking of, the Zone. I'm glad you got it done!
  13. I think I should have worded better about AMD. I am actually running a 3 monitor setup with 6700XT, and it's rock solid NOW, but the drivers before late July this year had problems with 2+ monitor setups. I had the same issue with RX580 - the first 2 years I owned it, their drivers had issues with multi monitor setup, but they got resolved. AMD really sucks at writing good drivers at card launch, but once the drivers are solid, their cards are good. I'm grudgingly happy with my AMD card for now, lol. The DPC latency issue in NVIDIA cards may have been resolved in the very recent drivers, also, so you may want to check that. The DPC latency only matters in DAW if you need very low latency monitoring while recording under high CPU load - which can be necessary if you are using software instruments running in real time, but not if you are just editing sound effects. How this is important really depends on how much you are willing to push your system. With the current CPUs, though, it's highly unlikely that you get anywhere near maxing CPU usage in DAW. Going NVIDIA probably won't cause problems today. 64 GB RAM - in the context of music production, this much RAM is necessary only if you are using a full orchestral sample library template with multiple microphone positions turned on. If you are doing regular pop/rock/EDM type stuff, you only really need 16 GB.
  14. Hello jcdenton2012, I don't think you can fix this. I remember there was another member who made the same mistake a few years ago (his map was very large, also), and the answers he got back then was also "no". If you don't mind redoing the whole thing, perhaps you could group buildings and such into static collections/pack-in objects, and organize them with area name prefix so that you can work fast when recreating locations. Or.... maybe you could just change your project location to Seattle/Vancouver/Vancouver Island, and keep the map.
  15. Hello SonAnima, My machine is far from the high end stuff in 2023, but here's my machine : i7 2600k (oc'd to 4.2 GHz) 32 GB RAM Radeon RX6700XT (12 GB VRAM) About Substance Painter, I've seen it taking up a lot of RAM, but I don't think I saw it approaching anywhere close to 32 GB in my experience. I do use lots of layers, and perhaps I'm not the most efficient user, but I think 32 GB is okay for the stuff I do. I use AMD GPU mostly because NVIDIA cards tend to have higher DPC latency. My PC is primarily used for music/audio work, and high DPC latency would be problematic in that context. Also, NVIDIA drivers tend to be not so great at supporting OpenGL, and I've seen some plugins having issues in the past. AMD is, however, terrible at supporting multiple monitors, and their GPUs tend to have weird issues like 2nd monitor flickering or system instability. AMD drivers have been very good since this summer, but that hasn't always been the case. If you are primarily a graphics artist, I'd recommend going NVIDIA. Blender, Substance, Autodesk stuff, they all have full NVIDIA GPU support. AMD pretty much is not competing in graphics artists' space.
×
×
  • Create New...