Jump to content

DiodeLadder

Supporter
  • Posts

    304
  • Joined

  • Last visited

Everything posted by DiodeLadder

  1. Hello SonAnima, I'm sure you've followed the Rizzler's tutorial, but that's not working for you? I don't think I do anything differently here to get the annotations to work. Just to be sure, are you selecting the "Root" (in schematic view, you can find it in the top left area) when editing the annotations?
  2. Hi guys, Those files are in "Fallout4 - Interface.ba2".
  3. Honestly, I don't remember. I said "a couple of vanilla guns", but I can't say I've tested everything, since I'm more of a first person player. It's been a long time since when I was testing, but I think it was Deliverer that had the problem with third person aimed shots landing a little off to the left of reticle, and another one was also ballistic. I wouldn't be surprised if there were more guns behaving like that in third person, though, and the Musket is one of them.
  4. Karna, just so that there's no misunderstanding, what I wanted to say was : 1) There are a couple of unmodded Bethesda guns that would shoot off to the left of reticle in the third person. I'm suspecting this to be caused by wrong metadata (AnimTextData) in the shipped game. You can see what this metadata is at the bottom of this page : https://falloutck.uesp.net/wiki/Subgraphs I say this because, I can take the same Bethesda animations and use them with freshly generated metadata without any issues. 2) There are Bethesda gun models with iron sights not properly aligned in X axis, and they need to be tilted slightly in the animations to look aligned in game, in first person view. So, I'm not saying that the animation affects the aiming mechanics of the game.
  5. Thanks South, that's interesting! In the case of weapon 3rd person anims, though, by nature they are layered unlike the animObject use. For example, a regular pistol 3rd person animation would have 1) walking / running animation, 2) the base pistol archetype animation (which takes over the upper body), 3) an "assembly pose" which would specify how a particular gun is held for use with the #2 base pistol animation. These layers are blended using the layer masks (which would control which part of the layer would be blended into the main anim, like, only right arm layered on top of another base anim, for example) and most of us working on gun mods are only adding the Assembly Pose, which is literally a pose made of 1 frame. We don't have much control over the 3rd person animations when working like this. So I think the issue using your idea with weapons would be, blending and transitioning of multiple layers may not work correctly without the TextData, and also, I wouldn't go that deep to override the base anims to make use of the annotation hack you suggest. I'd rather not mess with how Bethesda did things because this part is frustrating enough as it is, lol. 3rd person gun animations actually involves a fair amount of guess work as to which part we have control over. For example, it took me a while before realizing that I can't change the rotation of right hand in a pistol assembly pose, because the hand rotation is hardwired in the base animation for recoil handling. Totally off topic. Sorry guys.
  6. I do suspect that is very much the case, too. I'm trying to think of any other peculiarities with the gun mechanics in this game, but what I've said already is all I can think of at the moment.
  7. CrowOfItachi, I saw the video, and I can tell you that that's definitely not the vanilla game behavior. My game is pretty much near vanilla because I don't want any other mods affecting my projects, and as I've said in my posts, aimed shots would always go where I'm pointing at.
  8. Karna, I'm sorry but I'm not seeing what you are saying in your video, or in my video. Here's my findings from my own testing : Third person shots actually go a little off to the left from where the reticle is on a few Bethesda guns in the unmodded game, if I remember correctly, but I think it is to do with the wrong AnimTextData for those particular guns (they probably updated the animations for those guns without updating the AnimTextData - this does not affect the first person). Third person anim actually calculates the bullet trajectory from the center of the screen as if the shooter is at the center of screen, and not where you see the player character model is on screen (this only affects shots if the target is very very close). Also, there are a few Bethesda gun models with iron sight not properly aligned on 3D models, but compensated in the first person animations (which is the reason why using Deliverer as the base for animating custom gun can make it really difficult to align the sight properly in game). I've tested specifically for the accuracy of aimed shots on guns extensively, and aimed shots always go to where I'm pointing at. In my video, I've actually made some 60-80 m shots to demonstrate this. Aimed shots are definitely not luck based. I happen to be showing my own custom gun in the video, but the aiming works the same way for any vanilla guns with no mods used. If the aimed shots hat cone of accuracy, or "luck based system" which I don't think exists in aimed shots, I'd have noticed while testing. I disagree with all of you, lol, but this is after I've done countless hours of testing guns in FO4 specifically on the subject. I do agree that Far Cry is much easier, though, but I think that is because the hit box in that game seems to be more forgiving while characters are moving. In FC6, I found it ridiculously easy to make headshots, but that is because the game is compensating for my shots and not because I'm any good, lol.
  9. I might have much lower standards than you guys do, but I think the aiming in first person is just fine. This is a very old video I've made before I learned to do my own animations, but here I test with a few pretty long distance shots using iron sight only. Hip fire has the cone of accuracy, but the aimed shots don't. At least I've never noticed it before.
  10. I might not be thinking this thoroughly, but can you treat the camera as a weapon? "weaponFire" can trigger a muzzle flash, and that might be something you could use?
  11. Yeah, CullBone.bone name is it in the context of gun animations (for example, "CullBone.WeaponMagazine"). I suppose how the engine handles weapons is different from AnimObjects? I think many of these annotations are context sensitive. Sorry about wasting your time.
  12. Honestly, the only annotations I know so far are the ones used in gun animations, and those are not many. The CullBone/UncullBone are used in the Lever Action Rifle reload (to hide/show cartridges), and NPC reloads (they hide the entire mag). I've looked at your list, but yeah, without the contexts, it's hard to know how to use them I think the only way to learn these is via exporting XML files from the particular animations you are interested in.
  13. Hi guys, if you are using 3DS Max, controlling visibility is much simpler. Here's how : 1) select the object you want to animate the visibility of. 2) Open Dope Sheet (Graph Editors -> Dope Sheet). 3) Select the object in the dope sheet, and from "Edit" menu, select Visibility Track -> Add. This creates a visibility track. 4) Add keys to the track. Value 1.0 = visible, 0 = invisible. That's it. The exporter picks it right up and creates the necessary Controllers for the Nif automatically. One issue while you do this is, though, once you assign the BS Lighting FX, you can't preview the visibility animation in 3DS (it will work after the export, but no preview). You need to use the 3DS' default materials while you create keys. Controlling visibility of Havok anim can be done via annotations : CullBone.**bonename** (make the bone invisible), UncullBone.**bonename** (make the bone visible)
  14. Think of Black Planes as objects that draw black void. That's the reason why you place them as the top most objects. It's a very primitive system. Additionally, the "Hidden from Local Map" check box in the object reference window is for preventing objects from being included in the Local Map you see in Pip Boy.
  15. Ronin1971, you can still get older Blender versions. The main ones are of course the LTS versions (they get patches for 2 years) : https://www.blender.org/download/lts/ I'm still on 3.3.x myself. Here's the link for all the past release versions : https://download.blender.org/release/
  16. The fact that Elric app has "BC7 Threshold" setting tells me that Bethesda was planning to use BC7 on some assets, and opted not to in the released game for whatever reason. What that means is up to our interpretations, I guess. My guess is that it didn't quite work out for the console versions, and on older GPUs in PCs. Personally, I have been using Bethesda formats for environment assets and less important props, and BC7 for things that need to stand out. Sometimes you'd want uniformity with the vanilla assets, and other times you want your assets to stand out. Testing both ways and deciding what's best for your project is the best way, I think. If you are starting TODAY, though, you need to remember that the next-gen update is coming soon. Considering that Starfield has much shorter loading time and better performance than Fallout 4 does on my machine, I suspect all the BC7 performance considerations may no longer be necessary in the coming update.
  17. While what Ronin said above is correct, in this particular case, however, you need to be extra careful as an existing model is involved. The model needs to have the exact same splits (you can merge vertices + mark those edges sharp while working to make it easier to edit), and also the way triangulations is done needs to be exactly the same. You need the normal map used by the original model to work the same way on your modified model. Believe or not, if the triangulation goes differently from how the normal map was originally baked on a curved surface, you will get weird looking faces in game. Here's a very good video made by the former lead character artist at Eidos Montreal : This is just about the best video on YouTube on the subject. All her videos are pretty much in the league of "I can't believe this stuff is free", and well worth watching if you want to learn from a professional.
  18. I'd recommend using Steinberg's Wavelab Elements. Wavelab has been around since when we still used hardware samplers (musical instruments), and it's great for this kind of editing. Full version of Wavelab is pricey, as that one has full coding capability for professional mastering plants, but the Elements version is much cheaper, and you can pick it up for like 50 bucks during sales. Here is what a Bethesda auto firing SFX file looks like in Wavelab : The green markers are the loop start/end, and for each individual shots, there are unnamed markers (yellow ones) which I think are used to sync audio with WeaponFire annotations in hkx animation files. The last shot marker and the loop end marker are placed for the same shot, but 4 ms apart (as you can see in the top left section). 48kHz 16 bit mono WAV, root = C3. Metadata says it was originally created using Sony Sound Forge 9.0, by the way (it's now owned by Magix, not Sony). I think Sound Forge shows up in Humble Bundles sales from time to time, although I don't remember which version. I hope this helps.
  19. Hello Issacreaper2, Those split edges are the result of sharp/hard edges being exported in a way a game engine can handle them. For example : The edges marked as "sharp" (the light blue edges) here will need to be split in the game engine, so that when smooth shading is applied to the whole thing, these edges will not be affected by it. Because using sharp edges will double the vertex count in engine, you may need to be careful about using them in older games. I'm using sharp edges everywhere here, though, since my assets are light weight anyway. Wherever you have the sharp/hard edges, you'll need to have UV seams on them, so that the baker will handle those edges correctly when baking normal maps. I hope this helps.
  20. Texel Density is measured by how many pixels of texture you have per unit. I was just trying to say "try maximizing the UV island sizes", so nothing fancy really. The concept of Texel Density, however, is very important when creating 3D assets. For example, UV islands for a receiver and a buttstock should have the same texel density ideally even if they are on separate materials, so that both parts would have the same texture resolution on screen and one wouldn't be more blurry than the other. Here's an example : Here I have models with separate materials - receiver (4k texture), muzzle devices (2k), buttstock (2k), etc., in more or less the same density (not perfect, but close enough, lol). The "color grid" image included in Blender is very handy for keeping eye on texel density. There's a free Blender addon for matching texel densities across multiple parts, so you might want to check that out : https://mrven.gumroad.com/l/CEIOR Personally, I use ZenUV addon, but it's a paid product. https://sergeytyapkin.gumroad.com/l/zenuv4?layout=profile ZenUV is amazing, though, and it's totally worth your money. Take a look at their demo videos. In first person shooters, you may want to consider how much screen real estate an asset would occupy. For example, rear iron sight may need more polygons and higher texel density, since it may occupy a large part of your screen while aiming. The example above doesn't have such treatment because I was already packing a lot in one map, but it's something to consider, especially if you have a zoomed-in aim-down sight. Speaking of UV packing, this addon for Blender is really great to have : https://glukoz.gumroad.com/l/uvpackmaster3 I have the older version, which was cheaper, though. Looks like he raised the price. Anyway, I think Bethesda's 10 mm pistol heavy barrel could serve as a nice example of how to do UV mirroring, and use out of range UV coordinates for the mirrored islands. By "mirroring" I don't mean splitting the mesh at the center in this case, since symmetric top and bottom parts would have "butterfly" look if done in such a way. Bethesda artist mirrored the UVs only on the sides, and offset one side outside of regular UV coordinates (+1 in X axis), so that during baking normal map from high to low poly, one side would be ignored by the baker. The UV for Eyebot is a great example as well (by using a lot of UV mirroring + overlapping, the Eyebot 2k texture pretty much has the density of 4k maps used on other assets), but in that asset it looks like the artist moved the out of bounds UV islands back to regular coordinates after baking. About Substance Painter, I have never seen it crashing personally. I use the Steam version from 2019. ....It's gotten kinda long, but I hope someone would find it useful.
  21. DDS = Direct Draw Surface DDS is a part of DirectX, and Microsoft standard. For conversion, Bethesda used Microsoft's texconv.exe, with Bethesda's Elric app acting as the front end. You can find it in the Tools folder (the same folder where Elric is located). Elric takes the .TGA format, and converts it to DDS format of your choice. TGA image format is used in game industry often because of its uncompressed nature, and you can see Bethesda using it as the export format from CK, for example. The vanilla game textures are : _d = Diffuse (+Alpha) : BC1 (no alpha), BC3 (with Alpha) _n = Normal : BC5 (Blue channel removed) _s = Specular (Red channel) + Gloss (Green channel) : BC5 Note that Bethesda is using the DirectX format for normal map, and not OpenGL (which is used by Blender, for example). To convert OpenGL format to DirectX, you need to flip the Green channel. Bethesda's Fallout 4 normal maps do not contain the Blue channel (I assume this was done for better compression when creating .ba2). _s texture files actually contain 2 separate maps, Specular (metallic) and Gloss (smooth). RGBA image file is better seen here as a container for 4 separate 8 bit numeric values. Game devs often pack multiple texture data into 1 file to reduce the number of files being loaded in game. DXT designation was depreciated in DX11, and Microsoft started using BC# then. The old NVIDIA plugin for the 32 bit Photoshop doesn't even import the Bethesda normal maps correctly (it adds the Blue channel on import, and it appears the wrong color profile gets applied). That plugin predates the current BC- formats, and better avoided for Fallout 4. Anyway, in Elric, you'll find the Texture Conversion section in the middle part of the left column. At the top, you'll see "Force Format" and here you can set the DDS format of your choice. Check the "Generate MipMaps", and keep the "Quartering Threshold" and "BC7 Threshold" to "NONE". "Quartering" here means it would create a smaller resolution out of a larger one (Input 4k -> Output 2k, for example). Elric can batch convert TGA files to DDS. As for the texture compression format, the safest bet is what Bethesda used. For my own project, though, I've been using BC7 (d/n/s) for weapons/clothing, and vanilla formats for environment/props, as I feel that BC7 pops out more against the vanilla formats, which are noisier to my eyes. In any case, you may want to maximize your texel density of your textures, so that if you had to lower the resolutions for consoles in the end, they will survive the conversion. Bethesda used a lot of UV mirroring/overlapping to maximize texel density, and their assets are a great source of learning in this regard. Reducing the number of material files loaded by combining multiple elements can be a good idea if possible (each material files will add draw calls in game). I hope this helps. Good luck.
  22. Hello Andyno, Yeah, it was your project I was thinking of, the Zone. I'm glad you got it done!
  23. I think I should have worded better about AMD. I am actually running a 3 monitor setup with 6700XT, and it's rock solid NOW, but the drivers before late July this year had problems with 2+ monitor setups. I had the same issue with RX580 - the first 2 years I owned it, their drivers had issues with multi monitor setup, but they got resolved. AMD really sucks at writing good drivers at card launch, but once the drivers are solid, their cards are good. I'm grudgingly happy with my AMD card for now, lol. The DPC latency issue in NVIDIA cards may have been resolved in the very recent drivers, also, so you may want to check that. The DPC latency only matters in DAW if you need very low latency monitoring while recording under high CPU load - which can be necessary if you are using software instruments running in real time, but not if you are just editing sound effects. How this is important really depends on how much you are willing to push your system. With the current CPUs, though, it's highly unlikely that you get anywhere near maxing CPU usage in DAW. Going NVIDIA probably won't cause problems today. 64 GB RAM - in the context of music production, this much RAM is necessary only if you are using a full orchestral sample library template with multiple microphone positions turned on. If you are doing regular pop/rock/EDM type stuff, you only really need 16 GB.
  24. Hello jcdenton2012, I don't think you can fix this. I remember there was another member who made the same mistake a few years ago (his map was very large, also), and the answers he got back then was also "no". If you don't mind redoing the whole thing, perhaps you could group buildings and such into static collections/pack-in objects, and organize them with area name prefix so that you can work fast when recreating locations. Or.... maybe you could just change your project location to Seattle/Vancouver/Vancouver Island, and keep the map.
  25. Hello SonAnima, My machine is far from the high end stuff in 2023, but here's my machine : i7 2600k (oc'd to 4.2 GHz) 32 GB RAM Radeon RX6700XT (12 GB VRAM) About Substance Painter, I've seen it taking up a lot of RAM, but I don't think I saw it approaching anywhere close to 32 GB in my experience. I do use lots of layers, and perhaps I'm not the most efficient user, but I think 32 GB is okay for the stuff I do. I use AMD GPU mostly because NVIDIA cards tend to have higher DPC latency. My PC is primarily used for music/audio work, and high DPC latency would be problematic in that context. Also, NVIDIA drivers tend to be not so great at supporting OpenGL, and I've seen some plugins having issues in the past. AMD is, however, terrible at supporting multiple monitors, and their GPUs tend to have weird issues like 2nd monitor flickering or system instability. AMD drivers have been very good since this summer, but that hasn't always been the case. If you are primarily a graphics artist, I'd recommend going NVIDIA. Blender, Substance, Autodesk stuff, they all have full NVIDIA GPU support. AMD pretty much is not competing in graphics artists' space.
×
×
  • Create New...