Jump to content

receptor

Members
  • Posts

    9
  • Joined

  • Last visited

Nexus Mods Profile

About receptor

Profile Fields

  • Country
    Germany

receptor's Achievements

Rookie

Rookie (2/14)

  • First Post
  • Week One Done
  • One Month Later
  • One Year In

Recent Badges

0

Reputation

  1. The threads about this on steam and reddit are quite escalating, a lot of people seem to have high hopes for this patch. The heat in the discussions stems from an unexpected amount of people who really can´t afford a new rig. The discussion about this kinda brought out the worldwide divide in income and wealth, which is kinda sad. One side is angry because of people holding back progress, the other side simply can´t afford it. Anyways I think the state of things right now is: - Starfield uses a lot of AVX, not compareable to Cyberpunk, and a mod will most certainly only be able to run very slow, if at all. - A patch by Beth is highly unlikely, they got other problems to solve. But I think that would be possible in theory. - However there is one guy trying to do it via modding, maybe he uses just SDE, and it´s WIP ATM.
  2. Thanks for that interesting piece of info, where is this from? (Edit: from Reddit, I found it...) I knew that Cyberpunk didn´t use many AVX instuctions, but the extend of Starfields use was unknown to me. If they really use it in 50% of their code, a mod will be complicated to do and quite slow, I agree. Regardless it would be easy to do when compiling the code, as it is then when you can decide what instuction sets to compile the code for, as I don´t think Beth is coding in assembler. SSE can be used instead with only minor performance loss in most cases. But that´s not gonna happen, I know. Makes me wonder, why the engine in so badly multi-threaded, as CPU benchmarks suggest. But by using AVX they can make most of a few cores to mitigate that, perhaps. Anyways, that´s just speculation, but a lot of Devs say so: "... Unless your project is heavily dependent on 2 cores the change (in performance) should be minor. ..." "... If you're really squeezing your performance hard enough that you need AVX (i.e. you're in the late-game optimization stage and there's nothing else you can do to improve your throughput), you may as well write the "hot" loop in your code in x64 and then translate it to AVX intrinsics, which is the only way I've ever achieved significant performance improvements with it. ..." Using AVX seems more like a measure to get some performance, when nothing else helps. Edit: Besides Cyberpunk there is also Horizon Zero Dawn, which got AVX removed without loosing performance. Just to state another example, i didn´t know of before, as I played through that game without noticing, because it was after they patched AVX out. Additionaly I checked which modern engines, like UE5, need AVX and it´s none of them, mirroring the zero games besides SF that need it. In a multi core environment and with GPUs horsepower in FP32 AVX, which is a decade old in itself, is really not that beneficial, especially when you trade frequency in for using it, because of the heat.
  3. Again, removing AVX will do nothing to performance as it didn´t with Cyberpunk. Removing it or adding a check if it´s available is as trivial as checking a box in the compiler.There is little to no coding involved in this. However, modding it is another thing. Your guys arguments are either from the 90s and technicaly don´t apply here or already contradicted in reality by the fact, that it was done already by modders to another far more modern and taxing game (CP2077) without causing any trouble or performance loss. So you won´t loose anything, why are you opposing this? Remind you I´m not asking for Bethesdas precious time to make a patch, but for a mod. You don´t even need to use it and you´re fine. This game really has far more concerning issues, than AVX at the moment, I agree. BTW: The discussion was not about being a AAA gamer, whatever that is supposed to mean, but about an AVX mod. However I´m playing BG3, CP2077, Hogwarts, AC6 and so on on my rig just fine on High/Ultra 4K some with FSR quality and obviously without RT. Gaming is still mainly GPU limited, any 6core CPU around 4Ghz will do just fine for most GPUs. And then, also again, there is the fact, that most CPUs downclock, when using AVX as it generates a lot of heat. This degrades performance and if the tradeoff is beneficial in the end depends. It definitely reduces OC headroom and general stability with the heat: "Since AVX instructions are wider and generate more heat, some Intel processors have provisions to reduce the Turbo Boost frequency limit when such instructions are being executed." *quote from Wiki. And that´s what all modern AVX CPUs do, when it´s used. AVX is mainly for finacial and scientific use cases, it´s rarely used by many applications a home user or gamer encounters and certainly doesn´t benefit games in a big way. As you stated it´s been around for a decade and nearly no game needs it. How is this? Because the consoles support it, that can´t be the reason...
  4. Oh, believe me, I´m screaming about our beloved OS many times... lol! Really, I do know that software is compiled and that it´s relativly easy to not uese AVX in the process. In the minimum specs no AVX is mentioned and my hardware exceeds the minimum specs in performance. Cyberpunks 2.0 release got me back in that game recently and it runs just fine on high/ultra settings. I get 50-60 FPS in 4K with FSR on my Titan Xp. Every other modern game runs just fine, too (BG3, AC6, Hogwards, etc.). My CPU (X58 Xeon@6x4,3Ghz/24GB) is perfectly fine for 60Hz gameplay even today. In fact I could put in up to a 4070Ti without bottlenecking it as long as I stay around 60FPS. Do you oppose this request because you think removing AVX will make Starfield worse? That´s not the case, it´s no problem to use it when available, but not use it when not. Additionaly there is little benefit from it in games. I think mainly the consoles and to a lesser extend the fact that so few PC gamers own decent GPUs (mostly ...60 midrange cards) hold games back, besides the immense demands in manpower and money a modern game has. Cheers ;)
  5. "... so it very well could (and likely does) have avx instructions... " "... Creation Engine (just SF), no other game with it needs AVX..." That´s what I said :wink: I know SFs version of the creation engine has it in it, but the others (Skyrim, etc.) not, which, in conjunction with cyberpunk, leads me to think a removal should be possible without problems. A lot of people ask for a DX11 version BTW, as I figured when you mentioned it. I couldn´t resist searching for it (lolz) and especially Intel GPU owners seem to attribute their bad performance to SFs bad implementation of DX12. Problems that arise when using a decades old engine and just upgrading it. Like the sun not rendering on AMD, even when it´s optimised for AMD. Concerning 32bit just a quote from a german forum: "... 32 Bit Engine auf 64 Bit laufen lassen und nicht mal ca 3,2 GB Ram verwenden was zu ständigen nachladen bei Tür öffnen führt so wie bei ESO führt ist keine neue Engine die 64 Bit verarbeitet und mehr Ram beanspruchen kann.Typisch Bethseda... " In short this translates to: The engine is 32bit in the end and that´s why we have all those loading screens. They just pimped it a bit to only run on 64bit.
  6. ... lol, just a Troll, don´t feed him. :wink: Removing AVX is entirely possible, don´t worry. If it´s doable via a mod is another question, as the .exe needs to be compiled without using AVX instructions, blabla, as stated above. AVX is not hardcoded in the Creation Engine (just SF), no other game with it needs AVX and the engine is decades old, so it should run on decades old hardware, which is still compareable to a Ryzen 5 3600X performance wise :wink: However, I don´t think Bethesda cares, because just look at the state of that game on PC and still no patches in sight. Optimised for XBox 30FPS and nothing else really works correctly (HDR, Gamma, Sun rendering on AMD, performance, load screen hell, etc.). I playtested it meanwhile and while being kinda fun, it feels just quite old somehow... outdated. Story/Background didn´t really pull me in either, everything is so copy/pasted from Firefly, Cyberpunk, Expanse and Star Wars/Trek, which all do it better. And I really struggled to like even one of the companions.
  7. In case someone want´s to try the SDE route: sde-external-9.24.0-2023-07-13-win Starfield.cmd "C:\Starfield\sde-external-9.24.0-2023-07-13-win\sde.exe" -ivb -- "C:\Starfield\Starfield.exe" This is damn slow, but u can try. It was posted in the SF AVX thread on reddit. The SDE tool can also be used to find the AVX calls in the code I think. But I have no idea, how to proceed... Perhaps a solution could be to tell SDE to just emulate the AVX and not the whole CPU, as that´s what makes it so slow. But that seems to be impossible with this tool.
  8. I sent the guy, jensandree, a message... Let´s hope he replies and has an idea. He is one of the guys who modded AVX out of Cyberpunk before the official patch. I don´t know the other ones.
  9. It´s really just about ticking the AVX box in the compiler - or not in this case. The code can use SSE or software instead. It will be a bit slower, but not much, and for AVX capable CPUs you can easily add a check as mentioned before. CP2077 did it without issues, so it would just be beneficial for all. This leads to another point concerning AVX: Stability ! If you have OCed your rig, then AVX tends to crash an otherwise stable system. CPUs produce a lot of heat using AVX, more than without using it, which leads to decreased stability for OCed CPUs. Removing AVX, or making it optional, would open up a bunch of new players and give existing ones a more stable experience. All while taking nothing away in terms of performance - the game has other problems in that regard, meaning it´s optimized quite bad. Please, someone with the skills, mod AVX out of Starflield! ;)
×
×
  • Create New...