Jump to content

AVX FIX, similar to Cyberpunk 2077


LeauDesPates72400

Recommended Posts

Oh, believe me, I´m screaming about our beloved OS many times... lol!

 

Really, I do know that software is compiled and that it´s relativly easy to not uese AVX in the process.

 

In the minimum specs no AVX is mentioned and my hardware exceeds the minimum specs in performance.

 

 

Cyberpunks 2.0 release got me back in that game recently and it runs just fine on high/ultra settings. I get 50-60 FPS in 4K with FSR on my Titan Xp. Every other modern game runs just fine, too (BG3, AC6, Hogwards, etc.). My CPU (X58 Xeon@6x4,3Ghz/24GB) is perfectly fine for 60Hz gameplay even today. In fact I could put in up to a 4070Ti without bottlenecking it as long as I stay around 60FPS.

 

Do you oppose this request because you think removing AVX will make Starfield worse? That´s not the case, it´s no problem to use it when available, but not use it when not. Additionaly there is little benefit from it in games.

I think mainly the consoles and to a lesser extend the fact that so few PC gamers own decent GPUs (mostly ...60 midrange cards) hold games back, besides the immense demands in manpower and money a modern game has.

 

Cheers ;)

Link to comment
Share on other sites

  • Replies 51
  • Created
  • Last Reply

Top Posters In This Topic

Oh, believe me, I´m screaming about our beloved OS many times... lol!

 

Really, I do know that software is compiled and that it´s relativly easy to not uese AVX in the process.

 

In the minimum specs no AVX is mentioned and my hardware exceeds the minimum specs in performance.

 

 

Cyberpunks 2.0 release got me back in that game recently and it runs just fine on high/ultra settings. I get 50-60 FPS in 4K with FSR on my Titan Xp. Every other modern game runs just fine, too (BG3, AC6, Hogwards, etc.). My CPU (X58 Xeon@6x4,3Ghz/24GB) is perfectly fine for 60Hz gameplay even today. In fact I could put in up to a 4070Ti without bottlenecking it as long as I stay around 60FPS.

 

Do you oppose this request because you think removing AVX will make Starfield worse? That´s not the case, it´s no problem to use it when available, but not use it when not. Additionaly there is little benefit from it in games.

I think mainly the consoles and to a lesser extend the fact that so few PC gamers own decent GPUs (mostly ...60 midrange cards) hold games back, besides the immense demands in manpower and money a modern game has.

 

Cheers ;)

Your right that the game requirements don't explicitly call it out, however it calls out 2 minimum processors, so the overlapping instruction set and extensions supported by both can be reasonably be inferred to be fair game. Both processors (and all built in the last decade) support avx so it can be reasonable for the game to make use of that hardware feature.

 

Do you even know what avx does? You sit there saying it does nothing, but avx is used fairly extensively in plenty of other applications that are math heavy like physics or rendering (you know, two areas of the game that are known to be performance problems already). To say it does nothing is a load of crap. Yes they could remove avx but that will make the experience objectively worse for everyone else who has supported hardware.

 

Could they add support for both avx fallback for non-avx, sure but that's non trivial work and why do it when literally the only use case is for hardware half a decade older than the minimum processors you support?

 

I get that a few people are upset that there systems that otherwise still work for most cases can't play the game, but that's just how tech advances and eventually you need to upgrade, this is hardly the first game to require a new instruction set that older hardware doesn't support

Link to comment
Share on other sites

Here's a clue for the clueless. Many many many years ago, relevant since Beth eventually bought the company, a little company called iD released a game called Quake. Let me tell you a story.

 

John Carmack had created two important game engines (or frameworks that were used for multiple titles). The first was Wolfenstein, although not the first game to use that title. Wolfenstein had a rendering algorithm for the perspective correct textures on flat floors and flat walls at a 90 degree angle when the camera was fixed in angle relative to the flaw. This worked well on Intel 486 processors at the time, and less well on the 386. Doom was a refinement of the rendering algorithms, with more complex geometry but otherwise the same camera limitations.

 

On Quake Carmack was joined by a MUCH better coder (who left shortly afterwards down to Carmack refusing to let him become a shareholder in iD). Between them, they realised the upcoming Pentium processor (actually the 586, but misnamed by Intel to sell the brand soon to be properly associated with the TRUE Pentium- namely the Pentium Pro and its desktop variant, the Pentium 2- both new RISC architecture CPUs), because it inherited early the brand new FPU (maths engine) of the true Pentium chips, had massively improved floating point division instructions.

 

Well according to the fools who bang on about AVX being of no significance, what possible purpose could a strong divide engine bring to gaming- sigh. Here's the thing. They wanted "LOOKSPRING", or the ability of the mouse to freely move the first person mouse view in both axis. This requires true unrestricted perspective correct texture mapping- which technically needs one DIVISION calculation per pixel- a computing nightmare. Quake could not afford this (but the first 'GPU'- the 3DFX Voodoo- would soon release with exactly this per pixel division feature). So what Carmack and Abrash (the true Quake genius) did was use a mathematical approximation function for a line of pixels, still dependent on the massive upgrade in floating point division instruction speed.

 

The demo releases, and a clueless chump publishes on the Usenet (still a forum system at that point) a multi-page technical document explaining why all Quake used the new Pentium instructions for was data MOVE operations, meaning Quake could work fine on older processors if the compile flag were changed- does this ring any bells?

 

Today, not using AVX in your code is like not using the GPU, and doing 'software' rendering. Do we want our AAA games to run like junk because fools won't upgrade, and then lie about why AVX instructions exist in the first place? If you do not have AVX in your CPU, then you are not a AAA gamer in the modern age- full stop. And code written to meet your needs is sup-optimal to a quite horrifying degree, and ruins the gaming experience on the gaming PCs owned by the 99.9% majority. Nothing could be more selfish and destructive to the industry.

 

We are not to blame because Intel cynically still sells variants of their CPUs missing essential features. Intel even sells desktop chips for vast prices that have e-cores - cores that lack true AVX support, and thus emulate it at the speed of a snail (such code is supposed to run on the tiny number of p-cores). Amd pulls none of these tricks with its CPUs or GPUs. AMD has coming c-cores, but unlike Intel, these are merely more compact cores with FULL and TOTAL hardware support for all modern instructions. People should buy AMD.

 

The PS5 and Xbox have AVX. Code written to use AVX is much faster, and can do many things far better- including in rendering. All coming AAA games will be ports from the current gen of consoles- and NO, the devs won't break their game code to meet the needs of those with broken PC CPUs.

Link to comment
Share on other sites

Again, removing AVX will do nothing to performance as it didn´t with Cyberpunk. Removing it or adding a check if it´s available is as trivial as checking a box in the compiler.There is little to no coding involved in this. However, modding it is another thing.

 

Your guys arguments are either from the 90s and technicaly don´t apply here or already contradicted in reality by the fact, that it was done already by modders to another far more modern and taxing game (CP2077) without causing any trouble or performance loss.

 

So you won´t loose anything, why are you opposing this? Remind you I´m not asking for Bethesdas precious time to make a patch, but for a mod. You don´t even need to use it and you´re fine. This game really has far more concerning issues, than AVX at the moment, I agree.

BTW: The discussion was not about being a AAA gamer, whatever that is supposed to mean, but about an AVX mod. However I´m playing BG3, CP2077, Hogwarts, AC6 and so on on my rig just fine on High/Ultra 4K some with FSR quality and obviously without RT. Gaming is still mainly GPU limited, any 6core CPU around 4Ghz will do just fine for most GPUs.

 

 

And then, also again, there is the fact, that most CPUs downclock, when using AVX as it generates a lot of heat. This degrades performance and if the tradeoff is beneficial in the end depends. It definitely reduces OC headroom and general stability with the heat:

"Since AVX instructions are wider and generate more heat, some Intel processors have provisions to reduce the Turbo Boost frequency limit when such instructions are being executed." *quote from Wiki. And that´s what all modern AVX CPUs do, when it´s used.

 

AVX is mainly for finacial and scientific use cases, it´s rarely used by many applications a home user or gamer encounters and certainly doesn´t benefit games in a big way. As you stated it´s been around for a decade and nearly no game needs it. How is this? Because the consoles support it, that can´t be the reason...

Edited by receptor
Link to comment
Share on other sites

Again, removing AVX will do nothing to performance as it didn´t with Cyberpunk. Removing it or adding a check if it´s available is as trivial as checking a box in the compiler.There is little to no coding involved in this. However, modding it is another thing.

 

Your guys arguments are either from the 90s and technicaly don´t apply here or already contradicted in reality by the fact, that it was done already by modders to another far more modern and taxing game (CP2077) without causing any trouble or performance loss.

 

 

Cyberpunk is a bad choice for your argument here, do a slight bit of research and you will find that CP2077 was only flagged and checking for AVX, not really using it, and thats why it was easy to mod out. Starfield by comparison is actually using AVX instructions. To quote a modder already looking at a possible starfield patch:

 

 

The reason why it was so easy for Cyberpunk 2077 to develop a AVX patch is because they only had checks in place if AVX was there and AVX was only used for nonsense and empty calls. Starfield has more then 50% AVX Instructions in its code so running it on anything non AVX is just not possible / very very very slow.

I am sorry for all the people which are waiting for AVX to get patched out but that won't happen.

It's time to upgrade if you wanna play Starfield.

I will try to get the patch running atleast a little bit stable and then release it but don't keep your hopes up. The patch will run so slowly that you could use a steam deck and it would be 10x faster than the strongest non AVX CPUs.

 

AVX is useful in cases where heavy use of vectors and floating point is required, some preliminary findings seem to indicate its being used in starfield for lighting and possibly physics, 2 areas that are heavy in FP use. As for why AVX is only just now starting to be used in video games, simple it takes time to build a user base that can use new technology. 64 bit took years to become a requirement for games or software in general, despite it being generally available since 2005, it took MS until 2020 to stop selling 32 bit copies of windows. Its the same reason we dont see alot of games using ray tracing yet, because they technology is still pretty new and most of the games that incorporate it are partnerships with nvidia to push new GPU sales.

Link to comment
Share on other sites

 

Again, removing AVX will do nothing to performance as it didn´t with Cyberpunk. Removing it or adding a check if it´s available is as trivial as checking a box in the compiler.There is little to no coding involved in this. However, modding it is another thing.

 

Your guys arguments are either from the 90s and technicaly don´t apply here or already contradicted in reality by the fact, that it was done already by modders to another far more modern and taxing game (CP2077) without causing any trouble or performance loss.

 

 

Cyberpunk is a bad choice for your argument here, do a slight bit of research and you will find that CP2077 was only flagged and checking for AVX, not really using it, and thats why it was easy to mod out. Starfield by comparison is actually using AVX instructions. To quote a modder already looking at a possible starfield patch:

 

 

The reason why it was so easy for Cyberpunk 2077 to develop a AVX patch is because they only had checks in place if AVX was there and AVX was only used for nonsense and empty calls. Starfield has more then 50% AVX Instructions in its code so running it on anything non AVX is just not possible / very very very slow.

I am sorry for all the people which are waiting for AVX to get patched out but that won't happen.

It's time to upgrade if you wanna play Starfield.

I will try to get the patch running atleast a little bit stable and then release it but don't keep your hopes up. The patch will run so slowly that you could use a steam deck and it would be 10x faster than the strongest non AVX CPUs.

 

AVX is useful in cases where heavy use of vectors and floating point is required, some preliminary findings seem to indicate its being used in starfield for lighting and possibly physics, 2 areas that are heavy in FP use. As for why AVX is only just now starting to be used in video games, simple it takes time to build a user base that can use new technology. 64 bit took years to become a requirement for games or software in general, despite it being generally available since 2005, it took MS until 2020 to stop selling 32 bit copies of windows. Its the same reason we dont see alot of games using ray tracing yet, because they technology is still pretty new and most of the games that incorporate it are partnerships with nvidia to push new GPU sales.

My dear friend, yes, we (I) do not expect anything more from starfield)) The game is so dull and impersonal that it passes by without regret. I, personally, will still go through the whole cyberpunk 2077 with a new addition, with excellent optimization and with a new DLC. And thanks to people like you, the community will continue to use fetid, unoptimized s#*! and smacking to say - yes, look what "optimization" is because AVX)).

Link to comment
Share on other sites

 

Again, removing AVX will do nothing to performance as it didn´t with Cyberpunk. Removing it or adding a check if it´s available is as trivial as checking a box in the compiler.There is little to no coding involved in this. However, modding it is another thing.

 

Your guys arguments are either from the 90s and technicaly don´t apply here or already contradicted in reality by the fact, that it was done already by modders to another far more modern and taxing game (CP2077) without causing any trouble or performance loss.

 

 

Cyberpunk is a bad choice for your argument here, do a slight bit of research and you will find that CP2077 was only flagged and checking for AVX, not really using it, and thats why it was easy to mod out. Starfield by comparison is actually using AVX instructions. To quote a modder already looking at a possible starfield patch:

 

 

The reason why it was so easy for Cyberpunk 2077 to develop a AVX patch is because they only had checks in place if AVX was there and AVX was only used for nonsense and empty calls. Starfield has more then 50% AVX Instructions in its code so running it on anything non AVX is just not possible / very very very slow.

I am sorry for all the people which are waiting for AVX to get patched out but that won't happen.

It's time to upgrade if you wanna play Starfield.

I will try to get the patch running atleast a little bit stable and then release it but don't keep your hopes up. The patch will run so slowly that you could use a steam deck and it would be 10x faster than the strongest non AVX CPUs.

 

AVX is useful in cases where heavy use of vectors and floating point is required, some preliminary findings seem to indicate its being used in starfield for lighting and possibly physics, 2 areas that are heavy in FP use. As for why AVX is only just now starting to be used in video games, simple it takes time to build a user base that can use new technology. 64 bit took years to become a requirement for games or software in general, despite it being generally available since 2005, it took MS until 2020 to stop selling 32 bit copies of windows. Its the same reason we dont see alot of games using ray tracing yet, because they technology is still pretty new and most of the games that incorporate it are partnerships with nvidia to push new GPU sales.

 

My dear friend, yes, we (I) do not expect anything more from starfield)) The game is so dull and impersonal that it passes by without regret. I, personally, will still go through the whole cyberpunk 2077 with a new addition, with excellent optimization and with a new DLC and, lo and behold, without AVX. And thanks to people like you, the community will continue to use fetid, unoptimized s#*! and smacking to say - yes, look what "optimization" is because AVX)). Give examples of games with AVX that would be optimized not for 4090 but for the community as a whole.

Edited by vik24328
Link to comment
Share on other sites

 

The reason why it was so easy for Cyberpunk 2077 to develop a AVX patch is because they only had checks in place if AVX was there and AVX was only used for nonsense and empty calls. Starfield has more then 50% AVX Instructions in its code so running it on anything non AVX is just not possible / very very very slow.

I am sorry for all the people which are waiting for AVX to get patched out but that won't happen.

It's time to upgrade if you wanna play Starfield.

I will try to get the patch running atleast a little bit stable and then release it but don't keep your hopes up. The patch will run so slowly that you could use a steam deck and it would be 10x faster than the strongest non AVX CPUs.

 

 

Thanks for that interesting piece of info, where is this from? (Edit: from Reddit, I found it...)

 

I knew that Cyberpunk didn´t use many AVX instuctions, but the extend of Starfields use was unknown to me. If they really use it in 50% of their code, a mod will be complicated to do and quite slow, I agree.

Regardless it would be easy to do when compiling the code, as it is then when you can decide what instuction sets to compile the code for, as I don´t think Beth is coding in assembler. SSE can be used instead with only minor performance loss in most cases. But that´s not gonna happen, I know.

 

Makes me wonder, why the engine in so badly multi-threaded, as CPU benchmarks suggest. But by using AVX they can make most of a few cores to mitigate that, perhaps. Anyways, that´s just speculation, but a lot of Devs say so:

"... Unless your project is heavily dependent on 2 cores the change (in performance) should be minor. ..."

"... If you're really squeezing your performance hard enough that you need AVX (i.e. you're in the late-game optimization stage and there's nothing else you can do to improve your throughput), you may as well write the "hot" loop in your code in x64 and then translate it to AVX intrinsics, which is the only way I've ever achieved significant performance improvements with it. ..."

Using AVX seems more like a measure to get some performance, when nothing else helps.

 

Edit:

Besides Cyberpunk there is also Horizon Zero Dawn, which got AVX removed without loosing performance. Just to state another example, i didn´t know of before, as I played through that game without noticing, because it was after they patched AVX out.

Additionaly I checked which modern engines, like UE5, need AVX and it´s none of them, mirroring the zero games besides SF that need it.

In a multi core environment and with GPUs horsepower in FP32 AVX, which is a decade old in itself, is really not that beneficial, especially when you trade frequency in for using it, because of the heat.

Edited by receptor
Link to comment
Share on other sites

The threads about this on steam and reddit are quite escalating, a lot of people seem to have high hopes for this patch. The heat in the discussions stems from an unexpected amount of people who really can´t afford a new rig. The discussion about this kinda brought out the worldwide divide in income and wealth, which is kinda sad. One side is angry because of people holding back progress, the other side simply can´t afford it.

 

 

Anyways I think the state of things right now is:

- Starfield uses a lot of AVX, not compareable to Cyberpunk, and a mod will most certainly only be able to run very slow, if at all.

- A patch by Beth is highly unlikely, they got other problems to solve. But I think that would be possible in theory.

- However there is one guy trying to do it via modding, maybe he uses just SDE, and it´s WIP ATM.

Edited by receptor
Link to comment
Share on other sites

  • 2 weeks later...
  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...