Jump to content

Creation Kit: Optimum Hardware recommendation


Qrsr

Recommended Posts

Hello,

 

I'm just wondering which hardware currently available on the market has by far the fastest rendering and processing results in CK - world creation, world zooming/moving, etc. to reach very high FPS during work in the CK. Which CPU/GPU/RAM would be advisable here?

 

Thank you in advance.

 

Edited post:

 

Turned out FPS is not so trustworthy, far more interesting is rendering (real-)time. I found RAM to be the culprit and GPU is not so important, with the following applied:

- DDR4/5 32 GB+

- L3 Cache 20MB+

- NVME/SSD

 

File/Preferences...:

- .../Render Window: Grids to load 1, which must be done after each startup, since it cannot be configured via INI, at least it does not work on my end when modifying the INI.

 

Top Bar:
- disable "Toogle Light (A)"
- disable "Toogle Lights (6)"

 

Show/Hide:

- Markers/Misc Markers

- disable "Decals"

- disable "Object Hidden From Local Map"

 

Render Window:

- push "T" to set default angle on top of object

Edited by Qrsr
Link to comment
Share on other sites

  • 4 months later...

Alright, im answering it myself fyi...

- the best thing is to turn set uGrid to load to 1 (which must be done after each load, since it cannot be configured via INI)

- push t to set default angle on top of object

- disable "Toogle Light (A)"
- disable "Toogle Lights (6)"

- disable "Markers"

- disable "Decals"

- disable "Object Hidden From Local Map"

 

this would work very well with a CPU with only 8 GB ram, no GPU (only iGPU), L3 cache below 10 mb

 

The best you can do is to upgrade to a system with the most RAM available preferable DDR5 32 or 64 GB, and L3 cache the most you can afford either INTEL or AMD, it doesnt matter both offer great performance. L3 cache is the best you can do to your system in and off game, in and off Creation Kit. GPU is nice to have ingame but offgame iGPU is already more than good with 10 year old PC setups. Current iGPU can handle the CK perfectly fine concerning FO4.

 

Ingame iGPU can even handle FO4 on UHD with some shadow tweaks and overclocking.

 

UHD Graphics 770 (13th Gen) Overclocked to 2500 MHz

https://www.youtube.com/watch?v=gN73ri7cSBQ

Radeon Graphics (Ryzen 7000) Overclocked to 3100 MHz

https://www.youtube.com/watch?v=aohuBCWwQUA
Link to comment
Share on other sites

It doesn't matter what you run ck on. Even on a NASA supercomputer you will get the same crap rendered. Perhaps, by raising the frequency of 1 single core, something changes, but I did not notice any changes. I have EPYC 64 threads, 128Gb of RAM, and I didnt notice any difference compared to the 2nd generation intel and 8Gb of RAM on which I first launched ck.Just like in the game itself, I increased as much as ~1-2 fps by switching to epyc with xeon 2680v3/32gb RAM. Only the video card gives an increase in fps in the game, and the rest of the system is needed only to boost your video card. And we're also talking about a game from 8 years ago, for which a weak modern build is the default supercomputer. Edited by South8028
Link to comment
Share on other sites

It doesn't matter what you run ck on. Even on a NASA supercomputer you will get the same crap rendered. Perhaps, by raising the frequency of 1 single core, something changes, but I did not notice any changes. I have EPYC 64 threads, 128Gb of RAM, and I didnt notice any difference compared to the 2nd generation intel and 8Gb of RAM on which I first launched ck.Just like in the game itself, I increased as much as ~1-2 fps by switching to epyc with xeon 2680v3/32gb RAM. Only the video card gives an increase in fps in the game, and the rest of the system is needed only to boost your video card. And we're also talking about a game from 8 years ago, for which a weak modern build is the default supercomputer.

I have +20MB L3 cache and can uGrid7 without any interference in CK no lag no nothing so yes it does make a difference. Ingame L3 cache is worth gold since it allows for much more uGrid numbers as opposed to uGrid 5 by default. So yes it makes a very HUGE difference at least on my end.

 

The most important factor is L3 cache though. Anything past 20 MB runs like butter.

 

See here:

 

That was back in the DDR3 era.. with DDR4 it stopped scaling around DDR4 3000 MHz on my 8700K from DDR4 3000 which I run now vs 3666 MHz there is zero FPS difference. still dips to the same lows. It did scale quite well with memory back in the day but memory bandwidth on modern systems is high enough that its back to being thread limited. Going from 5 GHz on my 8700K down to 4.6 now had a bigger impact than the ram. So yeah it scales with a memory to a certain point but not so much now. Latency can have an impact as well as L3 cache in fact L3 cache is likely the biggest factor anything that doesn't fit into cache will get pushed to the system memory so Ryzen X3D chips will likely have bigger gains. But the biggest benefit is if you use a Core Affinity app and set the game to just run on set cores with no HT. tends to smooth things out and stops the game from swapping between so many cores on which its single thread bound anyway. So while it will distribute load the 1 important thread is the main culprit. If you can't scale clock speed then scaling bandwidth helps but suffice to say the game is still a buggy mess on ancient code.

 

As it is with the 7950X3D with same settings etc vs a 7950X with higher clocks but same memory the extra cache shows a 20-30% improvement in performance. Game wants to nom nom on more L3 cache to keep the CPU thread properly fed with data. Faster memory helps but will never compensate for that.

#65 https://www.techpowerup.com/forums/threads/fallout-4-9700k-5-1-ghz-32-gib-ddr4000-only-a-slight-fps-diff-between-1080ti-and-4090.307375/page-3#post-4998924
#72 https://www.techpowerup.com/forums/threads/fallout-4-9700k-5-1-ghz-32-gib-ddr4000-only-a-slight-fps-diff-between-1080ti-and-4090.307375/page-3#post-4999524
#74 https://www.techpowerup.com/forums/threads/fallout-4-9700k-5-1-ghz-32-gib-ddr4000-only-a-slight-fps-diff-between-1080ti-and-4090.307375/page-3#post-4999860
Link to comment
Share on other sites

 

It doesn't matter what you run ck on. Even on a NASA supercomputer you will get the same crap rendered. Perhaps, by raising the frequency of 1 single core, something changes, but I did not notice any changes. I have EPYC 64 threads, 128Gb of RAM, and I didnt notice any difference compared to the 2nd generation intel and 8Gb of RAM on which I first launched ck.Just like in the game itself, I increased as much as ~1-2 fps by switching to epyc with xeon 2680v3/32gb RAM. Only the video card gives an increase in fps in the game, and the rest of the system is needed only to boost your video card. And we're also talking about a game from 8 years ago, for which a weak modern build is the default supercomputer.

I have +20MB L3 cache and can uGrid7 without any interference in CK no lag no nothing so yes it does make a difference. Ingame L3 cache is worth gold since it allows for much more uGrid numbers as opposed to uGrid 5 by default. So yes it makes a very HUGE difference at least on my end.

 

The most important factor is L3 cache though. Anything past 20 MB runs like butter.

 

See here:

That was back in the DDR3 era.. with DDR4 it stopped scaling around DDR4 3000 MHz on my 8700K from DDR4 3000 which I run now vs 3666 MHz there is zero FPS difference. still dips to the same lows. It did scale quite well with memory back in the day but memory bandwidth on modern systems is high enough that its back to being thread limited. Going from 5 GHz on my 8700K down to 4.6 now had a bigger impact than the ram. So yeah it scales with a memory to a certain point but not so much now. Latency can have an impact as well as L3 cache in fact L3 cache is likely the biggest factor anything that doesn't fit into cache will get pushed to the system memory so Ryzen X3D chips will likely have bigger gains. But the biggest benefit is if you use a Core Affinity app and set the game to just run on set cores with no HT. tends to smooth things out and stops the game from swapping between so many cores on which its single thread bound anyway. So while it will distribute load the 1 important thread is the main culprit. If you can't scale clock speed then scaling bandwidth helps but suffice to say the game is still a buggy mess on ancient code.

 

As it is with the 7950X3D with same settings etc vs a 7950X with higher clocks but same memory the extra cache shows a 20-30% improvement in performance. Game wants to nom nom on more L3 cache to keep the CPU thread properly fed with data. Faster memory helps but will never compensate for that.

#65 https://www.techpowerup.com/forums/threads/fallout-4-9700k-5-1-ghz-32-gib-ddr4000-only-a-slight-fps-diff-between-1080ti-and-4090.307375/page-3#post-4998924
#72 https://www.techpowerup.com/forums/threads/fallout-4-9700k-5-1-ghz-32-gib-ddr4000-only-a-slight-fps-diff-between-1080ti-and-4090.307375/page-3#post-4999524
#74 https://www.techpowerup.com/forums/threads/fallout-4-9700k-5-1-ghz-32-gib-ddr4000-only-a-slight-fps-diff-between-1080ti-and-4090.307375/page-3#post-4999860
I don't know anyone who uses uGridsToLoad = 5 today, at least everyone uses 7, regardless of their configuration. Many people use = 9 if they are not afraid of inconvenience. I have 64 MB l3 cache (epyc 7551p) and I dont see any increase in ck or in the game itself. In the ck render the card still loads unbearably, in the game I had 60fps everywhere (with reshad/emb) on my old xeon with the same 1080ti (with reshad/emb), with a lower cbp value the dice fall. Friezes on maps with a large number of objects still exist. I noticed that the only thing that affects performance is the video card and the frequency of one core. Roughly speaking, ryzen 6 cores at ~4+ GHz are preferable to epyc 32 cores at 3 GHz. It is also obvious that the better your video card, the faster the rendering.I dont know exactly how many cores fo4 is able to use, but in ck nothing depends on this. Subjectively, both 5 years ago and now (after ~7 PC builds) everything is bad in ck.
Link to comment
Share on other sites

I don't know anyone who uses uGridsToLoad = 5 today, at least everyone uses 7, regardless of their configuration. Many people use = 9 if they are not afraid of inconvenience. I have 64 MB l3 cache (epyc 7551p) and I dont see any increase in ck or in the game itself. In the ck render the card still loads unbearably, in the game I had 60fps everywhere (with reshad/emb) on my old xeon with the same 1080ti (with reshad/emb), with a lower cbp value the dice fall. Friezes on maps with a large number of objects still exist. I noticed that the only thing that affects performance is the video card and the frequency of one core. Roughly speaking, ryzen 6 cores at ~4+ GHz are preferable to epyc 32 cores at 3 GHz. It is also obvious that the better your video card, the faster the rendering.I dont know exactly how many cores fo4 is able to use, but in ck nothing depends on this. Subjectively, both 5 years ago and now (after ~7 PC builds) everything is bad in ck.

 

the trick is to use uGrid 1 for most exterior cells, interior it doesnt matter. yet still i have no problems with the default setting (uGrid5) in the CK when modding inside boston city. CK is using 16 gb of ram sometimes on my end. i assume it wouldnt hurt to use as much as many ram as possible with good latency.

 

i dont know why you have problems with your setup in the CK what you say would mean that with a better GPU you would have lower performance compared to an iGPU.

 

the links i shared highlight which GPU is required, and at a certain point cpu, ram, L3 cache will do the rest. as for fo4 the only thing matters, afaik, is single core performance, ram and GPU. i can have 60 FPS on 1080p with an intel or amd iGPU with some tweaking in the BIOS. so again i have alot of ??? around my head why your setup is not working. we are speaking about vanilla fo4?! no magical mod packs with super big texture packs, no?

 

to come back to the CK, since there is no prerendering afaik, what matters is a big and fast RAM backup. the current CPU generatio both intel and amd will laugh hard, infact when monitoring load in the CK all which sums up is RAM the core itself is almost at idle, temps are below 40° with air cooler and iGPU no external GPU.

Link to comment
Share on other sites

 

I don't know anyone who uses uGridsToLoad = 5 today, at least everyone uses 7, regardless of their configuration. Many people use = 9 if they are not afraid of inconvenience. I have 64 MB l3 cache (epyc 7551p) and I dont see any increase in ck or in the game itself. In the ck render the card still loads unbearably, in the game I had 60fps everywhere (with reshad/emb) on my old xeon with the same 1080ti (with reshad/emb), with a lower cbp value the dice fall. Friezes on maps with a large number of objects still exist. I noticed that the only thing that affects performance is the video card and the frequency of one core. Roughly speaking, ryzen 6 cores at ~4+ GHz are preferable to epyc 32 cores at 3 GHz. It is also obvious that the better your video card, the faster the rendering.I dont know exactly how many cores fo4 is able to use, but in ck nothing depends on this. Subjectively, both 5 years ago and now (after ~7 PC builds) everything is bad in ck.

the trick is to use uGrid 1 for most exterior cells, interior it doesnt matter. yet still i have no problems with the default setting (uGrid5) in the CK when modding inside boston city. CK is using 16 gb of ram sometimes on my end. i assume it wouldnt hurt to use as much as many ram as possible with good latency.

 

i dont know why you have problems with your setup in the CK what you say would mean that with a better GPU you would have lower performance compared to an iGPU.

 

the links i shared highlight which GPU is required, and at a certain point cpu, ram, L3 cache will do the rest. as for fo4 the only thing matters, afaik, is single core performance, ram and GPU. i can have 60 FPS on 1080p with an intel or amd iGPU with some tweaking in the BIOS. so again i have alot of ??? around my head why your setup is not working. we are speaking about vanilla fo4?! no magical mod packs with super big texture packs, no?

 

to come back to the CK, since there is no prerendering afaik, what matters is a big and fast RAM backup. the current CPU generatio both intel and amd will laugh hard, infact when monitoring load in the CK all which sums up is RAM the core itself is almost at idle, temps are below 40° with air cooler and iGPU no external GPU.

I have no problems with fps in the game itself. There are freezes, short bursts of fps when loading Boston arrays, but as far as I've read, everyone has this. I have problems in ck, but these are problems with the ck code (as I wrote to you about), and it does not depend on the hardware. ck by itself, regardless of the configuration of your PC, works poorly because it cannot fully use your processor. Accordingly, the video card is not able to use its potential in ck rendering. Roughly speaking, I don't see a significant difference in ck rendering between the gtx 1060 and gtx 1080ti. Absolutely nothing changed in the work of ck, no matter what PC upgrades I did. ck render always seemed to depend only on the frequency of one cpu thread, and nothing else at all.

I dont know the peak fps values ​​in fo4, because my game (like probably everyone else) is locked at 60fps. Naturally, I dont have lower values ​​than 60fps, even with a bunch of mods, a bunch of my own hi-poly meshes, reshad and running fraps. Less than 60 fps is unacceptable, because with lower values ​​the cbp bones of my custom body will drop (tits and ass will lie on the ground). ) Friezes have such a nature... Frames with a large number of meshes in the future collapse to 30fps. I turn off fraps, the frequency is restored to 60fps. Without fraps, freezes are a drop in frequency per second to 30 fps when loading the new Boston array.

Edited by South8028
Link to comment
Share on other sites

Now i understand.

 

I think you cant trust all the numbers you see. I have no constant 60 FPS either thats perfectly fine though. The short freezes always occur because the CK load the next cell in background. precisely when you navigate from one to another grid. thats why you should always set uGrid 1 to perfectly see the cell itself and it become more clear where the interaction from one to another cell is happening. it makes modding a hell easier. if you do it the CK become "more stable" in terms load/rendering.

 

you can tweak some of your GPU settings via the CK INI and custom INI but again you dont need a GPU since the CK is not so dependend. RAM is the most important factor since you do not do any prerendering afaik. but you want a stable user experience and thus if you ran out of ram the CK become very laggy. enough RAM stable CK.

 

one important thing though is you need to change uGrid to 1 on each startup, i didnt find a way to force it via INI or else till today.. its somehow either hardcoded or the CK doesnt not just use its own INIs i assume the CK also uses the default FO4 prefs ini where uGrid 5 is set.

 

Also worth noting is to set the INI to "read only". Since the CK when stuck or crash will screw every bit of data in that INI from time to time.

Edited by Qrsr
Link to comment
Share on other sites

Now i understand.

 

I think you cant trust all the numbers you see. I have no constant 60 FPS either thats perfectly fine though. The short freezes always occur because the CK load the next cell in background. precisely when you navigate from one to another grid. thats why you should always set uGrid 1 to perfectly see the cell itself and it become more clear where the interaction from one to another cell is happening. it makes modding a hell easier. if you do it the CK become "more stable" in terms load/rendering.

 

you can tweak some of your GPU settings via the CK INI and custom INI but again you dont need a GPU since the CK is not so dependend. RAM is the most important factor since you do not do any prerendering afaik. but you want a stable user experience and thus if you ran out of ram the CK become very laggy. enough RAM stable CK.

 

one important thing though is you need to change uGrid to 1 on each startup, i didnt find a way to force it via INI or else till today.. its somehow either hardcoded or the CK doesnt not just use its own INIs i assume the CK also uses the default FO4 prefs ini where uGrid 5 is set.

 

Also worth noting is to set the INI to "read only". Since the CK when stuck or crash will screw every bit of data in that INI from time to time.

Maybe somewhere in the ini I need to specify the amount of RAM that ck can reserve? Because I did not notice any dependence of ck on the amount of RAM.
Link to comment
Share on other sites

Now i understand.

 

I think you cant trust all the numbers you see. I have no constant 60 FPS either thats perfectly fine though. The short freezes always occur because the CK load the next cell in background. precisely when you navigate from one to another grid. thats why you should always set uGrid 1 to perfectly see the cell itself and it become more clear where the interaction from one to another cell is happening. it makes modding a hell easier. if you do it the CK become "more stable" in terms load/rendering.

 

you can tweak some of your GPU settings via the CK INI and custom INI but again you dont need a GPU since the CK is not so dependend. RAM is the most important factor since you do not do any prerendering afaik. but you want a stable user experience and thus if you ran out of ram the CK become very laggy. enough RAM stable CK.

 

one important thing though is you need to change uGrid to 1 on each startup, i didnt find a way to force it via INI or else till today.. its somehow either hardcoded or the CK doesnt not just use its own INIs i assume the CK also uses the default FO4 prefs ini where uGrid 5 is set.

 

Also worth noting is to set the INI to "read only". Since the CK when stuck or crash will screw every bit of data in that INI from time to time.

You can set the "Grids to load" in the preference window under the "render window" tab. You get to the window by either navigating to File > Preferences or you right click into the render window and choose "render window properties". This way the grids to load should be saved even after a restart. Secondly, this also allows you to disable the background cell loader. If I rember correctly, it's the "Allow Render Window Cell Loads" option. Which means you can just navigate freely within your loaded cells without the CK loading additional cells, because working with just "Grids to load" set to 1 prevents you from performing certain operations like landscaping and other things, so it's mostly better to load more cells, usally ~3-5.

 

Unfortunately, the CK itself has some problems when it comes to memory management, which means that certain things will take quite a while. Additionally Bethesda also heavily relies on the usage of layers(which you can see in Boston downtown) to hide objects that are irrelevant for your current work. I would say FPS is usually not a problem in the CK if you keep the grids on a lower number depending on the cells you're working on and depending on if you're rendering shadows/displaying markers(Bethesda used fairly high poly models for markers which slows everything down) and if and how you're using layers.

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...