Kalell Posted January 1, 2018 Share Posted January 1, 2018 4790K1080 Ti16GB RAM @ 2133MHz1080p at ultraNo overclocks except for RAM I'm getting strange CPU and GPU usage with Fallout 4. When getting 60 FPS the GPU usage sits in the mid 30s and the CPU usage sits in the mid 70s. That's pretty much what I would expect. When I get FPS drops in dense areas (usually into the 40s) the GPU usage drops into the mid 20s and the CPU usage drops into the mid 50s. This is the opposite of what I would expect. My VRAM usage tops out around 3GB and system RAM tops out around 8GB. I've looked everywhere for a solution to this and haven't found one, but there are a lot of other people that have this problem. People here know more about the inner workings of this engine and its quirks so I thought I would post here to see if anyone has a solution, or at least knows why it happens? Link to comment Share on other sites More sharing options...
jones177 Posted January 2, 2018 Share Posted January 2, 2018 Hi It's your CPU. It is working too hard. I have Fallout 4 installed on 2 rigs. One with a i7 2600k with a GTX 1080 running a 3440 X 1440 & one with a i7 6700k with a GTX 1080 ti running at 4k. With a newish game around level 30 they both average 24% CPU usage with peaks in Boston at around 42% usage. Anything above 42% CPU usage, frame rate drops & the GPU works less because it is waiting for the CPU(bottleneck). At over 70% usage my CPUs would eventually heat throttle. This game modded is the extremely hot. I had to get a better cooler just for this game. That is what may be happening to you. Check your temps. Even at 4k I only get 50% GPU usage with the GTX 1080 ti. When I had 2X 980 ti I got 25% usage on each. To get the GPU to work harder ENB is needed. You can get strange readings with a GTX 1080 ti at less than 4k 60hz & 1440 144hz. Also check for save game bloat can do it as well. It can affect the CPU & i/o. Later Link to comment Share on other sites More sharing options...
Deleted3082751User Posted January 2, 2018 Share Posted January 2, 2018 (edited) @ OP ensure the game is not using your Inbuilt Graphics Accelerator (the GPU Built into the i7 4970k), this would most definitely cause higher then average CPU Usage and could also very likely cause lower then average GPU Usage (and likewise lower then expected performance). also the Overclocked Ram may also play a part in the higher then Average CPU usage. since System Ram is controlled by the Processor (CPU). Edited January 2, 2018 by Guest Link to comment Share on other sites More sharing options...
Kalell Posted January 2, 2018 Author Share Posted January 2, 2018 (edited) Thing is the CPU usage drops in demanding areas (from around 75% to 55%) instead of increasing (I probably wasn't clear enough about that in my OP). It's like the CPU just gives up. I imagine the FPS drops do have something to do with the CPU getting bogged down in some way, but it isn't from the usage being too high. If that were the case the CPU usage would go up instead of down in demanding areas. My temps are good. When playing FO4 my GPU is around 40c and my CPU is around 60c. I made the RAM overclock specifically for FO4 and it increased performance in the demanding areas by about 10 FPS. It's the only thing I've found that actually helped, and in fact I think the reason it helped is because it forced the game to use more of the CPU. I've also tried starting a new game with no mods and went straight to downtown Boston but there wasn't much of a difference. I've used ENB in the past and performance was a bit worse, but I didn't check to see what the usages were. I think I'll try running the game at 4K and see what kind of results I get. How do I ensure that my 4790K's integrated graphics aren't being used? I didn't even realize it was possible for the integrated graphics to be used at the same time as a graphics card. In games/programs that allow you to choose your graphics adapter the integrated graphics aren't even an option on my PC. Edited January 2, 2018 by Kalell Link to comment Share on other sites More sharing options...
Deleted3082751User Posted January 2, 2018 Share Posted January 2, 2018 (edited) oh so the CPU usage drops in more demanding areas, right, this could be caused by heat, higher usage = higher temperatures, if the temperatures goes over a threshold the CPU will then throttle itself to prevent damage it may be insufficient power, higher usage = more power consumed, so what could be happening is your processor is throttling itself due to lack of power, in demanding scenarios. (however since you have a gtx 1080ti, i would like to believe that your PSU is at least 600 watts, thus eliminating this potential cause) 60 degrees is pretty hot for a CPU, and will start limiting its lifespan, if this temperature is constantly reached and prolonged. Graphics Cards are much more powerful then CPU and Work much harder in tasks that require them, hence the reason they will typically reach high temperatures, and as such they are built to withstand these high temperatures, especially for prolonged use. a graphics card could reach 70 - 80 degrees constantly, and last 3 - 5 years with this temperature being hit every single day (my old Radeon 7970 was reaching 70 degrees almost every day for 12 hours at a time, and it lasted 5 years, probably even longer but i gave it to a friend after owning it for 5 years), if a processor was to reach these same temperatures every day for long periods of time, they would die within weeks maybe even days. basically all usage and temps should be for the GPU instead of the CPU, it looks like your CPU is being used far more then the GPU, which should not be happening. for example, that 60 degrees should be the graphics card, and the 40 degrees should be for the CPU, and likewise the CPU usage you mention should be for the GPU instead. in my own heavily modded fallout 4, my GPU is used far more then the CPU, i get very consistent 100% usage on the GPU, and 70 Degrees, never higher, where as the CPU usage is usually around 60%, and temperatures are 40 degrees (all of which are average usage and temperatures) as for the Inbuilt Graphics Accelerator, it should not be used by Default if you have a Dedicated GPU (which you do have) however its still possible, now i only mention because the low GPU usage that you mentioned and the very high CPU usage, so i put 2 and 2 together and came to the conclusion that it may be possible that the Graphics Accelerator is being used instead of the Dedicated Graphics Card (the 1080ti), the integrated graphics is probably only used for Physx however (Nvidia Debris for Fallout 4), which is why it is not an option, but that is not to say that it may be acting as the GPU. but again its actually very unlikely that this is the case, but due to the post you wrote, it seemed to make the most sense. 30% GPU Usage is incredibly low, especially for a game like fallout 4 which is both cpu and gpu intensive. and incredibly intensive on a heavily modded game with an ENB. An Example: My Borderlands 2 GPU Usage is 25% and Borderlands 2 is a very light game on both the GPU and CPU (basically it is not demanding at all), it is only using 2 cores. Fallout 4 is considerably more demanding on both GPU and CPU compared to Borderlands 2, so the fact that your GPU Usage is very close to my Borderlands 2 Usage would indicate a very underused GPU. again the reason i came to the conclusion that your Fallout 4 is Actually using your Inbuilt Graphics Card of your CPU, and also using Your GPU but only minimal. (if that makes sense), and becuase of the unusally high CPU Usage, this would indicate that the inbuilt GPU is is being used much more then your 1080ti. to give an even better example: Borderlands 2 = 200 Fps (would be even higher if it actually used all 4 cores and 8 threads of my CPU, and the GPU usage was much higher then only 25%, so we are talking only 2 cores and 25% gpu usage and getting 200 fps), Fallout 4 = 115 Fps (uses all 4 cores and 8 threads, and GPU usage is at a constant 100% usage, thats 115 fps on a very heavily modded game, would be far higher on a much less modded game, but still it is needed to achieve such high usage) so as you can see the difference between both games, and to highlight the issue here. so what i would suggest, go into your BIOS and disable your Onboard Graphics Card, i cannot help with this however unless i know what motherboard you have. to see if this solves the issue, or atleast makes it 1 less possible cause. Edited January 2, 2018 by Guest Link to comment Share on other sites More sharing options...
Kalell Posted January 2, 2018 Author Share Posted January 2, 2018 (edited) Unless my censors are wrong temp isn't the issue. The 4790K won't start throttling until it hits 90c, however 77c is the highest you would actually want to run it. It idles around 20c and I've never seen it go above 75c when under full load during benchmarks. 60c is safe for a 4790K and won't cause throttling. I've had this CPU for three years so if the censors were wrong it would likely have died by now. My GPU never goes above 70c (custom fan curve). The 1000 series does start throttling a small amount when you hit 60c, but I don't think that is the culprit since I don't go above 40c on FO4. I have a 750w 80 plus gold PSU. For the GPU being utilized more than the CPU that's true in almost every other game I have (unless it's CPU intensive). It's only Fallout 4 that has this large of a gap between the two. I'll check my bios and see what's going on with the integrated graphics, but like you I'm pretty sure that isn't the issue. Oh, and I tried 4K. It did increase the load on my GPU a good bit (to 100% at times), but CPU usage stayed the same. Oddly the FPS I got was identical. Edit: Initiate Graphic Adapter is set to PEG so the integrated graphics are off. IGD Multi-Monitor is also disabled. Edited January 2, 2018 by Kalell Link to comment Share on other sites More sharing options...
Deleted3082751User Posted January 2, 2018 Share Posted January 2, 2018 (edited) you are right, the i7 4970k throttles at 100 degrees, i stand corrected, i could not believe that this is the temperature that the CPU starts throttling, i mean it is the exact temperature of Boiling Water, i could not imagine a cpu even surviving until then. also are you sure that the 1000 series GPU throttles at 60 degrees, as i have the 1070 and have never noticed it at all. anywho 1. is your power options set to High Performance Mode in Nvidia Control Panel ?, if not, set it to High Performance Mode 2. You could also try changing your Windows Power Options to High Performance Mode as well 3. Have you Modified Fallout 4 Ini files at all ? 4. Do you have Nvidia's "Geforce Experience", if so undo any changes it made to your games, and delete it, it is not recommended to have this program as reported by a lot of people, due to it making bad optimizations to games, and its overlay can interfere with other programs, Such as Msi Afterburner, and Especially ENB. Note: In Nvidia Control Panel the Power management mode will be set to Optimal Power by default, this is obviously no good. and likewise windows default power options will also be set to optimal power, again no good, this can cause Core Parking, which in turn shuts of the CPU cores, which results in sub optimal performance, this will also have a negative impact on the GPU, as it will have to wait for the CPU. thats all i can think of atm Edited January 2, 2018 by Guest Link to comment Share on other sites More sharing options...
Kalell Posted January 2, 2018 Author Share Posted January 2, 2018 (edited) I got that 60c figure from Gamers Nexus so I'm pretty sure it's accurate. It's only a slight throttle though. Yup, have high performance set in both NCP and Power Options. I also used a program a few months ago to unpark my cores. I do have modified inis but I've deleted them and let the game re-make them and it didn't make a difference. No Geforce Experience. I've had it before but it causes more problems than it's worth. Edited January 2, 2018 by Kalell Link to comment Share on other sites More sharing options...
Appusle Posted June 26, 2019 Share Posted June 26, 2019 60 degrees is pretty hot for a CPU, and will start limiting its lifespan, if this temperature is constantly reached and prolonged. Graphics Cards are much more powerful then CPU and Work much harder in tasks that require them, hence the reason they will typically reach high temperatures, and as such they are built to withstand these high temperatures, especially for prolonged use. a graphics card could reach 70 - 80 degrees constantly, and last 3 - 5 years with this temperature being hit every single day (my old Radeon 7970 was reaching 70 degrees almost every day for 12 hours at a time, and it lasted 5 years, probably even longer but i gave it to a friend after owning it for 5 years), if a processor was to reach these same temperatures every day for long periods of time, they would die within weeks maybe even days. Normally I dont Necro posts but this needs to be addressed, its false information. 60c of a CPU is NOT too hot, In fact It's perfectly fine. The 4th gen i7 has a TJMax of 105c ofc, the CPU will start to throttle if it is nearing this temperature too quickly. This entire sections is complete and utter bullshit, who ever told you this is an idiot. Honestly. CPU's will not die within weeks or days if it hits 70 to 80c How do I know this? Well, I know a fair few people who run hefty OC's on their 9900K's sustaining about 70 - 75c every single day for months now without any issues. Ontop of that every macbook is "designed" to run at a 100c pretty much constantly since it doesn't ramp up the fans speed untill it reaches above or near a 100c, A classmate had a mac, and every other review of a macbook confirmed this. Only the new Macbook 2019 does a more decent job at cooling. Just because it hits the temp of boiling water doesn't mean it can't survive. You obviously never looked in to CPU's and how they work and just mimic what other users may have said online and took it as face value. Never in my entire life have I seen a CPU die within weeks because it ran at 60c, if it did, it's covered under warranty because it is a faulty product at that point. I can guarantee that no intell stock cooler can keep their CPU's below 60c and Intel how ever dumb they sometimes are will not give a cooler that jeopardises their product to that extent. /Rant over. Again, sorry for the Necro but I just couldn't let it rest. Link to comment Share on other sites More sharing options...
Guest deleted34304850 Posted June 26, 2019 Share Posted June 26, 2019 it had rested for 18 months until you read it and triggered yourself. Link to comment Share on other sites More sharing options...
Recommended Posts