dukemordred Posted December 10, 2015 Author Share Posted December 10, 2015 my CPU is i7 4.00 GHzmy GPU is 2Gb 256 bitmy RAM is 8Gb Anyway, the question was not "why do I have bad fps" It was: why so bad ONLY inside Institute. I have explored every single location in the game and nowhere does it even come close to that huge fps drop. Link to comment Share on other sites More sharing options...
Sdesser Posted December 12, 2015 Share Posted December 12, 2015 If you don't have an i7 or equivalent, your game is maxing out the CPU. You are most likely only seeing this In specific spots and even then, maybe only one direction.Upgrading is the only long term cure and it's not just Fallout 4. Why only now, is this new?Not really, in fact this is normal. The last 10 years of 32-bit gaming on 64-bit machines, has led to the proverbial "runs on a potato", quotes.Well no longer, can you just upgrade the GPU on that potato. Now you must upgrade the potato as well. To run the latest AAA games you must now upgrade both, about every three years. So alternating upgrades every 18 months, is once again the general guidance.Just like it was when, most PC's were 32-bit. It's not so much about being 64-bit, it's more about the massive increase in available memory that 64-bit provides.All modern CPU's had no problems with the 32-bit 4GB limit.Now the increase makes those with less power fail to cope. I'm running a i5 2500k (clocked at 3.4 GHz) and the game isn't using it all up. It spikes to the 90's and very rarely 100%, but only when loading in to a new area. Normally it runs around 60-70%. Link to comment Share on other sites More sharing options...
NorwegianAction Posted December 14, 2015 Share Posted December 14, 2015 Same problem here, running the game in 4k tho. Seem to be the lighting in the game that does it. Outside is fine but when i enter a place with bright lights (like the institute) the framerates go from 60 to 30.i7 5820k and titan z Link to comment Share on other sites More sharing options...
Sdesser Posted December 14, 2015 Share Posted December 14, 2015 Same problem here, running the game in 4k tho. Seem to be the lighting in the game that does it. Outside is fine but when i enter a place with bright lights (like the institute) the framerates go from 60 to 30.i7 5820k and titan z Yeah. Lights certainly seem to be the cause. Reducing shadow quality had no effect on Institute FPS. Link to comment Share on other sites More sharing options...
UhuruNUru Posted December 14, 2015 Share Posted December 14, 2015 ...I'm running a i5 2500k (clocked at 3.4 GHz) and the game isn't using it all up. It spikes to the 90's and very rarely 100%, but only when loading in to a new area. Normally it runs around 60-70%.I refer you to a site that actually tested for this, they are not the only ones to see this issue either.Fallout 4 CPU Benchmark: Major Impact on Performance - i3, i5, i7, & FXGamers Nexus do great performance analysis for PC. No offense, I trust their published results over anyones anecdotal evidence, though as an i7 owner I'm less concerned about this for myself, but saw the thread and suggest it may a factor in your issues, if not the entire cause. Link to comment Share on other sites More sharing options...
Sdesser Posted December 14, 2015 Share Posted December 14, 2015 ...I'm running a i5 2500k (clocked at 3.4 GHz) and the game isn't using it all up. It spikes to the 90's and very rarely 100%, but only when loading in to a new area. Normally it runs around 60-70%.I refer you to a site that actually tested for this, they are not the only ones to see this issue either.Fallout 4 CPU Benchmark: Major Impact on Performance - i3, i5, i7, & FXGamers Nexus do great performance analysis for PC. No offense, I trust their published results over anyones anecdotal evidence, though as an i7 owner I'm less concerned about this for myself, but saw the thread and suggest it may a factor in your issues, if not the entire cause. As a university student of Information Processing, I trust my own results and expertise over a third party web page I've never heard of. Not saying they're wrong. What I'm saying is that the results aren't necessary related to the issue. My first issue with the tests is that they were conducted only on "Ultra" settings. Individual effects and levels of settings may have huge differences in performance. Considering games these days are primarily built for the consoles, Ultra settings aren't usually very well optimized and for this reason not a very good testing point in terms of general performance. Especially on an old engine that was built long before some of the effects - they put in the game - were even invented, thus potentially being very inefficient and highly costly in terms of processing power to run. On the other hand, as I stated, my CPU isn't capped while playing and experiencing this issue. This could mean that the game isn't utilizing the CPU as well as it should. If this is the case, it's not a problem with the CPU and your argument to "upgrade your potatoes" isn't relevant. *Edit: P.S. Your i7 has pretty much identical performance in gaming as my i5. Difference between i5 and i7 is that i7 supports hyper threading, which is an amazing technology if you're encoding video or the sorts. Has no effect in current gen gaming though. Link to comment Share on other sites More sharing options...
UhuruNUru Posted December 14, 2015 Share Posted December 14, 2015 You can trust your own untested opinion over any published data, no one else can, data publicly available can be scientifically tested for accuracy. Your opinion can't do that, is your GPU even powerful enough, to not cause GPU bottlenecks, because otherwise your CPU stats are inaccurate and may be reduced by a maxed out GPU.As a university student, you should know the scientific method demands publication and independant verification, your "results" are worthless if unverifiable. Published data whether correct or not, can be checked, unsubstansiated claims are meaningless. Ohers testing methods are published. Gamers Nexus was just a site I know tests reliably, I saw that they analysed the specifivc issue of CPU optimization by maxing out the GPU so it wasn't causing any bottleneck at all.Unless you show the full results and method, it's untestable and worthless, to quote the Gamers Nexus Method Test MethodologyWe tested using our 2015 multi-GPU test bench, detailed in the table below. Our thanks to supporting hardware vendors for supplying some of the test components.NVidia's unreleased Fallout 4 drivers were used for testing, including the Fallout 4 optimizations. We tested using our best single-GPU available, the EVGA GTX 980 Ti Hybrid. The objective was to reduce GPU bottlenecking as much as possible to present the absolute performance difference between the CPUs.Game settings were configured to "Ultra" with “ultra” overrides where not selected (1080, 1440) and "Medium" (1080). Once we determined which settings provided a reasonable level of load for appropriate video cards, we forged forward testing those configurations on our suite of CPUs.Each scenario was tested for 30 seconds identically, then repeated three times for parity. We tested in Diamond City, the first major township the player reaches. We found parts of Diamond City to produce highly intensive load, with a performance gap as wide as nearly 60% in some instances. This makes Diamond City a poorly optimized region of the game that represents a mixed load scenario; our test run begins with the camera pointed toward a heavily occupied region of the city, then moves around a much less intensive corridor. The result is a mixed GPU load that is 100% reproduceable and representative of real-world play experiences.Our above video shows the course we used. This was chosen for its reproducibility and reliability during test. Benchmarks which do not precisely emulate our course taken will vary in results, depending on what area of the game they were executed. Now it may or may not be a factor in this issue or not, but the data is clear, i5 is bottlenecked and i7 isn't. I never said it was your issues cause, just a possible one. Unless you have a 980 TI, your GPU will likely bottleneck before the CPU could. It's irrelevant that the CPU is not maxed, when the GPU bottleneck is the limiting factor. As for i7 being i5 wtith hyperthreading, that's not quiet true as there's a range of both i5 and i7 so your 3.4 GHz isn't the same as my 4.6 Ghz i7, but it's more correct for matching CPU's, eg i5 3.4 GHz and i7 3.4 GHz, as you said i7 have hyper threading and i5 do not. The basic fact is that now we are playing 64-bit games with expanded memory access, all CPU's including i7's will become bottlenecked much faster, with new games.That wasn't true with 32-bit games played on 64-bit PC's. Which didn't tax an i7 in any way and led to the general attitude, that i5 or even i3 was all you needed, for any game.Games are now starting to max out all cores, including i7's hyperthreaded ones, the data is clear. The factors are the same as before last gen, in the PS2/XBox era, PC owners usually alternated upgrades, between CPU and GPU.With the advent of the PS3/360 era that stopped and for about 10 years, only GPU upgrades have been required to PC's. That time of CPU domination is over. These are basic facts that all gamers need to be aware of. i3 are obsolete today.i5 are borderline obsolete today.i7 are fast becoming obsolete today. Within a year or two at most, we will all need to upgrade the CPU and move to even better new hardware, whatever comes after the i. They may disappear completely for gaming, with VR needing even more power, than ever. The clock speed may become the key diiference and games unplayable at ultra, with under 8 cores (16 with hyperthreading). It's a new 64-bit playing field, with all the extra memory that provides, both GPU and CPU are needed for high end gaming. The years of just upgrading the GPU are gone for good (at least until 128-bit PC's appear)Currently my i7 only has 4 cores (8 Ht). The days of i3 Ultra gaming are over, i5 and then i7 will soon follow, that same dodo trail to extinction. Link to comment Share on other sites More sharing options...
raatorotta Posted December 14, 2015 Share Posted December 14, 2015 What resolutions you use? I'm running on AMD fx-8320 and Gtx 760, 12GB ram I haven't been in institute yet, but reading this im afraid i get lags. I play on 1600x900 resolution on my 32" fullhd tv though. Any bigger resolution makes any text too small for my like. Link to comment Share on other sites More sharing options...
UhuruNUru Posted December 15, 2015 Share Posted December 15, 2015 What resolutions you use? I'm running on AMD fx-8320 and Gtx 760, 12GB ram I haven't been in institute yet, but reading this im afraid i get lags. I play on 1600x900 resolution on my 32" fullhd tv though. Any bigger resolution makes any text too small for my like.Generally, if your CPU meets the minimum requirements, the game should run, but only with everything turned down to their minimum settings.Most PC issues like this are actually self inflicted, by the natural tendancy to want the best settings.Only the latest kit should play any new game on Ultra+. If your PC meets the minimum requirements always use the lowest settings.When most have no problems, but you and a few others do, if dropping to minimum serttings, doesn't solve your issues, it's possible only a component upgrade will. I don't know AMD CPU's at all, but the graphics card is good enough and it's listed as an 8 core CPU.So appears well above minimum, but with that GPU, Ultra+ is clearly to high.If you see issues, try minimum everything and that should run fine,PC's aren't like consoles they range from weaker than console to about 3 or 4 times better than them. That;'s why we have variable settings.We all want and can use Ultra+, most must accept some loss of performance, to run on ultra. Only you can decide what you will accept and what you won't.The standard is 1080p @ 60FPS, if your standard is higher, say 1440p @ 144+ FPS, then you need the latest kit.900p which you use, is less to render, that may be all the difference you need. Link to comment Share on other sites More sharing options...
NorwegianAction Posted December 15, 2015 Share Posted December 15, 2015 I noticed that when i look at certain types of bright light that makes me lag, my gpu's goes from both 100% to around 60%. Just a really annoying thing, hope they fix it in a patch meaby... Link to comment Share on other sites More sharing options...
Recommended Posts