Bloodlust666 Posted May 6, 2013 Share Posted May 6, 2013 (edited) The FPS dips are quite random, but the problem is... I shouldn't be having any. Any ideas to improve my FPS? AMD FX-8350 8 Core ProcessorDual AMD Radeon HD 7870 (Crossfire)8 GB RAM I've seen people saying they get 60 - 90 FPS on Maximum Settings in Skyrim with a Quad Core i5 and SLI 560 TI's (Something like that)... what's the deal with my rig? Why am I getting bad dips? My cards are significantly better than 560 Ti's and I have double processing power + way newer technology. And I thought I might add my game maxes out at 60 FPS according to Fraps - I have a Plasma HDTV; it allows me to see framerates much higher than such. A link below shows I should barely be dropping past 60 FPS - and go much higher. http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-GTX-660-vs-HD-7870-plus-HD-7790-HD-7850-GTX-650-Ti-BOOST/Skyrim- Edited May 6, 2013 by Bloodlust666 Link to comment Share on other sites More sharing options...
Beriallord Posted May 6, 2013 Share Posted May 6, 2013 (edited) Just because you got double the cores doesn't technically mean you got double the processing power. Nor does it mean Skyrim even uses those extra cores you got over an I5 quad core. If someone has a CPU like an I5 2500k, or 3570k it outperforms your AMD in most games, unless they they are heavily multithreaded, and Skyrim isn't. Skyrim is dual threaded, which means your 8 cores count for nothing vs an Intel I5 quad core. I get FPS drops in the swamp area mostly, and the FPS drops change immediately depending on my field of view. Its a strange drop, and I never found a solution for it and pretty much had to just deal with it in certain spots. Other than drops to 35fps or so in the swamp area I stayed pegged at 60fps. Also forgot to mention that my skyrim is heavily modded with texture packs. And definitely is above maximum vanilla settings. Skyrim is also heavily in CPU usage, because it uses the CPU to render shadows, which are an absolute atrocity in this game. You might want to try overclocking your CPU. I know the difference in OCing my I5 2500k to 4.5ghz eliminated most of the dips below 60FPS entirely except for the swamp area. **Edit, the swamp area is a GPU bottleneck, because its using 100% of my crossfire 6970s and all 2gbs of my Vram. I've heard of people using as much as 2.8gbs of Vram in Skyrim heavily modded. I think its the fog in that swamp area that is torturing my GPUs. Edited May 6, 2013 by Beriallord Link to comment Share on other sites More sharing options...
Bloodlust666 Posted May 6, 2013 Author Share Posted May 6, 2013 (edited) Swamp area is what drops me the lowest as well. Usually the most (25 FPS). And yes, I understand the difference between the i5 and FX-8350 having double the cores but not double the processing power, what I'm simply asking is why I am dropping so low so much because I do not use any texture mods whatsoever. I run on max settings (aside from shadows, which I set down because I can't see the difference between one crappy shadow and the next crappy[er] shadow) as well, but with even less power due to no texture mods. And if you look at the link I posted, it shows that my crossfire cards can go upwards of 180 FPS in Skyrim (which isn't something I want, but the point is...) and I am unsure why I am rarely at 60 FPS, regularly dipping 40 - 50 (which is VERY noticeable, because it's a rapid stop and return dip). Where is all the power that those guys were testing going to? Oh and by the way, there has been recent testing of the new FX-8350 processor versus the i5 and i7, and it has been shown to actually do better than both (not always, but it does). Not to mention it has a 4.00ghz standard power. It's about $120 less for only 4 FPS average less (not even a notable difference, game wise). Here's another link, I suggest you look at this one and the one in my first post as well. I am NOT achieving optimal performance from my machine, and I have come here to figure out why (so for future reference to anyone, save the lectures, help resolve this issue). Edited May 6, 2013 by Bloodlust666 Link to comment Share on other sites More sharing options...
Beriallord Posted May 6, 2013 Share Posted May 6, 2013 (edited) All I know is I'm getting better FPS than you, with higher settings, and your GPUs are a little bit better than mine, I think, or about equal. The biggest difference between us is CPU and I'm using an I5 2500k OC'd to 4.5ghz which is a pretty decent overclock. Around 30%. What resolution are you running at? I'm running at 1920 x 1080. If you're running a higher resolution than me then that could be the problem. Also, here is a benchmark: http://www.bit-tech.net/hardware/2012/11/06/amd-fx-8350-review/6 The numbers are skewed a little bit to compensate for overclocks, but clock for clock the I5 2500k still handily beats the 8350 in Skyrim. In fact a stock clock I5 2500k running at 3.3ghz beats an overclocked 8350 running at 4.8ghz. No offense but that is getting owned pretty hard. And they are using a GTX 690 which is a pretty monster GPU. So its no way comparable to the results me and you are getting, like dips in the swamp which probably don't happen with a GTX 690 with 4gbs of Vram. Considering our GPUs are pretty close to as powerful, or yours being a little better, then only difference between us is CPU power. I got 16gbs of DDR3 ram, but I don't think that will impact skyrim performance in any meaningful way. Basically, try overclocking your CPU. If you don't know how there is a good forums that can help you its called overclock.net. They helped me OC my I5 2500k. Edited May 6, 2013 by Beriallord Link to comment Share on other sites More sharing options...
Bloodlust666 Posted May 6, 2013 Author Share Posted May 6, 2013 (edited) Overclocking is something I could definitely look into, but I am going to op against it (for now). Never had it in me to push my systems harder than they are. But anyways, I'm trying to let Skyrim allocate more RAM using [Papyrus] in SkyrimPrefs.ini because I just updated Fallout: New Vegas to the 4GB version, and now I get a 100% smooth FPS in that at all situations. Sure, it won't be much, but it will do enough to hopefully at least give me smooth rates in non-bottleneck situations. I'll randomly take FPS drops in spots I don't during other times. I dunno, my FPS is variable. I'll still hold on some amount of considering towards overclocking, but Skyrim will half to annoy and push me to my limits before I do that to my computer. Thanks though, good discussion. Gave me more to think over. Other opinions still welcome though. Edited May 6, 2013 by Bloodlust666 Link to comment Share on other sites More sharing options...
prod80 Posted May 6, 2013 Share Posted May 6, 2013 You can force Skyrim to use all your Cores instead of just 2.But Skyrim is not really CPU limited, rather GPU. "But anyways, I'm trying to let Skyrim allocate more RAM using [Papyrus] in SkyrimPrefs.ini"you cannot allocate more RAM to SKYRIM through Papyrus.............. please dont do that you'll just break your game! That's just some stupid misconception RCRN people made on their website and has been answered by a Papyrus developer as being completely false and shouldnt be tried. As for bad FPS can be a combination of things. You run Crossfire, but that doesnt increase your VRAM... FPS flooring mostly has to do with running out of VRAM, which has to do with your Skyrim.ini settings regarding Grids to load and your Texture packs (if any). If you have modified uGridsToLoad in Skyrim.ini you should first, before anything else, set this back to default to continue troubleshooting performance issues. If this is a big nono for you then you just simply find yourself disabling more mods to resolve it then you actually need to. There's LOD/Distant detail mods out there to make distant terrain (including trees) look a hella lot better with minimal/no performance loss. Then look at your texture packs if the issue is not resolved and check if youre not running something utterly insane like 4K texture mods - if you say you run on 1920x1080 you simply shouldnt use anything over 2K textures, as your screen is 2K... it's just performance loss for you. The only objects which need higher textures are objects which on 100% size (dragons, trees) can fit your screen multiple times (eg. a Dragon can fit your screen 5 times so they need 10K textures to be detailed in 100% size). Link to comment Share on other sites More sharing options...
Dweedle Posted May 6, 2013 Share Posted May 6, 2013 Load order please. Some mods effect frame rate. Also, shadows are the biggest framerate killer I believe so if you have those on a high setting it will likely cause issues. If possible list what texture packs you use too. Link to comment Share on other sites More sharing options...
bben46 Posted May 6, 2013 Share Posted May 6, 2013 FPS is vastly overrated. I know people will argue that THEY can tell the difference between 50 and 60 FPS, but the scientific evidence is they can't. What they are seeing is the abrupt changes in the fps - The console version is capped at 30FPS and most console players will brag that they get a smoother game and it looks fantastic. Unfortunately capping the fps causes slower loading on the PC when you change cells for some reason. Instead of shooting for the highest FPS possible, try for the smoothest game with a resolution you like. The higher resolution (and high resolution mods you use) the lower fps you should expect. I cap mine to 42FPS. Any higher cap and I start to see noticeably slower loading. Any lower and I start to get fps drops.At 42 FPS, I have a very smooth game and very few drops. Using an older i5 650 cpu and GTX 550 Ti with 1Gig Vram & 8G system ram. And with shadows turned way down. Link to comment Share on other sites More sharing options...
myztikrice Posted May 6, 2013 Share Posted May 6, 2013 Where is this scientific evidence? There is clearly a difference between 50 FPS and 60 FPS. A constant 50 FPS is smoother than jumps to 60 FPS yes, but that's not what that sentence is saying. Link to comment Share on other sites More sharing options...
Rennn Posted May 6, 2013 Share Posted May 6, 2013 (edited) Assuming ideal conditions, anything above 24 fps should be invisible to the human eye. Every movie you've ever watched was recorded at 24 fps. However, game engines aren't perfect and the simple fact is that on a modern game engine, there is a visible difference between 24 and 60 fps simply because frame latency, instability, and microstutter decreases at higher framerates, leading to a smoother picture. It's not the framerate per se that's different at 60 fps, as the human eye generally can't tell the difference between 24 and 60. Other, more subtle and lesser known factors go into a framerate of 24 versus 60, meaning in real world use on a modern engine, there is still a visible difference. Edited May 6, 2013 by Rennn Link to comment Share on other sites More sharing options...
Recommended Posts