Jump to content

FMod

Members
  • Posts

    1136
  • Joined

  • Last visited

Everything posted by FMod

  1. You can almost always straighten them out, it takes just a little bit of finesse. Maybe 1% of the bent pins are non-repairable. This should really be in a separate thread. Anyway: - Noctua DH14 is obsolete. Today buy Thermalright Macho HR-02, or Archon, or NZXT Havik 140. - PC P&C is a reseller of mostly mediocre quality PSU. And Turbo-Cool is only a million years old. I don't know how good or bad it is, but it doesn't really matter when you can buy the better Corsair TX650V2 (or 850V2, though there's no real point) for under $100. - Fancy RAM is useless, Samsung DDR3-1600 works just as good and overclocks better. - OCZ drives work fine, their issues are mostly all in the past, but 60GB is too low. You'd be better off with a cheaper HDD (WD Green or Hitachi) and a 120GB SSD.
  2. You could manually delete the flash player (including from Firefox!) and reinstall an older version. Strange, works fine for me. But the Adobe Updater kept crashing for some reason, I deleted it a couple days ago.
  3. It depends. The cards can load up their memory - but excess capacity doesn't help much without the bandwidth to use it. 2 GB for 256 bit has long been the norm, so 7970 with 384 at higher clocks shouldn't lack for bandwidth. 580/3GB with slower 384 bits received partial benefits from the extra memory. Why no o/c? Phenoms overclock pretty well. DDR2-800 really works at 400MHz, so that's probably why the BIOS is indicating 400. Same way as how DDR3-1333 is only 667 MHz. 680 4GB just doesn't show any advantage over 680 2GB except with multiple displays. 670 is less powerful, so that should be even more the case. It just doesn't help any, be it because of the narrow bus or other factors. Well, the latest Nvidia drivers produce severe problems in Windows, crashing games, lagging desktop, worse: http://forums.nexusmods.com/index.php?/topic/688816-nvidia-30142-driver-problem/ Been over a month and no fix yet, except for using older drivers. So it's hard to tell which is worse. Historically ATI has been slower to update their drivers, and they definitely have far less leverage when it comes to having games optimized for their cards. It also reflects in the drivers, game makers do more testing and debugging on NV cards, AMD has to play catch-up. But the cards themselves have more raw power this time, they can catch up as happened in the past. Driver issues mostly affect Crossfire and SLI, single cards work well in both cases. Now, there, SLI is definitely more trouble-free than Crossfire.
  4. How they get better energy efficiency is because Nvidia slashed 64-bit units in favor of 32-bit (DP and SP respectively). For a while now, GPGPU has been a major driver behind GPU architecture, the reasoning behind Nvidia's own 8800 and Fermi architecture. AMD joined the game and 5000 series and above went all out. Then all of a sudden Nvidia drops DP GPGPU capabilities in 600 series. Otherwise the new architectures (GCN and Kepler) are actually very similar. Most GPGPU tasks use 64-bit calculations. However, a 64-bit unit has more transistors, which take more space and consume more power. Going back to 32 bits reduces power consumption, but obviously you can only do 32-bit calculations now. Modern Radeons are 64-bit throughout. 500 series had 1 in 4 units as 64-bit capable. In 600 series, it's just 1 in 24, leaving the performance "compatibility only". In games, one of the effects of the change is that with PhysX on the 600 series often delivers lower framerates than 500 series, as much as twice lower in some cases. It also kills the card's potential for any other tasks that require double precision. If the 600 series takes and they keep the same ratio in midrange chips, it might set consumer GPGPU development back a couple years. It did let them set higher stock clocks at low wattage, but that counts as cheating in my book - they're shooting the whole industry in the foot to get sales right now. Just hoping it was a forced measure and not done on purpose. GK104 is a rushed version of GK100. Nvidia planned to roll out the 600 series in December 2011, by Christmas. There were severe issues with low level chip design, however, which didn't let that happen, because the chips all came out malfunctioning. Around August they announced a serious delay. So, most likely, they took that time to quickly develop a slashed variant, GK104, partially fixing low-level issues and partially just slashing it in size to make it easier to produce. The narrow memory bus and only 2GB almost certainly came from that. Slashing 64-bit capability, hopefully also part of this rushed process. If not, they'll retard GPGPU development, which means lower computer performance and fewer GPU sales due to reduced usefulness. I've been sticking with Nvidia so far, but with 600 series, IDK what to think. It's not the product we've been promised, they've been talking about better computing capabilities, more versatility, rising GFLOPS/watt, and instead it's just a stopgap. The memory bus in question is on the card itself, chip to GDDR5. The data doesn't pass through PCI-E - that's actually the point of storing it onboard. Though all new cards are PCI-E 3.0.
  5. The OS and the game don't put a lot of load on the HDD (<10%). What matters is that the bitrate of raw 2MP video is higher than the speed of most drives. Specifically, 2MP*60fps is 360MB/s, and even the few SSD that can do it are unable to sustain such speed. In all cases recording 720p makes more sense than 1080p. Youtube uses a lower bitrate for 1080p than recommended for 720p, so it's only 1080p in name. The reduction in resolution isn't going to degrade game video quality much, especially for Skyrim, which is no eye candy in the first place. That's why 16GB with only half used for a ramdisk. One of the best programs I know of is Vsuite: http://www.romexsoftware.com/en-us/vsuite-ramdisk/download.html Unfortunately the free version won't do, but you can use the enterprise version free for 15 days.
  6. I'm aware of what's going on in the industry, not an insider, but have to work with them enough. It's not clear what will happen to GK110, it may be released, it may not. GK104 just isn't that good, it's slower than GF110 (GTX580) in computing tasks, not to mention Radeons, has fairly low theoreticals so it lacks scalability to high resolutions, multiple monitors or more demanding tasks. Now, it isn't bad, but not something to jump at.
  7. Actually the CPU is going to make a lot of difference. It's what does most of the encoding work. Ideally you could look into LGA2011 and i7-3930K, but it's a lot of money for a marginal improvement over 2600K. 4GB of RAM is too low however, you want 8GB. Also, you need a fast drive to store the recorded video and the temporary files on. Using 16GB of RAM, setting aside 6-8GB of those for a RAM drive, and doing all your recording there, should help. And work with the settings, use minimum compression, low quality, fastest encoding you can get. Don't bother with 1080p, it's excess details that will be lost on youtube anyway. Take 720p - then upload it as 1080p, there should be little difference in quality, but a lot in FPS.
  8. 670 4GB is a joke. Even 680 4GB barely ever extracts even a minimal benefit over its 2GB version. It's not all about memory size. 670 also has a narrow 256 bit memory bus, which doesn't let it access the memory fast enough to make use of it. If you want high resolutions and massive textures, you should be better off with 7970 which has 3GB on a 384-bit bus. I'm personally waiting to see if GK110 lives up to the expectations. While at that, if you do buy a 670, try to get one of the versions that use a 680 PCB, like Asus DC2. Anyway, I doubt that Skyrim loads up even 2GB of RAM. Oblivion and even Morrowind have a lot more volume in texture mods and don't come close to it. By default Skyrim is a console game, and they only have 256MB.
  9. Pretty much every non-reference 7970 is already overclocked to 1 GHz. And you want a non-reference one, from Sapphire or Asus, for their quiter cooling systems. 7970 is only a little more expensive than 7950 and you can overclock it further, so worth it.
  10. The change in performance is more like -3%. The only useful thing Virtu can do is let you plug your display into one adapter and render on another, at a cost of -1% to -10% performance. The "60% gain" works by detecting frames that haven't changed and displaying the same frame over and over again, so your framerate climbs while the image remains the same. They seem to have toned it down, used to be "400%" once. It does work in very few circumstances though, specifically when you have a static background at very low fps and do something that amounts to scrolling it. In that case, it can produce smooth scrolling while fair rendering at 10 fps would not.
  11. They are real cores, they just aren't very fast cores. While they have shared x86 decoders, that's just part of the architecture. So Phenom II or Athlon FM1 is better. Never heard of that. It would be very strange, really, since a CPU just runs the instructions given, it either does it right or doesn't. What it has an issue with - or, rather, what issue Windows 7 has with Bulldozer, is that it can't properly assign threads to its cores. Like virtual HT cores, Bulldozer cores are paired into modules. The correct way to assign threads is to Core 0, then Core 2, 4, 6, then Core 1, then 3, 5, 7. It's a lot more complex than that really, but that's about the general idea.
  12. You would need dedicated hardware to do it w/o performance loss. It's expensive, don't even bother thinking. Your real goal is probably just youtube videos, so drop the resolution down to 540p or 720p - remember, youtube compression is so bad that you won't even have 540p level of detail in the final picture - and keep the settings from getting too high. If you want the youtube video as 1080p, capture in 540p (960x540, need to add custom resolution) and then upscale it to 1080p in video editing software. The end result will not be as good as 1080p throughout, but better than youtube SD.
  13. Video capture severely stresses any system. Generally you need a special card and a separate system if you want to do it in full quality. Some settings can be adjusted in video capture to reduce the strain, generally you want as low a quality as possible, since youtube quality will be lower still. Still do.
  14. Can't you just buy blue LED speakers that you like and replace it with a red LED? LED are a dime a dozen, some smaller or good service stores can even help you pick the right LED if you show them the original one. That is unless someone actually knows of a pair exactly like you need. But if not, replace the LED.
  15. Right off the bat it doesn't strike me as a good deal. E6600 cost $80 when they were new... now, $25 maybe Mobo, say $60 (used value <$30) RAM, $30 GT440, $20 used but $50 new HDD, $70 DVD no idea, $20? +$10 PSU Case, $80 Was there something else?
  16. It's a little slower than a high-end quad-core Phenom. No match for i5-2500K obviously. But then it's half the price, with 60% the CPU performance and 200% the GPU performance vs 2500K. Intel chips also have a GPU, but even if you don't care about its performance, then come the drivers. Nvidia drivers >= AMD GPU drivers >> Intel GPU drivers. They're bad, Intel knows they're bad, but since the people who care the most have a discrete GPU, Intel chooses to be the ones to care the least. They're all direct current. Modular just means you can remove some of the cables. Doesn't really affect anything apart from aesthetics. I suppose you had experience with a $100 modular PSU and a $10 "came with the case" - but it's not modularity that made the difference.
  17. 1. About 3 watts for 2.5", up to 10 watts for 3.5". Peanuts, no effect on the PC. 2. SATA is permanent although semi-hot-swappable. You must mean eSATA, which is permanent/temporary hot-swappable. USB is completely plug-and-play. Firewire is kinda better than USB, but you'll never notice. eSATA easily outperforms everything else, inc. USB 3.0. Since it's just SATA with another cable and PnP capability, you can use eSATA drives like internal ones and never know the difference. But it's also like internal in that eSATA drives don't normally go to sleep, USB ones do, and USB is more compatible. USB 3.0 speed is usually enough for everything. 2.0, noticeably slow. 3. It's just a tool, it either works as required or it doesn't.
  18. It may be a difference in where you measure the framerate. BTW, 5450M is actually slightly (by 3%) faster than regular 5450. Either way, tests have been done: http://www.notebookcheck.net/The-Elder-Scrolls-V-Skyrim-Benchmarks.66057.0.html 7450 is slightly faster than 5450 (it's also old-gen).
  19. You need an upgrade. My post here addresses the same issue from another poster: http://forums.nexusmods.com/index.php?/topic/697822-skyrim-problems/page__view__findpost__p__5534089 Basically there's no way you'll get acceptable performance out of 5450.
  20. You didn't have to post the whole log... Your PC has a very weak GPU, maybe an integrated unit. No chance of smooth gameplay. It's in fact so weak that even i3-2105 would be faster, and it's only marginally faster than the iGPU in i3-2100. Such video cards are usually bought for office PCs. If you live in US, you should buy this card: http://www.newegg.com/Product/Product.aspx?Item=N82E16814127664 I mean this exact card at this exact retailer, probably order tomorrow. At $100 for a factory-OC 7770 with good cooling it's a great deal, normally you can only get a stock 7750 for that money. At 1440x900 (your monitor's native res) you should be able to comfortably run Skyrim on high settings, and really most any modern games and upcoming games for a few years.
  21. I just upgraded to 301.42 today, and browser typing got incredibly sluggish, slowing down the whole PC. And windows were slow to move. Went as far as to rollback Firefox to no avail. Uninstalling 301.42 and going back to 301.24 beta solved the problem. Just a heads up.
  22. Better testing procedures and better reputation. Also, multiple other sources agreeing with it. That said, the sources actually don't significantly disagree. Both Guru3D and THG place 7750 vs 5770 quite close most of the time with 7750 more significantly ahead in some modes. Guru3D's testing doesn't include Skyrim. Another source that did this comparison is Xbit: http://www.xbitlabs.com/articles/graphics/display/radeon-hd-7770-ghz-edition-hd-7750_8.html#sect0 Click on the large table. You can see how 7750 is a bit faster than 6770 or equal to it almost everywhere, the only exception being Dirt 3.
  23. That's not a reasonable argument. You don't mean to suggest that if people disagree with your source, their only recourse is to write letters to your source, rather than to present another one, including a better one, do you? TPU is more reliable than either (especially THG!) and here's it's page: http://www.techpowerup.com/reviews/Sapphire/HD_7770_Vapor-X/22.html Nevermind the Vapor-X. The current price match for 5770=6770 is the 7770, which is much faster. But you can see how even 7750 is at least a match for 6770. In the overall score, they are very close and generally even. edit: I have no interest in the argument itself, since I know what I'm stating to be correct to a high degree of certainty. As for OP, like I said, he should either jump at the $170 560Ti deal if that is his top dollar, or put together enough for a 7850 ($220-$250) if he has some more set aside for less important PC upgrades.
  24. Not really. I'll direct you to the Anandtech testing: http://www.anandtech.com/show/5541/amd-radeon-hd-7750-radeon-hd-7770-ghz-edition-review/19 Look at 5770 figures. 6770 is 5770 with a different label, as you can confirm from here: http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units Overall 7750 is about even and sometimes faster. In Skyrim, 7750 doesn't quite wipe the floor with 6770 (5770)... although, actually, in some relevant settings it does: Price-wise, the cheapest 6770 or 5770 I've found on newegg (cba to search everywhere) costs $100 post-rebate. Just a couple posts above, I've found and linked a 7770 for the same $100 post-rebate. That's the same price, and 7770 is considerably faster. The latest Catalyst dates April 25, 2012. That's not quite yearly. The latest Geforce driver release is a May 22 WHQL 301.42 rehash of an April 9 301.24 with a couple extra game profiles. For customer support, you would normally contact your video card vendor, not the GPU maker directly.
  25. I used to go back and forth. Now I have both (two high-end NVs on one and two midrange Radeons on two other computers) and for the love of all that is good and holy can't tell the difference. Crossfire is where it gets tricky, but problems with a single GPU are exceedingly rare. And if you want to believe NV's drivers are flawless, check this thread: http://forums.nexusmods.com/index.php?/topic/688816-nvidia-30142-driver-problem/ This is simply not so. New 28nm GPU deliver better performance at half the power and overclock better if you want to. Their price/performance ratio is the same or better, except for high-end units where you pay a premium on top for getting the best there is.
×
×
  • Create New...