Jump to content

obobski

Members
  • Posts

    472
  • Joined

  • Last visited

Everything posted by obobski

  1. Why 1366x768? That's fairly low resolution these days. As far as what you want, ignoring all of your super-specific peripherals (which will run the price up and there's nothing to be done about it unless you can budge on them), a decent Core i3/i5 with a higher-end graphics card (e.g. 290X) should have no problems doing what you want at 1080p, and fit around $1000 price-wise. With all of your specific add-on peripherals, and starting from square zero (e.g. no monitor, no keyboard, no OS key) it'll go higher as a result (your peripherals alone can add another $1000 (or more) to the price, depending on what you mean by "high quality speakers" and "nice looking case"). Within your $800 setup, you'll either have to budge on your peripherals, or budge on performance, and potentially both depending on how much "extra" stuff you need.
  2. The "dual BIOS" thing on the graphics card has nothing to do with the system's BIOS - most recent AMD cards have a switch between two BIOS settings for power management and/or overclocking, but that's independent of the system's BIOS/firmware. Performance-wise a single Fury will be worse than twin 780s, and with the 9590 you'll experience generally lower performance than anything with an Intel processor (which is what a lot of modern benchmarks use, due to the 9590's age), however its much more appropriate for the PSU (assuming the PSU is still in good shape) and won't have any of the multi-GPU qurikiness of SLI/CrossFire (so performance may improve in some applications). Overall not a bad buy.
  3. That looks like a decent system spec-wise; not a huge fan of the PSU but it shouldn't be the end of the world.
  4. This kind of superlative hyperbole isn't constructive imho (especially when based on vague anecdotes and not quantified data). There is no "magic secret sauce" to having an SSD - it can improve disk-related operations as it offers higher throughput and lower latency than some other storage solutions; faster storage *can* be a benefit for certain tasks and that's been true since the beginning of fixed storage on PCs (and I say "can" instead of "will" because it really depends on the application and what it needs, and how the operating system handles storage - most modern operating systems do a great deal of buffering and prefetch to improve performance, for example). However it does absolutely nothing (at all) for computationally bound tasks (e.g. "frame rate") - just because the thing can "start up" faster does nothing beyond letting it "start up" faster, and the same goes for (in a gaming context) things like level loads. Honestly I'd dump the SSD over any other significant component (e.g. graphics, CPU, memory, PSU) if budget is a consideration - you can throw as many top of the line SSDs as you want at a Pentium Pro with S3 VIRGE graphics and it will still never run Skyrim, Windows 10, etc but a Broadwell Core i7 with GeForce GTX 980 with a mechanical hard drive will happily do both (and a lot more). (and yes, this is a hyperbolic example, but so is "not having this piece of computer hardware is equivalent to attempting suicide with a gun") As far as the remaining systems, #4 and #6 both look good to me. #6 has a better CPU (at least for gaming), and while the GTX 750 isn't the fastest kid on the block, it's not a slouch; #6 also appears to have a better warranty (I care about things like this, you may not). #4 has a better graphics card, but I'm somewhat skeptical of how useful that will be with the FX-4300. Out of curiosity: any reason you don't build the machine yourself? What's your overall budget like as well?
  5. Sentinel will silently throttle/stop the GPU for overheat (e.g. it can allow it to overheat, slow it down/stop it, let it cool, and resume functionality seamlessly) - it will complain about power on start-up if the AUX connector isn't present (but is required), but if the motherboard's power section fails (or the PSU fails) you may not get a Sentinel pop-up in Windows. All of this is independent of the drivers though, so even if the drivers (for some unknown reason) stopped cycling the fans on/off or spinning them up to appropriate levels (which, with nVidia OEM drivers, is something I can't say I've ever observed - of course with "hacked" or "modded" drivers anything is possible though), Sentinel should still function. Without more information wrt warranty, fan condition, other hardware condition, etc its really tough to make any good judgment about this system or what it's doing, other than "not working." That PSU is on the "too small" side for this kind of hardware as well, and it doesn't matter if it measures very well - near 100% output on a long-term basis is no good for reliability. Without more information, this sounds to me like there's a lot of hardware involved that needs thorough testing, and some of it will likely need to be replaced, with more care/attention given to configuration and part selection at that time (e.g. if our goal is FX-9590 and 780 Ti or 290X SLI/CrossFire, a larger PSU is appropriate, and significant attention needs to be paid to cooling).
  6. I just noticed everyone has dots under their user name - what do they mean/represent? There isn't the same number of dots for all users, which is why I'm curious.
  7. You could upgrade from the 270X to something more robust, like a 280/290/390 or GTX 960 (I'm not a huge 970 fan due to the memory bug), the CPU is basically as good as you're gonna get (see the links I posted). On the SSD, yes it can help load times IF the application is stored on the SSD. In other words, if you installed FO4 on the SSD, it would improve load times if they're disk-bound; its just a faster storage device so anything that will be constrained by storage speeds (e.g. load times) can benefit from it. It won't do anything for frame-rates, or other computationally-bound features, so you'll have to decide exactly how much you think "fast boot up" or "fast load times" is worth, as SSDs can get pretty expensive (and represent an awful $/GB proposition).
  8. On GPU upgrades, I'm not a big 970 fan due to the memory bug; I'd go with GTX 960 or a Kepler (if the prices are low). For AMD, the 290 and 390 are the same thing, so whichever is cheapest today is what I'd get; the 280 shouldn't be overlooked either. Unless there's *tons* of other peripherals, the 650W PSU should be fine with any of these cards - the 290/390 do not consume 300W+ 24x7 (and that 300W+ number is from a review that ran the card with Furmark and something else to like 110% TDP just to get that number); they're very power efficient in real-world usage.
  9. Except, and here's where we're picking at nits, no nVidia GPU since NV30 (GeForce FX) will allow itself to overheat in that manner - they all have a feature known as "System Sentinel" that will clamp on thermal over-limit events. This feature is independent of the drivers - it is meant to save the hardware in the event of fan or cooling system failure, or PSU failure (if the card stops receiving power via the auxiliary inputs it will also clamp). I have multiple GeForce cards that have been saved by this mechanism over the years, either due to fan failure, or other hardware failure. For extra "fun" you can even run the card with no heatsink, and Sentinel will save the board (I tested this on an already damaged board; I don't suggest you try it on something you care about). Same as any modern CPU. If the fans are simply failed (do they spin up?) its likely the card isn't allowing itself to engage full p-state - this is fairly typical behavior of Sentinel in action (on some lower power cards they can dance in and out of full p-state and stutter their way through 3D applications, but a 780 Ti is not one of those cards). I would again strongly suggest looking at a warranty claim as opposed to just throwing these in the trash, even if it is just to replace the fans.
  10. Something I forgot to add: have you updated your graphics drivers since Fallout 4 came out? I know this has been a significant factor for performance with previous Bethesda games, and usually AMD (and nVidia) are pretty good about releasing driver updates that improve performance (and this can go on for quite a while - iirc nVidia didn't "stabilize" Skyrim optimizations/performance until early 2013, for example). And on the 4770/4790: http://www.anandtech.com/show/7963/the-intel-haswell-refresh-review-core-i7-4790-i5-4690-and-i3-4360-tested/9 http://www.anandtech.com/show/7963/the-intel-haswell-refresh-review-core-i7-4790-i5-4690-and-i3-4360-tested/11 http://www.anandtech.com/show/9320/intel-broadwell-review-i7-5775c-i5-5675c/9 (and Broadwell for the heck of it; some Haswell-compatible boards support Broadwell http://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/16 (and Skylake for the heck of it; Skylake requires a new platform) I wasn't kidding about "won't do much better (if at all)" - it's not worth the $300+ by any stretch of the imagination imho. Extra memory won't do anything unless we're running out of memory, which I'm skeptical is the case here. Would need more information about that.
  11. What graphics do you actually have? "200 Series" is too vague - it will have a specific name, like R9 290X, R9 280, R9 295X2, etc. Also, what resolution are you trying to run at? Anyways: you won't do much better (if at all) than the 4690 as far as the CPU goes, barring an overclock (and that's entirely "up in the air" as to what you get). Broadwell and Skylake just haven't brought that significant of a performance increase (we're talking like 1 FPS or less in most games, per Anandtech's benchmarks). If that's a 290 card in there, you won't do a whole lot better than that either - Fury, Titan X, Titan Z, 980 Ti, etc are somewhat faster, but not orders of magnitude, and certainly not worth the price (and if its a 295X2, you already have the fastest graphics card on the market). A 280 is somewhat slower, but I'd still expect reasonable performance in newer games. On the RAM: is that properly set-up in dual channel, or is it running as single channel? I'm assuming you have a 64-bit version of Windows as well. Overall, I don't see much that you could do apart from CrossFire, but even that wouldn't be leaps and bounds, and can cause issues in some games. On the storage: *facepalm* The SSD doesn't benefit anything that's not computationally bound, or stored on it, and there's no point in having it as an "OS drive" - the sole benefit of an SSD is increased read/write speed and lower latency, but that's only for data it contains. In this case, it means the computer will load Windows (on start-up) quickly, which is of little utility beyond being a neat party trick. It has absolutely no benefit for games, or other applications, because they're not stored on it. It's small though, so you probably can't load much else but Windows on there anyways. Oh well, live and learn... :mellow:
  12. Any of these cards will be bottlenecked by the 9590 - 390 isn't a bad card though. I'd strongly consider a bigger PSU - the 9590 is 200-300W (actual measured) and the 390s will be around that as well - that leaves something like 100W (at most) for hard-drives, fans, the motherboard, other peripherals, etc and assumes the PSU running at 100% (or near to it) output when under load (which is not a good idea even with a high quality PSU). As far as the 780s - do they have a warranty you could claim? Honestly I'm kind of surprised when you say "drivers killed them" - in some twenty years of working on computers I've never seen a driver legitimately kill a piece of hardware. It can render a machine unusable by being unstable or incompatible, but actually "kill" a piece of hardware is like the million-dollar shot. That said, if you're running all that on the same 1kW PSU, I wouldn't be surprised if the PSU is in not-so-great shape after powering all of that under heavy load. Can you, or have you, tested the cards individually and/or in another machine?
  13. You'll be largely CPU bound in many games, and that's just unfortunately the reality of the FX platform (even if you got a 9590 and dealt with the cooling and power requirements, assuming your motherboard could even run that chip). Upgrading the graphics card to a GTX 960 or 970 (or AMD equivalents, like the R9 270/280/290 series) wouldn't be a bad idea, per se, but the performance improvement won't help the CPU. To upgrade the CPU, that means a new motherboard, and (depending on your licence situation) new copy of Windows, not to mention the time for a total new build. If you're going that route, I'd go Intel Haswell with Z97 or Z97x; should let you re-use your DDR3 RAM and other peripherals at least. Honestly though, I can't imagine an FX-8350 + 660 Ti being that bad with modern games; I was using a Core 2 Quad Q9550 + GTX 660SC up until December/January without much complaint, and your system should be better than that across the board. Just my 2c. SSD will do nothing for computationally bound tasks.
  14. Any reason you can't just roll back to an older, working, nVidia driver?
  15. Didn't see your reply until now (I admittedly don't check Nexus consistently every day), here's my reply: - It's not so much "excessively hard" as it is "excessively tedious" - a completely open-ended build question has a legitimately unlimited number of possibilities. As far as brands I've used and liked, however: For motherboards: Asus, ASRock, Intel (they make their own motherboards), Biostar, and Gigabyte from time to time. I avoid MSI (awful customer service and reliability IME), and no-name/new-to-market brands (nothing comes to mind off-hand but basically I don't like being the guinea pig) For graphics cards, nVidia: EVGA and PNY For graphics cards, AMD: Sapphire, XFX, and PowerColor For sound cards: Creative (historically Razer and M-Audio were good, but neither still makes sound cards that I'm aware of - lots of stuff on ebay though (you can probably still find Razer AC-1 cards for cheap on ebay, and as long as they have the breakout HD-DAI cable, they're a steal - assuming you get a motherboard with a PCI slot)) For PSUs (and I can't stress enough how important a quality PSU is): PC Power & Cooling (today they're "FirePower Technology" but they'll always be PC Power & Cooling imo), Antec, and Enermax; there are other good suggestions here and in general I'd say read jonnyguru and pick something that measured/tested well For cases: Silverstone, Lian-Li, Antec, and believe it or not, Rosewill For RAM: G.Skill, Kingston, Corsair, and Crucial That should get you started in the right direction at least - ofc there are many other manufacturers that may (or do) make great products; I haven't tried every single product or every single brand under the sun, and admittedly I have a preference to find a brand that makes quality stuff at reasonable prices and stick with them, vs experimenting with every build. - On Skyrim, yes $1300 should be more than good enough. To give you some perspective, my old rig (used it up until last December) had a Core 2 Quad Q9550, 8GB of DDR3-1333, and a GeForce GTX 660 (which replaced a Radeon HD 4890 after its cooler died), could run Skyrim on Ultra with FXAA quite nicely (the 4890 could almost do this too) - that's mostly 2008 hardware. Skyrim is not really that demanding of a game - it's not "lightweight" (e.g. it isn't WoW or CS that will run on positively anything), but you don't need a modern $10,000+ machine with Titan X SLI and 512GB of RAM and 96 CPUs and so forth to get good frame-rates. You can totally mess with the performance by adding mods (especially if you get into texture/mesh mods and other graphics "enhancements"), and I'd say that really just has to come down to a judgment call on your part. IMHO I'm fine accepting more or less vanilla graphics and/or accepting turning down some settings to get better performance vs throwing a mountain of money at it. - On the RAM, the "more RAM = faster PC" is a myth that's older than time itself. I still remember when CompUSA would try to sell people that line directly. It's just not true. Applications basically don't care how much RAM your machine has, as long as it has enough - anything extra is just wasted surplus capacity. Skyrim itself can never use more than 2-4GB of RAM, and that's true of many other games too (conventional Win32 limits are 2GB; with LAA flags they can use 3GB on x86 and 4GB on x64 - that's "up to" btw, not "will always require"). I'm sure if you asked people, plenty of them were playing it back in 2011 on machines with 2-4GB of RAM, not 32-64, but memory gets progressively cheaper (per capacity) as time goes on. There's no downside to 32GB of RAM aside from cost - if your goal is gaming, 16GB is a good choice imho because its around the sweet spot for pricing right now (and to do 32GB you're doing 4 DIMMs, so you could always add another 16GB in the future), and will give you more than enough memory for contemporary 64-bit games too (I'm assuming you're going with a 64-bit version of Windows), which generally request/specify 6-8GB of system memory on their boxes. - On the Intel CPU, yes and no. Intel has tried to simplify their marketing and branding in recent years by doing the "i" thing but ultimately I think it just ends up being confusing in new ways. Technically speaking the "i7" series is top of the line, but for gaming performance there's no point - top-tier i5s will be just as good (we're talking <5% differences in benchmarks, usually fractions of an FPS or less), and in many cases top-tier i3s can be perfectly suitable as well. Generally, today, i3 means dual-core, i5 means quad-core, and i7 means quad-core with HyperThreading - a lot of games still favor single-threaded performance (and this is why AMD falls so far behind; CMT favors high multi-threaded performance), so there's plenty of cases (e.g. Skyrim) where a fast dual-core can be perfectly competent. i5 is a good place to be, price-wise, and unless you're doing something that can actually drive some benefit from HyperThreading (e.g. Handbrake encodes) I'd just save the $100+ on the i7. Ditto for Broadwell/Skylake - they're showing basically el zilcho in performance increases for gaming, but you'll spend more to get that; why not save the money? - On the GPU. nVidia is *extremely* popular on many forums (like to the point of having a borderline Apple-esque following). This doesn't mean nVidia is "good" or "bad." "Steam runs better with nVidia" sounds like such an Apple-esque statement: Steam isn't a game, Steam isn't even a 3D application - it's just a digital content delivery service. It will run fine on all manner of hardware, be it a top of the line nVidia/AMD graphics card, the rinky-dink Intel IGP in an Ultrabook, or even my cannot-game-to-save-its-life 3DLabs proline card. As far as games being better/worse with nVidia - I'd say its a mixed bag. nVidia has "held games hostage" with things like Gameworks and PhysX (these are SDKs that nVidia owns and pushes on developers, and often these games run worse or without some features on non-nVidia hardware; I'm trying to say this in the least conspiracy-theorist-sounding manner I can), and if you're after games that rely heavily on nVidia IP then you're going to be forced into their ecosystem (see where the Apple comment is starting to make sense?). Skyrim is not one of those games, however; it has run great for me on multiple cards from nVidia and AMD. As far as what I'd suggest, I like bang-for-buck - the GTX 960 is not a bad example there (just like the 760 and 660 before it), but moving up in price I'd probably switch to AMD with the R9 290/390 series (its the same GPU - just go with whatever is cheapest) since they'll generally keep pace with the higher-tier nVidia cards and tend to cost $100+ less (and lets not even get started on Titan). For Skyrim, I'd feel generally confident saying: "none of this matters" - Skyrim will run (more or less) maxed out on high-end cards from 2008, and will (no joke) run on cards even older than that (Ultra on a 7900GTX: https://www.youtube.com/watch?v=wXTrK_GtG60 - before "man its so laggy!" - this card came out in 2005*, and the game is on maximum settings; if you dropped things down it'd become much more playable, and consider that even midling cards today are orders of magnitude faster than 7900GTX). However if we're going to talk newer games, like Fallout 4 or whatever else, you will need to give graphics more thought - I still think the 960 is a good starting place, and then decide if you want to go cheaper (and probably somewhat slower) or more expensive (and somewhat faster). Depending on what kind of monitor you're hooking up to, the 960 may even be overkill (e.g. if your monitor is <1080p that's a very different story than if your monitor is 2560x1440). I will add that both manufacturers can, and have, made excellent products (and I own and use examples of both), but I wouldn't want to chain myself one way or another, because neither of them makes a consistently better (or worse) product. Currently (and for the last few years) they've been pretty well tied, ignoring nVidia's (arguably) anti-competitive tactics, and I'd just go for bang-for-buck at this point. - I haven't tried Win10, but I have no complaints about Win8.1 when I've played around with it. My main gaming PC still uses Windows 7 mostly because I don't like constant upgrades; I still have some secondary machines running Vista without complaint as well. I've read about Win10's mandatory updates, and this is where I'd be somewhat leery of going Win10 + nVidia, as nVidia has kind of a nasty history of breaking game support with successive driver updates (back to "its a mixed bag") - sure it may improve performance in some new game, or some game that current reviewers are fixated on for benchmarking, but it may break playability, features, performance, etc in some old game. Example: the Gameready driver updates that improved Watch_dogs performance (which was used in a lot of reviews to benchmark) broke shadows in The Sims 2. Other example: the driver family that introduced and improved SLI performance broke most Lithtech-based games. If you don't generally worry about games more than a year or two old (excepting iconic titles (e.g. Skyrim, WarCraft III, StarCraft, etc stuff that will often be specifically singled out for support)) this probably wouldn't be an issue, but if you're looking at a wide and diverse range of games, being forced into nVidia driver updates is unappealing imho. This isn't to say AMD drivers are flawless, but IME the general trend over the last 16 years has been one of improvement, while nVidia is one of periodization. So if I were being forced into mandatory driver updates, I'd probably rather go with AMD. Or just go with Windows 7/8 and completely circumvent the problem - by 2017/2020/2023 (when Vista/7/8 go EOL, respectively) I'm hopeful that Microsoft will have rethought/improved mandatory update model, that driver-providers will have further improved their game, or that a third-party platform (e.g. OS X) will be able to really counter for Windows for gaming. I'm not anti-update, just anti-avoidable-system-breaking-stuff. * EDIT for nitpicking: 7900GTX itself came out in early 2006, but its a refresh of the G70 (GeForce 7800) which was released in early 2005, and is directly based upon NV40 (GeForce 6800) released in early 2004 (they are quite similar, internally). The GeForce 6's primary competitor, the Radeon X (for the 6800, this would be the X800 and X850), is incompatible with Skyrim (it doesn't support SM3.0), but the GeForce 7's primary competitor, the Radeon X1000 (X1800, 1900, and 1950), would work with Skyrim (it does support SM3.0, and there are boards with 512MB) - I couldn't find a video example though.
  16. You used to be able to get the add-ons from the Bethesda website (some of them were free, some of them weren't), but I don't know if that's still a "thing" these days. What add-ons are you actually missing from the anthology? (I understood it to be complete) As far as crossing from Steam to non-Steam games, that may be a problem - iirc they're somewhat distinct builds due to Steam's DRM. I know that Steam will not let you "install over" a non-Steam game, so at the very best you'd be downloading the entire thing from Steam and then picking bits of it out for your disc install, and that still may not work. Honestly if you're going to the trouble of downloading the whole thing via Steam, I'd just go with that install moving forward.
  17. Re-install Steam on the new machine, and just copy-paste the Skyrim folder (the entire thing) on the new machine in its Steam directory. You may have to point Steam at the copy-pasted directory (it's been a while since I've had to do this, but its very common-sensical once you're actually looking at it). If the machines are significantly different there may be some quirks, e.g. if you're going from WinXP to Win7 you may have to "take ownership" of the folder once its pasted (http://www.howtogeek.com/howto/windows-vista/add-take-ownership-to-explorer-right-click-menu-in-vista/), and you'd be wise to let the auto-config re-run and write you a new .ini based on the new machine's capabilities and environment. You should, however, be able to transfer savegames without a problem (I've done this much more recently - as long as the overall game + mods is the same, its just a straight-across transfer).
  18. Asking for 3+ configurations per respondent is relatively "needy" - why don't you try putting something together yourself and then ask for critiques? I'm not at all trying to cast aspersions towards you, more trying to get you to recognize you're asking for a considerable time investment on other people's part with no compensation/etc on the backend. Given your budget, what you want is entirely possible - I'd personally go with an Intel CPU, DDR3 memory, and your choice of nVidia or AMD graphics (I have a slight preference towards AMD these days, but it doesn't matter too much either way). If Skyrim is your goal, you could easily do this build for more like $1000-$1300 (and if you have parts you can "recycle" from a previous machine you may save more, e.g. if you don't have to buy a case, drives, monitor, operating system, peripherals, soundcard, power supply, etc etc). If we/you have an idea of what parts you have, what parts you're looking at, etc it can be tremendously helpful in guiding you. As far as the RAM, 32GB is a total waste for gaming; 16GB is also overkill but given current memory prices is probably a healthy place to be. For the whys and wherefores: the vast majority of games, including Skyrim, are 32-bit applications - they will NEVER use more than 4GB of RAM (no matter if you rub the engine with cheetah blood or what) as that's a limitation of 32-bit applications (and out of the box, Skyrim and other games don't have LAA flagged, which means they're confined to 2GB). While 64-bit games are trickling out, I've seen nothing that has system requirements higher than 8GB of memory (the 64-bit games I've seen thus far usually say 6-8GB for overall system memory); 16GB is overkill but without being excessive. 32GB is just ridiculous. That isn't to say there aren't *other* kinds of tasks for which 32GB could make sense, but gaming just isn't really one of them. If you're doing a lot of DCC stuff, or running a lot of VMs, or other memory intensive tasks like that (that will either have 64-bit applications, or multiple "big" 32-bit applications) the extra RAM can be useful - it really comes down to what you need. For Skyrim alone, 8GB of RAM is more than good enough, along with a decent Core i3 or Core i5, and current mid-range (or better) graphics card (e.g. GeForce 960). But like I said, given current memory prices, unless you're on a very tight budget (or just want to save money), going with 16GB isn't a big problem.
  19. NVIDIA and "Gameworks" is nothing new - they've been doing this same racket since the "Way It's Meant To Be Played" campaign a few years ago (which, as far as I know, was born out of the unholy alliances ATi was forming with a lot of devs back in the day, that led to tons of super-ATi optimized code that ran like junk on multiple generations of NV cards)). Where it's gotten "off the deep end with greed" (and I do agree 100% with that) is when NV is deliberately doing stuff that breaks non-NV systems, for example using PhysX and Hairworks and so forth - I remember reading a presentation recently that showed a newer game generating tons of superfluous (non-drawing) polys that would wreak havoc on AMD and Intel GPUs, but the NV driver was designed in such a way as to ignore that little "gift" and voila, "NVIDIA HAS SO MUCH PERFORMANCE." It's also been demonstrated a few times they use PhysX in this way, with "CPU PhysX" deliberately using the least optimized, worst performing paths available (and the "GPU PhysX" isn't even fully done on the GPU - the GPU does part of the work, and the rest is done via optimized SIMD instructions on the CPU; basically the graphics card is acting as a DRM key to push NV hardware sales and providing some computational assistance, but it's nothing like the original "full hardware offload" that Ageia promised us back in 2005). They've also started holding various features "hostage" - things like DSR or power management - want a new feature? Gotta keep buying GeForce cards. Sure some of it is architectural, but a lot of it is just artificial lock-outs to try and push sales (and this is an unfortunate shift from how they used to do things). This isn't saying ATi/AMD haven't had close relationships with developers recently too, but I can't think of an example where a game that takes a lot of AMD tech (e.g. a Mantle game) is deliberately knee-capped without an AMD graphics card in the system. They also haven't gone down the road of feature lock-out with their currently supported cards (but they've dumped a lot of relatively recent stuff a lot more quickly than some people would like (e.g. I think the HD 4800 series only got like two years of full driver support, and the 2000/3000 series were completely left in the cold when it came to video acceleration and compute support)). Another thought I had though, regarding why we see GTX 400/500 series and Radeon HD 7000 series as minreq for a lot of games: AMD really only supports GCN going forwards, and the VLIW4 and TerraScale GPUs (HD 2000 thru HD 6000) series have largely been left in the dark. NV, by contrast, aims at like a 5-7 year lifecycle for all of their products, and Fermi (400/500 series) is still under that. Same thing is happening with DirectX: NV has listed Fermi and higher for DX12 compliant driver support, but AMD is only worrying about GCN and up (and it's entirely possible some of their older DX11 cards could work, but it'll never happen because they're not being supported). So it may be that Bethesda (or other devs, because I've seen that GTX 400/Radeon HD 7000 listing on other games recently as well) is basically saying "we're only going to officially support platforms that are currently officially supported with driver/software updates" instead of going down the slippery slope of trying to validate EOL (or nearly EOL) hardware. A random example: Skyrim can (believe it or not) be run on GeForce 7 (not 700, 7 - you can go look this up on YouTube if you want to see it; it isn't pretty but it does work) as long as the card has sufficient memory (there's a few that do), but Bethesda lists GeForce 8 as the official requirement. That makes complete sense: GeForce 8 was on mainstream support until very recently, while GeForce 7 was axed in early 2013 ("but obob, there are 2015 drivers" -> those are mandatory security patches for Vista/7/8), and I'm guessing that big developers like Bethesda probably have some sort of heads up as to what hardware is and isn't getting the axe, so knowing that GF7 was going the way of the dodo within (roughly) a year of Skyrim's launch, it makes sense to just omit it. That *could* be what's happening with the GeForce 400/Radeon HD 7000 thing on some fronts. My point is, it's really unfortunate if this is the case, because ultimately the consumer loses - even if you have the "right" piece of hardware. It'd be nice if "they" (being NV and AMD and so forth) would go back to trying to kill each other with better performance, better features, lower prices, etc instead of just trying to sabotage and destroy each other and using games to do it.
  20. Because this never, ever, happens with PC players? :psyduck: I do agree, I've wondered about the various texture/graphics/etc mods for consoles - even if the Xbox/PlayStation could run the "base game" on full max ultra, that doesn't mean they'll handle the N+3 year Ultra Realistic Photomagic ENB Deluxe FX with all 4K textures and 20GB of additional textures and meshes to give you 7300 varieties of hair and on and on (the same reasoning applies to PCs). But unlike PCs, the Xbox/PlayStation can't be upgraded. OTOH, having things like "Unofficial [whatever] Patch" or other vanilla-resources mods shouldn't be much of a problem (again, same reasoning taken from PCs), and may actually be of benefit to all sides (e.g. fixing bugged quests, non-working scripts, broken AI packages, misplaced objects, etc etc) where previously that didn't exist for console players. It will be interesting to see how it unfolds for sure. To the original point, in reading that I did have kind of an alternate interpretation (and no I'm not a lawyer and I'm not trying to "divine the contract" or anything): by setting themselves as the owner of the such content, it may give them (Bethesda) a mechanism to take down content that would be objectionable/not allowable for something like Xbox Live; e.g. they could more easily kill pornographic content since they can say "we have sole rights to this, we're submitting a takedown notice." (and this is all a PR shuffle - the press will absolutely hold the publisher/developer accountable for a third party's actions (has been proven time and again) so it makes sense that publishers/developers want some stronger mechanism to guard against it) Or something along those lines. More broadly, I'm thinking this kind of language probably does have more to do with Xbox/PlayStation licencing and "mods for consoles" than anything else...
  21. If it's not occurring on a new save, sounds like the save is corrupted - if TESVEdit can't fix it, that's pretty much game over. Back up to whatever "a few hours ago" is represented by, and go from there (I'd also make back-ups of working saves).
  22. +1 - you'd need to find an Ivy Bridge (that isn't the chipset; that's the CPU family) CPU for this board. There are options, but none that are in current production. You can see what's supported in that board from the Asus website: https://www.asus.com/Motherboards/P8B75M_LX/HelpDesk_CPU/ I'd go from there to ebay (or whatever you like) to find a chip - the 3570 is a great suggestion, but I have no idea what availability or pricing would be like. It may be that it's significantly cheaper to buy one model down (this is true of many other EOL platforms in my experience - resellers love to gouge "the best chip on this platform"). If the chip you end up with needs a newer BIOS, it's easy to update - just do it with the existing CPU in place, and then install the new one and you'd be set.
  23. Just to point out, that fan was not installed "backward...for some reason" - it's a downdraft design for a reason: the CPU isn't the only thing that needs cooling, and various components around the CPU (like the VRM sections) need airflow to prevent overheating. That said, if the case is able to provide airflow for that area of the PC it's fine - it doesn't HAVE to come from the CPU fan, but there is a reason that by default downdraft fans are downdraft fans. The original temps you listed aren't going to kill the chip, but the newer temps are much better - I'd just make sure you aren't going to sacrifice the motherboard for them.
  24. Welcome to the club; building a PC can be a very fun experience, but the first one is usually a bit more daunting. I'd highly suggest leaving an entire day available to yourself for this - don't rush or try to cut corners, and you should be in fine shape. To your itemized questions: 1) Modern CPUs, especially Intel ones that don't have pins, are pretty resilient, and can only be inserted into the motherboard one way; the heatsink may be a little annoying to install, but there's little risk of cracking the die or anything like that (this isn't the AthlonXP days). I'd always encourage people to educate themselves, so if you've got some videos or guides to watch/read, go for it! More info never hurts. 2) Technically yes, legally no. You can only activate an OEM key on one machine, once - what that means is that key is tied to that machine. It cannot be legally transferred. You can re-install it as many times as you want on that original machine though. I'd also be a little leery of Win10 due to the forced driver updates (there's a thread on Nexus exemplifying why this is a very bad thing); go pick up Win7x64 or (if you must) Win8.1 x64. 3) That could be trimmed significantly and achieve the same performance. First, biggest thing I'd do is dump Skylake and the DDR4 right into the sea. It provides little to no measured, verifiable, objective performance upgrade for gaming - Haswell is as good as it gets and has been for over a year, and DDR4 is silly expensive compared to DDR3 for no measurable material gain. Save your money and get an 1150 with a Z97 or Z97x, grab either an i5 4670/4690 or i7 4770/4790 (there's almost no difference here; only go for the i7 if you have something that can benefit from HyperThreading (e.g. you do tons and tons of video encoding)). Sources: http://anandtech.com/show/9320/intel-broadwell-review-i7-5775c-i5-5675c/8 http://anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/15 (you will want to click around in both articles, but the salient point is there is no good reason to dump all that extra money on Broadwell or Skylake for gaming - Haswell and even Ivy Bridge are completely good enough even today) I'd also dump that PSU for something both more robust and not Corsair (unless you move up to a higher end series from Corsair); Corsair's entry level stuff has dropped off dramatically in quality in recent years and even if it hadn't, 450W is pretty slim for everything that you've got in there + leaving upgrade potential. Dump the DDR4, Skylake, etc and upgrade to a ~600W PSU. You also only have a single DIMM - that's not the best scenario. Ideally you want two of matched capacity, for dual channel operation. I'd change that up too (and again, with DDR3, on 1150). Otherwise looks good - nice looking case, nice hard-drive, nice graphics card, Asus makes good motherboards (I'd honestly just switch to an Asus Haswell board; ASRock is also worth considering - just see what prices are like), G.Skill makes good RAM (if you want to comparison shop, Kingston, GeIL, and Corsair are all good too), and away you go.
  25. As much as I had hoped this wouldn't be the case, Windows 10 certainly seems to be living up to this (or worse; ME can actually be made stable and usable on some platforms). I was honestly looking forwards to Win10 for a few features, but the forced-driver installs (and resulting issues with "always must have newest driver" really are the last straw. I get the idea of mandatory updates for security and stability and consistency, and I'm guessing that for the vast majority of users the actual Microsoft provided platform updates, and regular security updates from Oracle/Adobe/Google/Mozilla/etc are only to their benefit, but forcing driver updates (and let's be clear here: this is not the first time, and won't be the last time, that nVidia releases a system-breaking driver) has very little upside, and huge potential for problems (like when you're pushing a system-breaking driver).
×
×
  • Create New...