Jump to content

obobski

Members
  • Posts

    472
  • Joined

  • Last visited

Everything posted by obobski

  1. +1 on the 80-series chipsets and Haswell; you have to double-check that it will support Haswell. The 90-series will support it (and Devil's Canyon) out of the box. No reason, imho, to go 4790 if you're just after gaming - a lot of benchmarks show the similarly clocked i5 (4690) to perform the same in games, the only place where the 4790 will really shine is encoding/compiling/etc where HyperThreading can actually provide some advantage. If you're not interested in overclocking, the S-suffix CPUs will use less power and run cooler as well. On the graphics card, GTX 960 and 980 don't have the 'memory bug', and there are also older nVidia cards that may still be available like GTX 770. Not sure what the problem with AMD is (my 290X has had no problems, and actually has some better compatibility with older games than my GTX 660SC with the newer nV drivers), but personal preference is personal preference (so get whatever you're comfortable with). If you don't mind shopping used, GTX 780s may also be worth looking at. SLI may also be a consideration, like dual 960s or some-such, depending on your budget and performance target. For the PSU, I'd probably look at FirePower, Antec, Enermax, EVGA, etc. Corsair's top of the line models (AX) are still very solid, but I've seen some less-than-favorable reviews of the mid-range stuff (like the one you listed). Basically, check JonnyGuru or some other high quality PSU reviews for the specific model you want (or just to find a top-rated model in the size range you need). Something in that 700-800W range will be fine for this build, and give you room to grow.
  2. The ini will be in My Documents/My Games not in the Fallout 3 folder. You want to add that threaded AI as well as iNumHWThreads=2; at least that's what has fixed Fallout 3 on multi-core CPUs for me (I've tried setting HWThreads to 4 on quad-core CPUs, and it doesn't seem to make any difference from 2).
  3. If it's not HyperThreading at fault (technically this shouldn't be a problem, as the motherboard's BIOS is supposed to enumerate logical cores last in the MADT, but that's making an assumption that everyone had their morning coffee and followed the directions, but I've seen systems that don't do this right so turning HTT off is worth checking), it may be some other application you have running that affects the 3D drawing (e.g. FRAPS, amBX, some recording applications, etc) which can really impact NV's performance. Also, you didn't mention, but are you trying to force very high levels of AA or some other image quality thing thru the nVidia driver? That may also be causing trouble.
  4. I know back in the Windows XP days there was an application called 3DNA that did this - it came with a lot of PNY and MSI graphics cards. However, as far as I know, the company that makes it went out of business and it doesn't work with Windows 7/8.
  5. If I'm not mistaken, that "7" icon shows that it's still a compressed file (turning on file extensions can help navigating this kind of stuff) - it isn't uncommon for large libraries to be compressed multiple times ('nested') to save space and produce a single file for distribution. It may just need to be unpacked another time to get to the actual resources.
  6. If I'm not mistaken, the 4590 is faster than the 4460 (I just looked them up on ARK and that appears to be true), so I'd probably take the #3 option. The 290 is going to be somewhat faster than GTX 960, but having the faster CPU will probably help more at the end of the day for Skyrim (it's fairly CPU dependant) and other games. You could always upgrade the graphics card down the line if the GTX 960 ends up being not enough for the settings you want, but 960 is a pretty fast card in its own right, and should be no problem with Skyrim and other games.
  7. A solid state disk (or SSD for short) is a different type of hard-drive - instead of using mechanical mechanism (spinning platters) it uses flash memory (hence 'solid state') to achieve much lower latencies and higher bandwidth. It will improve things like boot-up time, application load times (e.g. how quickly does Fallout actually start-up, how quickly do level transitions load, etc) but it cannot improve computationally bound tasks (e.g. running with higher levels of AA, higher resolutions, more demanding games, etc) so I wouldn't suggest it ahead of a graphics card upgrade; you would still end up limited by the GT 630. That said, it wouldn't be a bad upgrade alongside a graphics card. Also remember, as with any hard-drive upgrade, you will need to either clone your existing drive onto the new disk, or reformat/install Windows and re-install all of your applications and transfer files over; the SSD will do nothing whatsoever for files not contained on it. To your proposed upgrades, GTX 770 is a very nice choice, and should have no problems with Fallout/Skyrim/etc at basically max settings (I don't know what resolution you want, and if you want AA or AF, etc). The 750W PSU may be a little overkill, but if it fits in the case that wouldn't be a bad choice either. Make sure you go with a reputable manufacturer (e.g. Antec, Corsair, FirePower, Enermax, EVGA, etc) - a good place to look for reviews of PSUs is JonnyGuru (just google for it). You may consider some other nVidia cards as well: GeForce GTX 660, 760, and 960 specifically. They may work with your existing 460W PSU, and any of them would still be fast enough to get you what you want (I've run my 660SC on a 450W PSU just to test it, and it works fine, for example). Switching to AMD (Radeon graphics) would also be a consideration - the only extra step would be uninstalling the nVidia drivers and installing the AMD drivers (this is really easy to do). There's no big reason to do this, unless you find a really good price on a Radeon or want the gaming bundle (afaik they still offer that).
  8. Agreed - there's no good way to use a USB headset with a soundcard. There are some non-analog headphones and headsets that can be used with soundcards (Audio-Technica, for example, has a few models that have TOSlink connections), and many wireless setups can be used with soundcards if the base station can accept a digital or analog input (Sennheiser, Audio-Technica, Sony, Koss, and JVC all produce examples). But USB implies that the headset (or whatever) has its own audio interface - so it's basically including the same function as the Sound Blaster in the bundle. Now, to your second question about 5.1 <-> 7.1, it generally would not be a problem. With a 7.1 output to 5.1 speakers, the 7.1 source should be able to be configured to have its rear surround channels disabled (e.g. Audigy 2 ZS (which is 7.1) plugged into 5.1 speakers), and with 5.1 source into 7.1 speakers/whatever you will either get nothing on the rear surround, or the speaker/headset/etc may have some sort of upmix feature that will put something on the rear surround channels (some Logitech and Creative 7.1 sets do this). A lot of the wireless headphones I've referenced will have various DSP features as well, because they're generally designed for home theater, so you'd probably just want to take the 5.1 Dolby Digital Live output from the Sound Blaster and then let the base station unit perform whatever processing for the specific headphones.
  9. If it said "fulfilled by Amazon" Amazon will handle returns like normal, but if it was Corsair "through" Amazon (or Amazon Payments on another website) you're limited by whatever their return policy is (probably means not free shipping to send it back). Of course if they do something really jakey, Amazon CS tends to be pretty good, but honestly I'd be *amazed* if Corsair was bad to you. The PSU you picked is a little "cheap" for a TOTL build, but capacity wise it should be fine (IOW there should be nothing wrong with it, but usually for TOTL builds you see more expensive boxes), and I doubt you'll have issues unless you just receive a bad unit,which can happen with anything; and Corsair tends to be well made stuff so I wouldn't be too afraid of that. Of course if the box arrives and looks like it was run over by a tank, I wouldn't try plugging in the remains of the PSU, but if it's all in one piece, not dripping wet, etc it should be fine. Of course if you're hooking up like 16 hard drives, TECs, etc blah blah you should have a more robust PSU (or two), but I'm assuming you have a pretty conventional build with a single graphics card, CPU, some RAM, a mainboard, and that's probably about it.
  10. Not to state the obvious, but at 10ft is there any reason you can't just buy a 15-20ft Ethernet cable? (In fact, is there any good reason you aren't doing this from the beginning?) Anyways, to the specific problem - have you checked to ensure that the drivers for your laptop's WiFi controller were properly re-installed, and then check for updates? Have you ensured Windows itself is updated?
  11. I think I see part of the confusion, seeing the R9 295X2 at $1000 from those brands. There was a price drop recently to $999, so all 295X2 should be $999 or under. Sapphire and XFX are two of the best choices for AMD-based cards, and have been for a number of years. As far as I know XFX still offers a lifetime warranty on the card, I don't know about Sapphire. I agree on non-reference boards tending towards quality, to a point. One thing that can be obnoxious is finding replacement coolers - for example I have a non-reference HD 4890 (Powercolor PCS) that will not mount the standard 4870/4890 cooler because of the placement of some components. That also can make some full-coverage waterblocks not fit. Otherwise it's a fantastic card (and the stock cooler on it isn't bad; I'm just using it as an example). You do realize that the Titan Z and R295X2 are dual-chip designs, right? You don't actually get 8GB or 12GB or whatever of memory for the application to use; you get half of that (effectively) because they're (they = AMD/nVidia marketing) stating total memory for both GPUs. So it's like saying GTX 780 SLI is "6GB" of RAM - from the application's perspective it isn't, it's still 3GB. Not that Skyrim could ever approach needing that much memory (some other games may, but worry about them in the future when and if they come out). Personally you couldn't *give* me another dual-GPU card (I've owned a few over the years); permanent multi-GPU is not a good thing imho (and at least with CrossFire, you can probably expect some issues in Skyrim), nor is the extra heat production. If you want multi-GPU, add a second card to whatever you already have. Price-to-performance wise, adding another GTX 780 to your GTX 780 would be better than 295X2 or Titan Z etc because a GTX 780 should cost under $800 in pretty much any circumstance. nVidia also (still) has better frame-pacing, which is something to consider with multi-GPU. IMHO if you're really running everything you like without problems, leave well enough alone, put your money away, and wait until a game comes along that crushes your system, and then worry about upgrading. Things like Titan and 295X2 are an awful value proposition right now, because they're fairly dated (but that hasn't (and probably won't) "caught up" in pricing). If you really want to buy something new today, GTX 980 wouldn't be a bad choice, but the performance improvement over GTX 780 is not huge. Alternately the R9 290 and 290X are becoming very affordable relative to their performance, and could make an intresting multi-GPU system (you can almost do triple-GPU for the price of the 295X2, assuming your motherboard, PSU, etc will support something like that (actually that's worth talking about even in context of the 295X2, with its 500W TDP)).
  12. A MARS that isn't thousands of dollars and limited production? And it will fit in a normal case? And doesn't have an external power brick? What is the world coming to. :teehee: For those who don't get what I'm referencing, this is how nutter-butters Asus dual-GPU boards have been in the past: http://images.numerama.com/img/s/h124831-360-360-asus-extreme-n7800-gt-dual2dhtv.jpg http://www.dvhardware.net/news/2012/asus_rog_mars_iii_dual_gtx_680_tpu.jpg :dance: I will give it that it only costs $650 and is actually available for purchase, and brings more performance than Titan, 780, etc without running the price even higher. And that kind of sanity is certainly needed in the modern GPU landscape (high-end cards really don't need to be $1000+). But otherwise I don't see the appeal over buying a pair of GTX 760s - it costs $150 more and has less monitor connectivity options and more rigorous cooling demands. If it ran more like $500-$550 I think it'd be a much better overall "thing" methinks; maybe it'll drop with time. Honestly I'm surprised nVidia has not released a reference-design dual GPU board for the 700 series yet; I've heard rumors about a "GTX 790" but have not seen anything materialize. I'm also surprised Asus went with GTX 760s and made a reasonable product; I was actually half expecting them to put together a dual Titan in a 3-4 slot, 500W+ monstrosity... :facepalm:
  13. All excellent suggestions, as well. Buy yeah, 4GB of VRAM is nearly useless. I only recommended that because *technically* Skyrim can use 3GB of RAM, meaning 3GB is also mirrored into VRAM on DX9. People claim to get performance returns from above 2GB on a 760 when they use ridiculous amounts of HD textures in Skyrim. Not sure if it's true, but that was also one of the only overclocked 760s I could find so I suggested it. The 760 you recommended would almost certainly perform better. You've got that backwards. Skyrim can use up to 3GB of system memory (with IMAGE_LARGE_ADDRESS_AWARE and 4GT configured under 32-bit Windows; under 64-bit Windows the same will allow up to 4GB (http://msdn.microsoft.com/en-us/library/windows/desktop/aa366778(v=vs.85).aspx)), but the overall application system memory footprint is NOT duplicated in VRAM. What is loaded into VRAM must, however, be backed by system memory under DX9 (and older). VRAM holds things the GPU chews on - not the entire application (that's what the system memory is for). You do not need more VRAM to allow the application to consume a lot of system memory; you do need a lot of system memory to let the application consume a lot of VRAM. Here's a comparison between 2GB and 4GB GTX 770s: http://alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/ The 4GB card does nothing but cost more and look slick. Werne is absolutely right - there is no reason to need 4GB of VRAM unless you're doing GPU computing or workstation applications; for the most part even 1GB is entirely sufficient for Skyrim, as long as the GPU behind it is up to the task (as a random example, the HD 5870). But marketers have ostensibly "won the hearts and minds" of customers, and now we've got all sorts of hooplah about how 2-3-4GB is required just to browse the web, run an HD monitor on the desktop, etc. It has truly gotten out of hand... Something that I noticed in the first post: the system as-is has mismatched memory modules, so it almost certainly is not utilizing dual channel RAM. Might be worth taking a look at, long term, especially if the motherboard is being replaced. the "fix" is to have equal DIMMs in each channel, and it will increase overall memory bandwidth. Replacing the graphics card is a much more important task at the present time, but the memory configuration would be a "next up" upgrade if it were me. The point about "other TES games" also seemed a bit pertinent to address - in general if it can run Skyrim, it will run Oblivion and Morrowind better (because they have lower overall system requirements and are less complex). However that doesn't mean that Morrowind will magically become a model citizen and close its memory holes and run stable, or that some of the biggest/most ambitious mods for Oblivion won't still grind things to a halt because the engine just doesn't want to draw 400 NPCs all in one cell or deal with moving references or whatever else. Point is - yes you can (and should) upgrade, and yes you will realize considerable performance improvements, but some mods just wreck performance and the titles do have limits to what they'll handle smoothly. Just food for thought.
  14. Epic build... But I just hope you don't regret that monitor. :3 After my last 2 cheap monitors, I've made it a rule to spend as much on a monitor as I do on a video card, to balance the visuals. A $500 video card is a bit of a waste on a $200 monitor, for example. Your card will be putting out details and colors that you won't be able to see. :s Very nice build indeed! No idea if the monitor is good or not - have no experience with Asus monitors. That having been said, monitor quality doesn't seem to have a lot to do with price in my experience: I've got a Hannspree on the desk that cost (no joke) $89.99 and it has fantastic color/response (let me put it this way: it's good enough to run in a triple-head configuration with flat CRTs and not stand out). Granted the price was probably helped along by its design, but the hardware is solid. On the other hand I've seen $500+ monitors be complete and utter pieces of junk. I'd say it depends more on who made the monitor than how much it cost, but there's probably evidence to refute that claim too. I guess report back and let us know if the Asus works out or not - hopefully it's good. :blush:
  15. 6790 changes everything - that's a lysdexic moment on my part. Doh! Yes, 270X is a biggun upgrade over that (roughly double-over internal throughput, memory bandwidth isn't quite double though), but I agree with Werne on the 7870 - they're almost identical, so save the money by getting a 7870 (if possible). If the 4GB card is really only a few dollars, why not? Don't expect it to be a big performance gain over the 2GB model, but again - for a few dollars, why not? Having said that, I'd still look through your modlist, load order, see if anything needs cleaning, etc - you may find some performance improvement there too. I'll say PowerColor is very good - I'd put them on par with Sapphire. Both have been around for ages, and both make good cards. PowerColor may be better known as the TUL Corporation (PowerColor is a sub-brand), if that helps international members identify them. Werne - I think Thor is probably onto something with the card pricing going up due to bitmining and distributed computing; supply and demand and all that. It's also probably "allowable" because of the runaway train that has been nVidia pricing in the last few years.
  16. He's running mods, and we have no idea how many. That said, mods can easily bog down any PC. Upgrading his graphics card might give him great performance, or it might do almost nothing. Really it depends on what kinds of mods he's running and how many. I think we're basically on the same page here - more information is needed about the software and potential conflicts (or bad load order) before we can talk hardware upgrades. I'd add that the specific choice of the 270X is unlikely to do much - the performance difference between them is very narrow.
  17. HD 6970 should not be struggling with Fallout 3/New Vegas or Skyrim - not at all. Here's a TPU review of the 270X that has the 6970 included for comparison in Skyrim: http://www.techpowerup.com/reviews/AMD/R9_270X/21.html 270X is *slightly* faster, but both are putting out very playable frame-rates (both are over 60 fps at 1600x900 - higher than your target). 68 vs 78 fps is not worth a few hundred bucks - you won't notice it, and if vsync is on it won't matter at all. If you have something going on that's dragging performance down hard enough to see stuttering/lag on the 6970, the 270X will very likely hit the same wall. I'd look at your load order, modlist, etc - there's probably something in there that's killing performance; a slightly faster graphics card is unlikely to resolve that (it may not give you any measurable benefit, depending on what's causing the hang-up, or it may give you a marginal improvement (as it would in the base game) which may or may not get you above the stuttering)).
  18. Honestly I can't say I'm surprised about the hard-drive - they've been known as "DeathStars" since the 1990s for a reason (and every time the division gets sold, it usually gets worse; first it was IBM, then Hitachi, now WD (as "HGST a Division of WD")). Go with Seagate or WD - consistently more reliable, and they'll replace the drive if it comes DOA. I would be very leery of "hybrid" drives - you're lashing an SSD to a mechanical drive, so now you have the reliability concerns of both all rolled into one, and it lives or dies as one. If you want an SSD, get a separate SSD - you're using a desktop, and you have no reason to need to fit everything into a single 2.5" bay (like a laptop would). On the rest - no reason not to go Intel if it's what you want. 4670 should have no problems whatsoever. ASRock makes good boards, but I don't have experience with that specific model - I'd assume you shouldn't have problems though. You may also want to consider a board with more larger PCIe slots - not necessarily for graphics cards, but for other kinds of expansion cards (x4 and x8 cards aren't as rare as they once were, and boards with an x4 or x8 slot wired into an x16 physical aren't that rare either - means greater compatibility with whatever you might need to hook up). On the PSU compatibility thing - I'll defer to werne; I'm not very familiar with the issue. Gaming performance, as Werne points out, is generally GPU limited - sure if you had a really really terrible CPU (like a first-generation Athlon64) it would be a big bottleneck, but the GPU makes a much bigger difference at the end of the day (e.g. if you put a GTX Titan with said Athlon64 it would still probably have a chance, whereas if it had a GeForce 6200 it'd have no chance - but having a newer CPU along with that Titan would be the best situation; by contrast having an FX-9590 with a GeForce 6200 would still have no chance). I'd look at upgrading the graphics card if/when possible - wouldn't worry so much about "generations" (unless major technical changes (e.g. API updates, major hardware fixes (like GF6->7 was a big deal), etc) happen, it's mostly marketing in action - sure "newer is faster" is often true, but by how much and whether or not it's worth the money is another story); just look for something faster. GTX 770 for example (which is basically a GTX 680 under all the branding - its clocked a nudge faster) would be faster, so would one of the higher-spec Radeon R9 cards. If you aren't playing at very high resolutions + the very newest games, you may not need this upgrade however (that doesn't mean ignore replacing unreliable/failing parts).
  19. Build looks better, but I still have a few nits to pick (sorry!): 1. I'd drop the 4GB graphics card. You don't need that much video memory - nothing will benefit by it. Go with a 2GB 770 and save your money. Here's a comparison between them: http://alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/ 4GB doesn't look like it's doing a whole lot of anything besides costing more. You can even get an OC'd 2GB card and still save $100: http://www.amazon.com/PNY-GeForce-DisplayPort-PCI-Express-VCGGTX7702XPB/dp/B00CZ7Q028/ref=sr_1_5?ie=UTF8&qid=1391963606&sr=8-5 (Why PNY? Because I've always had good luck with them - that doesn't mean they're the only show in town though; this is truly just going with what I know - Asus, Gigabyte, PowerColor, XFX, HIS, etc all make good cards too) You could save another ~$20 if you drop to a conventional GTX 770 (they're right around $300) 2. The RAM is good - I have the same stuff in my machine and it's been great. 3. Windows 8 is a fine choice - no problems there. 4. Motherboard should be a fine choice - Asus has a good reputation over time, and it's also helpful if you know someone who has the same (or similar) hardware; can make troubleshooting a lot easier. Same goes for the monitor - I forgot Asus made monitors actually. (Doh!) 5. As far as tutorials go - some of the stuff you've mentioned will be handled automatically within Windows (like defrag, updates, etc) since Vista (and remember: we do not defrag SSDs - they don't need it, and won't benefit (but may wear faster due to it; it should not instantly kill anything though)), other stuff I'd say just look up an answer on a per-issue basis. If you've never worked on a PC before (like, at all) there's some "day one" stuff I'd suggest, like: 1) Don't work on it with it plugged into the AC outlet 2) Don't mount the motherboard raw to the case; use standoffs or you'll short it (and it won't start) 3) Be mindful of ESD 4) Go slow, be careful, etc - don't force things, if things aren't going how they should, stop and consider why; most hardware is pretty fragile at the end of the day (plan a few hours for a complete build, especially your first) 5) When it comes to case fans, less is often more 6) Windows installers write the MBR in real-time, so be sure you're making the changes you want to make (with brand new HDDs you don't need to worry about this hardly at all, but it's worth keeping in mind - it isn't like Ubuntu or similar where it will ask you to confirm before it writes) 7) Make sure any important data is backed-up before you do anything potentially damaging - losing hardware or hardware configurations is one thing, having it eat your 300 hour Skyrim save-game or term paper is an entirely different story.
  20. GRRRRRRRRRRRRRR! Forum software ate my reply!!! :mad: So now to retype it! 1. Looks good. 2. Won't work exactly that way - usually if features come down the development cycle that outmode hardware, its based on hardware feature support - the entire GTX 700 series would fail together, not just the less-expensive parts. The benefit of the top-end cards is higher performance with maximum IQ/resolution settings - if you need to support a very high resolution (QHD or 4K or multi-monitor) gaming environment the GTX 780 or Titan would have advantages, but otherwise I'd save the money and go with the 770. Take that money and put it in the bank - in 2-3 years if you need more graphics power, you'll do much better buying a new card (which will likely be double-over the performance of the 770) than having spent it all on the 780. 3. Unless you want to watch Blu-ray movies on your PC (which is a hit-and-miss experience), I'd probably dump it; just go with a DVD drive - it'll let you load software with no problems, and it'll handle physical media for games (I'm not aware of any game that uses BD-ROM). 4. See these articles on Wiki for comparison: http://en.wikipedia.org/wiki/Windows_7_editions http://en.wikipedia.org/wiki/Windows_8_editions If you're going with Windows 7, Home Premium or Professional are good choices (64-bit either way), for Windows 8 I don't remember Pro costing much more, so I'd probably splurge on it (I think it's only a $20-$30 upgrade). Both have relatively long support life-cycles on the current Microsoft roadmap; Windows 7 isn't "doomed" anytime soon. 5. In my experience, Dell, Samsung, and Viewsonic are good places to start. Unless you really want a very high resolution display, I'd probably stick around 1080p - it'll mean less performance requirement on the graphics card, and still gives you a big enough display for work. If you need more workspace, I'd go with a second monitor, as opposed to a larger single monitor. 6. What do the bad reviews say, and where are they coming from? Also, do they seem rooted in a specific time period - if they're all from say, 8 months ago, and you read positive things now, the issue was probably fixed via BIOS update (and new shipping boards probably have that update too). The hard-drive configuration looks good to me - you may find you need more space down the road, but you can always add later. I'd put Windows on the SSD, and probably your favorite or most demanding games and applications (whether or not they realize benefits from it is dependent on how they access data from disk - but most applications will see at least some benefit from faster read and lower latency access times). For recording, you can go software (like the paid version of FRAPS) or hardware (like capture devices from Diamond (inexpensive and simple) or Matrox (expensive and complex)) - I'd probably start with software (it costs less) and move to hardware if you need the extra functionality and performance.
  21. I'm not entirely sure where you're going here, but output impedance is not a measure of quality for an amplifier (be it for speakers or headphones). Thor - I think the confusion is coming from the article you linked in context of what you're posting; the article is about impedance selection for car audio speakers, not anything to do with the quality of source electronics or home audio speakers. Nominal impedance (or any other kind of impedance; output, input, etc) doesn't dictate or speak to the quality of a speaker or sound system - it's just a specification that's useful for matching components to one another. Judging the quality of a pair of speakers is a somewhat more involved process - unfortunately there's no simple metric like WEI or 3DMark for that, at least currently.
  22. What's that? First time I hear the term. I'm with you on this one... I've re-read the post 3 times and still come away going "huh?" :wacko: I'm guessing by "ohm rating" what's meant or being read is the speaker's nominal impedance, and on the sound card either the rated impedance loading spec or its output impedance, but none of that is a direct measure of "quality."
  23. Serious question: Have you ever played (or are you familiar with) Hitman or Tomb Raider? In both games the protagonist dual wields pistols, and while it isn't the most accurate representation of either weapon handling or reloading (Tomb Raider probably is more convincing, since she has the "ammo backpack" thing (they did it in the movie too), whereas Hitman seems to just produce the magazines from thin air (even in the movie...)), they're videogames after all (and if you're going to suspend disbelief enough to accept the premise of either, the fact that Agent 47 and Lara Croft can handle a pair of .45s isn't that much of a leap). Now whether or not Bethesda can/will pull it off just as well as IOI and Eidos have in the past, that's another story. But from a technical standpoint, I don't see why they couldn't if they really wanted to, given that it's been done before in a number of games (there are 5 Hitman games, and at least as many Tomb Raider games; both franchises have been around for a long while too). Alternately - what's to say it has to be dual-wielded guns? Why can't my character pick up two knives? That's a legitimate fighting tactic in some parts of the world. Or two sticks? Or why can't I box with two power fists? :devil: Finally - why should the game be a nanny and prevent my character from doing reckless things? I mean, I get that with something like Fallout 3 there are probably technical limits that prevent multi-weapon wielding, okay I can accept that. But if it could be done, why not? It lets you do all sorts of other reckless things, like consume massive quantities of addictive drugs and alcohol, pick fights with significantly stronger and better equipped enemies, climb around on rusty and unstable rubble piles, release dangerous and unknown viruses into a fragile ecosystem, etc. Is pulling out a pair of guns really that much worse? Also nuclear rocket RPGs (that the user survives at short range), drugs that instantly reverse the effects of radiation, and radioactive mutant soldiers are okay, but pulling out two guns and waving them around is too far? I guess I don't get it. :ohmy: No it's not, if the game is balanced in such a way that you end up at a disadvantage by not dual wielding then its inclusion has had a negative effect on anyone not wishing to run about like John McClane. If they must include dual wielding then there should be a significant accuracy penalty for using it, sadly I don't think Bethesda will go for that, all they care about it is things looking cool. I honestly doubt this would be the case - Skyrim includes dual-wielding and it is not required to play the game, nor are characters who don't build the skill at any significant disadvantage; it does not have a negative effect by not using it. It does incur its own penalties (you can't block), and the game's little "helper hints" even encourage you to weigh the pros and cons of increased damage output relative to not being able to block. Whether or not you dual-wield is based on your personal preferences and playstyle, but it certainly isn't required. I really could not foresee Bethesda doing something that so dramatically reduces the ability of players to play the game their own way. Also - what's the problem with things looking cool? It's a game after all - isn't that part of the point? (you know, to be fun) :blush: I'm not arguing that in real life it's a bad thing to do with your pistols (there's even a MythBusters episode that tests it, so that you don't have to waste your own ammo trying it out), but if I wanted to shoot guns off with perfect realism, I could just go shoot guns off (I don't know if this true in all parts of the world, so I guess this might be an overly broad statement - but in my situation, going to a shooting range isn't an unreasonable proposition). It's all the other stuff that I can't do and reasonably expect to survive - like fighting a mutated bear with a sledgehammer or taking on an army of robots with a BB gun. Being able to have two guns out at the same time doesn't seem like that much of a leap.
  24. Looked it over - few thoughts come to mind: 1) What's up with the RAM? You have two different sized modules going on there. It would work, but you'll get better performance with two of the same size module running them in what's known as "dual channel" - I'd probably get two 4GB modules in a kit (you can snag such a thing on Amazon for $69.99 too). 2) Do you really need a GTX 780? To determine: what games do you really want to play, and at what resolution? 3) Do you really need Blu-ray read/write capability? A DVD read/write drive would be less than half that price. (And Blu-ray recordable media is very expensive for its size) 4) Do you really need Windows Ultimate? Most users are generally more than covered with Home Premium; Professional edition adds some networking and crypto features that very few people will likely ever touch, I honestly forget what Ultimate adds beyond that - my point is, I'd probably cut down to 7 HP or 7 Pro and save $40-$100. 5) I'm unfamiliar with the maker of the monitor you selected (by itself probably doesn't mean much), but it seems like quite a lot of money for a monitor. ~$200 for a Samsung is probably more than good enough; if you want something ritzy (and with an equally ritzy warranty) look at the Dell Ultrasharp monitors. The PSU and Case look fine, and the CPU/mobo combo are compatible, and will give you good performance. You don't seem to have a storage hard-drive in there; a single 128GB SSD is pretty limited. If you're right up against it on the budget, I'd probably cut the SSD and replace it with (at least) a 1TB hard drive. On the other hand you could trim some of the "fat" off the machine (e.g. drop down to a less expensive version of Windows, a DVD writer, sort out the RAM, drop to a GTX 770, etc) and keep the SSD and add in some big hard drives for storage (or upgrade to a larger SSD or multiple SSDs, etc; you have a reasonably large budget and have a lot of options as a result). Oh, I meant to link some resources for building a PC too... This is somewhat older, but has some good info and a lot of pictures: http://electronics.howstuffworks.com/how-to-tech/build-a-computer.htm Also, I'd be very cautious about suggesting or dealing with Systemax (this means Tiger Direct, Circuit City, CompUSA, etc); in recent times their ResellerRatings rating has gone into the absolute toilet: http://www.resellerratings.com/store/Tiger_Direct (last time I pulled that up it was still over 2.5...sheesh) They used to be a great company to deal with, and I have no idea what happened, but apparently it was pretty significant.
  25. Thank you for the advice, but you're making the assumption that I'm not already into PC gaming. I have a medium-grade gaming PC (2GB GTX 660 GC, 8GB RAM, AMD Phenom II x4 at 3.2Ghz) and a 1080p LED monitor. Needless to say, I run all my PC games at the native res of 1920x1080. I'm only looking for a low resolution screen specifically so that I don't have to subsample consoles any more than necessary. I don't mind if the image is downscaled, but I don't want it to be upscaled. True, I did make that assumption. :blush: Anyways - you aren't going to accomplish precisely what you want with the PS3 - it will give you 720p as a result of whatever happens internally (and that's that). Since you seem almost allergic to resolutions higher than 720p, you should probably go with an SDTV. A small computer monitor will still be 1368x768 or 1600x900 (it would work fine, but you've stated this is not acceptable), and most 720p TVs will not give you 1280x720 natively. Your best bet there is to go with something used; I'd look for a Wega (as previously stated). KV-32FV16 (and smaller KV-27FV16 etc) is a good choice - has the 16:9 mode, will be under 720p, produces a sharp picture and good color (and can tweak its output), and is a big-ish screen. Will be glorious with the PS2; not so much with the PS3. There's a big list of various models on Wikipedia: http://en.wikipedia.org/wiki/FD_Trinitron_WEGA Note that all of the resolutions are interlaced. A lot of them don't have HDMI either.
×
×
  • Create New...