-
Posts
157 -
Joined
-
Last visited
Everything posted by dave1029
-
There's already a mod for this.
-
Can someone make it so we can assign our tamed creatures to various tasks. If you need realism, make it deathclaws only, as lore says deathclaws are actually pretty intelligent. I want to be able to assign them as provisioners for supply lines.
-
New Computer Specs Recommendations
dave1029 replied to CopperHeadGhost's topic in Hardware and software discussion
My problem with Digital Storm and others like it, is that they are just ridiculously more expensive. And usually, not always, something bottlenecks the system. For instance, they may put a bad ass awesome GPU, but not give a CPU that is near the performance of the GPU. I just use a local PC shop down the street. I do all the research, tell them these are the parts that I want, blah blah blah, and they charge me a $70 flat fee and do it. The only reason I don't build it myself, is because if the parts come in faulty, or something breaks during instillation, no skin off my bones. It's basically an insurance cost. -
800 isn't near enough if you want to play these games at 1080p @ 60 fps. For a "solid" PC build, you're going to have to spend at least $1500. Here's some of the parts broken down for you: Monitor: $150 for a good 1 ms HD one. These vary heavily in price that number is just being thrown out there as a guideline. OS: $70 these days? Keyboard: Gaming: $50, normal $15. CPU: $250 GPU: $400 RAM: $120 PSU: $200 HDD/SSD: $150 Case: $70 Don't think I missed anything... but those are prices you'd expect to run those older games at 60 fps @ ultra, and the newer ones at medium @ 60 fps. Don't forget, that's if you built it yourself... There's fees associated with someone assembling it for you.
-
Necessity of better hardware for the future
dave1029 replied to dave1029's topic in Hardware and software discussion
Sometimes I look at screenshots and wonder how the player can see where they're going, it's OK artistically for screenshots but I wouldn't want to play like it. Games don't need DOF, the eyes react to a virtual world on screen in exactly the same way they do in the real world, the eyes still focus on the point of interest with the rest covered by peripheral vision. +1 -
Necessity of better hardware for the future
dave1029 replied to dave1029's topic in Hardware and software discussion
DOF doesn't even make sense in gaming. We can move our eyes to survey our entire FOV, but in games you can't move your eyes to focus on a position so DOF becomes an ugly unrealistic hindrance. -
Necessity of better hardware for the future
dave1029 replied to dave1029's topic in Hardware and software discussion
I think the 295x2 and Titan-Z are the first single PCB cards to be able to handle ultra-realistic graphics @ resolutions higher than 1080p. For me, 4k looks way better than 1080p, but only slightly better than 1440p. And The Witcher 2, being one of the best looking games I've ever seen, runs above 60fps maxed (no ubersampling) on a single r9 290x @ 1440. So I think these newest cards can handle it, and it's only going to get better. -
Necessity of better hardware for the future
dave1029 replied to dave1029's topic in Hardware and software discussion
Oh it's definitely the games. Which is why we recently had this big bump in the need of better hardware because of the new consoles. But my point is games are starting to get pretty enough to the point where having cutting edge tech isn't giving as much extra eye candy as it used to. Imagine Skyrim on low *barf.* Watch dogs on medium did not look terrible. Didn't look good, but the scenery wasn't bad enough to detract from the enjoyment of the game. -
I think, we are finally at the point where increasingly better hardware is less important. Games with current top end hardware look absolutely gorgeous. To a point where I can't imagine in 5 years us saying it's an eye sore. Today, I can't go back and play oblivion because of the graphics, yet in 5 years, I wouldn't see a problem going back and playing Skyrim because the graphics look great with minimal mod usage. Thoughts?
-
I believe this game will set the tone for all "next gen" games. It looks... well... insane. Reported as 20% larger than Skyrim with no load screens. If games like this can become the standard, then maybe we will stop getting half ass games in the future. Now of course the game isn't out yet, so this is all speculation, but the developers of this series have been good to their customers. They have no reason to lie or misguide.
-
Got the witcher 2 for $5. Having a blast with it.
-
Maybe they said AMD couldn't, or AMD just doesn't care enough to market it with HDMI 2.0 in mind, with the knowledge most of us who care about 4k are going with DP.
-
It's the exact same panel, only put in different cabinets and sold with different promotional materials. Samsung definitely has a prettier stand and it also claims 370 cd/m^2, which may indicate a brighter backlight. On the other hand the Asus stand is adjustable. Tough choice. If the people behind HDMI cared one bit about the gaming market, it would support 120Hz, since physically that's what it does to deliver 3D pictures and every active 3D TV can run at least 120Hz, usually more. You can believe what you want, doesn't really affect me. Displayport uses 1 to 4 lanes to send all-purpose micro-packets, padded to match the resolution, using self-clocking LVDS signaling, at a fixed frequency of 540 MHz, AC coupled differential voltage of 0.2, 0.4, 0.8 or 1.2V. HDMI uses 3 lanes, one each for red, green and blue, with a separate clock signal, to send time-separated video pixels and audio packets, using TMDS signaling, at a frequency matching the current resolution's pixel rate, up to 600 MHz, single-ended voltage of 2.8-3.3V with 0.15V to 0.8V swing. As you can see yourself, the signaling is entirely different and electrically incompatible. Dual-mode displayport, which is what video cards use, is implemented by two separate transmitters, one for HDMI+DVI and one for Displayport, switching the port between them depending on which pins are shorted on the connector. They only share a connector to save room. The only time a HDMI 1.4 device can be firmware upgraded to 2.0 is when it was designed with HDMI 2.0 in mind, but produced before 2.0 became official, thus had to have its 2.0 mode disabled for compliance. Were Nvidia and AMD chips designed this way? Not impossible, but it's been half a year already, ample time. I guess we will just have to see. HDMI 2.0 was announced something like October 2013, and the 295x2/titan-z were released a couple of months ago. Perfectly feasible to think they were designed with 2.0 in mind. Just a waiting game at this point, I suppose.
-
It just does. Proven, tested, thousands of people run it on HDMI, most people who buy high-end hardware today play on TVs, not monitors.You can look up reviews, look up specs. And didn't your card include an adapter? A firmware upgrade through drivers would be like a firmware upgrade adding a turbocharger to a car that doesn't have one. You need new hardware for HDMI 2.0, much higher spec modem designs, it's on the same technology level as Ethernet 10GbE. You can't firmware update your 1 Gbps network card to 10 Gbps, can you? Same for HDMI. While at that, 10 GbE runs over exact same copper as 1 GbE, and that's a whole 10x speed bump - far better signaling hardware with near and far end crosstalk rejection makes the difference. It's what I've been talking about, lackluster quality. Don't worry, it's not worse than other TNs, about average. Just that there has never been a good full-size TN panel (a decent one was once made for Sony and Apple laptops). They're the low end of the market, never been 8-bit (forget 10-bit as high-end panels are), never had decent color reproduction, could never have because colors shift with viewing angles. It's OK I guess if you haven't dealt with high-end monitors or home theater TVs. Otherwise... As they say, "once you go black, you can never go back". Same here, once you run a year with a screen that ranges from pitch black to pure bright R/G/B/W, you'll never go back to one that ranges from dark gray to pastel. First off, why would I buy a $2k UHD gaming monitor when the games aren't even benefiting from 4k as much as they could. I'm quite happy with this monitor until the high end stuff becomes more affordable. Also, I still think you're wrong with the card. I don't see why firmware can't give the chip the new instructions. All they are changing is how the cables and transmitting the bandwith. My card can send the bandwith because displayport works. It's not a limitation of the hardware. HDMI said firmware/hardware implying that both could be used to make the change. Sony has also said firmware will make their existing devices HDMI 2.0. So. I'm going to stick with it being possible for AMD/Nvidia drivers to be able to fix the issue. HDMI would be committing suicide in the gaming market if they basically told everyone: "sorry none of your GPU's work. Going to have to buy the newest models for the 2.0 specification." Link an article saying my card or any similar card won't work with it, and I'll believe you.
-
Know what "backwards compatible" means?It means you can connect a HDMI x.y device to a HDMI a.b device and it will work. Not work as 2.0; work as the lower version of the two. Know what "connectors" are? Connectors are those physical plastic housings wrapped in sheet metal with copper wires inside. PHY layer. Of course they're the same. What defines HDMI version is interface on the chip. Well... try it.People have, it didn't work. 295x2 specifications state it only supports HDMI 1.4a. The chip doesn't have HDMI 2.0 support, putting two together doesn't change a thing, it's still the same chip as 290X. Same thing for Titan Z, same thing for 2x780Ti. Had an Asus PQ321 to test recently. I don't know what you have, but of all 4K computer screens that I've seen, PQ321 was among the best, it's got Sharp's 31.5" IGZO panel and full 10-bit color. The only 27" 4K panels I've heard of are cheap TN+ with 6-bit color, so I'll go out on a limb here and say that it's likely not better. PQ321 is almost good enough to use. Almost. But not quite. For one, it can't show black. So Metro 2033/LL - two out of maybe five games that do have the fine detail for 4K - are out; they're mostly black, so you want it in the palette. So-so uniformity, all colors washed out, can't tune gamma or they fade to gray. All in all, it looks like a 3 years old 2560x1440 monitor, only with slightly smaller pixels. Literally one built 3 years ago and never cleaned since. Others are worse, TN ones especially worse. Now, let's take... Sony XBR X900 4K TV. Whole other world. You can put a black image on it with one tiny bright spot, turn off the lights, and you'll never see where the screen border is. The bright spot will be blinding, everything else pitch black. Backlight is perfectly even, bright means blinding, no lag to speak of, colors just work. The kicker? Both cost about $3k, 32" PQ321 that looks like something unpleasant squeezed between two sheets of glass and 55" X900 that looks almost like a cinema screen minus all the people. But doesn't have displayport so won't do 4K@60. It's very nearly time for 4K, all it needs to move from purchasable tech demo stage to early adoption stage is suitable video cards. 20nm and new architecture to keep noise down, and of course HDMI 2.0. The next step is OLED 4K displays. Curved OLED 4K's have already been shown, some can even be bought. I haven't seen the best ones in person yet (few have), but from what reviewers agree on, they make X900 look like it has a dirty plastic bag stuck to its screen. Even brighter highlights, even deeper blacks, even faster response, even wider color gamut, perfect local contrast, and the curve helps immersion a lot. That will be the primetime for 4K. If the trends of previous display revolutions hold, it won't get better from that point on, just cheaper. I cannot find where the 295x2 supports HDMI at all. Only DP. And this is the monitor I have: http://www.newegg.com/Product/Product.aspx?Item=9SIA2RY1DK9756... Not the best. TN panel. But very affordable. Very clear, and with a lot of of color tweaking in the CCC, games just look absolutely gorgeous. I don't use ENB for various reasons in Skyrim, but this monitor makes it look like ENB with a few mods (like climates of tamriel). Just extremely cinematic. I would still like to know where the 295x2 supports HDMI 1.4a AND would like to know why a firmware upgrade through drivers could not make it support 2.0 if it is, in fact, needed. I mean, it sounds like they are just changing how the signal is sent through their Cat 2 cables.
-
Been doing some more research and it appears that the connectors don't have anything to do with it... but the monitor does. The monitor has to support HDMI 2.0. My 295x2 can support 4k @ 60fps on a 4k HDMI only TV... ONLY if the chip inside the TV allows for it. So, looks like me and fmod were both wrong. Graphics chips can only send that much bandwith over the cable if the receiving end of the signal will accept it. Moral of the story: Titan Z and 295x2 are just fine for future 4k gaming on DP or HDMI- just make sure the HDMI adapted monitor/TV has a 2.0 specification. Moral of the story 2: Don't buy a 4k TV until they are made with the 2.0 specification. Moral of the story 3: Even if it has to do with the connector and I have been wrong all along- 295x2 still wins because it only has mini display port. Which, with a category 2 HDMI cable and a 2.0 supported display, the 295x2 will deliver 60fps while the Titan-Z might not.
-
You're speaking with enough hubris that I'm not sure if I should reply, as this is not a specialist hardware forum after all. HDMI has everything to do with the "connector", or more precisely with the transceivers on either end. On the contrary, it's cables that don't matter - HDMI 2.0 uses the exact same Cat2 cables as HDMI 1.4. Have you read your own link? Because it says that too. 295x2 does have HDMI interfaces, but they share connectors with its MiniDP ports. MDP->HDMI adapters are not converters, they do not process the signal, they only change the mechanical connector. When you use such an adapter, the video card then sends a HDMI signal instead of a Displayport signal. It would have to send a HDMI 2.0 signal to establish a HDMI 2.0 connection. It can not. Only HDMI 1.4a is supported; thus it's impossible to connect to a 4K TV at 60 fps. In a perfect world displayport is better and everyone would be using it, but the reality is that the best image quality is only found in TVs, and they stubbornly stick to HDMI only. To you.To you. I can't look at 30fps for more than a couple minutes before I'd rather look elsewhere. I don't enjoy sharp lines showcasing every angle of old low-poly models or blurry textures stretched over even more pixels. If you do, more power to you. I know some people that have grown up with analog TV sets (as have I, but somehow didn't pick up this bug) seem to value sharpness above all when buying a TV set, so they try to hunt for it even on new LCD sets that are all perfectly sharp by definition. Maybe you share the same values. Or maybe it's something else, doesn't matter. I've been an early adopter for every display tech out there, Trinitrons, LCD monitors, plasma TV, wide color gamut, 2560x1600, active dimming LED, triple-screen setup. I will be for OLED 4K TV, in this order of importance, but not before there's something you can hook them up to without losing in image quality, screen size or framerate. You don't need to tell me what an increase in resolution looks like; been there, done that, even played around with 3840x2400 before it was cool and you had to hook up three cables to drive it. It may be your first display revolution, isn't mine. I'm glad that you're excited, but please - acting like a white knight amidst unwashed masses about it gets outright comical. Well I can't convince you about the HDMI thing if you're unwilling to listen. *Cough* Is HDMI 2.0 backwards compatible with HDMI 1.x?Yes, all HDMI versions are fully backward compatible with all previous versions. Does HDMI 2.0 require new connectors?No, HDMI 2.0 uses the existing connectors.
-
Do explain.http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-titan-z/specifications 1 - 3840x2160 at 30Hz or 4096x2160 at 24Hz supported over HDMI. 4096x2160 (including 3840x2160) at 60Hz supported over Displayport. This is HDMI 1.4a. It supports 3840x2160 at 30Hz - which is useless. Neither Titan Z nor 295x2 support HDMI 2.0. 60Hz is supported via Displayport, but there is nothing worthwhile to connect to Displayport, all 4K TV still only come with HDMI. 30 fps is not a comfortable framerate and it's well below the norm for PC gaming. So the extent of support for 4K today is: * The only games that have the detail to benefit from 4K won't run smoothly in 4K even on dual-GPU setups. * No gaming video cards support HDMI 2.0, so they only show 4K@60fps on small monitors with image quality below modern HT standards. Are you seriously this dense? HDMI 2 has NOTHING to do with the connector. It has to do with the cable and how much bandwith it can transfer. Displayport cables can run higher bandwith than HDMI. 30 fps is definitely comfortable. It's what the consoles run. 60 fps is amazing. Anything less than 30 sucks though. You are just really ignorant about this subject. Also, the 295x2 doesn't even have an HDMI port. It features 4 mini-displayports. Ie: a MDP to HDMI 2 adapter plus an HDMI 2 cable will work :smile:. Edit: You keep claiming that only certain games *benefit* from 4k. That could not be more wrong. Every game benefits. How much it benefits is variable. Of the games I've tested it with, all of them benefited immensly from the extra resolution. Sharper, crisper, and the textures looked better upscaled. Also, the ability to turn AA off is great. AA blurs the image, whereas 4k doesn't need AA, resulting in sharper non blurred images. Please educate yourself and/or pick up a 4k monitor and test it yourself. You'll feel like a fool afterwards. Edit 2: Here you go for the cable, STRAIGHT from HDMI's website- http://www.hdmi.org/manufacturer/hdmi_2_0/
-
......"Beauty is in the eye of the beholder" is all I can say to that. Stretching low-poly models with low-res textures over twice the pixels only exposes how bad they are, with angled 'curves', blurry and sharp surfaces all mixed together. In my experience, old games actually look better on a CRT that's lower in resolution and smoothed a bit by display technology. If you think Oblivion at 4K looks similar to Crysis 3 at 1080p, well, there's no argument I can make. It's the same as saying an average schoolkid's art class painting on canvas looks similar to Michelangelo's in a print. I guess to someone it does. Battlefield 4 isn't.Crysis 3 isn't. Metro Last Light isn't. What is... Oblivion? Sounds like you don't have a 4k monitor. Also, the cards do support HDMI 2.0. Why? Because it has to do with the chord, not the connector. Battlefield 4 runs at 72 fps @ 4k maxed. Crysis 30fps. Metro 45fps. Skyrim 60fps. Limited? Funny.
-
Games look way better at 4k, and I'm on a 27 inch monitor. What are you talking about? Even graphically pathetic games like Oblivion look @4k similar to current games at 1080p. *Using 295x2. And limited 4k abilities? Games are running at 60 fps @ 4k. Wouldn't call that limited.
-
You should read the whole thread; we've just discussed it on page 1. Partial support - most new features will still require new hardware. The one promised to be supported is not even user-oriented, but only concerns how calls through the WDDM stack are made, it's completely black-boxed and invisible to the end user. Basically, all this means is that DX12 games won't require two completely separate renderers like DX9 and DX11 renderers in modern games... well, it will require separate DX9 and DX12 renderers, if you want to support hardware older than GCN, but at least not three, DX9, DX11 and DX12 I give DX12 a few years before it is actually useful. It took DX11 awhile before it became better. By then, I'll have a new card anyways. Plus, all gaming evolved titles will have mantle.
-
Remember: the night DX12 is released, however cool your GPU are and however many of these you have, there will be a "poof" and a cloud as they turn into pumpkins. They'll still fetch a few dimes on the dollar on craigslist from people who just want a cheap beater, of course, but so can real pumpkins be cooked and eaten. The only situation where buying 2 cards buys you more time until upgrade than 1 is when a game is about to be released tomorrow (as in, very very soon), and you have it on preorder, and you you know with confidence that it will not run well in your current resolution on a single GPU. Currently for 1080p such games do not exist. If you might buy a new display later, see above about how your cards might go "poof" before you actually buy the screen. If you think they're going to start developing such a game and you might want to play pone of them, see above, except your cards will definitely go "poof" well before the game is out. If you want to be future proof, set up a personal fund for instantly preordering the next top GPU as soon as it's announced. If you have to feel future proof, but it's mandatory that you do so without upgrading your components, lock yourself up in a bunker without internet or TV so that you don't learn about any new developments. You do know that the HD 7900 series (AMD) through present, and the Gtx 600 series (Nvidia) through present, will be getting Dx 12 support?
-
It's 4GB VRAM, 4GB pool per GPU which is mirrored, same as two 290X in Crossfire since R9 295X2 is permanent Crossfire in a small package (same CF protocols and rules apply). If you have 1GB VRAM used on one GPU, all that data is mirrored into VRAM of the other GPU as well, occupying 2GB out of 8 total. If you have 3GB used out of 4 on a GPU, it occupies 6GB total. You seem to lack fundamental knowledge about how Crossfire and dual-GPUs work - yes, there is physically 8GB worth of GDDR5 on the PCB and no, there's only 4GB usable due to mirroring. To clarify - VRAM in CF operates differently than dual-channel system memory. System RAM increases bandwidth and doubles memory pool since it's designed to strip data and store it across two modules, which reduces access to each module boosting bandwidth and reducing latency. VRAM, on the other hand is mirrored since each GPU needs to be fed with the same data and they can't operate out of the same pool (data rewriting would cause conflicts). And even if they were able to, sharing the pool would effectively halve the bandwidth and increase latency since it would need an insanely fast interconnect bus, meaning that the card would be about the same speed or slower than a single-GPU card due to twice the I/O access to the pool. And FYI, even 4GB of VRAM is unnecessary unless running a console-ported mess like Watch Dogs (which stutters on a dual-GPU cards and CF/SLI anyway, even Titan Black which has 6GB of the stuff) on 4K. VRAM is a marketing gimmick, people actually think having tons of VRAM makes the card go faster. It doesn't, though it does limit memory OC potential due to having more GDDR5 modules which bring potential of one being crap and not allowing the others to reach higher clocks. It's why cards with less VRAM modules generally reach higher memory clocks, a good example being a 1GB 7850 vs 2GB 7850 where 1GB card often reaches up to 10% higher VRAM clocks despite having the same memory module manufacturer and models. Every module has different yields and the worst one determines the highest overclock, that's why people buy dual/triple/quad-channel memory kits instead of separate modules, they are more likely to have similar yields but not necessarily. Seeing as how you lack knowledge about what the card actually is, the most likely scenario is that you are lying. Either that or you got it without knowing what it is, in which case I feel sorry for your wallet. And you mention idle temperatures which were never relevant in any way, and they never will be. My card idles at 28C on GPU and that doesn't mean anything at all, it's the load temperature that matters (62C on GPU, 64C on VRMs). And I never said the GPUs will overheat, I said the VRMs will overheat once the card is overclocked and begin dropping down boost clocks lower at 95C even on stock (and it can easily reach 95C VRMs on stock). It's damn near unavoidable, unless you either have a full water block on it, play games for less than 1h at a time, or run on a 60Hz 1080p screen (but then you're completely wasting money). How well you cool the GPUs will not affect the VRM temperatures since those are air-cooled with the central axial fan on the shroud, completely separate. By the way, I have no idea what you're trying to say with "GPUs are exposed" and "heat can escape". Doesn't make much sense to me without any further details. A guy I know has a 7990 hooked up to a single 720p monitor, so I can honestly see people buying 295X2 and Z for sub-2M pixel resolutions. I sometimes feel bad for laughing at those people, but 99.9% of the time it amuses me. It's 59 C under load. And fine, Vram doesn't *stack*, but it does. This is because the cards use half as much vram in xfire as opposed to a single card. I've noticed this. AC 4 for instance, with one card it's about 2.5 GB vram. With xfire it's 1.3Gb per card. The card is extremely cool and quiet under heavy load. It is probably because my case has really good cooling, regardless, the vram and gpu's are cooled via liquid cooling. In a benchmark, the card never throttled over a 2h max load period. It's not permanent xfire. I can turn one of the gpu's off. I have to specify it in the application settings as it defaults to xfire always on. So I can play watch dogs. I got it for one reason, 4k gaming. So my wallet is hurting but it is a happy hurt.
-
It's something wrong with the 780 ti?That's pretty close to my top end budget. The 780 Ti is fine, but if you wanted to be future proof, you should get a dual GPU. It really depends on what resolution you plan to play at, and how much *eye candy* you want. If it's just 1080p and you don't mind turning down your graphics as time goes by, then get the 780. If you want to keep graphics maxed and only be turning down the resolution as time goes by, then get a dual GPU. Only problem with that, is, Nvidia hasn't released he 790 yet, and the only good AMD dual GPU is the 295x2, which is $1,500. So either wait on the 790, or get the 295x2. If those are out of your budget, go for the 780 ti.