Jump to content

dave1029

Premium Member
  • Posts

    157
  • Joined

  • Last visited

Everything posted by dave1029

  1. I used to think this way. I made a mod a long time ago for the old Skyrim that added all unique booze to vendor leveled list. It was unique at the time but I removed it because of so many downloads yet no assurance. It got like 5 endorsements and a couple of thank you comments but nobody seemed to care. Then I made a slightly overpowered shout mod and that got like 1500 and counting. So idk man. I just make what I want and if people like it they like it. If they don't, screw em. I make changes for myself, not for others. Though I'll happily share.
  2. Once you're 100 enchanting and have your god equipment, enchanting loses purpose. What would be astronomically cool, is if black souls could be bartered away via master level ritual spell that summons a powerful undead follower that can be fully interacted with like a normal follower (trade things, ordered to attack things, etc) but just doesn't speak. We as the player don't practice true necromancy we just summon. True necromancy is trading filled souls for power. A spell that also could give you attributes or temporary powerful bonuses would be awesome. The whole point of necromancy is trading souls for power. That does not exist in the current state of the game. Just simple powerful bone followers would satisfy me, and if I knew how to do it, I'd make it myself but I imagine that takes a quest with scripting to work and I don't know how to script.
  3. This is infuriating me to no end. I'm trying to make Dead Thrall a lesser power as it's annoying to switch everytime I kill someone, but the cast sound doesn't remain. It's completely silent. Any help?
  4. There's already a mod for this.
  5. Can someone make it so we can assign our tamed creatures to various tasks. If you need realism, make it deathclaws only, as lore says deathclaws are actually pretty intelligent. I want to be able to assign them as provisioners for supply lines.
  6. My problem with Digital Storm and others like it, is that they are just ridiculously more expensive. And usually, not always, something bottlenecks the system. For instance, they may put a bad ass awesome GPU, but not give a CPU that is near the performance of the GPU. I just use a local PC shop down the street. I do all the research, tell them these are the parts that I want, blah blah blah, and they charge me a $70 flat fee and do it. The only reason I don't build it myself, is because if the parts come in faulty, or something breaks during instillation, no skin off my bones. It's basically an insurance cost.
  7. 800 isn't near enough if you want to play these games at 1080p @ 60 fps. For a "solid" PC build, you're going to have to spend at least $1500. Here's some of the parts broken down for you: Monitor: $150 for a good 1 ms HD one. These vary heavily in price that number is just being thrown out there as a guideline. OS: $70 these days? Keyboard: Gaming: $50, normal $15. CPU: $250 GPU: $400 RAM: $120 PSU: $200 HDD/SSD: $150 Case: $70 Don't think I missed anything... but those are prices you'd expect to run those older games at 60 fps @ ultra, and the newer ones at medium @ 60 fps. Don't forget, that's if you built it yourself... There's fees associated with someone assembling it for you.
  8. Sometimes I look at screenshots and wonder how the player can see where they're going, it's OK artistically for screenshots but I wouldn't want to play like it. Games don't need DOF, the eyes react to a virtual world on screen in exactly the same way they do in the real world, the eyes still focus on the point of interest with the rest covered by peripheral vision. +1
  9. DOF doesn't even make sense in gaming. We can move our eyes to survey our entire FOV, but in games you can't move your eyes to focus on a position so DOF becomes an ugly unrealistic hindrance.
  10. I think the 295x2 and Titan-Z are the first single PCB cards to be able to handle ultra-realistic graphics @ resolutions higher than 1080p. For me, 4k looks way better than 1080p, but only slightly better than 1440p. And The Witcher 2, being one of the best looking games I've ever seen, runs above 60fps maxed (no ubersampling) on a single r9 290x @ 1440. So I think these newest cards can handle it, and it's only going to get better.
  11. Oh it's definitely the games. Which is why we recently had this big bump in the need of better hardware because of the new consoles. But my point is games are starting to get pretty enough to the point where having cutting edge tech isn't giving as much extra eye candy as it used to. Imagine Skyrim on low *barf.* Watch dogs on medium did not look terrible. Didn't look good, but the scenery wasn't bad enough to detract from the enjoyment of the game.
  12. dave1029

    Witcher 3

    That's true, but the game we saw footage of was on the xbox one, and not on PC. I think it will be the best looking game released to date when it debuts next February.
  13. I think, we are finally at the point where increasingly better hardware is less important. Games with current top end hardware look absolutely gorgeous. To a point where I can't imagine in 5 years us saying it's an eye sore. Today, I can't go back and play oblivion because of the graphics, yet in 5 years, I wouldn't see a problem going back and playing Skyrim because the graphics look great with minimal mod usage. Thoughts?
  14. dave1029

    Witcher 3

    I believe this game will set the tone for all "next gen" games. It looks... well... insane. Reported as 20% larger than Skyrim with no load screens. If games like this can become the standard, then maybe we will stop getting half ass games in the future. Now of course the game isn't out yet, so this is all speculation, but the developers of this series have been good to their customers. They have no reason to lie or misguide.
  15. Got the witcher 2 for $5. Having a blast with it.
  16. Maybe they said AMD couldn't, or AMD just doesn't care enough to market it with HDMI 2.0 in mind, with the knowledge most of us who care about 4k are going with DP.
  17. It's the exact same panel, only put in different cabinets and sold with different promotional materials. Samsung definitely has a prettier stand and it also claims 370 cd/m^2, which may indicate a brighter backlight. On the other hand the Asus stand is adjustable. Tough choice. If the people behind HDMI cared one bit about the gaming market, it would support 120Hz, since physically that's what it does to deliver 3D pictures and every active 3D TV can run at least 120Hz, usually more. You can believe what you want, doesn't really affect me. Displayport uses 1 to 4 lanes to send all-purpose micro-packets, padded to match the resolution, using self-clocking LVDS signaling, at a fixed frequency of 540 MHz, AC coupled differential voltage of 0.2, 0.4, 0.8 or 1.2V. HDMI uses 3 lanes, one each for red, green and blue, with a separate clock signal, to send time-separated video pixels and audio packets, using TMDS signaling, at a frequency matching the current resolution's pixel rate, up to 600 MHz, single-ended voltage of 2.8-3.3V with 0.15V to 0.8V swing. As you can see yourself, the signaling is entirely different and electrically incompatible. Dual-mode displayport, which is what video cards use, is implemented by two separate transmitters, one for HDMI+DVI and one for Displayport, switching the port between them depending on which pins are shorted on the connector. They only share a connector to save room. The only time a HDMI 1.4 device can be firmware upgraded to 2.0 is when it was designed with HDMI 2.0 in mind, but produced before 2.0 became official, thus had to have its 2.0 mode disabled for compliance. Were Nvidia and AMD chips designed this way? Not impossible, but it's been half a year already, ample time. I guess we will just have to see. HDMI 2.0 was announced something like October 2013, and the 295x2/titan-z were released a couple of months ago. Perfectly feasible to think they were designed with 2.0 in mind. Just a waiting game at this point, I suppose.
  18. It just does. Proven, tested, thousands of people run it on HDMI, most people who buy high-end hardware today play on TVs, not monitors.You can look up reviews, look up specs. And didn't your card include an adapter? A firmware upgrade through drivers would be like a firmware upgrade adding a turbocharger to a car that doesn't have one. You need new hardware for HDMI 2.0, much higher spec modem designs, it's on the same technology level as Ethernet 10GbE. You can't firmware update your 1 Gbps network card to 10 Gbps, can you? Same for HDMI. While at that, 10 GbE runs over exact same copper as 1 GbE, and that's a whole 10x speed bump - far better signaling hardware with near and far end crosstalk rejection makes the difference. It's what I've been talking about, lackluster quality. Don't worry, it's not worse than other TNs, about average. Just that there has never been a good full-size TN panel (a decent one was once made for Sony and Apple laptops). They're the low end of the market, never been 8-bit (forget 10-bit as high-end panels are), never had decent color reproduction, could never have because colors shift with viewing angles. It's OK I guess if you haven't dealt with high-end monitors or home theater TVs. Otherwise... As they say, "once you go black, you can never go back". Same here, once you run a year with a screen that ranges from pitch black to pure bright R/G/B/W, you'll never go back to one that ranges from dark gray to pastel. First off, why would I buy a $2k UHD gaming monitor when the games aren't even benefiting from 4k as much as they could. I'm quite happy with this monitor until the high end stuff becomes more affordable. Also, I still think you're wrong with the card. I don't see why firmware can't give the chip the new instructions. All they are changing is how the cables and transmitting the bandwith. My card can send the bandwith because displayport works. It's not a limitation of the hardware. HDMI said firmware/hardware implying that both could be used to make the change. Sony has also said firmware will make their existing devices HDMI 2.0. So. I'm going to stick with it being possible for AMD/Nvidia drivers to be able to fix the issue. HDMI would be committing suicide in the gaming market if they basically told everyone: "sorry none of your GPU's work. Going to have to buy the newest models for the 2.0 specification." Link an article saying my card or any similar card won't work with it, and I'll believe you.
  19. Know what "backwards compatible" means?It means you can connect a HDMI x.y device to a HDMI a.b device and it will work. Not work as 2.0; work as the lower version of the two. Know what "connectors" are? Connectors are those physical plastic housings wrapped in sheet metal with copper wires inside. PHY layer. Of course they're the same. What defines HDMI version is interface on the chip. Well... try it.People have, it didn't work. 295x2 specifications state it only supports HDMI 1.4a. The chip doesn't have HDMI 2.0 support, putting two together doesn't change a thing, it's still the same chip as 290X. Same thing for Titan Z, same thing for 2x780Ti. Had an Asus PQ321 to test recently. I don't know what you have, but of all 4K computer screens that I've seen, PQ321 was among the best, it's got Sharp's 31.5" IGZO panel and full 10-bit color. The only 27" 4K panels I've heard of are cheap TN+ with 6-bit color, so I'll go out on a limb here and say that it's likely not better. PQ321 is almost good enough to use. Almost. But not quite. For one, it can't show black. So Metro 2033/LL - two out of maybe five games that do have the fine detail for 4K - are out; they're mostly black, so you want it in the palette. So-so uniformity, all colors washed out, can't tune gamma or they fade to gray. All in all, it looks like a 3 years old 2560x1440 monitor, only with slightly smaller pixels. Literally one built 3 years ago and never cleaned since. Others are worse, TN ones especially worse. Now, let's take... Sony XBR X900 4K TV. Whole other world. You can put a black image on it with one tiny bright spot, turn off the lights, and you'll never see where the screen border is. The bright spot will be blinding, everything else pitch black. Backlight is perfectly even, bright means blinding, no lag to speak of, colors just work. The kicker? Both cost about $3k, 32" PQ321 that looks like something unpleasant squeezed between two sheets of glass and 55" X900 that looks almost like a cinema screen minus all the people. But doesn't have displayport so won't do 4K@60. It's very nearly time for 4K, all it needs to move from purchasable tech demo stage to early adoption stage is suitable video cards. 20nm and new architecture to keep noise down, and of course HDMI 2.0. The next step is OLED 4K displays. Curved OLED 4K's have already been shown, some can even be bought. I haven't seen the best ones in person yet (few have), but from what reviewers agree on, they make X900 look like it has a dirty plastic bag stuck to its screen. Even brighter highlights, even deeper blacks, even faster response, even wider color gamut, perfect local contrast, and the curve helps immersion a lot. That will be the primetime for 4K. If the trends of previous display revolutions hold, it won't get better from that point on, just cheaper. I cannot find where the 295x2 supports HDMI at all. Only DP. And this is the monitor I have: http://www.newegg.com/Product/Product.aspx?Item=9SIA2RY1DK9756... Not the best. TN panel. But very affordable. Very clear, and with a lot of of color tweaking in the CCC, games just look absolutely gorgeous. I don't use ENB for various reasons in Skyrim, but this monitor makes it look like ENB with a few mods (like climates of tamriel). Just extremely cinematic. I would still like to know where the 295x2 supports HDMI 1.4a AND would like to know why a firmware upgrade through drivers could not make it support 2.0 if it is, in fact, needed. I mean, it sounds like they are just changing how the signal is sent through their Cat 2 cables.
  20. Been doing some more research and it appears that the connectors don't have anything to do with it... but the monitor does. The monitor has to support HDMI 2.0. My 295x2 can support 4k @ 60fps on a 4k HDMI only TV... ONLY if the chip inside the TV allows for it. So, looks like me and fmod were both wrong. Graphics chips can only send that much bandwith over the cable if the receiving end of the signal will accept it. Moral of the story: Titan Z and 295x2 are just fine for future 4k gaming on DP or HDMI- just make sure the HDMI adapted monitor/TV has a 2.0 specification. Moral of the story 2: Don't buy a 4k TV until they are made with the 2.0 specification. Moral of the story 3: Even if it has to do with the connector and I have been wrong all along- 295x2 still wins because it only has mini display port. Which, with a category 2 HDMI cable and a 2.0 supported display, the 295x2 will deliver 60fps while the Titan-Z might not.
  21. You're speaking with enough hubris that I'm not sure if I should reply, as this is not a specialist hardware forum after all. HDMI has everything to do with the "connector", or more precisely with the transceivers on either end. On the contrary, it's cables that don't matter - HDMI 2.0 uses the exact same Cat2 cables as HDMI 1.4. Have you read your own link? Because it says that too. 295x2 does have HDMI interfaces, but they share connectors with its MiniDP ports. MDP->HDMI adapters are not converters, they do not process the signal, they only change the mechanical connector. When you use such an adapter, the video card then sends a HDMI signal instead of a Displayport signal. It would have to send a HDMI 2.0 signal to establish a HDMI 2.0 connection. It can not. Only HDMI 1.4a is supported; thus it's impossible to connect to a 4K TV at 60 fps. In a perfect world displayport is better and everyone would be using it, but the reality is that the best image quality is only found in TVs, and they stubbornly stick to HDMI only. To you.To you. I can't look at 30fps for more than a couple minutes before I'd rather look elsewhere. I don't enjoy sharp lines showcasing every angle of old low-poly models or blurry textures stretched over even more pixels. If you do, more power to you. I know some people that have grown up with analog TV sets (as have I, but somehow didn't pick up this bug) seem to value sharpness above all when buying a TV set, so they try to hunt for it even on new LCD sets that are all perfectly sharp by definition. Maybe you share the same values. Or maybe it's something else, doesn't matter. I've been an early adopter for every display tech out there, Trinitrons, LCD monitors, plasma TV, wide color gamut, 2560x1600, active dimming LED, triple-screen setup. I will be for OLED 4K TV, in this order of importance, but not before there's something you can hook them up to without losing in image quality, screen size or framerate. You don't need to tell me what an increase in resolution looks like; been there, done that, even played around with 3840x2400 before it was cool and you had to hook up three cables to drive it. It may be your first display revolution, isn't mine. I'm glad that you're excited, but please - acting like a white knight amidst unwashed masses about it gets outright comical. Well I can't convince you about the HDMI thing if you're unwilling to listen. *Cough* Is HDMI 2.0 backwards compatible with HDMI 1.x?Yes, all HDMI versions are fully backward compatible with all previous versions. Does HDMI 2.0 require new connectors?No, HDMI 2.0 uses the existing connectors.
  22. Do explain.http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-titan-z/specifications 1 - 3840x2160 at 30Hz or 4096x2160 at 24Hz supported over HDMI. 4096x2160 (including 3840x2160) at 60Hz supported over Displayport. This is HDMI 1.4a. It supports 3840x2160 at 30Hz - which is useless. Neither Titan Z nor 295x2 support HDMI 2.0. 60Hz is supported via Displayport, but there is nothing worthwhile to connect to Displayport, all 4K TV still only come with HDMI. 30 fps is not a comfortable framerate and it's well below the norm for PC gaming. So the extent of support for 4K today is: * The only games that have the detail to benefit from 4K won't run smoothly in 4K even on dual-GPU setups. * No gaming video cards support HDMI 2.0, so they only show 4K@60fps on small monitors with image quality below modern HT standards. Are you seriously this dense? HDMI 2 has NOTHING to do with the connector. It has to do with the cable and how much bandwith it can transfer. Displayport cables can run higher bandwith than HDMI. 30 fps is definitely comfortable. It's what the consoles run. 60 fps is amazing. Anything less than 30 sucks though. You are just really ignorant about this subject. Also, the 295x2 doesn't even have an HDMI port. It features 4 mini-displayports. Ie: a MDP to HDMI 2 adapter plus an HDMI 2 cable will work :smile:. Edit: You keep claiming that only certain games *benefit* from 4k. That could not be more wrong. Every game benefits. How much it benefits is variable. Of the games I've tested it with, all of them benefited immensly from the extra resolution. Sharper, crisper, and the textures looked better upscaled. Also, the ability to turn AA off is great. AA blurs the image, whereas 4k doesn't need AA, resulting in sharper non blurred images. Please educate yourself and/or pick up a 4k monitor and test it yourself. You'll feel like a fool afterwards. Edit 2: Here you go for the cable, STRAIGHT from HDMI's website- http://www.hdmi.org/manufacturer/hdmi_2_0/
  23. ......"Beauty is in the eye of the beholder" is all I can say to that. Stretching low-poly models with low-res textures over twice the pixels only exposes how bad they are, with angled 'curves', blurry and sharp surfaces all mixed together. In my experience, old games actually look better on a CRT that's lower in resolution and smoothed a bit by display technology. If you think Oblivion at 4K looks similar to Crysis 3 at 1080p, well, there's no argument I can make. It's the same as saying an average schoolkid's art class painting on canvas looks similar to Michelangelo's in a print. I guess to someone it does. Battlefield 4 isn't.Crysis 3 isn't. Metro Last Light isn't. What is... Oblivion? Sounds like you don't have a 4k monitor. Also, the cards do support HDMI 2.0. Why? Because it has to do with the chord, not the connector. Battlefield 4 runs at 72 fps @ 4k maxed. Crysis 30fps. Metro 45fps. Skyrim 60fps. Limited? Funny.
  24. Games look way better at 4k, and I'm on a 27 inch monitor. What are you talking about? Even graphically pathetic games like Oblivion look @4k similar to current games at 1080p. *Using 295x2. And limited 4k abilities? Games are running at 60 fps @ 4k. Wouldn't call that limited.
  25. You should read the whole thread; we've just discussed it on page 1. Partial support - most new features will still require new hardware. The one promised to be supported is not even user-oriented, but only concerns how calls through the WDDM stack are made, it's completely black-boxed and invisible to the end user. Basically, all this means is that DX12 games won't require two completely separate renderers like DX9 and DX11 renderers in modern games... well, it will require separate DX9 and DX12 renderers, if you want to support hardware older than GCN, but at least not three, DX9, DX11 and DX12 I give DX12 a few years before it is actually useful. It took DX11 awhile before it became better. By then, I'll have a new card anyways. Plus, all gaming evolved titles will have mantle.
×
×
  • Create New...