Jump to content

Nvidia finally loses.


Recommended Posts

Are you seriously this dense? HDMI 2 has NOTHING to do with the connector. It has to do with the cable and how much bandwith it can transfer. Displayport cables can run higher bandwith than HDMI.

 

Also, the 295x2 doesn't even have an HDMI port. It features 4 mini-displayports. Ie: a MDP to HDMI 2 adapter plus an HDMI 2 cable will work :smile:.

 

Edit 2: Here you go for the cable, STRAIGHT from HDMI's website- http://www.hdmi.org/manufacturer/hdmi_2_0/

You're speaking with enough hubris that I'm not sure if I should reply, as this is not a specialist hardware forum after all.

 

HDMI has everything to do with the "connector", or more precisely with the transceivers on either end. On the contrary, it's cables that don't matter - HDMI 2.0 uses the exact same Cat2 cables as HDMI 1.4. Have you read your own link? Because it says that too.

 

 

295x2 does have HDMI interfaces, but they share connectors with its MiniDP ports. MDP->HDMI adapters are not converters, they do not process the signal, they only change the mechanical connector. When you use such an adapter, the video card then sends a HDMI signal instead of a Displayport signal.

It would have to send a HDMI 2.0 signal to establish a HDMI 2.0 connection. It can not. Only HDMI 1.4a is supported; thus it's impossible to connect to a 4K TV at 60 fps.

 

In a perfect world displayport is better and everyone would be using it, but the reality is that the best image quality is only found in TVs, and they stubbornly stick to HDMI only.

 

 

30 fps is definitely comfortable. ...

Sharper, crisper, and the textures looked better upscaled.

To you.

To you.

 

I can't look at 30fps for more than a couple minutes before I'd rather look elsewhere.

I don't enjoy sharp lines showcasing every angle of old low-poly models or blurry textures stretched over even more pixels.

 

If you do, more power to you.

 

 

I know some people that have grown up with analog TV sets (as have I, but somehow didn't pick up this bug) seem to value sharpness above all when buying a TV set, so they try to hunt for it even on new LCD sets that are all perfectly sharp by definition. Maybe you share the same values. Or maybe it's something else, doesn't matter.

 

I've been an early adopter for every display tech out there, Trinitrons, LCD monitors, plasma TV, wide color gamut, 2560x1600, active dimming LED, triple-screen setup.

I will be for OLED 4K TV, in this order of importance, but not before there's something you can hook them up to without losing in image quality, screen size or framerate.

 

You don't need to tell me what an increase in resolution looks like; been there, done that, even played around with 3840x2400 before it was cool and you had to hook up three cables to drive it. It may be your first display revolution, isn't mine. I'm glad that you're excited, but please - acting like a white knight amidst unwashed masses about it gets outright comical.

Link to comment
Share on other sites

  • Replies 60
  • Created
  • Last Reply

Top Posters In This Topic

 

Are you seriously this dense? HDMI 2 has NOTHING to do with the connector. It has to do with the cable and how much bandwith it can transfer. Displayport cables can run higher bandwith than HDMI.

 

Also, the 295x2 doesn't even have an HDMI port. It features 4 mini-displayports. Ie: a MDP to HDMI 2 adapter plus an HDMI 2 cable will work :smile:.

 

Edit 2: Here you go for the cable, STRAIGHT from HDMI's website- http://www.hdmi.org/manufacturer/hdmi_2_0/

You're speaking with enough hubris that I'm not sure if I should reply, as this is not a specialist hardware forum after all.

 

HDMI has everything to do with the "connector", or more precisely with the transceivers on either end. On the contrary, it's cables that don't matter - HDMI 2.0 uses the exact same Cat2 cables as HDMI 1.4. Have you read your own link? Because it says that too.

 

 

295x2 does have HDMI interfaces, but they share connectors with its MiniDP ports. MDP->HDMI adapters are not converters, they do not process the signal, they only change the mechanical connector. When you use such an adapter, the video card then sends a HDMI signal instead of a Displayport signal.

It would have to send a HDMI 2.0 signal to establish a HDMI 2.0 connection. It can not. Only HDMI 1.4a is supported; thus it's impossible to connect to a 4K TV at 60 fps.

 

In a perfect world displayport is better and everyone would be using it, but the reality is that the best image quality is only found in TVs, and they stubbornly stick to HDMI only.

 

 

30 fps is definitely comfortable. ...

Sharper, crisper, and the textures looked better upscaled.

To you.

To you.

 

I can't look at 30fps for more than a couple minutes before I'd rather look elsewhere.

I don't enjoy sharp lines showcasing every angle of old low-poly models or blurry textures stretched over even more pixels.

 

If you do, more power to you.

 

 

I know some people that have grown up with analog TV sets (as have I, but somehow didn't pick up this bug) seem to value sharpness above all when buying a TV set, so they try to hunt for it even on new LCD sets that are all perfectly sharp by definition. Maybe you share the same values. Or maybe it's something else, doesn't matter.

 

I've been an early adopter for every display tech out there, Trinitrons, LCD monitors, plasma TV, wide color gamut, 2560x1600, active dimming LED, triple-screen setup.

I will be for OLED 4K TV, in this order of importance, but not before there's something you can hook them up to without losing in image quality, screen size or framerate.

 

You don't need to tell me what an increase in resolution looks like; been there, done that, even played around with 3840x2400 before it was cool and you had to hook up three cables to drive it. It may be your first display revolution, isn't mine. I'm glad that you're excited, but please - acting like a white knight amidst unwashed masses about it gets outright comical.

 

Well I can't convince you about the HDMI thing if you're unwilling to listen.

 

*Cough*

Is HDMI 2.0 backwards compatible with HDMI 1.x?

Yes, all HDMI versions are fully backward compatible with all previous versions.

 

Does HDMI 2.0 require new connectors?

No, HDMI 2.0 uses the existing connectors.

Link to comment
Share on other sites

HDMI 2.00 allows for greater 4k capabilities around 18gbps bandwidth limits, and upgraded 3d support.

 

Here is a more in depth spec sheet.

 

http://www.extremetech.com/computing/165639-hdmi-2-0-released-18gbps-of-bandwidth-allowing-for-4k-60-fps-32-audio-channels

 

Best to buy cheap right now, sense hdmi 2.00 is freaking expensive,

Link to comment
Share on other sites

Been doing some more research and it appears that the connectors don't have anything to do with it... but the monitor does. The monitor has to support HDMI 2.0. My 295x2 can support 4k @ 60fps on a 4k HDMI only TV... ONLY if the chip inside the TV allows for it. So, looks like me and fmod were both wrong. Graphics chips can only send that much bandwith over the cable if the receiving end of the signal will accept it.

 

Moral of the story: Titan Z and 295x2 are just fine for future 4k gaming on DP or HDMI- just make sure the HDMI adapted monitor/TV has a 2.0 specification.

Moral of the story 2: Don't buy a 4k TV until they are made with the 2.0 specification.

Moral of the story 3: Even if it has to do with the connector and I have been wrong all along- 295x2 still wins because it only has mini display port. Which, with a category 2 HDMI cable and a 2.0 supported display, the 295x2 will deliver 60fps while the Titan-Z might not.

Link to comment
Share on other sites

Is HDMI 2.0 backwards compatible with HDMI 1.x?

Yes, all HDMI versions are fully backward compatible with all previous versions.

Does HDMI 2.0 require new connectors?

No, HDMI 2.0 uses the existing connectors.

Know what "backwards compatible" means?

It means you can connect a HDMI x.y device to a HDMI a.b device and it will work. Not work as 2.0; work as the lower version of the two.

 

Know what "connectors" are?

Connectors are those physical plastic housings wrapped in sheet metal with copper wires inside. PHY layer. Of course they're the same.

What defines HDMI version is interface on the chip.

 

 

Been doing some more research and it appears that the connectors don't have anything to do with it... but the monitor does. The monitor has to support HDMI 2.0. My 295x2 can support 4k @ 60fps on a 4k HDMI only TV... ONLY if the chip inside the TV allows for it.

Well... try it.

People have, it didn't work.

 

295x2 specifications state it only supports HDMI 1.4a. The chip doesn't have HDMI 2.0 support, putting two together doesn't change a thing, it's still the same chip as 290X.

Same thing for Titan Z, same thing for 2x780Ti.

 

 

Had an Asus PQ321 to test recently. I don't know what you have, but of all 4K computer screens that I've seen, PQ321 was among the best, it's got Sharp's 31.5" IGZO panel and full 10-bit color. The only 27" 4K panels I've heard of are cheap TN+ with 6-bit color, so I'll go out on a limb here and say that it's likely not better.

 

PQ321 is almost good enough to use. Almost. But not quite. For one, it can't show black. So Metro 2033/LL - two out of maybe five games that do have the fine detail for 4K - are out; they're mostly black, so you want it in the palette. So-so uniformity, all colors washed out, can't tune gamma or they fade to gray. All in all, it looks like a 3 years old 2560x1440 monitor, only with slightly smaller pixels. Literally one built 3 years ago and never cleaned since.

 

Others are worse, TN ones especially worse.

 

 

Now, let's take... Sony XBR X900 4K TV.

Whole other world.

You can put a black image on it with one tiny bright spot, turn off the lights, and you'll never see where the screen border is. The bright spot will be blinding, everything else pitch black. Backlight is perfectly even, bright means blinding, no lag to speak of, colors just work.

 

The kicker? Both cost about $3k, 32" PQ321 that looks like something unpleasant squeezed between two sheets of glass and 55" X900 that looks almost like a cinema screen minus all the people. But doesn't have displayport so won't do 4K@60.

 

It's very nearly time for 4K, all it needs to move from purchasable tech demo stage to early adoption stage is suitable video cards. 20nm and new architecture to keep noise down, and of course HDMI 2.0.

 

The next step is OLED 4K displays. Curved OLED 4K's have already been shown, some can even be bought. I haven't seen the best ones in person yet (few have), but from what reviewers agree on, they make X900 look like it has a dirty plastic bag stuck to its screen. Even brighter highlights, even deeper blacks, even faster response, even wider color gamut, perfect local contrast, and the curve helps immersion a lot.

 

That will be the primetime for 4K. If the trends of previous display revolutions hold, it won't get better from that point on, just cheaper.

Edited by FMod
Link to comment
Share on other sites

Coming from playing with the Vita i highly agree Oleds is the next big step. To bad it will be 3 years before they come out in price i could afford.

Edited by Thor.
Link to comment
Share on other sites

 

Is HDMI 2.0 backwards compatible with HDMI 1.x?

Yes, all HDMI versions are fully backward compatible with all previous versions.

Does HDMI 2.0 require new connectors?

No, HDMI 2.0 uses the existing connectors.

Know what "backwards compatible" means?

It means you can connect a HDMI x.y device to a HDMI a.b device and it will work. Not work as 2.0; work as the lower version of the two.

 

Know what "connectors" are?

Connectors are those physical plastic housings wrapped in sheet metal with copper wires inside. PHY layer. Of course they're the same.

What defines HDMI version is interface on the chip.

 

 

Been doing some more research and it appears that the connectors don't have anything to do with it... but the monitor does. The monitor has to support HDMI 2.0. My 295x2 can support 4k @ 60fps on a 4k HDMI only TV... ONLY if the chip inside the TV allows for it.

Well... try it.

People have, it didn't work.

 

295x2 specifications state it only supports HDMI 1.4a. The chip doesn't have HDMI 2.0 support, putting two together doesn't change a thing, it's still the same chip as 290X.

Same thing for Titan Z, same thing for 2x780Ti.

 

 

Had an Asus PQ321 to test recently. I don't know what you have, but of all 4K computer screens that I've seen, PQ321 was among the best, it's got Sharp's 31.5" IGZO panel and full 10-bit color. The only 27" 4K panels I've heard of are cheap TN+ with 6-bit color, so I'll go out on a limb here and say that it's likely not better.

 

PQ321 is almost good enough to use. Almost. But not quite. For one, it can't show black. So Metro 2033/LL - two out of maybe five games that do have the fine detail for 4K - are out; they're mostly black, so you want it in the palette. So-so uniformity, all colors washed out, can't tune gamma or they fade to gray. All in all, it looks like a 3 years old 2560x1440 monitor, only with slightly smaller pixels. Literally one built 3 years ago and never cleaned since.

 

Others are worse, TN ones especially worse.

 

 

Now, let's take... Sony XBR X900 4K TV.

Whole other world.

You can put a black image on it with one tiny bright spot, turn off the lights, and you'll never see where the screen border is. The bright spot will be blinding, everything else pitch black. Backlight is perfectly even, bright means blinding, no lag to speak of, colors just work.

 

The kicker? Both cost about $3k, 32" PQ321 that looks like something unpleasant squeezed between two sheets of glass and 55" X900 that looks almost like a cinema screen minus all the people. But doesn't have displayport so won't do 4K@60.

 

It's very nearly time for 4K, all it needs to move from purchasable tech demo stage to early adoption stage is suitable video cards. 20nm and new architecture to keep noise down, and of course HDMI 2.0.

 

The next step is OLED 4K displays. Curved OLED 4K's have already been shown, some can even be bought. I haven't seen the best ones in person yet (few have), but from what reviewers agree on, they make X900 look like it has a dirty plastic bag stuck to its screen. Even brighter highlights, even deeper blacks, even faster response, even wider color gamut, perfect local contrast, and the curve helps immersion a lot.

 

That will be the primetime for 4K. If the trends of previous display revolutions hold, it won't get better from that point on, just cheaper.

 

I cannot find where the 295x2 supports HDMI at all. Only DP. And this is the monitor I have: http://www.newegg.com/Product/Product.aspx?Item=9SIA2RY1DK9756... Not the best. TN panel. But very affordable. Very clear, and with a lot of of color tweaking in the CCC, games just look absolutely gorgeous. I don't use ENB for various reasons in Skyrim, but this monitor makes it look like ENB with a few mods (like climates of tamriel). Just extremely cinematic. I would still like to know where the 295x2 supports HDMI 1.4a AND would like to know why a firmware upgrade through drivers could not make it support 2.0 if it is, in fact, needed. I mean, it sounds like they are just changing how the signal is sent through their Cat 2 cables.

Link to comment
Share on other sites

I cannot find where the 295x2 supports HDMI at all. Only DP.

It just does. Proven, tested, thousands of people run it on HDMI, most people who buy high-end hardware today play on TVs, not monitors.

You can look up reviews, look up specs. And didn't your card include an adapter?

 

 

I would still like to know where the 295x2 supports HDMI 1.4a AND would like to know why a firmware upgrade through drivers could not make it support 2.0 if it is, in fact, needed. I mean, it sounds like they are just changing how the signal is sent through their Cat 2 cables.

A firmware upgrade through drivers would be like a firmware upgrade adding a turbocharger to a car that doesn't have one. You need new hardware for HDMI 2.0, much higher spec modem designs, it's on the same technology level as Ethernet 10GbE.

 

You can't firmware update your 1 Gbps network card to 10 Gbps, can you? Same for HDMI. While at that, 10 GbE runs over exact same copper as 1 GbE, and that's a whole 10x speed bump - far better signaling hardware with near and far end crosstalk rejection makes the difference.

 

 

And this is the monitor I have: http://www.newegg.com/Product/Product.aspx?Item=9SIA2RY1DK9756... Not the best. TN panel. But very affordable. Very clear, and with a lot of of color tweaking in the CCC, games just look absolutely gorgeous.

It's what I've been talking about, lackluster quality. Don't worry, it's not worse than other TNs, about average.

 

Just that there has never been a good full-size TN panel (a decent one was once made for Sony and Apple laptops). They're the low end of the market, never been 8-bit (forget 10-bit as high-end panels are), never had decent color reproduction, could never have because colors shift with viewing angles.

 

It's OK I guess if you haven't dealt with high-end monitors or home theater TVs. Otherwise... As they say, "once you go black, you can never go back". Same here, once you run a year with a screen that ranges from pitch black to pure bright R/G/B/W, you'll never go back to one that ranges from dark gray to pastel.

Link to comment
Share on other sites

 

I cannot find where the 295x2 supports HDMI at all. Only DP.

It just does. Proven, tested, thousands of people run it on HDMI, most people who buy high-end hardware today play on TVs, not monitors.

You can look up reviews, look up specs. And didn't your card include an adapter?

 

 

I would still like to know where the 295x2 supports HDMI 1.4a AND would like to know why a firmware upgrade through drivers could not make it support 2.0 if it is, in fact, needed. I mean, it sounds like they are just changing how the signal is sent through their Cat 2 cables.

A firmware upgrade through drivers would be like a firmware upgrade adding a turbocharger to a car that doesn't have one. You need new hardware for HDMI 2.0, much higher spec modem designs, it's on the same technology level as Ethernet 10GbE.

 

You can't firmware update your 1 Gbps network card to 10 Gbps, can you? Same for HDMI. While at that, 10 GbE runs over exact same copper as 1 GbE, and that's a whole 10x speed bump - far better signaling hardware with near and far end crosstalk rejection makes the difference.

 

 

And this is the monitor I have: http://www.newegg.com/Product/Product.aspx?Item=9SIA2RY1DK9756... Not the best. TN panel. But very affordable. Very clear, and with a lot of of color tweaking in the CCC, games just look absolutely gorgeous.

It's what I've been talking about, lackluster quality. Don't worry, it's not worse than other TNs, about average.

 

Just that there has never been a good full-size TN panel (a decent one was once made for Sony and Apple laptops). They're the low end of the market, never been 8-bit (forget 10-bit as high-end panels are), never had decent color reproduction, could never have because colors shift with viewing angles.

 

It's OK I guess if you haven't dealt with high-end monitors or home theater TVs. Otherwise... As they say, "once you go black, you can never go back". Same here, once you run a year with a screen that ranges from pitch black to pure bright R/G/B/W, you'll never go back to one that ranges from dark gray to pastel.

 

First off, why would I buy a $2k UHD gaming monitor when the games aren't even benefiting from 4k as much as they could. I'm quite happy with this monitor until the high end stuff becomes more affordable. Also, I still think you're wrong with the card. I don't see why firmware can't give the chip the new instructions. All they are changing is how the cables and transmitting the bandwith. My card can send the bandwith because displayport works. It's not a limitation of the hardware. HDMI said firmware/hardware implying that both could be used to make the change. Sony has also said firmware will make their existing devices HDMI 2.0. So. I'm going to stick with it being possible for AMD/Nvidia drivers to be able to fix the issue. HDMI would be committing suicide in the gaming market if they basically told everyone: "sorry none of your GPU's work. Going to have to buy the newest models for the 2.0 specification." Link an article saying my card or any similar card won't work with it, and I'll believe you.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...