Jump to content
We are aware aware of an issue with authentication on the site. See our status page for details. ×

Nvidia finally loses.


Recommended Posts

4k monitors are coming down in price, If i was to get a 4k monitor it would be Asus.

It's the exact same panel, only put in different cabinets and sold with different promotional materials.

 

Samsung definitely has a prettier stand and it also claims 370 cd/m^2, which may indicate a brighter backlight. On the other hand the Asus stand is adjustable. Tough choice.

 

 

HDMI would be committing suicide in the gaming market if they basically told everyone: "sorry none of your GPU's work. Going to have to buy the newest models for the 2.0 specification."

If the people behind HDMI cared one bit about the gaming market, it would support 120Hz, since physically that's what it does to deliver 3D pictures and every active 3D TV can run at least 120Hz, usually more.

 

You can believe what you want, doesn't really affect me.

 

I don't see why firmware can't give the chip the new instructions. All they are changing is how the cables and transmitting the bandwith. My card can send the bandwith because displayport works.

 

Displayport uses 1 to 4 lanes to send all-purpose micro-packets, padded to match the resolution, using self-clocking LVDS signaling, at a fixed frequency of 540 MHz, AC coupled differential voltage of 0.2, 0.4, 0.8 or 1.2V.

 

HDMI uses 3 lanes, one each for red, green and blue, with a separate clock signal, to send time-separated video pixels and audio packets, using TMDS signaling, at a frequency matching the current resolution's pixel rate, up to 600 MHz, single-ended voltage of 2.8-3.3V with 0.15V to 0.8V swing.

 

As you can see yourself, the signaling is entirely different and electrically incompatible. Dual-mode displayport, which is what video cards use, is implemented by two separate transmitters, one for HDMI+DVI and one for Displayport, switching the port between them depending on which pins are shorted on the connector. They only share a connector to save room.

 

The only time a HDMI 1.4 device can be firmware upgraded to 2.0 is when it was designed with HDMI 2.0 in mind, but produced before 2.0 became official, thus had to have its 2.0 mode disabled for compliance. Were Nvidia and AMD chips designed this way? Not impossible, but it's been half a year already, ample time.

Edited by FMod
Link to comment
Share on other sites

  • Replies 60
  • Created
  • Last Reply

Top Posters In This Topic

 

4k monitors are coming down in price, If i was to get a 4k monitor it would be Asus.

It's the exact same panel, only put in different cabinets and sold with different promotional materials.

 

Samsung definitely has a prettier stand and it also claims 370 cd/m^2, which may indicate a brighter backlight. On the other hand the Asus stand is adjustable. Tough choice.

 

 

HDMI would be committing suicide in the gaming market if they basically told everyone: "sorry none of your GPU's work. Going to have to buy the newest models for the 2.0 specification."

If the people behind HDMI cared one bit about the gaming market, it would support 120Hz, since physically that's what it does to deliver 3D pictures and every active 3D TV can run at least 120Hz, usually more.

 

You can believe what you want, doesn't really affect me.

 

I don't see why firmware can't give the chip the new instructions. All they are changing is how the cables and transmitting the bandwith. My card can send the bandwith because displayport works.

 

Displayport uses 1 to 4 lanes to send all-purpose micro-packets, padded to match the resolution, using self-clocking LVDS signaling, at a fixed frequency of 540 MHz, AC coupled differential voltage of 0.2, 0.4, 0.8 or 1.2V.

 

HDMI uses 3 lanes, one each for red, green and blue, with a separate clock signal, to send time-separated video pixels and audio packets, using TMDS signaling, at a frequency matching the current resolution's pixel rate, up to 600 MHz, single-ended voltage of 2.8-3.3V with 0.15V to 0.8V swing.

 

As you can see yourself, the signaling is entirely different and electrically incompatible. Dual-mode displayport, which is what video cards use, is implemented by two separate transmitters, one for HDMI+DVI and one for Displayport, switching the port between them depending on which pins are shorted on the connector. They only share a connector to save room.

 

The only time a HDMI 1.4 device can be firmware upgraded to 2.0 is when it was designed with HDMI 2.0 in mind, but produced before 2.0 became official, thus had to have its 2.0 mode disabled for compliance. Were Nvidia and AMD chips designed this way? Not impossible, but it's been half a year already, ample time.

 

I guess we will just have to see. HDMI 2.0 was announced something like October 2013, and the 295x2/titan-z were released a couple of months ago. Perfectly feasible to think they were designed with 2.0 in mind. Just a waiting game at this point, I suppose.

Link to comment
Share on other sites

Except the silicon itself in Titan Z was designed back in 2011 and released in 2012 as Tesla K20. They just put two on the same board. Seeing how gaming wasn't even the point, and how long ago it was, it's clear cut.

 

AMD's silicon is a lot more recent, post HDMI 2.0, but why wouldn't they trumpet it as a feature at launch? Sony released firmware updates enabling HDMI 2.0 for their 4K TV in November 2013, same for others.

 

Realistically, someone may eventually make an active converter (current ones are all low-spec), that will solve it, though likely with some extra lag. Though by that time most people who care will have new silicon.

Link to comment
Share on other sites

Except the silicon itself in Titan Z was designed back in 2011 and released in 2012 as Tesla K20. They just put two on the same board. Seeing how gaming wasn't even the point, and how long ago it was, it's clear cut.

 

AMD's silicon is a lot more recent, post HDMI 2.0, but why wouldn't they trumpet it as a feature at launch? Sony released firmware updates enabling HDMI 2.0 for their 4K TV in November 2013, same for others.

 

Realistically, someone may eventually make an active converter (current ones are all low-spec), that will solve it, though likely with some extra lag. Though by that time most people who care will have new silicon.

Maybe they said AMD couldn't, or AMD just doesn't care enough to market it with HDMI 2.0 in mind, with the knowledge most of us who care about 4k are going with DP.

Link to comment
Share on other sites

HDMI 2.0 silicon is expensive enough that even Sony, the company behind HDMI, only places one 2.0 port on its 4K TVs, the rest of the ports are HDMI 1.4b.

 

Displayport is much cheaper to make for the same data rate, so it's a non-issue. We're not talking big money, a few bucks tops, but that's a few for the price war.

High quality 4K screens are still too expensive, video cards are still slow at 4K, there is still zero 4K movie content you can buy, so very few would ever care.

 

 

I always believed we should have done 2560x1440 movies and TVs first, then 4K transition would've gone much smoother and you'd already have 1440p content that definitely looks better in 4K than in 1080p.

(Due to the way video is compressed, upscaling 1.5x does not degrade it, only improve. "2:1" is meaningless except for calming down uneducated customers, unaware that no one does pixel doubling anyway because it's ugly as hell. And Bluray discs already have an excess of capacity that h264 1440p would fit just perfectly.)

Link to comment
Share on other sites

I agree its stupid the way the industry made tv's first instead of content. What is the point of 4k if there no content available for it even, not even 2k so why jump to 4k.

Link to comment
Share on other sites

Oh, they do it for theaters, just not the public. Theaters don't use home HD resolutions and certainly not home blurays.

 

It wouldn't even cost extra to provide movies in higher resolution. Most movies even today are film, not digital, and the resolution of scanned film medium is 2048x1556. Which translates pretty nicely into 2560x1080 and even 3840x1600. More than half the scanned pixels (3.2 million) is wasted in the 1920x800 format (1.5M) used today.

 

So even for standard cheap 2K scans, a 2560 wide screen is a much better fit than a 1920 wide. Image processing can very faithfully recover 2560 horizontal pixels out of the still lost vertical resolution. (1:1 is for chumps, 2048*1556->2560*1080 actually makes for a sharper and more detailed picture than keeping the width at 2048 would).

 

And then of course you can scan film in 4K. Plenty of movies are recorded on high quality film, not to mention IMAX with 8K+ effective. Digital cameras can run 4K to 5K resolutions too, and not just for Hobbit, even some TV series like the recent season of House of Cards (which is great BTW) are being shot in 4K.

So the content's there, just not in the retail chain.

Link to comment
Share on other sites

Yet uyou atill don't see that in Blu ray form, the 1080P stuff downscaled. Why not uncompressed 2k or 4k video. Surely with todays blu ray disk sizes it wouldn't be impossible to do.

Link to comment
Share on other sites

um no, its the other way around. HDMI 2.0 is the port, not the cable. look at TVs. only one of Sony's 4k TVs even have HDMI 2.0. all HDMI cords can work with HDMI 2.0, so its not the cord that matters, its the port.

Link to comment
Share on other sites

Speaking of 4k does anyone recommend a 4k HD receiver preferably with SD, 1080p, 4k upscaling, at a decent price. Trust me HD receivers is the way to go when it comes to upscaling. I might as well be future proof when it comes to Home theatre.

Edited by Thor.
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...