本文章最後由 r32 於 2013-8-7 18:57 編輯
關於10 bit色深一事avs上早就怨聲載道.
"Rumor is the new HDMI 2.0 spec has been finalized and approved. A lot of industry members are pissed off about it and reportedly the big boys bought out some smaller companies to get enough votes to have their way including one big boy I am very surprised at which reportedly voted against the interest of some of its companies elements. the standard as approved is not 12 bit but only 10 bit. No 4:4:4 but 4:2:0. I don't think even 4:2:2 and only 60. There is quite a bit of discussion about this in the reduser forum under the Odemax in the fall thread. No display port. See Stacey Spears posts starting on page 10 or so."
post by mark haflich, avs
"HDMI 2.0 will have a total data rate limit of 18 Gbps and that sets a hard limit as to max. bit depth vs. resolution that it will be able to support. for a given chroma subsampling scheme. Remember that rec. BT.2020 for UHD also includes 8K video as well as 4K and HDMI 2.0 was not intended to supporting everything defined by BT2020.
Increase bit-depth directly results in increased data rates, so there will be a limit, its just a question of where it is. Even though video will be using the bulk of the data capacity of HDMI, HDMI's data rate budget must include the capacity to carry uncompressed multichannel audio and control information.
Working within the data rate limit, for supporting 4K video the real question is what combinations of bit depths and chroma subsampling schemes (e.g., 4:2:0 = lowest data rates and 4:4:4 = highest data rates) can technically be supported and to what extent has politics within the HDMI Forum artificially constrained the HDMI 2.0 supported capabilities to something less than what would be technically viable."
post by Ron Jones, avs
"If these discussed limitation are really true, it will be a huge disappointment. They took almost two year to put together this spec, and it is already obsolete before it is released as it does not have the full support for 4K.
Why is it so impossible for these short sited people to plan a bit ahead and build in extra capacity (bandwidth)?
The cost increase per HDMI chip is going to be miniscule in relations to the cost of a 4K projector/receiver (whether HDMI chip cost $1 or $10 or $30 on a $3,000 piece of equipment is totally irrelevant; $30 is 1% of $3,000 - completely irrelevant)
From consumer's point of view, the HDMI consortium would be much better holding off another year and build in capacity for full 8k support, including 3D (without any artificial constraints in the chip).
If broadcasters are concerned about current bandwidth, they can use lower standards in the way they transmit content. Just because the HDMI chip has full support for for 4:4:4 16 bit color, does not mean they could not broadcast content in 4:2:0 and 10 bits to start with and then transition to 4:4:4, 16 bit color as bandwidth capacity becomes available. But do not cripple chip support artificially.
post by Dionyz, avs
|