By James DeRuvo (doddleNEWS)
In the music world, unless they have “golden ears,” most people can’t really tell the difference between so called “lossless” quality and your garden variety MP3. 4K has a similar challenge, in that most viewers won’t see the difference between 1080p and Ultra HD (UHD), unless they have a screen so large that they’d have to mortgage the house to afford. That’s the result of a recent study of viewers comparing both formats. And it could be a more difficult issue to deal with than price or bandwidth.
“Ultra HD is upon us and the images were indeed amazing but have we thought about the transport methods to get such high data rates to so many different devices? Have we considered visual acuity or what the eye can actually resolve. On a 50-inch panel, you need to be about three meters away to fully see 1080p,” – James Chen managing director, Kortz at CEDIA
Last Sunday for the Super Bowl, CBS broadcast the big game from over 80 high def broadcast cameras in 1080p and six 4k cameras – used mostly for instant replay. But it was lost on the average viewer who didn’t have a TV over 60 inches, which would account for roughly 99.9% of the world wide audience. Why? Because of he simple limitations of the human eye, which cannot tell the difference at an average viewing distance of a few meters.
“You’re…saying that smaller 4K TVs are viable. How much smaller?” Asks Geoffrey Morrison of CNET. ” Well, not 50 inches. Probably not 60 inches either. These are the sizes people are buying…”
The problem is due to being taught that watching TV too close is bad for your eyes. The fact is that our focusing distance is set by biology, not technology. And our eyes can only see so well, fortunately for our wallets. And most are not going to buy a 70-110″ screen in order to get the 4K experience. That could spell doom for the manufacturers at CES who announced 4k TVs with great fanfare and award winning design. And it may be that Panasonic had data that suggested the same and justified not playing in the 4K game (except for a tablet – but that has the same issue). And it may also explain the absence of the long rumored Apple iTV, which may have missed its window.
But the physical limitations of 4K flat screens may present an opportunity for 4K projectors. Data also suggests that most consider 70″+ TVs not only too costly, but somewhat of an eyesore in the living room. But a projector gives the best of both worlds, allowing for larger images when desired and virtually no footprint on the living room. But the flip side is that users need a costly screen in order to get the best possible image. Lighting up the wall simply won’t do the job – especially if laying out five figures for the 4k privilege.
Then there’s the issue of bandwidth, which we’ve discussed before. According to Chen, in order for viewers to enjoy 4k as broadcast or streaming, they’d have to have enough bandwidth to handle the 8.91 Gb/s pipeline. And at that rate, most Internet data caps would be eaten up in less time than it takes to watch a single episode of The Big Bang Theory. Not going to happen since Internet providers are refusing to update their out of date business models.
That leaves little room for wide spread adoption, which is what’s needed to make 4K and beyond affordable. We may be seeing the end of technological road as a result and it may just be due to simple biology.
Hat Tip – SCRI