Do higher resolutions for display systems make sense? In a word, no, but let me explain why.

Consumer display technology continues to advance at a lightning pace, and it's hard to keep up with all that is happening. One of the trends we have all seen over the last decade is the push for higher and higher resolutions. Yet, is anyone asking whether we should be pursuing these higher resolutions?

resolution comparison2
The resolution counts push forever higher, but how many pixels do we really need?

We like to think that more is usually better, so why should total pixel counts be any different?

This leads to a consideration of different variables involved in how we perceive resolution, particularly because our own eyesight has a limited maximum "resolution." Once we reach that limit, anything higher is wasted on us.

We cannot think about our eyesight resolution in the same way as we do pixel-based imagery. Our eyes have rods and cones that transmit pieces of our field of view to the brain, where it is re-assembled into a complete image. Couple that with the fact that only the center of our field of view is where we can identify high detail, and it becomes harder to translate resolution into human eyesight. Of course, it doesn't stop there; we have imperfect eyes as well. Decreased acuity means less ability to resolve detail. Near-sightedness, far-sightedness, and other visual conditions can impact our acuity.

There are very complex mathematical models that imply that someone with 20/20 visual acuity has as much as 576 megapixels of "resolution" in their eyes. However, we must consider that this number is total imagery captured, from both the peripheral and the center of our eyesight. A single glance is far less, perhaps in the 5-15 megapixel range, which isn't bad. But remember all the issues our eyes can have.

When we factor this 15-megapixel maximum resolution into our thinking, we realize that a human's perfect eyesight would max out between 2.5K and probably 5.5K. So why push beyond this limitation?

One can think of a number of reasons why higher resolutions make sense, such as the need for high-detail imagery for lab research, astronomy, surveying, and the like. But for our purposes of home theater or commercial display? Not so much, if we're being honest with ourselves.

HD 8K resolution graphic 800

When we look at 1080p and its 2-megapixel resolution, this might seem quite small compared to the massive 33 megapixels that 8K brings. However, what does science teach us about display sizes and viewing distances to appreciate the differences between resolutions?

Let's say that your room is blessed with a 16:9, 150-inch projection screen. According to viewing distance standards from THX and the Society of Motion Picture & Television Engineers (SMPTE), the viewing distance necessary to appreciate the difference between resolutions is as follows:

  • 480p = 44 feet viewing distance
  • 720p = 29 feet. viewing distance
  • 1080p = 20 feet viewing distance
  • 4K = 9 feet viewing distance
  • 8K = 5 feet viewing distance

As you can see, to fully appreciate the difference between 1080p and 4K requires a more than 50% reduction in viewing distance! Consider, also, how large a 150-inch diagonal image would be from 9 feet away! If your home theater could support a screen this size, then moving to 4K could work for you. However, in most use cases, this makes no sense because once you are further than 9 feet away from the screen, you will see no appreciable difference in detail over 1080p.

You might think that you can get a larger screen, and that would help, to a point. Let's take a fairly large 220-inch, 16:9 diagonal screen and run the numbers again:

  • 480p = 64feet
  • 720p = 43 feet
  • 1080p = 29 feet
  • 4K = 14 feet
  • 8K = 7 feet

As you can see, at 4K resolution, adding another 70 inches of diagonal size yields a paltry increase in max viewing distance of 5 feet vs. the 150-inch screen. This means that if you are planning to have 4K projection for even a modest size space, let's say, 40 feet in depth, the size of your screen would have to be much larger than the available space you likely have.

I believe it's also important to remember that many projectors on the market claim 4K capability but do not sport native 4K imaging sensors. They rely on what I consider somewhat deceptive tactics, such as pixel-shifting.

So why is this so, and do we have to follow the THX or SMPTE standards? Let's answer the first question: viewing distance matters because the higher your resolution, the smaller the pixels get. To be able to appreciate the difference in that smaller pixel size and the ensuing boost in fine detail, the closer you physically need to be to the image.

If you have worked with most LED video walls to date, you can appreciate that they look better from further away. This is because the illusion of a cohesive image is easier to pull from a distance. But get close enough to a video wall, and unless we are talking about very tight dot pitch you'll see the individual pixels, and thus, the illusion fails.

So I ask again: Do we have to follow the THX or SMPTE standards? The easy answer is no. These are standards designed for theaters and ideal scenarios. Anyone who has ever worked with projection systems knows how difficult it can be to realize ideal circumstances. But does this mean we should throw out what they say?

Not necessarily. We start with the standard, and then we look at what is often used in the field and ultimately what will work for your specific use case.

You might, at this point, be thinking that higher resolutions are pointless for display, and I would tend to agree. However, I can think of one use case where it could make sense, and that is when you want to utilize several different windows within a single frame, such as you may see at large touring shows or with ultra-wide displays. For example, if you think about the Las Vegas Sphere, there is such high resolution built into that system (256 million pixels) that they can virtually do anything they want visually without loss in resolution, both inside and out.

Remember, also, that moving to a 4K display means an investment in an entire infrastructure to support that resolution. This could mean signal cabling, switcher or receiver, and displays. This can quickly make the entire idea of 4K display financially, technically, or logistically unfeasible for you, particularly if you need to realize some kind of ROI from the expenditure.

Future-proofing isn't a good reason either because of all the reasons listed above. Remember, you have to massively increase the image size given a static viewing distance to appreciate the difference, and the sheer cost of scaling your display system up that large is unrealistic.

In conclusion, the fact that higher and higher resolutions are being forced on us by manufacturers clamoring to meet the demand of largely ignorant consumers who believe more is always better should not drive us to succumb to these trends. I understand that 4K is here to stay; however, we are still seeing projector manufacturers offering a large variety of native WUXGA laser projectors that allow you to retain your 1080p display. And in many cases, that will be more than enough.

Tim Adams is president and chief systems designer for Timato Systems, an audio/video integration company specializing in servicing the sound, lighting, video, projection and live-streaming needs of churches and other houses of worship. He can be reached at info@timatosystems.com.


 
Comments (4) Post a Comment
David Sobers Posted Jun 5, 2024 3:38 PM PST
A good and timely article.

I would enjoy Tim or Ron adding the comparison of digital images versus the gold standards of the past - 35 mm and 70 mm film. Also, the higher pixel count fits in well with the common home theater environment where viewing from a 5' or 7' distance is desirable or even necessary to many viewers in terms of their limited room size and the continuing interest in video immersion where the viewer becomes part of the scene. The immersion factor is one of the strong attractions for the headset users as those manufacturers continue to improve their resolution, field of view and 3D effects. I have enjoyed sky diving, shark feeding, plane piloting, satellite orbiting, and the like thanks to YouTube's many offerings. The larger screen with the accompanying high pixel count, high contrast and color fidelity, and immersive sound allows a group of participants to enjoy the experience, aka iMax and Cinema on your home video wall. A video wall does not have to be LED based, it can be projected as well.

The escape that surround sound and surround video offers is enticing. We are still some time away from residential holograms, but the immersion lure is there, and the industry will satisfy it as long as the demand is present. I look forward to the ride.
Tod Posted Jun 5, 2024 7:41 PM PST
Often times, up to 1/3 of the projected 16/9 image ( I'm talking movies ) is covered by masking black bars. Wasted detail and luminence. For material produced for home consumption (streaming providers) could we somehow convince the producers and directors to fill the entire 16x9 display, regardless of the resolution. Then they can mask it when it's shown in theaters instead. If masking home video is so great, then why don't they mask sporting events?
Rob Sabin, Editor Posted Jun 6, 2024 8:50 AM PST
@Tod, this is an unfortunate byproduct of the conflict between the movie industry's longstanding widescreen format and the 16:9 aspect ratio settled on for HDTV. Most televisions have an aspect ratio picture mode that will zoom a widescreen movie image to fill the screen, but at the expense of either sacrificing some content or introducing some level of geometric distortion. Back in the day of 4:3 TV sets, broadcasters (as well as airlines) regularly practiced "pan and scan" when adapting widescreen movies for viewing, where the entire film was re-recorded with decisions being made along the way as to what relevant content should actually be shown to the viewer. A two-shot of actors speaking to each other would become a series of bouncing one-shots, and so forth.

The only truly effective fill technique I've seen is found in the topline madVR Envy video processors, which applies very subtle correction in places it won't be easily noticed to provide consistent screen aspect ratio no matter the native aspect ratio of the content.
Ryan Posted Jul 6, 2024 3:30 AM PST
You didn't really touch on monitors really. I'll tell you for a fact I have a 4k 144hz 28" and a 5k 60hz 27" and even from 18"-24" away I can tell it's much sharper. Looking from 8" away the 4k doesn't look really sharp anymore where the 5k is pretty sharp. I can't wait for them to make a 8k 144hz 28".

Post a comment

 
Enter the numbers as they appear to the left