4K, 8K, 16K-Stewart Filmscreen is There and Ready When You Are!

By Alan C. Brawn CTS, ISF, ISF-C

Just as we are beginning to embrace the 4K phenomenon in displays, we are already seeing the rumblings, dare I say emergence, of 8K and even 16K displays in the not-too-distant future. While some seem to be content with 1920 x 1080 or full HD, evidence shows 4K is here to stay – at least until the next evolution in resolution takes hold.

We have all seen articles with titles like “Is 4K Really Necessary?” and “Barriers to 4K.” Many of these articles talk about the availability of 4K source material and the bandwidth needed to stream 4K, while others mention that 4K is best viewed at close proximity and that there is little visible difference at longer viewing distances.  There is no argument that we are in the midst of an evolution of source material and available bandwidth before we can embrace full 4K adoption, but the issue of visible differences is one that begs for more discussion centered around the science of visual acuity.

Understanding Visual Acuity and Pixel Density

Let’s start with pixels. AV experts and video enthusiasts know: The more pixels there are, the smaller they become, resulting in more overall picture information that can be displayed on a screen. This is true from a technical standpoint, but too simplistic to truly explain the relevance of greater pixel count.  It ultimately boils down to what the human eye can resolve. So, just what can we see? Referring back to the Snellen eye chart to measure 20/20 human vison, we learn that the human eye resolves one arc minute of information. An arc minute is a subdivision of 1 arc degree. There are 360 arc degrees in a complete circle and 60 arc minutes in each arc degree.

As you can see from the diagram below, we can also subtend 1 arc minute because someone with 20/20 vision can see that letter E on the eye chart. In short, that’s what our eyes actually detect, but let’s do a little bit more math and relate it to pixels.

Image3

There are 10,800 arc minutes in 180 degrees of viewing. So the eye has a limited resolution of one arc minute and would require an image no less than 10,800 pixels wide. Achieving similar horizontal resolution with 1920 x 1080 projectors would require 10,800 divided by 1920 H pixels per projector, which is 5.6 projectors across edge to edge and 7000/1080 or 6.5 projectors vertically— and that’s without blending losses. So we would need about 6 x 7 or 42 of these HD projectors, minimum, to match human acuity. Clearly that is very expensive and not practical for most applications.  But the example gives us some idea of what the human eye can actually see. Now, those 4K, 8K, and 16K displays begin to make more sense, because human vision really can detect a difference.

Image Resolution: More than Just Pixel Density

If we turn our attention to the display devices that create what we see and keep in mind the “holy grail” of what the human eye can resolve, let’s explore resolution from another perspective.  Image resolution correlates directly to the amount of detail in an image. Higher resolution means greater image detail and more detail brings us closer to visual acuity. Continue reading