December 9, 2024

magellan-rfid

More Computer Please

8K vs 4K TVs: Double-blind study by Warner Bros. et al reveals most consumers can’t tell the difference

I’ve long maintained that the value of 8K displays is not in the increased pixel count. There’s a limit to the resolution that humans can discern on video screens at normal seating distances, and increasing the pixel density beyond that limit offers no advantage.

But where, exactly, is that limit? More specifically, do 8K displays offer any benefit in terms of perceived detail compared with 4K under normal viewing conditions? In collaboration with Pixar, Amazon Prime Video, LG, and the American Society of Cinematographers (ASC), Warner Bros. recently addressed this question in a well-designed, double-blind study to see if people could discern a difference between 4K and 8K with a variety of content.

For the purposes of this article, “4K” refers to a resolution of 3840 x 2160, and “8K” refers to a resolution of 7680 x 4320. As you might already know, these definitions are something of a misnomer; to be technically accurate, “4K” really means 4096 x 2160 and “8K” means 8192 x 4320. But these terms have been coopted by the consumer-electronics industry to apply to the corresponding television resolutions, so that’s how I’ll use them here.

Content selection and preparation

A total of seven clips were prepared, each in native 8K and about 10 seconds long with no compression. Two clips from Warner Bros.’ Dunkirk (8K scans of 70mm film) included a closeup on a character and a wide shot of the beach. Animated clips from Pixar’s Brave and A Bug’s Life were rendered in 8K for this study. And two clips from the Amazon live-action series The Tick—one in a cave and the other in a spaceship—were shot in 8K on a Red digital-cinema camera, as was a clip of nature footage shot by Stacey Spears.

All seven clips were also natively HDR and encoded in HDR10. Fig. 1 lists the HDR10 statistics for each clip.

4k vs 8k fig1 Warner Bros.

Fig. 1: All seven clips were encoded in HDR10. This table reveals the MaxFALL (Maximum Frame Average Light Level) and Max CLL (Maximum Content Light Level) of each clip. As you can see, the clips represent a wide range of average and maximum light levels.

Each clip was also downscaled to 4K using the industry-standard Nuke post-production software. Then, the 4K clips were “upscaled” back to 8K using the Nuke cubic filter, which basically duplicates each pixel four times with just a bit of smoothing so the final image is effectively 4K within an 8K “container.”

Why upscale the 4K versions back to 8K? Because both versions would be played on the same 8K display in a random manner (more in a moment). In order to play the 4K and 8K versions of each clip seamlessly without HDMI hiccups or triggering the display to momentarily show the resolution of the input signal, both had to “look like” 8K to the display.

Test equipment and procedure

The display on which all clips were shown was an LG 88Z9 88-inch 8K OLED TV, which I reviewed here. All clips were loaded onto a Windows PC with an Intel 18-core i9 CPU, SSD RAID storage, and Nvidia 1080Ti GPU. A BlackMagic 8K Pro video interface sent the video via four 12G SDI links to four AJA Hi5-12G SDI-to-HDMI converters, which output HDMI 2.0. The four converters sent HDMI 2.0 to an Astrodesign SD-7075, which converted them into a single HDMI 2.1 bit stream that was sent to the TV.

Source Article