Inevitably, every year around CES at least one company hypes a new acronym that could be the next big break in TV display technology – or just a proprietary rebranding of existing tech. What it actually means is almost always perplexing. So let’s take a moment to alleviate confusion and clarify what all the different terms, new and old, really mean.

What Are the Different Types of TV Screens?

There are two main types of display technology that dominate the television marketplace – OLED and LED. And now, for those that have won the lottery, there’s MicroLED to be added to the list. While the names look similar, the way they work is quite different and each have their own benefits and drawbacks.

OLED TVs

OLED (Organic Light-Emitting Diode) TVs were introduced by LG back in 2013, and for a while they were the only manufacturer making them. Sony joined the party in 2017 (not including the 11-inch XEL-1 in 2007), and we are now seeing more companies enter the fray. Vizio released its first OLED late in 2020, and Panasonic announced its first at CES 2021 (others are available around the world but haven’t yet entered the US marketplace). OLEDs are electroluminescent, meaning when a pixel is sent an electrical charge it emits its own light dependent on the strength of the signal. Stronger signal, brighter pixel.

The huge benefit of this is when a pixel isn’t charged, it’s completely off, so OLEDs can attain absolute black. Since the light from each individual pixel can be controlled, the bright sections of an image can be accurately pinpointed and there isn’t the blooming effect seen on many LED displays. It’s not all good news, though. OLEDs can’t get as bright as LEDs, they have (slight) potential for burn-in, and they’re more expensive than LED televisions.

LED TVs

LED TVs are LCD TVs that have an LED backlight that is either around the edge of the TV (edge-lit) or directly behind the screen (direct-lit) in clusters or zones. Edge-lit TVs are less expensive while direct-lit TVs cost more. Another type that has LEDs behind the screen is full-array local dimming (FALD), which has better control over where the light shines on screen. Both are susceptible to blooming, where the light bleeds out from bright spots on screen, with edge-lit being most susceptible and FALD being least. A FALD backlight is separated into zones and, when done well, mitigates the blooming and gives the best contrast you can get from LED TVs. These TVs could have up to hundreds of small LEDs to provide the light for the image. There’s now also mini-LED technology, which was first released by TCL last year, and could number more than 25,000 tiny LEDs in one display (TCL now calls its tech OD Zero). The extra LEDs allow for more backlight zones with finer control, so less blooming. LG and Samsung have their own mini-LED versions named QNED and Neo QLED, respectively.

The LCDs control how much of the LED backlight is allowed to pass through to your eyes. But because the light needs to be blocked as opposed to turned off completely, LED TVs cannot achieve the same absolute black level of OLEDs. They’ve gotten a magnitude better over the past five to ten years, but still can’t compete. LED TV’s picture quality also suffers the more off-axis you get from center. But their light output absolutely blows OLED out of the water.

MicroLED TVs

We’ve been hearing about MicroLED from Samsung for a couple years, and now the company is finally releasing three models in 2021 (LG commercially released one last fall, although it’s primary usage will likely be signage). It’s the first new TV display technology to hit the consumer market since the aforementioned Sony XEL-1 more than ten years ago. Each pixel has three microscopic LEDs that produce the color and brightness. Because each pixel is controlled independently, they can turn off to achieve absolute black like an OLED and don’t need an LCD layer in front of the LEDs to block the light. The tech combines the benefits of both OLED (black level) and LED (brightness) without any danger of burn-in. But like with any new technology, MicroLED is ungodly expensive and will be for a few years.

Best 4K Gaming TV For PS5 and Xbox Series X

What Are Quantum Dots?

Quantum dots tech isn’t its own display technology like OLED, LED, and MicroLED. It’s an additional layer that can be added to LED or, potentially, OLED displays to increase brightness and improve the colors a display can achieve. The layer is made up of nanoparticles that, when hit with a light source, excite the particles to create more brightness and a wider color space (more on that later). In addition to just calling it Quantum Dot Technology, there are a few other names manufacturers put to their own proprietary process – including QLED (Samsung), Triluminos (Sony), and NanoCell (LG).

The Deal With HDR?

When we think of HDR (High Dynamic Range) it’s usually in terms of contrast ratio, but it also refers to color range. The expanded color range is called wide color gamut, or WCG. The purpose is to more accurately recreate the brightness and color space on television that we experience in real life. There’s still a long way to go, but HDR gets us closer than before.

Display brightness, or luminance, is measured in units called candelas per square meter (cd/m2), also known as nits. SDR displays usually have a brightness up to around 300 nits (although a properly calibrated display is set to 100 nits). HDR brightness goes up from there with some current TVs capable of outputting 3,000 nits.

The target color range for HDR is called BT.2020 (SDR uses Rec.709). There aren’t any displays that can yet achieve all the colors in the BT.2020 color space, although each improvement gets us closer. You might also see the DCI-P3 color space referred to, which is between Rec.709 and BT.2020, and is what’s used for movie theater presentations.

There are a few different formats in which HDR is delivered to our TVs. Not all content delivers all formats, and not all formats are supported by all TVs.

HDR10

HDR10 is the basic flavor of HDR that all TVs can accept. It has static metadata, including the maximum brightness of the pixels, or MaxCLL (Maximum Content Light Level), the average brightness of the pixels, or MaxFALL (Maximum Frame-Average Light Level), and color point information. Since it’s static, there’s a single set of information that pertains to the entirety of the content as opposed to individual scenes or frames.

Dolby Vision and HDR10+

Dolby Vision and HDR10+ both use dynamic metadata, so the metadata adjusts on a scene-by-scene or frame-by-frame basis. HDR10+ supports brightness levels up to 4,000 nits and 10-bit color depth. Dolby Vision, on the other hand, allows for a brightness of 10,000 nits and 12-bit color depth. It also is proprietary technology and requires content providers and display manufacturers to pay for a license. Even so, it is supported by more studios and manufacturers than HDR10+, which might be on its way out after losing support from Fox Studios last year.

HLG HDR Explained

HLG, or Hybrid Log-Gamma was developed for use in broadcast television. A bunch of TVs support it, but in the United States there’s still very little content broadcast with HLG. DirecTV uses it on its 4K channels. It’s backwards compatible, so if your TV doesn’t support it, you’ll still receive the signal in SDR.

Because current TVs can’t actually display the full capabilities of HDR, they use tone mapping. Basically, this takes the HDR brightness and color information and adjusts it to fall within the constraints of the TV. Some TVs do this better than others.

HGiG

The difference in tone mapping capabilities can cause issues for gamers competing, or just casually playing, on different TVs. Over the past year or so there’s been more murmurings about HGiG, the HDR Gaming Interest Group, with HGiG settings being implemented on displays such as LG OLEDs. The group of companies that comprise HGiG – including Microsoft, Sony, Vizio, LG, TCL, Panasonic, Warner Bros. Interactive, EA, and Activision – have proposed guidelines to help optimize game HDR performance across platforms and displays. To properly utilize those guidelines, all parts of the chain need to support it – TVs, consoles, and games. While there is support with some TVs and the next-gen consoles, game support is currently sparse.

Entering the Age of HDMI 2.1

As gamers, we’ve all been awaiting the arrival of HDMI 2.1. Now with the PS5, Xbox Series X/S, high-end graphics cards, and a smattering of TVs, it’s here. An enormous benefit of HDMI 2.1 over previous specifications is its increase in bandwidth, up to 48Gbps from the 18Gbps of HDMI 2.0. More bandwidth means higher resolution (up to 10K), higher frame rate (up to 120Hz), or a combination of the two. But there are other parts to the specification that could be included in a television’s HDMI 2.1 port.

HDMI VRR

Variable Refresh Rate, or VRR, isn’t something new to gaming. We’ve seen it on computers in the form of G-Sync and FreeSync for years, and even the Xbox One added support for FreeSync midway through its lifespan. It’s important to understand that, while G-Sync and FreeSync can still be supported on a TV, the VRR we’re talking about is HDMI VRR.

What is ALLM?

Auto Low Latency Mode, or ALLM, is a nice quality-of-life upgrade that will cause your TV to automatically switch to its best pre-determined gaming settings when it senses a game. The nice thing about this, is a TV’s game mode turns off all the extra processing that a TV is usually doing constantly (although much of it I suggest you disable anyway) which lowers the input lag.

HDMI ARC vs HDMI eARC

ARC/eARC, or the Auto Return Channel is only useful to those that are using external speakers instead of a TV’s speakers (please, let that be most of you). It sends audio information from your TV to your AVR or soundbar without the need of a separate audio-only cable, like an optical cable. It can also turn your TV on when a source powers up, or change your AVR to the proper source input. eARC, enhanced Audio Return Channel, is a better version of traditional ARC. It can pass higher-quality audio streams – including uncompressed 7.1, Dolby Atmos (technically possible with ARC), and DTS:X – and has lip sync correction.

Even though a TV has HDMI 2.1, it doesn’t mean that all of its HDMI ports are 2.1. Or that VRR, ALLM, and eARC are supported on all HDMI inputs. For instance, a TV I’m currently testing only supports eARC on HDMI 3 and VRR/ALLM on HDMI 4. It’s a frustrating situation that will hopefully be rectified by future TVs.

Don’t Forget About the ASTC 3.0 Tuner

At CES 2020 we began to see TVs with ATSC 3.0, known as NextGen TV, and as the year rolled on they were released to the masses. It’s a built-in tuner box for over-the-air broadcasts capable of 4K, HDR, and high frame rates. All you need is an antenna. If you only get your content through streaming apps or cable subscriptions, then NextGen TV won’t matter to you. But it does have a significant benefit over Netflix, Disney+, or DirecTV. It’s free.

By tracy