Tom explores the history and evolution of the HDMI interface and the features it may, or may not, support.
Featuring Tom Merritt.
MP3
Please SUBSCRIBE HERE.
A special thanks to all our supporters–without you, none of this would be possible.
Thanks to Kevin MacLeod of Incompetech.com for the theme music.
Thanks to Garrett Weinzierl for the logo!
Thanks to our mods, Kylde, Jack_Shid, KAPT_Kipper, and scottierowland on the subreddit
Send us email to feedback@dailytechnewsshow.com
Episode Script
I have an HDMI cable but is it the right one?
What the heck is HDMI 2.1?
Do I need new cables?
Confused?
Don’t be. Let’s help you Know A Little More about HDMI 2.1
Let’s start with HDMI itself the High Definition Multimedia Interface.
It transmits uncompressed video and either compressed or uncompressed audio to a compatible device.
In other words it’s digital video.
It’s a proprietary system but it’s an implementation of the EIA/CEA-861 standard. That’s why the old DVI standard is interoperable with HDMI. Just that DVI doesn’t have audio.
In fact before HDMI most systems carried audio and video separately. You may remember the old yellow composite video plug that dangled alongside the red and yellow stereo RCA plugs, for instance.
HDMI added audio so you could use just one cable. And it also supports Consumer Electronics Control, or CEC which lets devices control each other over the cable. Hence your TV can turn itself on when the Roku tells it to, for example.
There have been seven versions of HDMI. 1.0, 1.1, 1.2, 1.3, 1.4, 2.0 and the most recent 2.1, that we’re going to talk about in depth later. They all use the same form factor so they all look essentially the same and can work in the same ports. But the new versions have added improved capacity and performance, higher resolution and color space support and advanced features like 3D and ethernet data connection.
And HDMI isn’t a public standard, but it’s also not controlled by just one company. The founders were Hitachi, Panasonic, Philips, Silicon Image, Sony, Thomson, RCA, and Toshiba. Development of HDMI began April 16, 2002 and products with HDMI started in late 2003. Control was further diversified when the HDMI forum was founded October 11, 2015. It is an open organization that manages HDMI development and includes 83 members. More than 1700 companies have adopted HDMI.
Oh and HDMI integrates the HDCP copy-protection developed by Intel which is now provided by an Intel subsidiary Digital Content Protection LLC.
There’s a rabbit hole we could go down of varying types of connectors and cables but we’ll try to keep it simple.
There are technically five types of connectors. Type B- Dual-link isn’t used by anyone because the maximum bandwidth of HDMI 1.3 and later makes the “dual” link unnecessary.
Type C is the Mini connector a smaller version meant for mobile devices. You’ll see this on cameras. There’s also Type D, an even smaller version that looks like micro-USB. Type E is for automotive systems, with a lock to stop it from vibrating out and some moisture and dirt prevention.
But almost all the cables you’ll see are Type A. That’s the one you most likely see on your TV and desktop monitor.
Those are the connectors, the ports and the ends of the cable.
There are also different kinds of cables. Before HDMI 2.1, there were basically three groups, Standard up to 1080, High Speed up to 4K and Premium High speed up to 18 gigabits per second with HDR, and all of those can come with or without ethernet support. There’s also a standard automotive HDMI cable. And along with HDMI 2.1 there’s now an Ultra High Speed HDMI cable up to 8K and 10K at 120 Hertz.
Keep in mind the cables don’t match up with the HDMI versions. Cable type only affects bandwidth and therefore the maximum resolution and refresh rate. The ports determine whether you can do things like 3D, variable refresh rate etc. So any cable will work if it fits the connector but it may not enable all the features of the device. And older cables can sometimes still handle newer features.
Yes that’s confusing. Your port is HDMI 1.4 but there’s no such thing as an HDMI 1.4 cable. The cable just has a bandwidth capacity, right? And cables generally don’t label whether they are high speed, premium high speed etc. So the second best method I’ve found to figure out what cables I have, is to plug them in and see if your features work right. Which is annoying. The best method, and most wasteful, is to responsibly recycle all your current cables and buy new ones you’re sure are the right ones.
But how do you even buy the right ones? You can pretty much ignore everything the description says except the bandwidth. If it says 18 gigabits per second or higher you’re good to go for 4K 60 Hertz HDR. If you’re sure you have an HDMI 2.1 port on your TV and you want to do 4K 120 fps, look for the cables that say 48 Gbps. Older cables can do the 4K for those consoles just not at 120 fps.
Also try only to buy certified cables since companies who get certified, get punished for lying about what the cables can do. And keep in mind that certified cables cannot market themselves as HDMI 2.1 or any other HDMI version number. So if you see that be suspicious. The HDMI Forum has an app for verifying genuine cables for both iOS and Android.
Which leads us to HDMI 2.1. do you need a new TV or monitor that supports it?
Here’s what it means.
HDMI 2.1 increases bandwidth support from 18 to 48 gigabits per second. It can support resolutions up to 10K and up to 120 frames per second. It supports BT.2020 and 16 bits per color for wide color gamut.
Remember each feature increases bandwidth. If you go from 1080p to 4K, you’e sending more data. Now if you increase from 30 fps to 60 fps you’re also increasing the data. Do both 4K and 60 fps and you’ve increased data a lot more. Add in data for HDR and Wide Color Gamut, so you see what I mean. HDMI 2.1 has higher bandwidth so it can do more things at once.
But if you were paying attention you remember me saying that the cables were identified by bandwidth but the connectors were the ones with the HDMI 2.1 type designation, because they also add features.
HDMI 2.1 connectors can support the following features.
eARC or electronic Audio Return Channel. ARC lets audio go two ways over HDMI without needing the second audio cable. It can do stereo and compressed 5.1 surround sound. e-ARC however can also do uncompressed 5.1 and 7.1 surround sound. In other words Dolby Atmos. And since audio doesn’t use as much bandwidth, eARC just needs the devices to have HDMI 2.1, an older High Speed cable with ethernet can support it.
Next feature, Dynamic HDR. Yes. I know it technically means Dynamic High Dynamic Range. Twice the dynamism!
HDR is metadata that tells a device like a TV how to treat an entire video file or stream. One set of instructions that it applies to every frame to improve the color range.
Dynamic HDR, on the other hand, can set new rules for each frame, though in practice it usually sets it by scene. So, HDR treats a person under a bridge in broad daylight the same as campfire at night. Dynamic HDR will have different rules for each, to make each scene pop even more.
HDMI 2.0 offered partial support for Dynamic HDR at 4k 60 fps but HDMI 2.1 offers dynamic HDR up to 4k at 120 fps.
And finally Variable Refresh Rate or VRR. This reduces tearing, that thing when you see part of one frame and part of another at the same time and the scene is all jagged for a split second. That’s usually an issue in games because the game console isn’t creating frames at exactly the frame rate the TV expects. Cause it’s not a pre-recorded video it’s a dynamic game that you’re affecting with your amazing game playing skills.
Variable Refresh Rate lets the console vary the rate it creates frames and send them when they’re done. This reduces image artifacts like tearing. If you’re familiar with Nvidia G-Sync or AMD FreeSync, its’ similar. Both of those only work over DisplayPort. But new graphics cards like the GeForce RTX 3080 and Radeon RX 6800 XT support Game Mode VRR over HDMI. AND it also works on previous cables as long as you have HDMI 2.1 devices on either end.
A companion feature of HDMI 2.1 is Quick Frame Transport or QFT that shortens the delay sending active video, so there’s less time between you pressing the button and the laser hitting the aliens face.
There’s also Auto Low Latency Mode – or ALLM that can automatically enable and disable gaming mode when you’re playing a game rather than making you dig through the display’s menu. And there’s Quick Media Switching or QMS that eliminates the black screen when switching inputs.
Now one last thing to keep in mind. Just because HDMI 2.1 CAN support a feature doesn’t mean the device with the HDMI 2.1 certified connector supports it. To get certified as HDMI 2.1 you just have to support one of the features. One. The connector just makes those features possible. The device still has to implement it. So, for example, it’s possible to get a TV with HDMI 2.1 that doesn’t support eARC because the TV maker didn’t implement. Or it might do eARC but not high frame rates. And it might support HDMI 2.1 and have 5 HDMI ports but only one of them is HDMI 2.1.
It’s technically possible to add some of these features in a firmware upgrade but almost nobody expects device makers to do that.
I say this not to confuse you but to remind you that HDMI 2.1 is not a shortcut for “has all the features.” When you’re buying the device make sure it supports the features you want.
So if you want 4K at 120 frames per second, especially if you have a PlayStation 5 or Xbox Series X. If you want dynamic HDR. Or if for some reason you really want 8K at 60fps. Then yeah, HDMI 2.1 might be for you.
In other words, I hope you know a little more about HDMI 2.1