HDMI 2.0 is a key component of today’s computers and TVs. Here we go through what it really is for technology and when you actually need it.
High-Definition Media Interface (HDMI) is one of several standards available for transmitting image and sound via a cable. The first version of HDMI was released already at the end of 2002 and has been developed and refined considerably over the years. The basic idea is still the same – sending picture and sound over a single cable. The amount of data that the interface can handle, however, has increased significantly and so also the image and sound-related functions that are supported. From 2015 onwards, it is HDMI 2.0 that is the new hot.
HDMI 2.0, which is the latest version, has several improvements compared to previous versions that become important with today’s TV and monitor market. For TV users who use, for example, 4K Ultra-HD resolution (3,840 x 2,160 pixels), HDMI 2.0 is almost a must.
When do you need HDMI 2.0 and what’s new?
HDMI is currently used on everything from graphics cards, monitors and TVs, but also other home electronics. The general rule of thumb is that HDMI 2.0 is required when you want to use 4K resolution at 60 Hz (frames per second) and HDR.
HDMI 2.0 can transfer larger amounts of data to the same physical cables as HDMI 1.4. With HDMI 1.4, 4K resolution was locked to 30 hertz (maximum 30 frames per second), simply because no more data could be sent over the interface. With HDMI 2.0, the bandwidth also increases the picture frequency to 60 hertz, critical for not least computer users and players. The higher frame rate gives much more flexible movements on the screen and better empathy. It is already evident when you move the mouse on the desktop, but not least in fast action games.
Another exciting novelty with HDMI 2.0 is the High Dynamic Range (HDR) that enhances the contrast between dark and bright parts displayed on the screen. This will, in turn, result in more realistic images. However, this feature requires a monitor or TV with HDR support. However, according to independent studies, HDR will deliver larger perceived image enhancements than 4K resolution and is used today in everything from film to TV and video games.
HDMI 2.0 also adds support for the delivery of dual video streams to the same screen. Up to 4 audio streams can also be sent to the same source. These two features mean that if two people are watching the same TV, it is possible for people to watch one program at the same time on the same TV.
Better colors and more audio channels
Compared to HDMI 1.4, you will also be rewarded with a much larger color palette thanks to a larger color range. Where the support for Rec. 2020 will deliver better image quality with less banding and other compression artifacts. There is also official support for the bioimage ratio 21: 9.
The increased bandwidth will also allow your audio system to send 32 separate audio channels of 48 kHz each. Up from 8 audio channels with the previous version of HDMI. HDMI 2.0 also supports Dolby Atmos with separate height channels for the greater sound experience.
Graphics card support and the future
In order to utilize the latest version of HDMI, it is required that both your picture source (Graphics card, Blu-ray player, game console, etc.) and your image viewer (Monitor, TV, Projector and others) support the technology. For most products, it is relatively easy to find out, but when it comes to graphics cards it can be trickier.
Support for HDMI 2.0 has been on graphics cards from Nvidia since the 900 series launched in 2015. That is, graphics cards based on the Maxwell architecture such as the GTX 970 and GTX 980. HDMI 2.0 support.
AMD took longer to implement HDMI 2.0 on their graphics cards. Not until 2016 and the new RX series of graphics cards implemented support for the latest HDMI standard at Radeon graphics cards.
How do you do if your screen / TV has HDMI 2.0 support but not your graphics card?
If you have support for HDMI 2.0 on your screen or TV but not on your graphics card then there is no cause for panic. The HDMI standard has not been physically changed, so graphics cards with support for, for example, HDMI 1.4 are compatible with HDMI 2.0 displays. However, one loses the newer functions such as 4K at 60 Hz and then becomes bound to 4K at 30 Hz.
Since no physical changes have been made to the HDMI connectors, you do not have to throw away their old picture cables as they still work with any HDMI port. HDMI High Speed cables that have been in the past are fully compatible with HDMI 2.0 and its increased bandwidth requirements.
If you use a screen / TV with support for HDMI 2.0 but a graphics card that does not have that function then you can rely on DisplayPort (DP). If you have a DP 1.2 or newer output on your graphics card, you can use an adapter between this and the HDMI 2.0 input on your monitor. Displayport 1.2 also supports 4K resolution at 60 Hz and can, with a matching adapter, feed your HDMI 2.0 TV with the right signal.
The future and Thunderbolt / USB Type-C
The HDMI interface has several competitors in the market today, where on the PC market we have long since switched between HDMI and Displayport. Two other competing interfaces and formats are Thunderbolt and USB Type-C. Both of these formats are all used more extensively in computers and not least by Apple in their Macbook family. These formats can both use a small and smooth Type-C cable and transmit both large amounts of data and also power between devices. Thunderbolt and USB Type-C are more complementary to the HDMI interface at the time of writing, but in some cases, they can completely replace HDMI and other cables. Generally, the HDMI standard and its currently generally louder cables are mostly used in home electronics. This is probably a trend that will continue in the future where the format such as Displayport and Thunderbolt / USB Type-C takes over more on the PC and the computer market.
HDMI 2.1 which is expected to be the next HDMI standard has not been specified or received a nailed launch date at the time of writing.