RTX Video HDR: Nvidia’s AI gives ordinary web videos HDR look

Artificial intelligence automatically generating HDR image from SDR sources in web browsers

Last year, Nvidia introduced a feature called RTX Video Super Resolution, which uses the GPU to upscale and enhance web video with a DLSS 1.0-like filter utilising an artificial intelligence (though you can use this upscaler in VLC Media Player as well). This technology has now been extended to RTX Video HDR, which is again an AI filter that recreates (simulates) an HDR component for an ordinary video, adding high dynamic range visuals.

HDR, or high dynamic range image, means that a game, or in this case a video (since RTX Video HDR is a feature designed to postprocess video content), can display a higher range of both color hues and higher range of brightness than what is normally possible in a computer image. Normally, when you’re working on a PC, you’ll get content with the sRGB color gamut and a certain standard brightness range (typically up to 400 cd/m², but you’ll probably have the brightness turned down on your LCD lower unless you’re working on a laptop outside in the sun).

The high dynamic range means that more saturated colors than lie outside the sRGB space can be displayed, as well as higher brightness (up to 1000 cd/m² or even more). For example, a shot of the sky, sun, fire, or other luminous objects can locally display more intense brightness that more closely matches the real world experience, whereas in standard (SDR) display, even the most blindingly bright objects will get clamped to the same normal range of brightness.

Most web video, and generally most video you’ll see anywhere, is just regular SDR color and brightness, because historically, HDR is a relatively recent thing, and most of the material has been shot (or rendered in case of computer animation) without HDR information, and simultanously most monitors and TVs can only display SDR images, so there’s little incentive to start some sort of full transition to HDR video.

RTX Video HDR

However, the Nvidia RTX Video HDR feature allows you to play any video material with automatic enhancement that simulates a real HDR image, or more accurately “estimates” it from the SDR content. As is usually the case with Nvidia, a pre-trained neural network (AI) is used to do this. It is trained to generate HDR images from SDR images. Nvidia presumably did this in a similar way to how it trained DLSS. That is, it started with a corpus of HDR videos that were converted to SDR (i.e., stripped of extra color gamut and brightness beyond the SDR/sRGB color space).

This created a corpus of images, where for each SDR image you know how its HDR version should look like (because you have the original saved). Using these pairs, you then gradually train the AI to generate an image with HDR enhancements from an input SDR image – which means the AI enhances the colour gamut and adds brightness in hopefully appropriate places. This “hallucination” (as it’s sometimes jokingly called) is then compared to the original, scored (based on some similarity metric) and you proceed with training the network based on the similarity scores to gradually make the fabricated HDR images more closely resemble the real HDR original.

This is probably best compared to the problem of artificially colorizing black and white photos and movies, although that is probably a much harder task than a automatically estimating the HDR component for an SDR image – in the case of HDR enhancement, the input doesn’t lack nearly as much information and the AI can probably exploit recognizing objects like clouds, sun, flames in the image more reliably than such recognition can be used for recolorizing.

The RTX Video HDR feature then applies this pre-trained neural network to the video you’re playing back, so you get an estimated HDR version of the video. The effect can be likened to increasing color saturation and contrast (and maximum brightness), but the effect should be adaptive and local.

Of course, it’s hard to tell how faithful and realistic Nvidia’s AI filter can be without trying it out extensively. There is of course a risk with such filters that they will oversaturate and overbrighten even things that shouldn’t be as bright/saturated, and conversely miss some objects that should be bright or have strongly saturated colours. Of course, even an “unfaithful” image can be attractive at first glance, because humans tend to prefer exaggerated contrast and colour saturation (it’s similar to the case with artificial sharpening), but if the filter gets things wrong, the result will be an unnatural-looking scene.

Demo of the effect that RTX Video HDR can have (Source: Nvidia preview video)

Available in Chrome-like browsers, for all GeForce RTX cards

The RTX Video Super Resolution feature should apparently work alongside RTX Video Super Resolution (AI upscaling), and is again available in Google Chrome-based web browsers (including MS Edge) on the Windows platform. It can be turned on in the same section of the Nvidia Control Panel, where the “High Dynamic Range” option has now been added. As a prerequisite for activating this HDR filter, it is apparently necessary to have upscaling turned on, so you must have “Super Resolution” enabled alongside this checkbox.

Enabling RTX Video HDR in Nvidia Control Panel

In order for the filter to do its job, you need to use an HDR monitor, more specifically a monitor that supports the HDR 10 (or HDR 10+) standard. The output will therefore be encoded in this format. HDR display mode must also be enabled in the operating system environment.

The filter uses acceleration on GeForce GPU tensor cores, so it is only available for owners of GeForce RTX 2000, 3000 and 4000 series cards (this time the support includes the oldest Turing generation).

RTX Video HDR cannot yet be used for local video content playback. RTX Video Super Resolution has already been implemented in VLC (and also MPC-BE), but adding an HDR component probably needs some extra work within the player, whose rendering output module has to manage the HDR output, and that is not a trivial matter. But there’s probably a reasonable chance that this feature will eventually make its way into these classic video players as well, and not just stay exclusive to various youtube-like services and and other online content.

Source: Nvidia

English translation and edit by Jozef Dudáš


  •  
  •  
  •  
Flattr this!

Gigabyte SSD brings back SLC NAND, lasts 109,500 write cycles

The boom (or bubble?) around AI has brought many things, and among them interesting news for those missing SSDs based on MLC and SLC NAND Flash which was more pricy but had better performance and crucially, much longer lifespan so you didn’t have to worry about wearing out the SSD. That said, Gigabyte is launching an SSD that is officially designed for AI applications, but not just for them – its main asset is precisely SLC recording. Read more “Gigabyte SSD brings back SLC NAND, lasts 109,500 write cycles” »

  •  
  •  
  •  

End of DDR3 memory, old PC upgrades to get costlier. Because of AI

HBM2, HBM3 etc. to be rare memory used on expensive server and compute hardware in low volumes. But the boom of AI accelerators (like Nvidia’s GPUs) suddenly catapulted the technology into a highly desirable component, now accounting for large percentages of total DRAM production. This is going to be at the expense of legacy RAM – a large portion of the lines previously producing DDR3 memory have reportedly switched to HBM. Read more “End of DDR3 memory, old PC upgrades to get costlier. Because of AI” »

  •  
  •  
  •  

AMD rebrands CPUs too, mimics Intel’s Core Ultra with Ryzen AI

Last summer, Intel announced a transition to a new era of processors, symbolized by the rebranding to Core Ultra for Meteor Lake and the upcoming Lunar Lake and Arrow Lake processors. AMD has often copied Intel’s branding methods in the past, and it seens it’s going to do it again. The upcoming Zen 5 processors will get their own version of the Ultra gimmick, and ironically, Intel might actually be jealous of the idea, this time. Read more “AMD rebrands CPUs too, mimics Intel’s Core Ultra with Ryzen AI” »

  •  
  •  
  •  

Leave a Reply

Your email address will not be published. Required fields are marked *