RTX Video HDR: Nvidia’s AI gives ordinary web videos HDR look

Artificial intelligence automatically generating HDR image from SDR sources in web browsers

Last year, Nvidia introduced a feature called RTX Video Super Resolution, which uses the GPU to upscale and enhance web video with a DLSS 1.0-like filter utilising an artificial intelligence (though you can use this upscaler in VLC Media Player as well). This technology has now been extended to RTX Video HDR, which is again an AI filter that recreates (simulates) an HDR component for an ordinary video, adding high dynamic range visuals.

HDR, or high dynamic range image, means that a game, or in this case a video (since RTX Video HDR is a feature designed to postprocess video content), can display a higher range of both color hues and higher range of brightness than what is normally possible in a computer image. Normally, when you’re working on a PC, you’ll get content with the sRGB color gamut and a certain standard brightness range (typically up to 400 cd/m², but you’ll probably have the brightness turned down on your LCD lower unless you’re working on a laptop outside in the sun).

The high dynamic range means that more saturated colors than lie outside the sRGB space can be displayed, as well as higher brightness (up to 1000 cd/m² or even more). For example, a shot of the sky, sun, fire, or other luminous objects can locally display more intense brightness that more closely matches the real world experience, whereas in standard (SDR) display, even the most blindingly bright objects will get clamped to the same normal range of brightness.

Most web video, and generally most video you’ll see anywhere, is just regular SDR color and brightness, because historically, HDR is a relatively recent thing, and most of the material has been shot (or rendered in case of computer animation) without HDR information, and simultanously most monitors and TVs can only display SDR images, so there’s little incentive to start some sort of full transition to HDR video.

RTX Video HDR

However, the Nvidia RTX Video HDR feature allows you to play any video material with automatic enhancement that simulates a real HDR image, or more accurately “estimates” it from the SDR content. As is usually the case with Nvidia, a pre-trained neural network (AI) is used to do this. It is trained to generate HDR images from SDR images. Nvidia presumably did this in a similar way to how it trained DLSS. That is, it started with a corpus of HDR videos that were converted to SDR (i.e., stripped of extra color gamut and brightness beyond the SDR/sRGB color space).

This created a corpus of images, where for each SDR image you know how its HDR version should look like (because you have the original saved). Using these pairs, you then gradually train the AI to generate an image with HDR enhancements from an input SDR image – which means the AI enhances the colour gamut and adds brightness in hopefully appropriate places. This “hallucination” (as it’s sometimes jokingly called) is then compared to the original, scored (based on some similarity metric) and you proceed with training the network based on the similarity scores to gradually make the fabricated HDR images more closely resemble the real HDR original.

This is probably best compared to the problem of artificially colorizing black and white photos and movies, although that is probably a much harder task than a automatically estimating the HDR component for an SDR image – in the case of HDR enhancement, the input doesn’t lack nearly as much information and the AI can probably exploit recognizing objects like clouds, sun, flames in the image more reliably than such recognition can be used for recolorizing.

The RTX Video HDR feature then applies this pre-trained neural network to the video you’re playing back, so you get an estimated HDR version of the video. The effect can be likened to increasing color saturation and contrast (and maximum brightness), but the effect should be adaptive and local.

Of course, it’s hard to tell how faithful and realistic Nvidia’s AI filter can be without trying it out extensively. There is of course a risk with such filters that they will oversaturate and overbrighten even things that shouldn’t be as bright/saturated, and conversely miss some objects that should be bright or have strongly saturated colours. Of course, even an “unfaithful” image can be attractive at first glance, because humans tend to prefer exaggerated contrast and colour saturation (it’s similar to the case with artificial sharpening), but if the filter gets things wrong, the result will be an unnatural-looking scene.

Demo of the effect that RTX Video HDR can have (Source: Nvidia preview video)

Available in Chrome-like browsers, for all GeForce RTX cards

The RTX Video Super Resolution feature should apparently work alongside RTX Video Super Resolution (AI upscaling), and is again available in Google Chrome-based web browsers (including MS Edge) on the Windows platform. It can be turned on in the same section of the Nvidia Control Panel, where the “High Dynamic Range” option has now been added. As a prerequisite for activating this HDR filter, it is apparently necessary to have upscaling turned on, so you must have “Super Resolution” enabled alongside this checkbox.

Enabling RTX Video HDR in Nvidia Control Panel

In order for the filter to do its job, you need to use an HDR monitor, more specifically a monitor that supports the HDR 10 (or HDR 10+) standard. The output will therefore be encoded in this format. HDR display mode must also be enabled in the operating system environment.

The filter uses acceleration on GeForce GPU tensor cores, so it is only available for owners of GeForce RTX 2000, 3000 and 4000 series cards (this time the support includes the oldest Turing generation).

RTX Video HDR cannot yet be used for local video content playback. RTX Video Super Resolution has already been implemented in VLC (and also MPC-BE), but adding an HDR component probably needs some extra work within the player, whose rendering output module has to manage the HDR output, and that is not a trivial matter. But there’s probably a reasonable chance that this feature will eventually make its way into these classic video players as well, and not just stay exclusive to various youtube-like services and and other online content.

Source: Nvidia

English translation and edit by Jozef Dudáš


  •  
  •  
  •  
Flattr this!

Amazon unveils 96-core ARM Graviton4 CPU and Trainium2 AI chip

Last month, Microsoft unveiled their first custom processors being developed for datacenter and Azure services. Also Amazon, which was the first of these US hyperscalers to go the custom hardware route, is now launching new CPUs for its servers. And with it Trainium2, already the second generation of an in-house developed AI accelerator. Amazon also revealed that it has already produced over two million of its CPUs. Read more “Amazon unveils 96-core ARM Graviton4 CPU and Trainium2 AI chip” »

  •  
  •  
  •  

Nvidia’s new fastest AI GPU: H200 with 141GB of HBM3E memory

Last year, Nvidia launched the 4nm H100 accelerator with Hopper architecture. It has since been the company’s fastest GPU for AI. Now the company is launching its successor dubbed H200. It isn’t quite a new generation yet, but something of a refresh that will lead Nvidia’s lineup until the next generation with the Blackwell architecture is released. The H200 relies on the use of faster memory, but that should also lift overall performance. Read more “Nvidia’s new fastest AI GPU: H200 with 141GB of HBM3E memory” »

  •  
  •  
  •  

Intel unveils Meteor Lake processors: 4nm, tiles, Xe LPG graphics

Meteor Lake is Intel’s first processor manufactured on in-house 4nm node, an important milestone. It is also, paradoxically, Intel’s first processor manufactured at TSMC, as many of its parts are outsourced in this way – a milestone too. This is the first mainstream Intel processor to use chiplets (or tiles) and advanced 3D packaging. It’s almost and extra beyond that, that there are new CPU cores, new GPU, and a new NPU for AI acceleration. Read more “Intel unveils Meteor Lake processors: 4nm, tiles, Xe LPG graphics” »

  •  
  •  
  •  

Leave a Reply

Your email address will not be published. Required fields are marked *