Memory prices to double in Q1 2026 compared to year-end

Once again, we return to a topic that is probably as popular in the PC world as artificial intelligence (and is close buddies with it): memory prices. TrendForce has released their outlook of how DRAM prices—that is, system memory and graphics memory—are expected to develop in Q1 2026. Unfortunately, it appears that the explosive price surge is not staying, but the three months we are now entering could be the worst yet. Read more “Memory prices to double in Q1 2026 compared to year-end” »

Our visit at Nvidia: AI game assistants and G-Sync Pulsar LCDs

Yesterday, Nvidia hosted a Central European media briefing where they showcased its latest technological developments live. Unsurprisingly, the spotlight was on innovations unveiled or launched last month at CES 2026—namely the G-Sync Pulsar technology for monitors and DLSS 4.5, both of which we have already covered in separate articles—as well as a demo of the ACE technology, which aims to bring AI-driven assistants into games. Read more “Our visit at Nvidia: AI game assistants and G-Sync Pulsar LCDs” »

AMD AI Bundle: Ready-made AI tools integrated into Radeon drivers

These days, one could paraphrase an old joke from the communist era: artificial intelligence is everywhere, and you’re almost afraid to open a can of food in case you find it there as well. Now AI jumps out at you if you own a Radeon GPU. AMD has long been criticized for lagging behind Nvidia in artificial intelligence, and it is responding with a new “AI Bundle” that introduced into the graphics card drivers. Read more “AMD AI Bundle: Ready-made AI tools integrated into Radeon drivers” »

DLSS 4.5 brings 6× frame generation and better image quality

Last night, Nvidia held their keynote ahead of CES 2026, which—as usually—only focused on AI, robotics and large enterprises market rather than consumer PCs and devices. Even so, the company does have something new for gamers.  The company is introducing a new generation of their AI-based game rendering—DLSS 4.5. It places even greater emphasis on interpolated frame generation, but improvements go beyond that as well. Read more “DLSS 4.5 brings 6× frame generation and better image quality” »

AI pays better: Micron kills Crucial, exits consumer memory & SSDs

The next casualty of the massive price hikes and memory shortages fuelled by the AI bubble is, ironically, an actual memory vendor. More precisely, a business that used to support them, but which now isn’t lucrative enough compared to the booming AI datacenter chip business. In a way, it’s symbolic of how the PC industry’s pursuit of AI profits is currently squeezing out “ordinary” everyday customers—both consumers and professionals. Read more “AI pays better: Micron kills Crucial, exits consumer memory & SSDs” »

AMD got massive GPU order from OpenAI. Had to pay for it dearly

In recent weeks, we’ve seen SoftBank, Nvidia, and the U.S. government pour money into Intel in exchange for a significant stake in the chip industry’s former hegemon. Now a similar scenario is playing out with AMD, albeit in a stranger form. The company has announced a contract with OpenAI that, on one hand, has AMD securing a larger slice of the AI market, but on the other, hands a substantial piece of AMD over to OpenAI. Read more “AMD got massive GPU order from OpenAI. Had to pay for it dearly” »

GeForce RTX 6090? Nvidia reveals first next generation Rubin GPU

You may have noticed that while Nvidia keeps its gaming GPU roadmap under wraps and avoids talking about it ahead of launches, it does the opposite with AI server GPUs. Those are often unveiled up to a year before release—Grace CPUs were announced two years early. Now these two approaches may have converged. Despite Blackwell only having debuted this year, Nvidia has already announced the first GPU of the follow-up architecture, Rubin. Read more “GeForce RTX 6090? Nvidia reveals first next generation Rubin GPU” »

Cooperative Vectors in DirectX to use Blackwell Neural Shaders

Nvidia recently talked new features for GeForce graphics cards – primarily the RTX Remix modding platform leaving beta and first games using Nvidia ACE. The company has another announcement: Neural Shaders, one of the architectural innovations in Blackwell GPUs, will be coming to DirectX. Microsoft is adding a Cooperative Vectors function to this API, which GeForce RTX 5000 series will support precisely through their Neural Shaders. Read more “Cooperative Vectors in DirectX to use Blackwell Neural Shaders” »

Better, more capable than expected: RDNA 4 architecture deep dive

Unofficial leaks from the past initially didn’t paint the RDNA 4 architecture as a major new design, suggesting that it’s more akin to RDNA 3 bugfix – except for new ray tracing units. But it turns out that was a big misconception, as RDNA 4 is a significant upgrade that leaves no GPU subsystems untouched, far beyond just adding new ray tracing units. It also brings enhanced AI acceleration and redesigned compute units (shaders). Read more “Better, more capable than expected: RDNA 4 architecture deep dive” »

Nvidia boosts RTX Video Super Resolution performance, adds HDR

When Nvidia unveiled GeForce RTX 5000 graphics in January, various new features were presented (though not all of them are exclusive to these new GPUs), most notably DLSS 4 able to generate more interpolated frames. We’ve devoted a separate article to Blackwell’s features, but now that the GPUs have started selling (albeit in limited quantities), we see that that are some additional new features that have flown under-the-radar before. Read more “Nvidia boosts RTX Video Super Resolution performance, adds HDR” »

Blackwell: GeForce RTX 5000 architecture and innovations [Analysis]

Nvidia’s new graphics cards – the GeForce RTX 5090 and RTX 5080 – won’t be out until the 30th, but NDA is over and the first reviews of the top-of-the-line RTX 5090, which we also tested, are out. In this article, we take a look at the Blackwell architecture that powers these new GPUs, its new features and functions. DLSS 4, compute unit architecture and features of the GPUs as well as the software side of this new generation. Read more “Blackwell: GeForce RTX 5000 architecture and innovations [Analysis]” »

GeForce RTX 5090 gets Chinese D version, no performance reduction

A year ago, due to US sanctions aimed at limiting China’s access to powerful AI acceleration, Nvidia began selling the cut-down RTX 4090D instead of the GeForce RTX 4090 in that market because the high-end gaming GPU was already crossing the performance limits imposed. As expected, the new Blackwell generation will face the same problem and Nvidia is preparing a special RTX 5090D Chinese SKU, but reportedly with full performance. Read more “GeForce RTX 5090 gets Chinese D version, no performance reduction” »

UDNA: Next-gen architecture will unite AMD’s gaming and AI GPUs

The next generation AMD GPUs with RDNA 4 architecture should be coming soon. The company has now confirmed the rumors that high-end models will not be released in this generation and it will cover only part of the performance and price range. But the company also discussed their long-term roadmaps for the Radeon and Instinct GPU architectures. It seems we could be in for a change as significant as the transition from GCN to RDNA. Read more “UDNA: Next-gen architecture will unite AMD’s gaming and AI GPUs” »

Mobile Zen 5 is here: Ryzen AI 300 “Strix Point” SoC detailed

The Ryzen AI 300 mobile CPUs with Zen 5 architecture officially launched on Sunday. There’s a lot of news to go along with it: a third model has been added to form the top of the range, and we have learned various other architectural details of the laptop version of Zen 5 (and Zen 5c). Including information about the implementation of AVX-512, which as leaked before and will have lower performance than the fully 512-bit desktop Ryzen 9000. Read more “Mobile Zen 5 is here: Ryzen AI 300 “Strix Point” SoC detailed” »

Gigabyte SSD brings back SLC NAND, lasts 109,500 write cycles

The boom (or bubble?) around AI has brought many things, and among them interesting news for those missing SSDs based on MLC and SLC NAND Flash which was more pricy but had better performance and crucially, much longer lifespan so you didn’t have to worry about wearing out the SSD. That said, Gigabyte is launching an SSD that is officially designed for AI applications, but not just for them – its main asset is precisely SLC recording. Read more “Gigabyte SSD brings back SLC NAND, lasts 109,500 write cycles” »