Games News Hub

The Last of Us Part 2 demonstrates that even with 8 GB of VRAM, it’s possible to achieve 4K gaming at max settings, thanks to its efficient asset-streaming technique. So, why aren’t more games adopting this smart optimization strategy?

When Sony’s masterpiece The Last of Us Part 1 appeared on the humble PC two years ago, I hoped it would become a watershed moment in the history of console ports. Well, it was, but for all the wrong reasons—buggy and unstable, it hogged your CPU and GPU like nothing else, and most controversially of all, it tried to eat up way more VRAM than your graphics card has. It’s fair to say that TLOU1’s watershed moment cemented the whole ‘8 GB of VRAM isn’t enough’ debate.

Most of those issues were eventually resolved via a series of patches, but like so many big-budget, mega-graphics games, if you fire it up at 4K on Ultra settings, the game will happily let you use more VRAM than you actually have. The TLOU1 screenshot below is from a test rig using an RTX 3060 Ti, with 8 GB of memory, showing the built-in performance HUD; I’ve confirmed that RAM usage figure with other tools and the game is indeed trying to use around 10 GB of VRAM.

(Image credit: Sony Interactive Entertainment)

So when I began testing The Last of Us Part 2 Remastered a couple of weeks ago, the first thing I monitored after progressing through enough of the game was the amount of graphics memory it was trying to allocate and actually use. To do this, I used Microsoft’s PIX on Windows, a tool for developers that lets them analyse in huge detail exactly what’s going on underneath their game’s hood, in terms of threads, resources, and performance.


Source link

Add comment

Your Header Sidebar area is currently empty. Hurry up and add some widgets.