What sparked the discussion about the amount of VRAM in graphics cards?

The year 2023 has given us many reasons to think about it. If earlier failures with optimization of the conditional Arkham Knight and Fallout 76 did not occur very often and caused a lot of controversy, then in 2023 the problem became systemic in nature. The trend started in January with Forspoken, Square Enix's new AAA project. It was revealed that the game, centered around a person entering a fantasy world, struggled to run smoothly with less than 8 GB of video memory. Alex Battaglia from Digital Foundry, after two weeks of testing, couldn't find optimal graphics settings for graphics cards with less than 8 GB of VRAM, which, according to Steam statistics, represents the majority of PC players.

History seems to be repeating itself with releases like Redfall, Gotham Knights, and the recent Starfield. These games, not exactly "next-gen," surprisingly demand substantial graphics card and VRAM capacities. Many gamers are puzzled by recent titles. How come Cyberpunk 2077, released three years ago, looks better and only requires 8 GB of VRAM? Where's the promised photorealism in Forspoken? And why does Gotham Knights 2023 appear inferior to Batman: Arkham Knight from eight years ago, struggling even on top-tier RTX 4080 and 4090 cards?

What is VRAM and what is it responsible for?

Due to the lack of 8GB of memory, the RTX 3070 was unable to load all the textures, as was the weaker RTX 2080 Ti with 11GB of VRAM.

To simplify greatly, VRAM is a specialized memory that plays the role of a kind of buffer in which textures, shaders, polygons and other video data important for the game are stored, which are subsequently displayed on the screen. Low VRAM can have a variety of impacts on your gaming experience. Its influence is evident in several key ways.

Poor performance and "slideshows": When a graphics card does not have enough video memory to store all the necessary textures and data, it can lead to a drop in performance. The game becomes unplayable and you see extremely low FPS making gameplay almost impossible. This is the most obvious option.

Insufficient video memory can lead to stutters and uneven frametimes, causing the in-game image to jerk, especially in scenes with complex objects and constant player movement. In the recent PC release of God of War, older Radeon cards with 4 GB managed a stable 60 FPS in Full HD and medium settings, but combat situations revealed noticeable deterioration. Despite the FPS counter showing 45 — 50 FPS, the poor frametime resulted in a visually jerky experience.

Reduced Texture Quality: When VRAM is low, the game may dynamically reduce texture quality for optimization. For example, they might start loading right in front of your eyes (hello, potato-people in Cyberpunk 2077), or download in low resolution. Or start flickering.

Tests: 8 GB vs 16 GB in 2023 games

Radeon RX 6800 vs GeForce RTX 3070.

The recommended video memory (VRAM) for gaming varies based on specific game requirements and screen resolution. If you predominantly play less demanding computer games, particularly network applications where graphics are secondary, focus on achieving optimal FPS with your 3D accelerator. Games like Dota or CS: GO can run smoothly on integrated graphics like Radeon Vega or Intel Iris, which utilize "shared memory" from RAM. However, for the latest resource-intensive projects with high-quality graphics settings, considering VRAM becomes crucial.

An interesting example of the influence of video memory on the performance of a graphics card was the sensational test of the popular GeForce RTX 3070 against the much less promoted Radeon RX 6800. At the time of release in 2021, both cards cost almost the same and produced almost identical results. In most tests, the gap in FPS did not exceed 8%, and the only difference was in the amount of memory: AMD/Radeon were generous with 16 GB of video memory, while NVIDIA was limited to 8 GB. For which, by the way, some experts scolded NVIDIA even during the announcement.

Official comparison of Radeon RX 6800 and GeForce RTX 3070 from AMD.

Two years later, critics' assessments have proven accurate, revealing disparities among video accelerators across various price ranges. In the PC port of the first part of The Last of Us, Radeon outperformed its competitor by nearly 50%. In Hogwarts Legacy, even after a substantial performance-improving patch, the RX 6800 outshone the RTX 3070 at 1440p resolution, exhibiting nearly double the FPS. Prior to the patch, the RTX 3070 was virtually unplayable.

Things get intriguing in tests for The Callisto Protocol at both Full HD and 2K resolutions, where both cards perform similarly until ray tracing is enabled. At that point, the GeForce RTX 3070 struggles, dropping to an average of 10 FPS with dips to 4 FPS — making gameplay challenging. Meanwhile, the Radeon RX 6800, boasting 16 GB, maintains a reliable average of 56 FPS, with no drops below 38 FPS in busy scenes. This pattern repeats in games like A Plague Tale: Requiem, Resident Evil 4, Marvel's Spider-Man: Miles Morales, where NVIDIA cards fall short, expected to excel due to hardware RT cores.

What can we say, even if Doom Eternal 2020 released in 2K and with Ultra Nightmare settings brought to its knees any more or less powerful graphics card with 8 GB VRAM. For example, the same RTX 3070 with such settings is suffocating, barely producing an average of 26 FPS, while the Radeon RX 6800 does not even have time to break a sweat with its almost 100 FPS. The difference is almost 5 times, Carl! (Sorry)

So why have the equirements of modern games increased?

The very photorealistic stone in the history of games from Forspoken.

The surge in memory requirements can be attributed to the prevalence of 2K and 4K monitors and, on the other hand, the decline in quality standards and the complacency of many AAA game developers. In older games, once considered visually impressive, developers often employed tricks to enhance engine performance — like using a limited set of textures for objects. Now, developers like Todd Howard might simply suggest upgrading your computer, claiming their games are perfectly optimized.

But seriously, modern high-budget games are increasingly pursuing photorealism. Even a mundane stone in a forest may now be adorned with several distinct high-quality textures to achieve a unique appearance. When crafting lifelike landscapes, developers opt for 4K/8K textures to ensure natural-looking details, even when a player examines an object up close. To grasp the magnitude, while a game texture previously comprised 5000 polygons, the new Unreal Engine 5, with its Nanite rendering system, considers figures ranging from 500 thousand to 2 million polygons as the norm.

Due to the overload of the Texture Streaming Pool, the polygons of this wall simply did not load, reminiscent of the graphics of Minecraft or old-school games from the late 90s.

The demand for VRAM has surged due to increased complexity in object geometry. Take a brick wall, for instance; when viewed from the side, the desire is to showcase the bricks protruding, emphasizing that it's not merely a flat wall with a brick texture. In Unreal Engine 5, the engine itself contributes to texture generation, streamlining the workflow but concurrently escalating VRAM requirements. If VRAM is insufficient and the Texture Streaming Pool buffer is maxed out, textures cease to load, resulting in objects appearing as low-polygon entities with unloaded textures.

The requirements for creating characters have also increased. For example, in old games like GTA 3 Vice City, Tony’s figurine was actually covered with one single texture with pants, a Hawaiian shirt and sneakers. Compare this to Baldur's Gate 3 or The Callisto Protocol with realistic character faces, naturalistic hair and almost life-like facial animation.

So how much video memory do you need in 2023?

The influence of video memory in tests of Radeon graphics cards.

According to a developer from Unreal Engine 5, the future outlook for video memory is bleak, driven by the unprecedented demands of next-gen gaming. Major companies are heavily invested in creating games for PlayStation 5 and Xbox Series X consoles, which boast 12 GB of video memory, hardware decompression, and SSD streaming. Transitioning from PS5's 12 GB VRAM to PC's 8 GB becomes a challenging optimization task without sacrificing quality. Time constraints, coupled with the high cost of game development, make this switch even more daunting. Additionally, the adoption of advanced features in Unreal Engine 5, such as HD textures, meshes, nanites, and Lumen global illumination, comes at a significant resource cost, as evident in initial UE5 projects.

Even if you don't have plans to significantly increase resolution, heavily rely on ray tracing, or exclusively play on "ultra" settings, opting for a graphics card with 16 GB of VRAM is the most reasonable and future-proof choice. Games like Forspoken and Hogwarts Legacy set 12 GB as the minimum requirement for "Full HD + Ultra" settings. Notably, Resident Evil 4 Remake, set to "Full HD + Ultra," refused to start on systems with 8 GB VRAM and experienced frequent crashes. Hence, the once-standard 8 GB may soon become insufficient, leading to consistent slowdowns, freezes, and crashes, along with the perpetual search for optimal graphics settings.

As for choosing a specific model, in our opinion, it would be more correct to look for oldies from two or three years ago at competitive prices: for NVIDIA it can be a modified RTX 2060 and RTX 3060 with 12 GB of memory or an RTX 2080 Ti with 11 GB, for Radeon — the RX 6800 mentioned above, as well as the younger RX 6700 XT and 6750 XT.