Indiana Jones and the Dial of Destiny is the latest game to expose the limitations of GPUs with less than 12GB of VRAM, particularly impacting cards like the RTX 4060 and RX 7600, which struggle to maintain smooth frame rates even at 1080p.
Following the trend seen with The Last of Us Part 2 on PC, Indiana Jones demonstrates the increasing need for larger VRAM buffers in modern AAA games. Benchmarks from Computer Base reveal that the game requires more than 8GB of VRAM for a smooth experience at Full HD resolution with medium or higher graphics settings.
With "Hyper" settings and ray tracing enabled at 1080p, the RTX 4060 and RTX 3060 Ti (8GB) fail to maintain a consistent 30 FPS average, with 1% lows dipping below 20 FPS. The RX 7600 and RTX 4060 Ti (8GB) fare slightly better, averaging around 30 FPS, but still suffer from low 1% lows.
Interestingly, the RX 6700 XT (12GB) significantly outperforms the RTX 3080 (10GB), despite having lower theoretical rasterization and ray tracing performance. This highlights the impact of limited VRAM. The Arc A770 (16GB) also performs better than the RTX 4060 Ti (8GB), further emphasizing the importance of VRAM capacity.
As games continue to push graphical boundaries with higher fidelity textures and more demanding ray tracing effects, the performance gap between cards with less than 12GB of VRAM and those with 12GB or more is likely to widen. Hopefully, future generations of GPUs from NVIDIA and AMD will address this issue by offering at least 12GB of VRAM as a standard.