広告

Is it worth the upgrade from 48GB to 60GB VRAM?

Reddit r/LocalLLaMA / 3/28/2026

💬 OpinionIdeas & Deep AnalysisTools & Practical Usage

Key Points

  • The post asks whether upgrading from 48GB VRAM (two RTX 3090s) to 60GB VRAM (implying adding a 3080 12GB) is actually beneficial for specific models or workloads.
  • It frames the main decision as a tradeoff between potential performance gains at higher VRAM capacity versus the effort and complexity of adding a third GPU to the system.
  • The author is looking for concrete use cases where 60GB would meaningfully improve training/inference feasibility, speed, or context/model sizing compared to 48GB.
  • The discussion is centered on practical, local AI/LLaMA-style GPU memory constraints rather than a new product release or benchmarked announcement.

My system currently has two 3090s (48GB VRAM) and 128GB of system RAM. I have an extra 3080 12GB sitting around and I'm wondering if there are any models out there or use cases where the 60GB will be an improvement. My concern is I don't want to go through the hassle of the hardware modifications required to add a third video card to my system if there's no real use case at that memory level.

submitted by /u/CBHawk
[link] [comments]

広告