Best VRAM Per Dollar
For AI workloads, VRAM capacity matters more than raw speed. This leaderboard ranks every GPU by how much VRAM you get per dollar spent — the most important metric for budget-conscious AI builders.
Based on current street prices, not MSRP. The best value right now is the NVIDIA Tesla P40 at $13/GB.
How to Read This
$/GB — how much each gigabyte of VRAM costs you. Lower is better. A card at $12.50/GB gives you twice the VRAM capacity per dollar vs one at $25/GB.
BW/$ — memory bandwidth per dollar (GB/s per $). Higher means faster AI inference per dollar spent. Important for tokens-per-second performance, not just fitting models.
Why VRAM/$ matters — if a model doesn't fit in VRAM, it spills to system RAM which is 10-50x slower. Buying more VRAM per dollar means running larger, smarter models at usable speeds.
Key Takeaways
- +Used data center cards (Tesla P40, A100) often dominate $/GB rankings — but they lack display output and need server chassis
- +The RTX 3060 12GB and RTX 3090 24GB are the best consumer $/GB picks on the used market
- +AMD cards offer good $/GB but lack CUDA — only viable for gaming, not AI
- +New-gen cards (RTX 50 series) have worse $/GB than used cards, but better bandwidth and features