← Back to GPUs
NVIDIA A100 80GB

NVIDIA · Data Center

NVIDIA A100 80GB

$8000$15000 MSRP

The NVIDIA A100 80GB is the data center GPU that powered the AI revolution. With 80GB of HBM2e memory at over 2 TB/s bandwidth, it runs any consumer LLM completely unquantized — including 70B models at full FP16 precision. Originally ,000+, used A100s are now available for around ,000. They require a server chassis or PCIe adapter and have no display output. For AI builders with the budget and technical skill, a used A100 offers unmatched VRAM capacity.

Best ForRunning the largest AI models with zero compromises on quality
VerdictThe ultimate AI GPU if you can handle the server-grade form factor.
AI
10/10
Gaming
1/10

Specifications

VRAM80GB HBM2e
Memory Bandwidth2039 GB/s
CUDA Cores6,912
Boost Clock1410 MHz
TDP300W
Power Connector1x 8-pin
Length267mm
Form FactorDual Slot
Release Year2021

AI Capabilities

Unrivaled80GB VRAM

Run 70B+ models, no compromises. The AI power user's dream.

Can run (Q4 quantized)

Llama 3.1 70BLlama 3.1 8BQwen 2.5 72BQwen 2.5 32BQwen 2.5 14BMistral 7BDeepSeek R1 70BFLUX.1 DevStable Diffusion XLStable Diffusion 3.5 LargeHunyuanVideoCogVideoX-5BMochi 1LTX VideoStable Video DiffusionWan Video 14BCodestral 22BQwen 2.5 Coder 32BLLaVA 1.6 34BAlphaFold 2ESMFold (ESM-2 15B)ESM-2 3BscGPTRFdiffusionFine-tune Llama 8BFine-tune Llama 70BTrain SDXL LoRATrain FLUX LoRA

Recommended system RAM for AI: 160GB+ (2x GPU VRAM for model overflow)

Pros

  • +80GB HBM2e — run any model unquantized
  • +2 TB/s bandwidth
  • +NVLink for multi-GPU scaling
  • +Available used for ~$8k

Cons

  • -No display output on most models
  • -Requires server chassis or adapter
  • -No gaming drivers
  • -High used market prices
aiworkstation

Where to Buy