LLM VRAM Calculator | GPU Requirements Tool
Calculate GPU requirements for LLM deployment. Estimate VRAM needs, bandwidth, and performance for running LLM locally or in production. Free open source LLM VRAM calculator for DeepSeek, Llama, Qwen models.
Model
Workload
Workload
Preset
Hardware (optional)
Results
Calculations are physics-based estimates. See our Data Inclusion Policy for details.