DeepSeek AI4 local models

DeepSeek Models: Reasoning AI You Can Run Locally

DeepSeek made headlines with R1, a reasoning model that matches GPT-4 level thinking. While the full 671B V3 model needs a server farm, distilled versions from 7B to 70B bring strong reasoning and coding to your local hardware. DeepSeek R1 distilled models are among the best choices if you need chain-of-thought problem solving on a Mac.

Developer

DeepSeek AI

Models

4

Size Range

7B – 671B

RAM Range

10400 GB

Key Features

Best-in-class reasoning with R1 models
Strong coding performance
Mixture-of-Experts architecture (V3)
Distilled versions run locally on modest hardware

All DeepSeek Models

ModelSizeQuantVRAMMin RAMBest ForQualityOllama
DeepSeek-R1 Distill Qwen 7B7BQ4_K_M5.5 GB10 GBReasoning, Coding
85
DeepSeek-R1 Distill Qwen 14B14BQ4_K_M11 GB22 GBReasoning, Quality
92
DeepSeek-R1 Distill Llama 70B70BQ4_K_M42 GB48 GBReasoning, Quality
97
DeepSeek-R1 671B671BQ4_K_M380 GB400 GBReasoning, Coding
100

Device Compatibility

Which DeepSeek models can run on each device class, based on minimum RAM requirements.

ModeliPhoneAirProStudioMini
DeepSeek-R1 Distill Qwen 7B (7B)PossiblePossibleExcellentExcellentExcellent
DeepSeek-R1 Distill Qwen 14B (14B)NoPossiblePossibleExcellentPossible
DeepSeek-R1 Distill Llama 70B (70B)NoNoPossiblePossiblePossible
DeepSeek-R1 671B (671B)NoNoNoPossibleNo

RAM Requirements

DeepSeek-R1 Distill Qwen 7B
5.5 GB
min 10 GB
DeepSeek-R1 Distill Qwen 14B
11 GB
min 22 GB
DeepSeek-R1 Distill Llama 70B
42 GB
min 48 GB
DeepSeek-R1 671B
380 GB
min 400 GB

Frequently Asked Questions

Can I run DeepSeek R1 locally?+
Yes. The distilled versions (7B, 14B, 32B, 70B) run locally with Ollama. The 7B distill needs 10GB RAM and is a great reasoning model for laptops.
What is the difference between DeepSeek R1 and V3?+
R1 is a reasoning-focused model trained with reinforcement learning. V3 is a massive 671B MoE general model. For local use, R1 distilled models are the practical choice.
How much RAM does DeepSeek R1 need?+
DeepSeek R1 7B distill needs 10GB RAM. The 14B distill needs 16GB, the 32B needs 24GB, and the 70B needs 48GB. All use Q4 quantization.
Is DeepSeek R1 good for coding?+
Excellent. DeepSeek R1 distilled models show strong coding performance, especially at 14B and above. The chain-of-thought reasoning helps with complex debugging and architecture questions.

Related Model Families

Getting Started