LLM Trust
AI Model Discovery
Platform
Home
Models
Categories
Compare
Blog
Newsletter
My Account
Dashboard
Categories
U
Guest
Sign in to sync
Back to Models
DeepSeek V3
671B
DeepSeek's 671B MoE model with 37B active parameters. Matches GPT-4o on many benchmarks.
open-source
moe
reasoning
code
128k-context
Download
Source
Overview
Specifications
Usage
Reviews (0)