LLM Trust
AI Model Discovery
Platform
Home
Models
Categories
Compare
Blog
Newsletter
My Account
Dashboard
Categories
U
Guest
Sign in to sync
Back to Models
DeepSeek Coder V2
236B
236B MoE coding model that rivals GPT-4 Turbo on code tasks.
open-source
code
moe
coding-specialist
128k-context
Download
Source
Overview
Specifications
Usage
Reviews (0)