• LLM TrustAI Model Discovery
Platform
  • Home
  • Models
  • Categories
  • Compare
  • Blog
  • Newsletter
My Account
  • Dashboard
Categories
    Back to Models

    SmolLM2 1.7B

    1.7B

    HuggingFace's ultra-compact 1.7B model. Best-in-class for its size.

    open-sourcetinyefficientedgeapache-2.0
    Architecture

    smollm

    Parameters

    1.7B

    Context Length

    8,192 tokens

    License

    Apache 2.0

    About SmolLM2 1.7B

    SmolLM2 1.7B is trained on 11T tokens of curated data. Despite its tiny size, it outperforms larger models on many tasks and is perfect for resource-constrained environments.

    Author

    Community

    Category

    text generation

    Downloads

    1,234,567

    License

    Apache 2.0