• LLM TrustAI Model Discovery
Platform
  • Home
  • Models
  • Categories
  • Compare
  • Blog
  • Newsletter
My Account
  • Dashboard
Categories
    Back to Models

    Falcon 180B

    180B

    TII's massive 180B model trained on 3.5T tokens of RefinedWeb data.

    open-sourcemassivereasoning
    Architecture

    falcon

    Parameters

    180B

    Context Length

    2,048 tokens

    License

    Falcon-180B TII License

    About Falcon 180B

    Falcon 180B is one of the largest open-source models. Trained primarily on web data, it performs well across general language tasks and reasoning.

    Author

    Community

    Category

    text generation

    Downloads

    432,109

    License

    Falcon-180B TII License