Abacus.ai:

We recently released Smaug-72B-v0.1 which has taken first place on the Open LLM Leaderboard by HuggingFace. It is the first open-source model to have an average score more than 80.

  • Infiltrated_ad8271@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    9 months ago

    I tested it with a 16GB model and barely got 1 token per second. I don’t want to imagine what it would take if I used 16GB of swap instead, let alone 130GB.