Hey everyone, I’m interested in using a local LLM (Language Model) on a Linux system to create a long story, but I’m not sure where to start. Does anyone have experience with this or know of any resources that could help me get started? I’d love to hear your tips and suggestions. Thanks!

  • Nyfure@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    7 months ago

    How much time do you have? Because even small models will take alot of time on that kind of hardware to spit out a long text…
    And the small models arent that great. I think the current best and economic model would be a mistral, mixtral or dolphin.
    If you got the power, nous-capybara is very good and “only” 34B parameters (loading alone needs like 40GB of memory).