• vale@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    1 month ago

    Take a look at Ollama.ai, just follow the installation instructions. A decent GPU is recommended, and the models are around 10GB iirc.