(sorry if anyone got this post twice. I posted while Lemmy.World was down for maintenance, and it was acting weird, so I deleted and reposted)

  • PeterPoopshit@lemmy.world
    link
    fedilink
    arrow-up
    42
    ·
    edit-2
    1 year ago

    Download and install llamacpp from its github repository, go on huggingface.co and download one of the wizard vicuna uncensored GGUF models. It’s the most obedient and loyal one and will never refuse even the most ridiculous request. Use --threads option to specify more threads for higher speed. You’re welcome.