this post was submitted on 08 Jun 2023
19 points (100.0% liked)
LocalLLaMA
20 readers
1 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
llama cpp is crashy on my computer, it even didn't compiled.
Huh, that's interesting. If llama.cpp doesn't work, try https://github.com/oobabooga/text-generation-webui which (tries to) provides a user-friendly(-ier) experience.
it launches just fine, but when loading a model it says something like: successfully loaded none
Have you put your model in the "models" folder in the "text-generation-webui" folder? If you have, then navigate over to the "Model" section (button for the menu should be at the top of the page) and select your model using the box below the menu.
I tried to download an example one, cus I don't have any model, failed.
I'd recommend the model Wizard-Vicuna-7b-Uncensored (i know it's like a sentence https://huggingface.co/TheBloke/Wizard-Vicuna-7B-Uncensored-GGML) direct download link is here: https://huggingface.co/TheBloke/Wizard-Vicuna-7B-Uncensored-GGML/blob/main/Wizard-Vicuna-7B-Uncensored.ggmlv3.q5_1.bin