SkySyrup

joined 1 year ago
MODERATOR OF
[–] SkySyrup@sh.itjust.works 4 points 1 year ago (1 children)

Hi, thanks for replying to this post! I originally had a longer version of this, but I accidentally clicked on "cancel" instead of "reply" because I'm still getting used to this interface.

I understand where you're coming from. I'm in no way familiar with bird biodiversity, specifically in Germany where I live, but I'll try and present (what I think is) a argument against this:

We can't successfully keep the cats out of the garden long-term. We can do it for a few days, but then a neighbour lets them out or they lift up the catflap backwards or something else frees them. It's, unfortunately, a pointless venture.

There's also the issue of shrinking living options for the birds in our city's center ("Altstadt"). Homeowners have started putting up nets and barriers in places where birds have been coming for literal centuries, and because of that, they're becoming increasingly desperate and come to our garden.

We've decided to leave this the way it is for now because

  1. Most of the birds still make it. About 80% of the birds that don't fall out of the nest make it. Even the ones that do have a good chance.
  2. Helmut's bird amount has remained about the same for the past decade. It's always 1-2 birds a year.
  3. The birds simply have no other place to go. We don't let the cats out, they find a way out. We don't let the birds in, 2-4 nests don't get built.

I understand if you don't agree. But we've simply done what we see best, and you are completely in the right to disagree.

 

He's 15 years old now, and his ears really bother him, but he still brutally murders birds in our garden.

the fur on the sofa is from the other cats lol

[–] SkySyrup@sh.itjust.works 1 points 1 year ago (1 children)

I hope this model gets fine-tuned similar to how the original LLaMA got fine-tuned, it really improves it!

[–] SkySyrup@sh.itjust.works 0 points 1 year ago (2 children)

Have you put your model in the "models" folder in the "text-generation-webui" folder? If you have, then navigate over to the "Model" section (button for the menu should be at the top of the page) and select your model using the box below the menu.

[–] SkySyrup@sh.itjust.works 0 points 1 year ago (4 children)

Huh, that's interesting. If llama.cpp doesn't work, try https://github.com/oobabooga/text-generation-webui which (tries to) provides a user-friendly(-ier) experience.

[–] SkySyrup@sh.itjust.works 1 points 1 year ago* (last edited 1 year ago)

Hi, I'm happy to see you are willing to give llama a try! If you want to do GPU-Accelerated processing, it depends on your OS and Hardware what you are able to do. If you have a Nvidia card, you will be able to use cuBLAS, instructions here: https://github.com/ggerganov/llama.cpp#cublas . I don't have experience with other cards, but I'll try to help if issues arise!

Also, for more ease-of-use try text-generation-webui (https://github.com/oobabooga/text-generation-webui). Well, ease-of-use, until you can want to use GPU acceleration, because you'll need to look at https://github.com/oobabooga/text-generation-webui/blob/main/docs/llama.cpp-models.md#gpu-acceleration if you want to do that with LLaMA.

33B and 65B models seem to be the best for storytelling and writing.

 

This Community is new, but I plan to expand it and partially mirror posts from r/LocalLLaMA on Reddit.

[–] SkySyrup@sh.itjust.works 1 points 1 year ago* (last edited 1 year ago) (6 children)

Hi, sure, thank you so much for helping out! As for LLaMA, I would point you at llama.cpp, (https://github.com/ggerganov/llama.cpp) which is the absolute bleeding edge, but also has pretty useful instructions on the page (https://github.com/ggerganov/llama.cpp#usage). You could also use Kobold.cpp, but I don't have any experience with it, so I can't help you if you have issues.

 

Hi, you've found this ~~subreddit~~ Community, welcome!

This Community is intended to be a replacement for r/LocalLLaMA, because I think that we need to move beyond centralized Reddit in general (although obviously also the API thing).

I will moderate this Community for now, but if you want to help, you are very welcome, just contact me!

I will mirror or rewrite posts from r/LocalLLama for this Community for now, but maybe we could eventually all move to this Community (or any Community on Lemmy, seriously, I don't care about being mod or "owning" it).

[–] SkySyrup@sh.itjust.works 5 points 1 year ago

I've distro-hopped a LOT, but always come back to Fedora, because it's super stable, gives me no issues and doesn't get in my way when I want to screw around.