this post was submitted on 30 Sep 2022
12 points (100.0% liked)
Self-hosting
136 readers
1 users here now
Hosting your own services. Preferably at home and on low-power or shared hardware.
Also check out:
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Hmm, not sure how much can be done with these. Most ML stuff you can self-host requires CUDA or OpenCL, i.e. a GPU.
I am planning to setup a Libre-Translate instance on an old CUDA enabled gaming laptop turned server soon:
https://github.com/LibreTranslate/LibreTranslate
Cool would be an auto-translate button on Lemmy posts with Libre-translate API support like it exists for Discourse forums.