this post was submitted on 12 Jun 2023
499 points (100.0% liked)

Selfhosted

573 readers
1 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Resources:

> Any issues on the community? Report it using the report flag.

> Questions? DM the mods!

founded 1 year ago
MODERATORS
 

A simple question to this community, what are you self-hosting? It's probably fun to hear from each-other what services we are running.

Please mention at least the service (e.g. e-mail) and the software (e.g. postfix). Extra bonus points for also mentioning the OS and/or hardware (e.g. Linux Distribution, raspberry pi, etc) you are running on.

you are viewing a single comment's thread
view the rest of the comments
[–] behohippy@lemmy.world 4 points 1 year ago (1 children)

Stable Diffusion (Stability AI version), text-generation-webui (WizardLM), a text embedder service with Spacy, Bert and a bunch of sentence-transformer models, PiHole, Octoprint, Elasticsearch/Kibana for my IoT stuff, Jellyfin, Sonarr, FTB Minecraft (customized pack), a few personal apps I wrote myself (todo lists), SMB file shares, qBittorrent and Transmission (one dedicated to Sonarr)... Probably a ton of other stuff I'm forgetting.

[–] Kaerey@lemmy.world 2 points 1 year ago (2 children)

Do you have a GPU in there with the Stable Diffusion? If not how's it working? I'm debating moving to a machine I can't guarantee my spare GPU will fit in.2

[–] behohippy@lemmy.world 1 points 1 year ago

Yep, I'm using an RTX2070 for that right now. The LLMs are just executing on CPU.

[–] jerrimu@lemmy.world 0 points 1 year ago (1 children)

Without a GPU, it's pretty horrible.

[–] Kaerey@lemmy.world 1 points 1 year ago

That's what I was afraid of. Wad gifted a Dell VRTX blade chassis and trying to figure out how to shove my spare 2080 Super in there and get it power.