this post was submitted on 13 Nov 2023
1 points (100.0% liked)
Homelab
22 readers
1 users here now
Rules
- Be Civil.
- Post about your homelab, discussion of your homelab, questions you may have, or general discussion about transition your skill from the homelab to the workplace.
- No memes or potato images.
- We love detailed homelab builds, especially network diagrams!
- Report any posts that you feel should be brought to our attention.
- Please no shitposting or blogspam.
- No Referral Linking.
- Keep piracy discussion off of this community
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If you are talking about AI and gaming workloads you are going to need a GPU so form factor is important. I squeezed a 1050ti in my Dell R720xd for a gaming vm a few years ago and it was great but nowhere near what is needed for running simple inference on open source models. I’d consider more of a gaming rig for AI workloads. See /r/localllama
I ended up picking up last year’s top of the line Mac Studio with 128gigs of shared memory on sale at Microcenter for ai inference workloads. I’m using its 10g onboard to connect to my R720 via sfp+ to cat 7/rj45 and dig that. I have many secondary use cases for the Mac Studio which is a pleasure to have at my desk. If I was a PC gamer, I might have gone that route.