this post was submitted on 15 Nov 2023
1 points (100.0% liked)

Homelab

22 readers
1 users here now

Rules

founded 1 year ago
MODERATORS
 

Wondering if anyone has a feel for the power efficiency of older server hardware. I'm reading posts from people who say they have R710 with lots of hard drive and it IDLES at 160W with 8 hard drives. So...if you take the hard drives out of the equation, it's probably still like 120W. Is that just how inefficient old computers are? Kinda like incandescent bulbs are less efficient than LED bulbs? How efficient is the R730 compared to the R710?

My 6 year old desktop computer is 60W idle with a GPU, and 30W idle without the GPU. Seems like a huge difference. It's like $70 more per year to run a R710 than my old desktop with a GPU. Is that correct?

you are viewing a single comment's thread
view the rest of the comments
[–] androidusr@alien.top 1 points 1 year ago (2 children)

Holy cow. What's driving half of that wattage? Is it the 32 sticks of ram? Or the 4 cpu?

Your server is 75% of my entire house power, including my server.

[–] hodak2@alien.top 1 points 1 year ago

I forgot there is also a gtx 1650 in there as well.

But honestly. I’m fairly sure the majority of the power draw is the 4 CPU’s.

96 cores and 192 threads on older architectures was a bit of a power suck. If I had it all to do over again I would for sure have gotten an epyc chip instead.

[–] wireframed_kb@alien.top 1 points 1 year ago

Well, a DDR4 RAM stick would use 2-4W, so 32 sticks is 64-128W alone. 4 CPUs don't help either. :)

Given it's only 512MB, it could be achieved with just 8 64GB modules, which would save a bit of power, but that's also a lot of money to put into what is after all somewhat obsolete hardware. (And I say that running a v4 Xeon as well :P)