this post was submitted on 13 Aug 2023
41 points (100.0% liked)
TechTakes
42 readers
22 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
from the BI article:
oh man I can’t wait for this grift tech to strain our global IC manufacturing capability and create chip shortages and eventually a mountain of e-waste when it switches to being hosted on ASIC-powered nodes that can’t be used for anything but this specific grift tech nobody asked for (and of course as long as the grift keeps going, they’ll come up with “better” models that need bigger ASICs, creating more e-waste…)
what fucking year is it? I’m having deja vu
e: not to mention, making this shit ASIC-reliant means it’s incredibly easy to gatekeep who’s able to run these models via patents, licensing, and hostile pricing. a lot of the promptfans who think we’re one hardware accelerator away from running these on mobile devices aren’t just ignoring engineering — they’re arguing in the exact opposite direction of the cloud enshittification the industry has actually been pushing towards for years
I just wanna buy computer hardware for a reasonable amount of money, man. Can't I try a video game from the current decade for once in my life?
If promptfans love ASICs so much, why not go commit corporate espionage at Broadcom and leak their networking chips' firmware sources (in Uplink the video game).
from yesterday: https://mastodon.social/@LukaszOlejnik/110872362221245876
this being just an ARM server should hopefully limit its potential as e-waste, assuming the cores aren’t absolute dogs for non-LLM workloads? but of course this being attached to AI hype will ensure its price tag is at least 10x what’d typically be considered reasonable for 144 ARM cores on a fast interconnect
e: some more details on the chip from before they went all-in on AI hype. it’s a nice datacenter chip that’ll host a whole lot of vCPUs, but it’s not architecturally innovative in any way I can see other than number going up in terms of performance
I kinda doubt it'd be a rigid single-purpose ASIC like a miner. AI people like to fiddle with things too often for that I think.