this post was submitted on 27 Jul 2024
350 points (100.0% liked)

Programmer Humor

421 readers
79 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS
 
top 12 comments
sorted by: hot top controversial new old
[–] DashboTreeFrog@discuss.online 20 points 3 months ago (1 children)

Ah, they're digging their own graves!

Saw this like three times before I realized. Am dumb

[–] sukhmel@programming.dev 10 points 3 months ago* (last edited 3 months ago) (1 children)

I don't think it's that you're dumb, I think it's that first interpretation would depend on one's views of the matter.

AI is quite a new phenomenon, sure everyone is now an expert, but we still don't know what this will end up doing to the society

[–] pkill@programming.dev 2 points 3 months ago* (last edited 3 months ago)

Good luck to AI-based legal "solutions" startups, hope they and their customers are generously insured to cover for the fallout of such blatantly ignorant stupidity that completely discards our current subject matter expertise, which clearly shows that the error rate is too high, while you're either right about the law or you're not.

[–] leisesprecher@feddit.org 19 points 3 months ago (4 children)

I wonder what will happen with all the compute once the AI bubble bursts.

It seems like gaming made GPU manufacturing scale enough to start using them as general compute, Bitcoin pumped billions into this market, driving down prices (per FLOP) and AI reaped the benefit of that, when crypto moved to asics and crashed later on.

But what's next? We've got more compute than we could reasonably use. The factories are already there, the knowledge and techniques exist.

[–] magic_lobster_party@kbin.run 7 points 3 months ago

It will be used for more AI research probably.

Most of the GPUs belong to the big tech companies, like OpenAI, Google and Amazon. AI startups are rarely buying their own GPUs (often they’re just using the OpenAI API). I don’t think the big tech will have any problem figuring out what to do with all their GPU compute.

[–] BehindTheBarrier@programming.dev 4 points 3 months ago

Compute becomes cheaper and larger undertakings happen. LLMs are huge, but there is new tech moving things along. The key part in LLMs, the transformer is getting new competition that may surpass it, both for LLMs and other machine learning uses.

Otherwise, cheaper GPUs for us gamers would be great.

[–] abbadon420@lemm.ee 3 points 3 months ago

I'll buy a couple top tier gpu's from a failed startup on ebay to run my own ai at home.

[–] msgraves@lemmy.dbzer0.com 2 points 3 months ago

i think open source will build actually useful integrations due to the available compute

[–] pkill@programming.dev 6 points 3 months ago (3 children)

Why does it have to always be a hype that is a literal kick in the eye of gamers? Previously it was shitcoins, now it's hallucination engines... I'm afraid what's up next.

[–] zagaberoo 9 points 3 months ago

Turns out massively parallel computation has applications beyond video rendering.

[–] technom@programming.dev 1 points 3 months ago

That makes me wonder! All these new GPU uses are enormous energy hogs. Is gaming like that too?

there's always amd.