this post was submitted on 18 Sep 2023
312 points (100.0% liked)

Programmer Humor

854 readers
1 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS
 
top 15 comments
sorted by: hot top controversial new old
[–] quicken@aussie.zone 47 points 1 year ago (2 children)

Just chuck more data at it and hope for the best! It's a pretty fun strategy even if it fails a lot

[–] Swedneck@discuss.tchncs.de 18 points 1 year ago (1 children)

I'm convinced they're just into machine learning because of the funny mistakes they make

[–] nottheengineer@feddit.de 19 points 1 year ago

That's honestly one of the best parts about it. Reading error messages and thinking logically is boring compared to trying to guess how this stupid LLM got the idea to spit out what it did.

[–] CanadaPlus@lemmy.sdf.org 6 points 1 year ago* (last edited 1 year ago)

ML honestly sounds like a maddeningly dull profession to me because of this. It's a cool technology but jiggling hyperparameters and then waiting would grate on me.

[–] crow 30 points 1 year ago (1 children)

Also drowning “knowing how your code works”

[–] TonyTonyChopper@mander.xyz 4 points 1 year ago

can't wait till chat gpt or its maintainers start injecting Skynet and other trojans

[–] nottheengineer@feddit.de 21 points 1 year ago (3 children)

Any task that can be expressed as mostly translation is a good task to try with an LLM.

And you know what? Stakeholders tend to love LLMs, so have fun with your complicated problems while I build them by using the ancient technique of slapping some boilerplate together and combining it with the new ways of pasting error messages into chatgippity.

[–] h3ndrik@feddit.de 33 points 1 year ago* (last edited 1 year ago) (2 children)

I don't think that was the point. The thing is, people replace calculators with that...

  • User: Assistant?
  • Assistant: * BEEP *
  • User: What is 21 divided by three?
  • Assistant: 52, my master.

Thing is, they only get some results right and hallucinate others. And you're doing billions of matrix multiplications just to calculate 2+1.

Sure. You can go to a construction site with only your one favorite tool. And use it for everything. And it's impressive to open a glass bottle of beer with a hammer and such. But I can guarantee you, you'll be slower digging that hole than the guys using a proper tool like an excavator.

[–] P1r4nha@feddit.de 9 points 1 year ago

And: you don't solve any fundamental problems if you don't have the data for it. If the information isn't in your data, the network will start guessing and it will be horrible.

[–] nottheengineer@feddit.de 3 points 1 year ago (1 children)

That's not a translation problem, so LLMs are terrible for it.

Always use the right tool for the job. If there are a lot of nails to be hammered, you need a guy with a hammer.

[–] h3ndrik@feddit.de 4 points 1 year ago

Yeah, that was kind of my point. I think the meme picture means people throw it at everything. No matter what. And the next logical thing would be to strip the computer scientist out of the picture. We have Github Copilot now ;) Let AI decide if AI is the proper tool.

[–] fmstrat@lemmy.nowsci.com 8 points 1 year ago

Translation, but not categorization. Trying to get reliable, and more importantly, predictibly accurate, metadata from an LLM without serious training is a pain. ML algorithms are far better for this but certainly take more brainpower (in my experience so far).

[–] TehPers 3 points 1 year ago

There are a disproportionately large number of people who get one pretty demo and think LLMs are the solution to everything. Even for translations, I'd be interested to see how accurate the major models are in real world scenarios. We've been struggling hard to find any practical usage of LLMs that doesn't require the user to be able to verify the output themselves.

[–] nx2@feddit.de 20 points 1 year ago

Nvidia stock goes brrrr

[–] kitonthenet@kbin.social 5 points 1 year ago

Modular design