this post was submitted on 28 Jan 2025
96 points (100.0% liked)

Technology

37920 readers
55 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 3 years ago
MODERATORS
 

cross-posted from: https://lemmy.ml/post/25282200

you are viewing a single comment's thread
view the rest of the comments
[–] leisesprecher@feddit.org 16 points 2 days ago (2 children)

Deepseek showed that actually putting thought into the architecture achieves much more than just throwing more hardware at the problem.

This means a) there will be much less demand for hardware, since much more could be run locally on regular consumer devices. And b) the export restrictions don't really work and instead force China to create actually better models.

That means, a lot of the investments into the thousands of AI companies are in jeopardy.

[–] artificialfish@programming.dev 1 points 28 minutes ago

I think “just writing better code” is a lot harder than you think. You actually have to do research first you know? Our universities and companies do research too. But I guarantee using R1 techniques on more compute would follow the scaling law too. It’s not either or.

[–] bobs_monkey@lemm.ee 9 points 2 days ago (1 children)

Realistically, the CCP is probably throwing a lot of money at developers to get something good going and available, and US companies are whining about how it's not fair. The fact of the matter is that a solid product is available for much cheaper, and US companies are now screaming foul. Guess what, a superior product made of good code (people) beats out just throwing money at hardware, who'd've gone an thunk it.

[–] leisesprecher@feddit.org 12 points 2 days ago (2 children)

What I find really fascinating here is that obviously openAI, Meta, etc. seem to be structurally incapable of actually innovating at this point.

I mean, reducing training costs by literally an order of magnitude just by writing better software is astonishing and shows how complacent the large corporations have gotten.

[–] artificialfish@programming.dev 1 points 28 minutes ago* (last edited 27 minutes ago)

Meta? The one that released Llama 3.3? The one that actually publishes its work? What are you talking about?

[–] algorithmae@lemmy.sdf.org 1 points 1 day ago

You can write off hardware purchases, paying for skilled devs is like pulling teeth