Alternative title: CEO of failing company makes shocking statement in order to attract investors.
He might believe this but developers aren’t going anywhere. Development is just becoming more abstract. This has happened in fits since forever.
This sublemmy is a place for sharing news and discussions about artificial intelligence, core developments of humanity's technology and societal changes that come with them. Basically futurology sublemmy centered around ai but not limited to ai only.
!auai@programming.dev (Our community focuses on programming-oriented, hype-free discussion of Artificial Intelligence (AI) topics. We aim to curate content that truly contributes to the understanding and practical application of AI, making it, as the name suggests, “actually useful” for developers and enthusiasts alike.)
Alternative title: CEO of failing company makes shocking statement in order to attract investors.
He might believe this but developers aren’t going anywhere. Development is just becoming more abstract. This has happened in fits since forever.
alternative alternative title: Grifter makes nonsensical promise in order to select for the most gullible users and investors, a la Nigerian prince scam
Nothing to see here. Let’s all move on.
If that is true, that is not something to look forward to. It means that ... oh ok this is c/singularity, carry on, lol
lol
Are they renaming the role? LLM supervisor?
Why do you assume the AI of tomorrow is going to be the same LLM of today? I don't think AI is going to need supervising for much longer.
Because everything needs supervision, even the president of the United States is supervised by judges and voters.
AI will need supervision in 5 years. Even a completely autonomous agent will need a supervisor
Anything thrown together in 5 years will be terrifying. ChatGPT is, effectively, a labotomised speach center of an AI. Google have made significant progress on the Visual systems. However, they are small parts of a true AI.
In order to translate from what a human wants to working code, you need to do a lot of thinking. You need to fill in a lot of gaps, and remap from manager mindset to actual code abstracts. Therefore, in order to do this reliably, an AI would have to be operating at or near sentience levels, with a LOT of additional structures to stabilise it.
Such an AI would likely be a perfect demo of the "paperclip maximiser" problem.
As a software engineer who existing limited tools daily: actually programmers will definitly be gone. i do think there will be something along the lines of "llm supervisor"/ "Person who fixes bugs in llm code that it cant" but such a drastically lower number of them.
like probably for every 100 programmers now there will be a few.. and it will mostly be the engineers not developers. most developers right now are given a series of bug or simple feature request tickets that LLM's now can do the majority of the work on. i cant imageine in 5-10 years
I think efficiency will be improved but so much time is spent on things other than physically typing out code that I don't think we'll see that many programmers getting eliminated. I do some programming at my job (DevOps), and far more time is spent on design decisions, requirements gathering, and testing than actually writing any code.
Will just be a New job title... Solution designer. I'm personally not afraid
In all parts of the world? Bullshit
Lol.... Lmao