this post was submitted on 05 Feb 2024
138 points (100.0% liked)

Asklemmy

1462 readers
75 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy πŸ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

Ok let's give a little bit of context. I will turn 40 yo in a couple of months and I'm a c++ software developer for more than 18 years. I enjoy to code, I enjoy to write "good" code, readable and so.

However since a few months, I become really afraid of the future of the job I like with the progress of artificial intelligence. Very often I don't sleep at night because of this.

I fear that my job, while not completely disappearing, become a very boring job consisting in debugging code generated automatically, or that the job disappear.

For now, I'm not using AI, I have a few colleagues that do it but I do not want to because one, it remove a part of the coding I like and two I have the feeling that using it is cutting the branch I'm sit on, if you see what I mean. I fear that in a near future, ppl not using it will be fired because seen by the management as less productive...

Am I the only one feeling this way? I have the feeling all tech people are enthusiastic about AI.

top 50 comments
sorted by: hot top controversial new old
[–] bloopernova@programming.dev 51 points 1 year ago (1 children)

There's a massive amount of hype right now, much like everything was blockchains for a while.

AI/ML is not able to replace a programmer, especially not a senior engineer. Right now I'd advise you do your job well and hang tight for a couple of years to see how things shake out.

(me = ~50 years old DevOps person)

[–] wito@lemmy.techtailors.net 8 points 1 year ago

Great advice. I would add to it just to learn leveraging those tools effectively. They are great productivity boost. Another side effect once they become popular is that some skills that we already have will be harder to learn so they might be in higher demand.

Anyway, make sure you put aside enough money to not have to worry about such things πŸ˜ƒ

[–] mozz@mbin.grits.dev 21 points 1 year ago

I think all jobs that are pure mental labor are under threat to a certain extent from AI.

It's not really certain when real AGI is going to start to become real, but it certainly seems possible that it'll be real soon, and if you can pay $20/month to replace a six figure software developer then a lot of people are in trouble yes. Like a lot of other revolutions like this that have happened, not all of it will be "AI replaces engineer"; some of it will be "engineer who can work with the AI and complement it to be produtive will replace engineer who can't."

Of course that's cold comfort once it reaches the point that AI can do it all. If it makes you feel any better, real engineering is much more difficult than a lot of other pure-mental-labor jobs. It'll probably be one of the last to fall, after marketing, accounting, law, business strategy, and a ton of other white-collar jobs. The world will change a lot. Again, I'm not saying this will happen real soon. But it certainly could.

I think we're right up against the cold reality that a lot of the systems that currently run the world don't really care if people are taken care of and have what they need in order to live. A lot of people who aren't blessed with education and the right setup in life have been struggling really badly for quite a long time no matter how hard they work. People like you and me who made it well into adulthood just being able to go to work and that be enough to be okay are, relatively speaking, lucky in the modern world.

I would say you're right to be concerned about this stuff. I think starting to agitate for a better, more just world for all concerned is probably the best thing you can do about it. Trying to hold back the tide of change that's coming doesn't seem real doable without that part changing.

[–] Lmaydev@programming.dev 19 points 1 year ago (1 children)

I use AI heavily at work now. But I don't use it to generate code.

I mainly use it instead of googling and skimming articles to get information quickly and allow follow up questions.

I do use it for boring refactoring stuff though.

In its current state it will never replace developers. But it will likely mean you need less developers.

The speed at which our latest juniors can pick up a new language or framework by leaning on LLMs is quite astounding. It's definitely going to be a big shift in the industry.

At the end of the day our job is to automate things so tasks require less staff. We're just getting a taste of our own medicine.

[–] domi@lemmy.secnd.me 5 points 1 year ago

I mainly use it instead of googling and skimming articles to get information quickly and allow follow up questions.

I do use it for boring refactoring stuff though.

Those are also the main uses cases I use it for.

Really good for getting a quick overview over a new topic and also really good at proposing different solutions/algorithms for issues when you describe the issue.

Doesn't always respond correctly but at least gives you the terminology you need to follow up with a web search.

Also very good for generating boilerplate code. Like here's a sample JSON, generate the corresponding C# classes for use with System.Text.Json.JsonSerializer.

Hopefully the hardware requirements will come down as the technology gets more mature or hardware gets faster so you can run your own "coding assistant" on your development machine.

As an example:

Salesforce has been trying to replace developers with "easy to use tools" for a decade now.

They're no closer than when they started. Yes the new, improved flow builder and omni studio look great initially for the simple little preplanned demos they make. But theyre very slow, unsafe to use and generally are impossible to debug.

As an example: a common use case is: sales guy wants to create an opportunity with a product. They go on how omni studio let's an admin create a set of independently loading pages that let them:
β€’ create the opportunity record, associating it with an existing account number.
β€’ add a selection of products to it.

But what if the account number doesn't exist? It fails. It can't create the account for you, nor prompt you to do it in a modal. The opportunity page only works with the opportunity object.

Also, if the user tries to go back, it doesn't allow them to delete products already added to the opportunity.

Once we get actual AIs that can do context and planning, then our field is in danger. But so long as we're going down the glorified chatbot route, that's not in danger.

[–] MajorHavoc@programming.dev 15 points 1 year ago

I'm both unenthusiastic about A.I. and unafraid of it.

Programming is a lot more than writing code. A programmer needs to setup a reliable deployment pipeline, or write a secure web-facing interface, or make a useable and accessible user interface, or correctly configure logging, or identity and access, or a million other nuanced, pain-in-the-ass tasks. I've heard some programmers occasionally decrypt what the hell the client actually wanted, but I think that's a myth.

The history of automation is somebody finds a shortcut - we all embrace it - we all discover it doesn't really work - someone works their ass off on a real solution - we all pay a premium for it - a bunch of us collaborate on an open shared solution - we all migrate and focus more on one of the 10,000 other remaining pain-in-the-ass challenges.

A.I. will get better, but it isn't going to be a serious viable replacement for any of the real work in programming for a very long time. Once it is, Murphy's law and history teaches us that there'll be plenty of problems it still sucks at.

[–] Turun@feddit.de 14 points 1 year ago

If you are afraid about the capabilities of AI you should use it. Take one week to use chatgpt heavily in your daily tasks. Take one week to use copilot heavily.

Then you can make an informed judgement instead of being irrationally scared of some vague concept.

[–] BolexForSoup@kbin.social 13 points 1 year ago* (last edited 1 year ago)

Thought about this some more so thought I’d add a second take to more directly address your concerns.

As someone in the film industry, I am no stranger to technological change. Editing in particular has radically changed over the last 10 to 20 years. There are a lot of things I used to do manually that are now automated. Mostly what it’s done is lower the barrier to entry and speed up my job after a bit of pain learning new systems.

We’ve had auto-coloring tools since before I began and colorists are still some of the highest paid folks around. That being said, expectations have also risen. Good and bad on that one.

Point is, a lot of times these things tend to simplify/streamline lower level technical/tedious tasks and enable you to do more interesting things.

[–] cobra89 9 points 1 year ago* (last edited 1 year ago)

I'm gonna sum up my feelings on this with a (probably bad) analogy.

AI taking software developer jobs is the same thinking as microwaves taking chefs jobs.

They're both just tools to help you achieve the same goal easier/faster. And sometimes the experts will decide to forego the tool and do it by hand for better quality control or high complexity that the tool can't do a good job at.

[–] daniyeg@lemmy.ml 8 points 1 year ago (1 children)

i'm still in uni so i can't really comment about how's the job market reacting or is going to react to generative AI, what i can tell you is it has never been easier to half ass a degree. any code, report or essay written has almost certainly came from a LLM model, and none of it makes sense or barely works. the only people not using AI are the ones not having access to it.

i feel like it was always like this and everyone slacked as much as they could but i just can't believe it, it's shocking. lack of fundamental and basic knowledge has made working with anyone on anything such a pain in the ass. group assignments are dead. almost everyone else's work comes from a chatgpt prompt that didn't describe their part of the assignment correctly, as a result not only it's buggy as hell but when you actually decide to debug it you realize it doesn't even do what its supposed to do and now you have to spend two full days implementing every single part of the assignment yourself because "we've done our part".

everyone's excuse is "oh well university doesn't teach anything useful why should i bother when i'm learning ?" and then you look at their project and it's just another boilerplate react calculator app in which you guessed it most of the code is generated by AI. i'm not saying everything in college is useful and you are a sinner for using somebody else's code, indeed be my guest and dodge classes and copy paste stuff when you don't feel like doing it, but at least give a damn on the degree you are putting your time into and don't dump your work on somebody else.

i hope no one carries this kind of sentiment towards their work into the job market. if most members of a team are using AI as their primary tool to generate code, i don't know how anyone can trust anyone else in that team, which means more and longer code reviews and meetings and thus slower production. with this, bootcamps getting more scammy and most companies giving up on junior devs, i really don't think software industry is going towards a good direction.

[–] shasta@lemm.ee 2 points 1 year ago

I think I will ask people if they use AI to write code when I am interviewing them for a job and reject anyone who does.

[–] kent_eh@lemmy.ca 7 points 1 year ago (1 children)

Have you seen the shit code it confidently spews out?

I wouldn't be too worried.

[–] fievel@lemm.ee 5 points 1 year ago

Well I seen, I even code reviewed without knowing, when I asked colleague what happened to him, he said "I used chatgpt, I'm not sure to understand what this does exactly but it works". Must confess that after code review comments, not much was left of the original stuff.

[–] A1kmm@lemmy.amxl.com 7 points 1 year ago (1 children)

Programming is the most automated career in history. Functions / subroutines allow one to just reference the function instead of repeating it. Grace Hopper wrote the first compiler in 1951; compilers, assemblers, and linkers automate creating machine code. Macros, higher level languages, garbage collectors, type checkers, linters, editors, IDEs, debuggers, code generators, build systems, CI systems, test suite runners, deployment and orchestration tools etc... all automate programming and programming-adjacent tasks, and this has been going on for at least 70 years.

Programming today would be very different if we still had to wire up ROM or something like that, and even if the entire world population worked as programmers without any automation, we still wouldn't achieve as much as we do with the current programmer population + automation. So it is fair to say automation is widely used in software engineering, and greatly decreases the market for programmers relative to what it would take to achieve the same thing without automation. Programming is also far easier than if there was no automation.

However, there are more programmers than ever. It is because programming is getting easier, and automation decreases the cost of doing things and makes new things feasible. The world's demand for software functionality constantly grows.

Now, LLMs are driving the next wave of automation to the world's most automated profession. However, progress is still slow - without building massive very energy expensive models, outputs often need a lot of manual human-in-the-loop work; they are great as a typing assist to predict the next few tokens, and sometimes to spit out a common function that you might otherwise have been able to get from a library. They can often answer questions about code, quickly find things, and help you find the name of a function you know exists but can't remember the exact name for. And they can do simple tasks that involve translating from well-specified natural language into code. But in practice, trying to use them for big complicated tasks is currently often slower than just doing it without LLM assistance.

LLMs might improve, but probably not so fast that it is a step change; it will be a continuation of the same trends that have been going for 70+ years. Programming will get easier, there will be more programmers (even if they aren't called that) using tools including LLMs, and software will continue to get more advanced, as demand for more advanced features increases.

[–] LarmyOfLone@lemm.ee 1 points 1 year ago

AI powered spreadsheets are going to be the next big technology for programmers :D

[–] Yerbouti@lemmy.ml 7 points 1 year ago (1 children)

I'm a composer. My facebook is filled with ads like "Never pay for music again!". Its fucking depressing.

[–] cobra89 2 points 1 year ago* (last edited 1 year ago)

Good thing there's no Spotify for sheet music yet... I probably shouldn't give them ideas.

[–] r00ty@kbin.life 7 points 1 year ago

I don't think software developers or engineers alone should be concerned. That's what people see all the time. Chat-GPT generating code and thinking it means developers will be out of a job.

It's true, I think that AI tools will be used by developers and engineers. This is going to mean companies will reduce headcounts when they realise they can do more with less. I also think it will make the role less valuable and unique (that was already happening, but it will happen more).

But, I also think once organisations realise that GPTx is more than Chat-GPT, and they can create their own models based on their own software/business practices, it will be possible to do the same with other roles. I suspect consultancy businesses specializing in creating AI models will become REALLY popular in the short to medium term.

Long term, it's been known for a while we're going to hit a problem with jobs being replaced by automation, this was the case before AI and AI will only accelerate this trend. It's why ideas like UBI have become popular in the last decade or so.

[–] arthur@lemmy.zip 6 points 1 year ago

Man, it's a tool. It will change things for us, it is very powerful; but still a tool. It does not "know" anything, there's no true intelligence in the things we now call "AI". For now, is really useful as a rubber duck, it can make interesting suggestions, make you explore big code bases faster, and even be useful for creating boilerplate. But the code it generates usually is not very trustworthy and have lower quality.

The reality is not that we will lose our jobs to it, but that companies will expect more productivity from us using these tools. I recommend you to try ChatGPT (the best in class for now), and try to understand it's strengths and limitations.

Remember: this is just an autocomplete on steroids, that do more the the regular version, but that get the same type of errors.

[–] purpleprophy@feddit.uk 6 points 1 year ago (1 children)

This might cheer you up: https://visualstudiomagazine.com/articles/2024/01/25/copilot-research.aspx

I don't think we have anything to worry about just yet. LLMs are nothing but well-trained parrots. They can't analyse problems or have intuitions about what will work for your particular situation. They'll either give you something general copied and pasted from elsewhere or spin you a yarn that sounds plausible but doesn't stand up to scrutiny.

Getting an AI to produce functional large-scale software requires someone to explain precisely the problem domain: each requirement, business rule, edge case, etc. At which point that person is basically a developer, because I've never met a project manager who thinks that granularly.

They could be good for generating boilerplate, inserting well-known algorithms, generating models from metadata, that sort of grunt work. I certainly wouldn't trust them with business logic.

[–] fievel@lemm.ee 3 points 1 year ago

I think you raise a very good point about explaining the problem... Even us as "smart humans" have often great difficulty to see the point while reading PM specs...

[–] tunetardis@lemmy.ca 6 points 1 year ago

As a fellow C++ developer, I get the sense that ours is a community with a lot of specialization that may be a bit more difficult to automate out of existence than web designers or what have you? There's just not as large a sample base to train AIs on. My C++ projects have ranged from scientific modelling to my current task of writing drivers for custom instrumentation we're building at work. If an AI could interface with the OS I wrote from scratch for said instrumentation, I would be rather surprised? Of course, the flip side to job security through obscurity is that you may make yourself unemployable by becoming overly specialized? So there's that.

Our company uses AI tools as just that, tools to help us do the job without having to do the boring stuff.

Like I can now just write a comment about state for a modal and it will auto generate the repetitive code of me having to write const [isModalOpen, setIsModalOpen] = useState(false);.

Or if I write something in one file it can reason that I am going to be using it in the next file so it can generate the code I would usually type. I still have to solve problems it’s just I can do it quicker now.

[–] ulkesh 5 points 1 year ago* (last edited 1 year ago) (1 children)

I’m less worried and disturbed by the current thing people are calling AI than I am of the fact that every company seems to be jumping on the bandwagon and have zero idea how it can and should be applied to their business.

Companies are going to waste money on it, drive up costs, and make the consumer pay for it, causing even more unnecessary inflation.

As for your points on job security β€” your trepidation is valid, but premature, by numerous decades, in my opinion. The moment companies start relying on these LLMs to do their programming for them is the moment they will inevitably end up with countless bugs and no one smart enough to fix them, including the so-called AI. LLMs seem interesting and useful on the surface, and a person can show many examples of this, but at the end of the day, it’s regurgitating fed content based on rules and measures with knob-tuning β€” I do not yet see objective strong evidence that it can effectively replace a senior developer.

[–] knightly@pawb.social 3 points 1 year ago

Companies are going to waste money on it, drive up costs, and make the consumer pay for it, causing even more unnecessary inflation.

The "AI" bubble will burst this year, I'd put money on it if I had any.

The last time we saw a bubble like this was "Web3" and we all know how that turned out.

[–] fievel@lemm.ee 4 points 1 year ago

I'd like to thank you all for all your interesting comments and opinion.

I see a general trends not being too worried because of how the technology works.

The worrysome part being what capitalism and management can think but that's just an update of the old joke "A product manager is a guy that think 9 women can make a baby in 1 month". And anyway, if not that there will be something else, it's how our society is.

Now, I feel better, and I understand that my first point of view of fear about this technology and rejection of it is perhaps a very bad idea. I really need to start using it a bit in order to known this technology. I already found some useful use cases that can help me (get inspiration while naming things, generate some repetitive unit test cases, using it to help figuring out about well-known API, ...).

[–] Chozo@kbin.social 4 points 1 year ago

Your job is automating electrons, and now some automated electrons are threatening your job.

I have to imagine this is similar to how farmers felt when large-scale machinery became widely available.

[–] olbaidiablo@lemmy.ca 4 points 1 year ago

AI allows us to do more with less just like any other tool. It's no different than an electric drill or a powered saw. Perhaps in the future we will see more immersive environment games because much of the immersive environment can be made with AI doing the grunt work.

[–] eugenia@lemmy.ml 4 points 1 year ago (2 children)

I disagree with the other posts here that you're overreacting. I think that AI will replace most jobs (maybe as high as 85% at some point). Consider becoming a plumber or an electrician. Until the robots will become commonplace in 20 years from now, you will have a job that AI won't be able to touch much. And people won't run out of asses or gaming. So they'll be stable professions for quite a while. You can still code in your free time, as a hobby. And don't cry for the lost revenue of being a programmer, because that will happen to everyone who will be affected by AI. You'll just have another job while the others won't. That's the upside.

I understand that this comment is not what people want to hear with their wishful thinking, so they'll downvote it. But I gotta say it how I see it. AI is the biggest revolution since the industrial revolution.

[–] knightly@pawb.social 3 points 1 year ago

"AI" is a bubble. A lot of these concerns will go away this year once the bean-counters do the math and realize that the benefits of running generative neural networks aren't worth the costs.

A single chatGPT query costs about 50-500 times as much energy as a pre-Bard Google search, to say nothing of the engineering time needed to build the models. And, since LLM outputs can't be trusted, the end users will still need writers and developers to go over everything and check for hallucinations.

The trajectory here closely mimics "Web3", when people thought that massively redundant distributed ledgers were going to be the next big thing, despite the fact that traditional electronic ledgers beat the blockchain in literally every aspect of performance, efficiency, and security.

Soon, "AI" will be just as synonymous with "plagirism" as "cryptocurrency" is with "scam".

[–] ParsnipWitch@feddit.de 1 points 1 year ago

With the difference that the industrial revolution created a lot of new jobs with better pay. While AI doesn't. I see people suggesting that this has happened before and soon it will turn the economic situation into something much better. But I don't see that at all. Just because it's also a huge revolution, doesn't mean it will have the same effects.

As you have written, people will have to switch into manual jobs like layering bricks and wiping butts. The pay in these jobs won't increase just because more people have to work them.

[–] CanadaPlus@lemmy.sdf.org 4 points 1 year ago

Give Copilot or similar a try. AI or similar is pretty garbage at the more complex aspects of programming, but it's great at simple boilerplate code. At least for me, that doesn't seem like much of a loss.

[–] Lath@kbin.social 3 points 1 year ago

If you are, it should be due to working for the wrong people. Those that don't understand what's what and only seek profit religiously.

Thanks for the readable code though.

[–] howrar@lemmy.ca 3 points 1 year ago* (last edited 1 year ago)

If your job truly is in danger, then not touching AI tools isn't going to change that. The best you can do for yourself is to explore what these tools can do for you and figure out if they can help you become more productive so that you're not first on the chopping block. Maybe in doing so, you'll find other aspects of programming that you enjoy just as much and don't yet get automated away with these tools. Or maybe you'll find that they'll not all they're hyped up to be and ease your worry.

[–] l0st_scr1b3 3 points 1 year ago (1 children)
[–] fievel@lemm.ee 2 points 1 year ago

I probably should have used llm to help me write a clearer question :D

[–] ParsnipWitch@feddit.de 3 points 1 year ago

Your fear is in so far justified as that some employers will definitely aim to reduce their workforce by implementing AI workflow.

When you have worked for the same employer all this time, perhaps you don't know, but a lot of employers do not give two shits about code quality. They want cheap and fast labour and having less people churning out more is a good thing in their eyes, regardless of (long-term) quality. May sound cynical, but that is my experience.

My prediction is that the income gap will increase dramatically because good pay will be reserved for the truly exceptional few. While the rest will be confronted with yet another tool capitalists will use to increase profits.

Maybe very far down the line there is blissful utopia where no one has to work anymore. But between then and now, AI would have to get a lot better. Until then it will be mainly used by corporations to justify hiring less people.

[–] FaceDeer@kbin.social 3 points 1 year ago (1 children)

I'm in a similar place to you career-wise. Personally, I'm not concerned about becoming just a "debugger." What I'm expecting this job to look like in a few years is to be more like "the same as now, except I've got a completely free team of "interns" that do all the menial stuff for me. Every human programmer will become a lead programmer, deciding what stuff our AIs do for us and putting it all together into the finished product.

Maybe a few years further along the AI assistants will be good enough to handle that stuff better than we do as well. At that point we stop being lead programmers and we all become programming directors.

So think of it like a promotion, perhaps.

[–] dutchkimble@lemy.lol 1 points 1 year ago (1 children)

Why would there be a team of AIs in this scenario under the human, instead of just one AI entity.

[–] FaceDeer@kbin.social 1 points 1 year ago (1 children)

The role of the human is to tell the AI what it's supposed to do. If you're worried about AI that's sophisticated enough to be completely self-directed then you're worrying about AGI, which will be so world-changing that piddly little concerns such as "what about my job?" Are pretty trivial.

[–] dutchkimble@lemy.lol 1 points 1 year ago (1 children)

No, I meant keeping the human directing things but they could do it with one AI under them

[–] FaceDeer@kbin.social 1 points 1 year ago (1 children)

Well yes, then. That's what I said. You'd be a programmer who had free underlings doing whatever grunt work you directed them to.

Or are you questioning my use of the term "team" for the AIs? LLMs are specialized in various ways, you'd likely want to have multiple ones that handle different tasks.

[–] dutchkimble@lemy.lol 1 points 1 year ago

Yeah I meant the team part. Learnt something new, I thought all AI was more or less equal!

[–] LordGimp@lemm.ee 2 points 1 year ago (1 children)

As a welder, I've been hearing for 20 years that "robots are going to replace you" and "automation is going to put you out of a job" yadda yadda. None of you code monkies gave a fuck about me and my job, but now it's a problem because it affects you and your paycheck? Fuck you lmao good riddance to bad garbage.

Weirdly hostile, but ok. It's like any other tool that can be used to accelerate a process. Hopefully at some point it's useful enough to streamline the minutia of boring tasks that a competent intern could do. Not sure who is specifically targeting welders??

If it frees up your time to focus on more challenging stuff or stuff you enjoy, isn't that a good thing? Folks are dynamic and will adjust, as we always have.

Don't think there's a good excuse to come at someone with animosity over this topic.

I am om the product side of things and have created some basic proof of concept tools with AI that my bosses wanted to sell off. No way no how will I be able to sevrice or maintain them. It's incredibly impressive that I could even get this output.

I am not saying it won't become possible, but I lack the fundamental knowledge and understanding to make anything beyond the most minor adjustments and AI is still wuite bad at only addressing specific issues or, good forbid, expanding code, without fully rewriting the whole thing and breaking everything else.

For our devs I see it as a much improved and less snide stackoverflow and Google. The direct conversational nature really speeds things up with boilerplate code and since they actually know what they are doing, it's amazing. Not only that but we had devs copy paste from online searches withoout fully understanding the snippets. Now the AI can explain it in context.

[–] Damage@feddit.it 1 points 1 year ago

If this follows the path of the industrial revolution, it'll get way worse before it gets better, and not without a bunch of bloodshed

[–] BolexForSoup@kbin.social 1 points 1 year ago

To answer your question directly: The debate has been going on in the broader public since ChatGPT 3 dropped

To answer how you’re feeling: that’s valid, because a lot of big pockets seem to not care at all about the ethical considerations.

[–] coolin 1 points 1 year ago

I think your job in your current form is likely in danger.

SOTA Foundation Models like GPT4 and Gemini Ultra can write code, execute, and debug with special chain of thought prompting techniques, and large acale process verification on synthetic data and RL search for correct outputs will make this 10x better. The silver lining to this is that I expect this to require an absolute shit ton of compute to constantly generate LLM output hundreds of times for each internal prompt over multiple prompts, requiring immense compute and possibly taking longer than an ordinary software engineer to run. I suspect early full stack developer LLMs will mainly be used to do a few very tedious coding tasks and SWEs will be cheaper for a fair length of time.

I expect it will be 2-3 years before this happens, so for that short period I expect workers to be "super-productive" by using LLMs in the coding process, but I expect the crossover point when the LLM becomes better is quite soon, perhaps in the next 5 years as compute requirements go down.