>> The computers will come for all of our jobs eventually, but those of us who refuse or decline to embrace the most powerful creative tools we’ve ever been given will be the first to fall.
It's being mandated by almost all companies. We're forced to use it, whether it produces good results or not. I can change a one-line code faster than Claude Code can, as long as I understand the code. Someday, I'll lose understanding of the code, because I didn't write it. What am I embracing?
Times change, you have to learn new tools or get left behind. Nobody is asking if you like it, even expecting your personal preferences to matter and be taken seriously is a serious example of privilege.
I'm serious. Your boss comes to you and tells you to use AI, how is that any different from the foreman of an experienced carpentry / framing crew coming up to his old pros and handing them a pneumatic nail gun, and telling them to put down the hammers? You think they didn't complain, these men who pride themselves in being able to drive a nail with fewer swings than others? These men who are super fast and experienced with hammers and took pride in that, even enjoyed their work, do you think they were happy when their boss, some unskilled management bitch, showed that he could drive nails faster than they could just by pulling a trigger? They hated it! And it didn't matter, they had to adapt. Why should programmers be more privileged?
Your boss tells you to use AI because he gave it a task that took you a day to complete and the AI did it in five minutes. Your boss doesn't care about your skill with an obsolete craft, or the art or aesthetic qualities of coding by hand, or even the supposed quality benefits of doing it by hand. None of that matters when the new tool can demonstrably work faster. Your boss sees it himself, then sees you complaining, and he then sees you as an old veteran complaining about the new way of things. Complain all you want, you either learn to keep up or you'll get left behind.
> Someday, I'll lose understanding of the code, because I didn't write it.
I've been wading through vast corporate codebases I never wrote and yet had to understand for the past 20 years. This isn't any different, and AI tools help with that understanding. A lot!
It's a skill set just like coding. You can embrace an elevated workflow where you can forget about the specific syntax and focus on the architecture and integration. It takes time to intuit what exactly the models are bad at, so you can forsee hallucinations and prevent them from happening in the first place. Yes you can write 1 line faster than Claude, but what about 10 lines? 100? 1000?
> Yes you can write 1 line faster than Claude, but what about 10 lines? 100? 1000?
Bingo. One quick edit when you already know what needs to be done is trivial, that means nothing. What happens when you have to write a new feature and it will take hundreds of lines of code? Unless you're an elder god of programming, the LLM will lap you easily.
Intersting observation. There is a difference however. Pre-AI each human programmer understood the code they wrote, in general. So there were many humans who understood some part of the code. Poast-AI there will be no humans who understand any code, presumably. Sure we will understand its syntax, but the overall architecture of applications may be so complicated that no human can understand it, in practice.
You're not embracing it, you're forced to accept it because the nature of employer-employee relationships has a fundamental power differential which makes it exploitative.
You helped build the company, you should own a proportional part of it.
The issue with the current system is that only the people who provide money, not the people who provide work, get to own the result. Work (and natural resources) is where value comes from. Their money came from work as well but not only their work, they were in a position of power which allowed them to get a larger cut than deserved. Ownership should, by law, be distributed according to the amount and skill level of work.
Then people wouldn't worry about losing their jobs to automation - because they'd keep receiving dividends from the value of their previous work.
If their previous work allowed the company to buy a robot to replace them, great, they now get the revenue from the robot's work while being free to pursue other things in their now free time.
If their previous work allowed an LLM to be trained or rented to replace them, great, they get get the revenue from the LLM's work...
Crapitalism didn't let up for the benifit of the old veteran framers who didn't want to use nailguns, why should it let up for the benifit of old veteran programmers who don't want to use LLMs? We aren't special. Expecting a shakeup of society's whole economic system just to preserve your preference for old tools is totally out to lunch.
Used to be people sold capitalism as something that gave freedom to individuals. Now it's just the thing that forces us to to act against what we believe is best for ourselves and society at large. Anyone who expresses distaste for that is of course, out to lunch.
I'm not selling capitalism. I'm telling you that society is indifferent to your desire to program in the old ways, we're not going to start a worker's revolution for the sake of programmers who don't like coding agents. You can either adapt, or get left behind.
Since in my system, you cannot buy ownership of a corporate person (just like you cannot buy a natural person, for good reasons), that severely limits how such a situation could arise in the first place.
You still get paid a salary, you can still save up or invest it, it's just that money only buys you ownership according to how much work (times * skill) you put in to make that money.
Any system based on market competition in which your scenario realistically happens was probably so degenerate it would end up being replaced (whether democratically or by force). My system, AFAICT, is strictly (in the mathematical sense) better than the current implementation of capitalism, it just has extra precautions against buying power. What it boils down to is you want a perfect system while I am proposing a system that's better than the current state and you reject it based on not being perfect.
BTW a part of your comment is a condescending personal attack which doesn't add to the discussion and is against the guidelines.
My feeling is that AI is not real coding; it is coding-adjacent. Project Management, Sales, Marketing, Writing Books About KanBan, AI Programming, User Interface Design, Installing Routers are coding-adjacent. AI is not real coding any more than The Sims is homemaking. You can use AI and hang with the tech guys and get your check but you are going to be treading water and trying to be liked personally to stay where you are. No question it's a job, but no, it's not coding.
My thinking is that high level languages like C aren't real coding. If you don't even know what ISA the software will be run on, then you need to get the fuck off my lawn!
Yes, "An LLM is just a new higher level programming language", sure. A new programming language with no tractable guarantees about the behavior of any particular program, including in practice no guarantees that the same source code (in this new programming language) will reliably produce the same behavior. This is very different from traditional programming languages and how we can reason about them.
(Yes, one can write C programs with undefined behavior, but C does also have many well-defined properties which allow people to reason about well-defined C programs logically reliably.)
It used to be that old farts made a big deal out of knowing the ins and outs of assembly programming and sneered at the kids who only knew "high level" languages like C, why, those damn kids didn't even understand the machine code their compilers emitted, didn't even know or care that it was so obviously suboptimal.
Really, your attitude is not new. Including even your "this is why it's different this time", its just a mirror reflecting the past.
I can't speak personally to what it was like to be a C developer in the early days of that language, but when I started out as a Ruby on Rails developer over a decade ago I was definitely told by some people that it didn't count as 'real programing' because of how much was abstracted away by the framework.
Mocking? I'm quoting exactly the sort of thing that used to be said in earnest in the 80s and 90s. What you're doing now is exactly the same thing, there's no difference at all. Its the same reaction borne from the same old man instinct to bitch about the kids going soft. Yawn.
Both Algol and Lisp were from the 60s. I think programmers and computers scientists were already acquainted with high level programming languages enough to not equate using C as going soft.
Also software was always about domain knowledge and formal reasoning. Coding is just notation. Someone may like paper and pen, and someone may like a typewriter, but ultimately it’s the writing that matters. Correctness of a program does not depends on the language (and the cpu only manipulate electric flow).
I argue against AI because most of its users don’t care about the correctness of their code. They just want to produce lots of it (the resurgence of the flawed LoC metric as a badge of honor).
> Both Algol and Lisp were from the 60s. I think programmers and computers scientists were already acquainted with high level programming languages enough to not equate using C as going soft
Most programmers didn't have access to anything so fancy as those, in fact most programmers considered access to a C compiler to be an extravagant flex well into the 80s. By the 90s they were well available and you had a new generation of programmers that started out with C, encountering older programmers who started out writing assembly for their z80s, the latter of whole were constantly sneering at kids going soft.
Starting when I was about 10, I learned BASIC (not a real language, I was constantly told), z80, then C (a "high level" language) for the m68k. I encountered all of the attitudes I know see ITT being thrown at programmers using coding agents. Hell, I would bet money on some of those exact same old farts still lurking IRC channels, occasionally waking up like
Rip Van Winkle to grumble about kids taking old ubiquitous technology for granted now.
Right? Some of us used to read hex digits off printed paper dumps to debug mainframe memory (like me), but we can be excited about AI and embrace it, too.
From my perspective, knowing how it gets down to machine code makes it more useful and easier to control, but that doesn't mean I want to stop writing English now that we can.
Sorry to see you're getting mocked. I hate both the (current) low quality and the exploitation aspects of AI more than anyone. However, I don't understand your post. What is real coding according to you?
> If you interpret these examples to mean that any person can write down any list of requirements along with any user interface specs, and the AI will consistently produce a satisfactory product, then I’d agree programmers are toast.
I think the road to this is pretty clear now. It’s all about building the harness now such that the AI can write something and get feedback from type checks, automated tests, runtime errors, logs, and other observability tools. The majority of software is fairly standardized UI forms running CRUD operations against some backend or data store and interacting with APIs.
I am also of this opinion that a lot of this can be solved in time with a harness. And whole heartedly agree that there is a class of webapp that has been trivialized that can make a mom and pop shop up to 'enterprise' (80% of our architecture seems to center around the same pattern at my $DAYJOB) run just fine if they accept some of the vibes.
This type of works seems to be happening as a lot of orchestrator projects that pop up here every once in a while, and I've just been waiting for one with a pipeline 'language' malleable enough to work for me that I can then make generic enough for a big class of solutions. I feel doomed to make my own, but I feel like I should do my due diligence.
> The majority of software is fairly standardized UI forms running CRUD operations against some backend or data store and interacting with APIs.
Have you ever look at Debian’s package list?
Most CRUD apps are just a step above forms and reports. But there’s a lot of specific processing that requires careful design. The whole system may be tricky to get right as well. But CRUD UI coding was never an issue.
DDD and various other architecture books and talks are not about CRUD.
Roughly 20y DeepBlue to AlphaZero. I don't think that is comparable though. Use of deep neural networks was what made the machines starting with AlphaZero dominant again. I.e. we're already in the new paradigm.
> any person can write down any list of requirements along with any user interface specs
Isn't this just a new programming language? A higher level language which will require new experts to know how to get the best results out of it? I've seen non-technical people struggle with AI generated code because they don't understand all the little nuances that go into building a simple web app.
A programming language with no tractable guarantees about the behavior of any particular program, including in practice no guarantees that the same source code (in this new programming language) will reliably produce the same behavior. This is very different from traditional programming languages and how we can reason about them.
> The computers will come for all of our jobs eventually, but those of us who refuse or decline to embrace the most powerful creative tools we’ve ever been given will be the first to fall.
The most powerful creative tool I've ever been given is my body, including my brain. Yeah, AI is cool, too.
For anyone who relies on their knowledge of business, of taking requirements, know that eventually your customers will be at least as good as you on this skill.
After all they are asking questions. And they are not dumb ass who don’t learn. They are also motivated to learn to adapt to AI.
IMO the best value of humans right now is to provide skills, as fuel of the future. Once we burn up then the new age shall come.
I think customers will say: "I don't want to try to come up with (correct) requirements. I rather hire this SW firm that specializes in that skill.
What is often overlooked is that we are not trying to just produce programs, we are trying to produce "systems", which means systems where computers and humans interact beneficially.
In the early days of computers, "System Analyst" was a very cool job-description. I think it will make a comeback with AI.
> What is often overlooked is that we are not trying to just produce programs, we are trying to produce "systems", which means systems where computers and humans interact beneficially
People do overlook that. Software is written to solve a problem. And it interacts with other software and data from the real world. Everything is in flux. Which is why you need both technical expertise (how things work) and domain knowledge (the purpose of things) to react properly to changes.
Creating such systems is costly (even in the AI edge), which is why businesses delegate parts that are not their core business to someone else.
‘There is a confirmation bias at work here: every developer who has experienced such a remarkable outcome is delighted to share it. It helps to contribute to a mass (human) hallucination that computers really are capable of anything, and really are taking over the world.”
This is survivorship bias, a form of sample bias.
Confirmation bias is a form of motivated reasoning where you search for evidence that confirms your existing beliefs.
> "We will forever be useful!" As a sounding cry against radical transformation. I hope that's the case, but some of these pieces just seem like copium.
Yeah no shade to the author, it was a well written piece, but these arguments increasingly seem like a form of self-soothing to me. On current trajectories I don't see how AI won't swallow up the majority of the software development field in ways that completely devalue not just software engineers, but everyone up the stack in software-focused companies, eventually removing the need for most of the companies to even exist.
(Why have a software company employing someone who is an expert at gathering customer requirements to feed to the LLM when the customer can just solve their own problem with an LLM? They'll build the wrong thing the first time... and the fifth time... and maybe the tenth time... but it'll still be way faster for them to iterate their own solution than to interface with another human).
It'll take longer for senior engineers to suffer than juniors, and longer for system architects to suffer than senior engineers, and so on up the chain but I'm just not seeing this magical demarcation line that a lot of people seem to think is going to materialize prior to the rising tide of AI starting to go above their own neck.
And this is likely to all happen very quickly, it was less than a year ago when LLMs were awful at producing code.
> The time may come, perhaps even soon, when AI takes over programming completely. But in the mean time, a programmer who embraces AI, yet is skeptical about everything it creates, is better-equipped than any comparably-skilled human in programming history.
The premise is flawed.
Your company has 3 layers: workers - managers - owners. When AI (actual AI, not "AI") takes over, that'll change into 2 layers: AI - owners.
How you're equipped doesn't matter because you're out of the picture. Not economically viable. Irrelevant.
I have a friend. He's very passionate about trains, and he worked his entire career in and around them. Some time ago we had an incident where entire railway in the region had to be shut down for some time because the computer managing the traffic broke. Big fuckup because lots of people rely on railway. My friend instantly started complaining with visible satisfaction "See? Computers are shit! Back in my days everything was done manually and that was great! We never had such outages!". What my friend is incapable of understanding is that it's simply not possible to manage traffic at scale using humans, and it's much better and safer to do this using computers. When he was young, there was much less traffic, and delays were much more frequent.
The core issue is that he doesn't really understand computers, and because he's old, there's no chance he'll learn a new technology, so he's very distrustful towards computers. Not to mention that modern UX is about enshittification, so it does take some skill to navigate the technology, especially mobile - he trips over things that I consider extremely basic like "this is an ad, don't click this".
I have a similar feeling when reading discussions about AI on this website. Most people here refuse to appreciate AI because "back in my days...". Well, the future is here, old man, adapt or perish. Programming using AI is a completely new paradigm that requires completely new approach and new skills. Either you learn them or you'll be left behind, whether you like it or not - back in 2010's all you had to do to get a lucrative software job was to show that you knew how to use google because most people refused that. I suspect that when the dust settles, something similar will happen with AI, and people who know how to extract maximum value from it will be rewarded handsomely.
> Just a few years ago, AI essentially could not program at all. In the future, a given AI instance may “program better” than any single human in history. But for now, real programmers will always win.
For how long? Do I get to feel smug about this for 10 days, 10 weeks, or 10 years? That radically changes the planned trajectory of my life.
These posts are just programmers trying to understand their new place in the hierarchy. I'm in the same place and get it, but also truisms like 'will always win' is basically just throwing a wild guess at what the future will look like. A better attitude is to attempt to catch the wave.
TFA's author is literally saying it may happen. He's using AI so he already caught the wave. He's augmenting himself with AI tools. He's not saying "AI will never surpass humans at writing programs". He writes:
" At this particular moment, human developers are especially valuable, because of the transitional period we’re living through."
You and GP are both attacking him on a strawman: it's not clear why.
We're seeing countless AI slop and the enshittification and lower uptime for services day after day.
To anyone using these tools seriously on a daily basis it's totally obvious there are, TODAY*, shortcomings.
TFA doesn't talk about tomorrow. It talks about today.
>> The computers will come for all of our jobs eventually, but those of us who refuse or decline to embrace the most powerful creative tools we’ve ever been given will be the first to fall.
It's being mandated by almost all companies. We're forced to use it, whether it produces good results or not. I can change a one-line code faster than Claude Code can, as long as I understand the code. Someday, I'll lose understanding of the code, because I didn't write it. What am I embracing?
Times change, you have to learn new tools or get left behind. Nobody is asking if you like it, even expecting your personal preferences to matter and be taken seriously is a serious example of privilege.
I'm serious. Your boss comes to you and tells you to use AI, how is that any different from the foreman of an experienced carpentry / framing crew coming up to his old pros and handing them a pneumatic nail gun, and telling them to put down the hammers? You think they didn't complain, these men who pride themselves in being able to drive a nail with fewer swings than others? These men who are super fast and experienced with hammers and took pride in that, even enjoyed their work, do you think they were happy when their boss, some unskilled management bitch, showed that he could drive nails faster than they could just by pulling a trigger? They hated it! And it didn't matter, they had to adapt. Why should programmers be more privileged?
Your boss tells you to use AI because he gave it a task that took you a day to complete and the AI did it in five minutes. Your boss doesn't care about your skill with an obsolete craft, or the art or aesthetic qualities of coding by hand, or even the supposed quality benefits of doing it by hand. None of that matters when the new tool can demonstrably work faster. Your boss sees it himself, then sees you complaining, and he then sees you as an old veteran complaining about the new way of things. Complain all you want, you either learn to keep up or you'll get left behind.
> Someday, I'll lose understanding of the code, because I didn't write it.
I've been wading through vast corporate codebases I never wrote and yet had to understand for the past 20 years. This isn't any different, and AI tools help with that understanding. A lot!
The tools and techniques are out there.
It's a skill set just like coding. You can embrace an elevated workflow where you can forget about the specific syntax and focus on the architecture and integration. It takes time to intuit what exactly the models are bad at, so you can forsee hallucinations and prevent them from happening in the first place. Yes you can write 1 line faster than Claude, but what about 10 lines? 100? 1000?
> Yes you can write 1 line faster than Claude, but what about 10 lines? 100? 1000?
Bingo. One quick edit when you already know what needs to be done is trivial, that means nothing. What happens when you have to write a new feature and it will take hundreds of lines of code? Unless you're an elder god of programming, the LLM will lap you easily.
> I'll lose understanding of the code, because I didn't write it.
What about code that other (human) engineers write? Do you not understand that code too because you didn't write it?
Intersting observation. There is a difference however. Pre-AI each human programmer understood the code they wrote, in general. So there were many humans who understood some part of the code. Poast-AI there will be no humans who understand any code, presumably. Sure we will understand its syntax, but the overall architecture of applications may be so complicated that no human can understand it, in practice.
Have you read Coding machines[0]?
BTW, that guy received an Oscar for coding. Oh, far far have we fallen since those days and how far we have yet to go...
[0]: https://www.teamten.com/lawrence/writings/coding-machines/
You're not embracing it, you're forced to accept it because the nature of employer-employee relationships has a fundamental power differential which makes it exploitative.
You helped build the company, you should own a proportional part of it.
The issue with the current system is that only the people who provide money, not the people who provide work, get to own the result. Work (and natural resources) is where value comes from. Their money came from work as well but not only their work, they were in a position of power which allowed them to get a larger cut than deserved. Ownership should, by law, be distributed according to the amount and skill level of work.
Then people wouldn't worry about losing their jobs to automation - because they'd keep receiving dividends from the value of their previous work.
If their previous work allowed the company to buy a robot to replace them, great, they now get the revenue from the robot's work while being free to pursue other things in their now free time.
If their previous work allowed an LLM to be trained or rented to replace them, great, they get get the revenue from the LLM's work...
Crapitalism didn't let up for the benifit of the old veteran framers who didn't want to use nailguns, why should it let up for the benifit of old veteran programmers who don't want to use LLMs? We aren't special. Expecting a shakeup of society's whole economic system just to preserve your preference for old tools is totally out to lunch.
Used to be people sold capitalism as something that gave freedom to individuals. Now it's just the thing that forces us to to act against what we believe is best for ourselves and society at large. Anyone who expresses distaste for that is of course, out to lunch.
I'm not selling capitalism. I'm telling you that society is indifferent to your desire to program in the old ways, we're not going to start a worker's revolution for the sake of programmers who don't like coding agents. You can either adapt, or get left behind.
What about the people who didn't work for the one company that became the only company?
You still don't make sense, mate.
Yes, a better system would be great. Half-baked ideas only stand in its way.
Since in my system, you cannot buy ownership of a corporate person (just like you cannot buy a natural person, for good reasons), that severely limits how such a situation could arise in the first place.
You still get paid a salary, you can still save up or invest it, it's just that money only buys you ownership according to how much work (times * skill) you put in to make that money.
Any system based on market competition in which your scenario realistically happens was probably so degenerate it would end up being replaced (whether democratically or by force). My system, AFAICT, is strictly (in the mathematical sense) better than the current implementation of capitalism, it just has extra precautions against buying power. What it boils down to is you want a perfect system while I am proposing a system that's better than the current state and you reject it based on not being perfect.
BTW a part of your comment is a condescending personal attack which doesn't add to the discussion and is against the guidelines.
My feeling is that AI is not real coding; it is coding-adjacent. Project Management, Sales, Marketing, Writing Books About KanBan, AI Programming, User Interface Design, Installing Routers are coding-adjacent. AI is not real coding any more than The Sims is homemaking. You can use AI and hang with the tech guys and get your check but you are going to be treading water and trying to be liked personally to stay where you are. No question it's a job, but no, it's not coding.
My thinking is that high level languages like C aren't real coding. If you don't even know what ISA the software will be run on, then you need to get the fuck off my lawn!
Attitude as old as time itself.
Yes, "An LLM is just a new higher level programming language", sure. A new programming language with no tractable guarantees about the behavior of any particular program, including in practice no guarantees that the same source code (in this new programming language) will reliably produce the same behavior. This is very different from traditional programming languages and how we can reason about them.
(Yes, one can write C programs with undefined behavior, but C does also have many well-defined properties which allow people to reason about well-defined C programs logically reliably.)
It used to be that old farts made a big deal out of knowing the ins and outs of assembly programming and sneered at the kids who only knew "high level" languages like C, why, those damn kids didn't even understand the machine code their compilers emitted, didn't even know or care that it was so obviously suboptimal.
Really, your attitude is not new. Including even your "this is why it's different this time", its just a mirror reflecting the past.
(And the behavior of any given C implementation is completely defined.)
You mock, but not very persuasively. You seem to be relying on a silly idea you don't even believe in: that someone, once, made fun of C programming.
I can't speak personally to what it was like to be a C developer in the early days of that language, but when I started out as a Ruby on Rails developer over a decade ago I was definitely told by some people that it didn't count as 'real programing' because of how much was abstracted away by the framework.
I have some bad news for you
>AI is not real coding any more than The Sims is homemaking.
Your analogy is bad. The programmer and the AI both produce working code. The other poster's response was correct.
I believe that someone can get it to produce working code. Whenever I press the button though, I'm not getting great results.
The AI does not, in fact, produce working code.
Mocking? I'm quoting exactly the sort of thing that used to be said in earnest in the 80s and 90s. What you're doing now is exactly the same thing, there's no difference at all. Its the same reaction borne from the same old man instinct to bitch about the kids going soft. Yawn.
Both Algol and Lisp were from the 60s. I think programmers and computers scientists were already acquainted with high level programming languages enough to not equate using C as going soft.
Also software was always about domain knowledge and formal reasoning. Coding is just notation. Someone may like paper and pen, and someone may like a typewriter, but ultimately it’s the writing that matters. Correctness of a program does not depends on the language (and the cpu only manipulate electric flow).
I argue against AI because most of its users don’t care about the correctness of their code. They just want to produce lots of it (the resurgence of the flawed LoC metric as a badge of honor).
> Both Algol and Lisp were from the 60s. I think programmers and computers scientists were already acquainted with high level programming languages enough to not equate using C as going soft
Most programmers didn't have access to anything so fancy as those, in fact most programmers considered access to a C compiler to be an extravagant flex well into the 80s. By the 90s they were well available and you had a new generation of programmers that started out with C, encountering older programmers who started out writing assembly for their z80s, the latter of whole were constantly sneering at kids going soft.
Starting when I was about 10, I learned BASIC (not a real language, I was constantly told), z80, then C (a "high level" language) for the m68k. I encountered all of the attitudes I know see ITT being thrown at programmers using coding agents. Hell, I would bet money on some of those exact same old farts still lurking IRC channels, occasionally waking up like Rip Van Winkle to grumble about kids taking old ubiquitous technology for granted now.
> I argue against AI because most of its users don’t care about the correctness of their code.
This is remarkably sloppy for someone who codes. No facts, just opinion, claimed with confidence.
> No facts, just opinion, claimed with confidence
Strong opinions, loosely held.
What I’ve seen seems to confirm that opinion, so I’m still holding on to it.
> Strong opinions, loosely held.
Yes doctor, but enough about my stool.
Right? Some of us used to read hex digits off printed paper dumps to debug mainframe memory (like me), but we can be excited about AI and embrace it, too.
From my perspective, knowing how it gets down to machine code makes it more useful and easier to control, but that doesn't mean I want to stop writing English now that we can.
I think it should be called dice-coding, not vibe-coding. You roll the LLM dice, and sometimes it comes with the right looking program on the top.
Since the dice is loaded heavily, this happens quite often. This makes people think that the dice can program.
Sorry to see you're getting mocked. I hate both the (current) low quality and the exploitation aspects of AI more than anyone. However, I don't understand your post. What is real coding according to you?
> If you interpret these examples to mean that any person can write down any list of requirements along with any user interface specs, and the AI will consistently produce a satisfactory product, then I’d agree programmers are toast.
I think the road to this is pretty clear now. It’s all about building the harness now such that the AI can write something and get feedback from type checks, automated tests, runtime errors, logs, and other observability tools. The majority of software is fairly standardized UI forms running CRUD operations against some backend or data store and interacting with APIs.
I am also of this opinion that a lot of this can be solved in time with a harness. And whole heartedly agree that there is a class of webapp that has been trivialized that can make a mom and pop shop up to 'enterprise' (80% of our architecture seems to center around the same pattern at my $DAYJOB) run just fine if they accept some of the vibes.
This type of works seems to be happening as a lot of orchestrator projects that pop up here every once in a while, and I've just been waiting for one with a pipeline 'language' malleable enough to work for me that I can then make generic enough for a big class of solutions. I feel doomed to make my own, but I feel like I should do my due diligence.
> The majority of software is fairly standardized UI forms running CRUD operations against some backend or data store and interacting with APIs.
Have you ever look at Debian’s package list?
Most CRUD apps are just a step above forms and reports. But there’s a lot of specific processing that requires careful design. The whole system may be tricky to get right as well. But CRUD UI coding was never an issue.
DDD and various other architecture books and talks are not about CRUD.
In chess, engines have long been stronger than humans, but for a long time a (super) grandmaster with an engine was still better than an engine alone.
Roughly 20y DeepBlue to AlphaZero. I don't think that is comparable though. Use of deep neural networks was what made the machines starting with AlphaZero dominant again. I.e. we're already in the new paradigm.
> a (super) grandmaster with an engine was still better than an engine alone.
any examples? I'd love to watch how that went
> any person can write down any list of requirements along with any user interface specs
Isn't this just a new programming language? A higher level language which will require new experts to know how to get the best results out of it? I've seen non-technical people struggle with AI generated code because they don't understand all the little nuances that go into building a simple web app.
A programming language with no tractable guarantees about the behavior of any particular program, including in practice no guarantees that the same source code (in this new programming language) will reliably produce the same behavior. This is very different from traditional programming languages and how we can reason about them.
> The computers will come for all of our jobs eventually, but those of us who refuse or decline to embrace the most powerful creative tools we’ve ever been given will be the first to fall.
The most powerful creative tool I've ever been given is my body, including my brain. Yeah, AI is cool, too.
> But people are less likely to share all the times the AI failed in some ridiculous way.
I have not noticed this: people love sharing AI failing in ridiculous ways.
For anyone who relies on their knowledge of business, of taking requirements, know that eventually your customers will be at least as good as you on this skill.
After all they are asking questions. And they are not dumb ass who don’t learn. They are also motivated to learn to adapt to AI.
IMO the best value of humans right now is to provide skills, as fuel of the future. Once we burn up then the new age shall come.
I think customers will say: "I don't want to try to come up with (correct) requirements. I rather hire this SW firm that specializes in that skill.
What is often overlooked is that we are not trying to just produce programs, we are trying to produce "systems", which means systems where computers and humans interact beneficially.
In the early days of computers, "System Analyst" was a very cool job-description. I think it will make a comeback with AI.
> What is often overlooked is that we are not trying to just produce programs, we are trying to produce "systems", which means systems where computers and humans interact beneficially
People do overlook that. Software is written to solve a problem. And it interacts with other software and data from the real world. Everything is in flux. Which is why you need both technical expertise (how things work) and domain knowledge (the purpose of things) to react properly to changes.
Creating such systems is costly (even in the AI edge), which is why businesses delegate parts that are not their core business to someone else.
‘There is a confirmation bias at work here: every developer who has experienced such a remarkable outcome is delighted to share it. It helps to contribute to a mass (human) hallucination that computers really are capable of anything, and really are taking over the world.”
This is survivorship bias, a form of sample bias.
Confirmation bias is a form of motivated reasoning where you search for evidence that confirms your existing beliefs.
I'm observing that there is some kind of status quo bias nearly uniformly being surfaced by the programming community right now.
I myself have feelings like this, as a software engineer by trade.
"We will forever be useful!" As a sounding cry against radical transformation. I hope that's the case, but some of these pieces just seem like copium.
> "We will forever be useful!" As a sounding cry against radical transformation. I hope that's the case, but some of these pieces just seem like copium.
Yeah no shade to the author, it was a well written piece, but these arguments increasingly seem like a form of self-soothing to me. On current trajectories I don't see how AI won't swallow up the majority of the software development field in ways that completely devalue not just software engineers, but everyone up the stack in software-focused companies, eventually removing the need for most of the companies to even exist.
(Why have a software company employing someone who is an expert at gathering customer requirements to feed to the LLM when the customer can just solve their own problem with an LLM? They'll build the wrong thing the first time... and the fifth time... and maybe the tenth time... but it'll still be way faster for them to iterate their own solution than to interface with another human).
It'll take longer for senior engineers to suffer than juniors, and longer for system architects to suffer than senior engineers, and so on up the chain but I'm just not seeing this magical demarcation line that a lot of people seem to think is going to materialize prior to the rising tide of AI starting to go above their own neck.
And this is likely to all happen very quickly, it was less than a year ago when LLMs were awful at producing code.
logic, rigor, judgement and taste
> The time may come, perhaps even soon, when AI takes over programming completely. But in the mean time, a programmer who embraces AI, yet is skeptical about everything it creates, is better-equipped than any comparably-skilled human in programming history.
The premise is flawed.
Your company has 3 layers: workers - managers - owners. When AI (actual AI, not "AI") takes over, that'll change into 2 layers: AI - owners.
How you're equipped doesn't matter because you're out of the picture. Not economically viable. Irrelevant.
> Speaking of goodness, I share the majority opinion that AI is generally good
I would like to know how the author concluded that this is the majority opinion.
Good read.
I have a friend. He's very passionate about trains, and he worked his entire career in and around them. Some time ago we had an incident where entire railway in the region had to be shut down for some time because the computer managing the traffic broke. Big fuckup because lots of people rely on railway. My friend instantly started complaining with visible satisfaction "See? Computers are shit! Back in my days everything was done manually and that was great! We never had such outages!". What my friend is incapable of understanding is that it's simply not possible to manage traffic at scale using humans, and it's much better and safer to do this using computers. When he was young, there was much less traffic, and delays were much more frequent.
The core issue is that he doesn't really understand computers, and because he's old, there's no chance he'll learn a new technology, so he's very distrustful towards computers. Not to mention that modern UX is about enshittification, so it does take some skill to navigate the technology, especially mobile - he trips over things that I consider extremely basic like "this is an ad, don't click this".
I have a similar feeling when reading discussions about AI on this website. Most people here refuse to appreciate AI because "back in my days...". Well, the future is here, old man, adapt or perish. Programming using AI is a completely new paradigm that requires completely new approach and new skills. Either you learn them or you'll be left behind, whether you like it or not - back in 2010's all you had to do to get a lucrative software job was to show that you knew how to use google because most people refused that. I suspect that when the dust settles, something similar will happen with AI, and people who know how to extract maximum value from it will be rewarded handsomely.
> Just a few years ago, AI essentially could not program at all. In the future, a given AI instance may “program better” than any single human in history. But for now, real programmers will always win.
For how long? Do I get to feel smug about this for 10 days, 10 weeks, or 10 years? That radically changes the planned trajectory of my life.
These posts are just programmers trying to understand their new place in the hierarchy. I'm in the same place and get it, but also truisms like 'will always win' is basically just throwing a wild guess at what the future will look like. A better attitude is to attempt to catch the wave.
TFA's author is literally saying it may happen. He's using AI so he already caught the wave. He's augmenting himself with AI tools. He's not saying "AI will never surpass humans at writing programs". He writes:
" At this particular moment, human developers are especially valuable, because of the transitional period we’re living through."
You and GP are both attacking him on a strawman: it's not clear why.
We're seeing countless AI slop and the enshittification and lower uptime for services day after day.
To anyone using these tools seriously on a daily basis it's totally obvious there are, TODAY*, shortcomings.
TFA doesn't talk about tomorrow. It talks about today.
To be fair, the author phrased his point poorly in a way that invites confusion:
> "But for now, real programmers will always win."
"for now ... always", not a good phrasing.
[flagged]