So, in the 80s it took around a year and a half for a team of ~10 people to develop Pac-Man (Namco). Nowadays it probably would take any engineer between a day and a weekend to develop it solo. Does that mean engineers are out of jobs? No. Things get easier to be developed, but humans always want more.
So in X years when developing a Figma may take one or two guys and a week of development effort, the society will expect "better" (more complex) software. And there I will be ready to develop such software (using AI probably like yet another tool). The AI available in X years will probably do wonders but won't be a silver bullet to develop the software required in such days. Our needs are always ahead of what the technology can provide, that's why technology keeps evolving. It's silly to thing that we'll reach a ceiling and that a given tech will do everything from that point onwards.
Ive been a super early adopter of AI. I basically had something like an MCP server up in the early days of Chat GPT (basically a bunch of prompts that guided the model to spit out specific data for the wrapper to catch and execute).
I can safely say that the way things are heading, AI is not going to take my job as a software engineer. Nobody is building AI. Everyone is building effectively explicitly coded agents that all still generally can't reason.
The only thing that AI has improved is the time from knowing what you want to getting it into working code. Which is significant. But for any time improvements that this gives, it also means that things can be written wrong faster, which means more time spent later on undoing it.
From my data of ~1.8M jobs, "Software Engineer" as a job title in postings are down ~25% and "Senior Software Engineer" jobs are down ~10% comparing the first 2 months of 2026 with 2025
We have no idea what the world's going to look like in 5 years. Maximize your ability to adapt, grow, learn and get things done. Any plan you make today is going to be worse than a plan made 3 years from now with more information.
I saw the writing on the wall when ChatGPT was first released. I'm a good 2 years into a pivot to become an online creator who makes weird web experiences and content.
I am knee deep in a challange to build 25 projects in 25 weeks. I'm working on project #20 today.
I hope to build an audience through Patreon, sponsorships, and monetizing some of the projects. Maybe a micro SaaS or two? Maybe a newsletter or two?
My plan is to learn how to use AI and be proficient with it, so they will be more likely to keep me, as opposed to those who refuse to use it. Some people on my team still refuse to use AI, thinking you can't trust anything since it's all hallucinations.
There's a lot of clarity to be gained by decomposing the question of "a living" into its parts.
How do you plan to continue the processes of gaining sustenance and economic power as needed? Do you own the land or capital needed for your sustenance or do you need to trade for them? If you need to trade, what assets or capabilities do you have to trade?
How well are these guarded against theft, cyberattack, romance scams against you, quasi-legal expropriation and changes in the legal system that eliminate your status of having property rights?
We'd better hope that robotic replacements for human hands aren't soon in coming.
The premise is flawed, I don't see AI taking jobs because companies want to grow in productivity generally speaking and would rather just make their existing employees do more; all productivity saving technologies have always been net positive in the amount of jobs created.
> all productivity saving technologies have always been net positive in the amount of jobs created
To what extent has the net increase in jobs been because there have just been more people who needed to work in order for society to not collapse?
Population growth is slowing (expected to peak in 2080-ish). To some, AI feels like a different sort of "productivity enhancer" than we've seen in this past.
I don't think the person's premise is flawed. It's more that you just disagree with it.
The counter argument to that is what happened with horses. Since domestication every advance of human civilization lead to having more horses. Until cars were invented and improved - which eliminated overnight 90% of the number of horses used.
So the fact that in the past new technologies have created new jobs is not a guarantee that AI will create new jobs.
On top of that have a look what happened say during the industrial revolution in Britain. You'd have a village with 2000 workers producing clothes or materials to make clothes. A rich guy opens a factory in that village that employees 200 people and produces more than the whole village before that. 90% are unemployed, the 10% that work in factory have far worse working conditions. Studying graves from that time shows the height of people went down during the industrial revolution - as the conditions in the factories were far worse than what they had before. Hence the luddite movement - but as the reach people owned the mass media, the luddites were portrayed as crazy. Eventually, many years later, new better jobs did appear.
I'm try to save money and could take an early retirement in 2-3 years if necessary. If I'm fired before that, and if I don't find a job as an SWE or manager, I think I could teach.
But who knows if your job will be gone. There could be more jobs in order to fix the tech debt and AI slop. I'm using Claude daily to write all my code, I wouldn't bet my life on the fact I'm more productive individually, and I don't think my team/company is. But we'll see.
IMHO no one know what the heck is going to happen. Probably will have to adapt to the situation as needed... In other words, it's too hard to predict to come up with a plan ahead of time?
I remember saying ~2 years ago that people should probably assume the role they're in would be their last programming job. I feel like that was an unreasonably good prediction given almost everyone on HN at the time was arguing that LLMs were just stochastic parrots.
I suspect people will still be needed to create software for some time but increasingly it won't be software engineers who have spent decades learning syntax. It will be a relatively poorly paid role compared to today. If you're happy to be paid a fraction of what you're paid today, perhaps you can transition into a vibe coder role.
There will probably be some more senior people maintaining projects, but the number of people like this you'd realistically need at any org is probably in the single digits. The main hiring criteria will be that they're very personable, not the typical anti-social engineer type. The autistic 10x engineer's value is basically zero in this new AI economy.
Longer term (~10-20 years) we're all dead anyway so your priority probably shouldn't be to optimise for income or prestige.
So, in the 80s it took around a year and a half for a team of ~10 people to develop Pac-Man (Namco). Nowadays it probably would take any engineer between a day and a weekend to develop it solo. Does that mean engineers are out of jobs? No. Things get easier to be developed, but humans always want more. So in X years when developing a Figma may take one or two guys and a week of development effort, the society will expect "better" (more complex) software. And there I will be ready to develop such software (using AI probably like yet another tool). The AI available in X years will probably do wonders but won't be a silver bullet to develop the software required in such days. Our needs are always ahead of what the technology can provide, that's why technology keeps evolving. It's silly to thing that we'll reach a ceiling and that a given tech will do everything from that point onwards.
There's nothing you can do against civilizational movement. Most people exist because of other people. When those other people lose jobs, so will you.
Ive been a super early adopter of AI. I basically had something like an MCP server up in the early days of Chat GPT (basically a bunch of prompts that guided the model to spit out specific data for the wrapper to catch and execute).
I can safely say that the way things are heading, AI is not going to take my job as a software engineer. Nobody is building AI. Everyone is building effectively explicitly coded agents that all still generally can't reason.
The only thing that AI has improved is the time from knowing what you want to getting it into working code. Which is significant. But for any time improvements that this gives, it also means that things can be written wrong faster, which means more time spent later on undoing it.
I plan to do the same thing I did the last several times new automation took my old job: carry right along making software.
Jevons' paradox tells us that AI will create more demand for software than ever. I see no threat to my career here.
From my data of ~1.8M jobs, "Software Engineer" as a job title in postings are down ~25% and "Senior Software Engineer" jobs are down ~10% comparing the first 2 months of 2026 with 2025
We have no idea what the world's going to look like in 5 years. Maximize your ability to adapt, grow, learn and get things done. Any plan you make today is going to be worse than a plan made 3 years from now with more information.
I saw the writing on the wall when ChatGPT was first released. I'm a good 2 years into a pivot to become an online creator who makes weird web experiences and content.
I am knee deep in a challange to build 25 projects in 25 weeks. I'm working on project #20 today.
I hope to build an audience through Patreon, sponsorships, and monetizing some of the projects. Maybe a micro SaaS or two? Maybe a newsletter or two?
> Maybe a micro SaaS or two? Maybe a newsletter or two?
No offence, but I'm not sure a pivot into micro SaaS software development or newsletter writing is much of a plan for AI job destruction.
Probably right, but I'd rather be 1-2 years into figuring this out than starting now. I have a lot of traction and I've built up an audience.
My plan is to learn how to use AI and be proficient with it, so they will be more likely to keep me, as opposed to those who refuse to use it. Some people on my team still refuse to use AI, thinking you can't trust anything since it's all hallucinations.
There's a lot of clarity to be gained by decomposing the question of "a living" into its parts.
How do you plan to continue the processes of gaining sustenance and economic power as needed? Do you own the land or capital needed for your sustenance or do you need to trade for them? If you need to trade, what assets or capabilities do you have to trade?
How well are these guarded against theft, cyberattack, romance scams against you, quasi-legal expropriation and changes in the legal system that eliminate your status of having property rights?
We'd better hope that robotic replacements for human hands aren't soon in coming.
The premise is flawed, I don't see AI taking jobs because companies want to grow in productivity generally speaking and would rather just make their existing employees do more; all productivity saving technologies have always been net positive in the amount of jobs created.
> all productivity saving technologies have always been net positive in the amount of jobs created
To what extent has the net increase in jobs been because there have just been more people who needed to work in order for society to not collapse?
Population growth is slowing (expected to peak in 2080-ish). To some, AI feels like a different sort of "productivity enhancer" than we've seen in this past.
I don't think the person's premise is flawed. It's more that you just disagree with it.
The counter argument to that is what happened with horses. Since domestication every advance of human civilization lead to having more horses. Until cars were invented and improved - which eliminated overnight 90% of the number of horses used.
So the fact that in the past new technologies have created new jobs is not a guarantee that AI will create new jobs.
On top of that have a look what happened say during the industrial revolution in Britain. You'd have a village with 2000 workers producing clothes or materials to make clothes. A rich guy opens a factory in that village that employees 200 people and produces more than the whole village before that. 90% are unemployed, the 10% that work in factory have far worse working conditions. Studying graves from that time shows the height of people went down during the industrial revolution - as the conditions in the factories were far worse than what they had before. Hence the luddite movement - but as the reach people owned the mass media, the luddites were portrayed as crazy. Eventually, many years later, new better jobs did appear.
I'm try to save money and could take an early retirement in 2-3 years if necessary. If I'm fired before that, and if I don't find a job as an SWE or manager, I think I could teach.
But who knows if your job will be gone. There could be more jobs in order to fix the tech debt and AI slop. I'm using Claude daily to write all my code, I wouldn't bet my life on the fact I'm more productive individually, and I don't think my team/company is. But we'll see.
IMHO no one know what the heck is going to happen. Probably will have to adapt to the situation as needed... In other words, it's too hard to predict to come up with a plan ahead of time?
I remember saying ~2 years ago that people should probably assume the role they're in would be their last programming job. I feel like that was an unreasonably good prediction given almost everyone on HN at the time was arguing that LLMs were just stochastic parrots.
I suspect people will still be needed to create software for some time but increasingly it won't be software engineers who have spent decades learning syntax. It will be a relatively poorly paid role compared to today. If you're happy to be paid a fraction of what you're paid today, perhaps you can transition into a vibe coder role.
There will probably be some more senior people maintaining projects, but the number of people like this you'd realistically need at any org is probably in the single digits. The main hiring criteria will be that they're very personable, not the typical anti-social engineer type. The autistic 10x engineer's value is basically zero in this new AI economy.
Longer term (~10-20 years) we're all dead anyway so your priority probably shouldn't be to optimise for income or prestige.
UBI
fingers crossed!
If ai takes enough jobs from people.
we will all get together, start a revolution and overthrow our respective governments.
desperate people will do desperate things to survive
There are always more of us than them.
I truly hope that it happens in my lifetime. I really am sick of this shitty world.
Its not what I was promised 69 years ago when I was born.
The governments may have by that time armies of drones and robots, controlled by a few loyal people or AI.