The next two years of software engineering

(addyosmani.com)

191 points | by napolux 15 hours ago ago

189 comments

  • maciejzj 2 hours ago ago

    TBH, it all feels like a huge gamble at this point. Neither skills, education, institutional ties, nor current employment can guarantee a stable foundation for life.

    This hits harder depending on how much money, social capital, or debt you accumulated before this volatility began. If you’ve paid off your debts, bought a house, and stabilized your family life, you’re gambling with how comfortable the coming years will be. If you’re a fresh grad with student debt, no house, and no social network, you’re more or less gambling with your life.

    • schnitzelstoat an hour ago ago

      I felt a lot safer when I was a young grad than now that I have kids to support and I can't just up and move to wherever the best job opportunity is or live off lentils to save money or whatever.

      • maciejzj an hour ago ago

        Yeah, kids change the landscape a lot. On the other hand, if you don't have any personal ties, its easier to grab opportunities, but you are unlikely to build any kind of social network when chasing jobs all over the country/world.

        Either way, there is very little to no path toward "family + place to live + stable job" model.

    • rwmj 25 minutes ago ago

      Work on becoming Financially Independent. The best time to start was when you started your career, the second best time to start is now.

      • pepperball 23 minutes ago ago

        Yeah really seems like the only way to win (or rather not lose) is simply not to play.

        At this point I’ve realized I need to cast all other ambitions aside and work on getting some out of the way land that I own.

  • afro88 an hour ago ago

    > The bottom line: Junior developer hiring could collapse as AI automates entry-level tasks

    If AI automated entry-level tasks from today, that just means "entry-level" means something different now. It doesn't mean entry-level ceases to exist. Entey-level as we know it, but not entry-level in general.

  • babblingfish 12 hours ago ago

    My experience hasn't been LLMs automate coding, just speeds it up. It's like I know what I want the solution to be and I'll describe it to the LLM, usually for specific code blocks at a time, and then build it up block-by-block. When I read hacker news people are talking like it's doing much more than that. It doesn't feel like an automation tool to me at all. It just helps me do what I was gonna do anyways, but without having to look up library function calls and language specific syntax

    • Aurornis 12 hours ago ago

      > My experience hasn't been LLMs automate coding, just speeds it up.

      This is how basically everyone I know actually uses LLMs.

      The whole story about vibecoding and LLMs replacing engineers has become a huge distraction from the really useful discussions to be had. It’s almost impossible to discuss LLMs on HN because everyone is busy attacking the vibecoding strawman all the time.

      • miki123211 4 hours ago ago

        As a professional programmer, I think both are useful in different scenarios.

        You're maintaining a large, professional codebase? You definitely shouldn't be vibe coding. The fact that some people are is a genuine problem. You want a simple app that you and your friends will use for a few weeks and throw away? Sure, you can probably vibe code something in 2 hours instead of paying for a SaaS. Both have their place.

        • iknowSFR 2 hours ago ago

          I’m seeing vibe coding redefine what the product manager is doing. Specifically, adding solution execution to its existing strategy and decision making responsibilities. The PM puts solutions in front of a customer and sees what sticks, then hands over the concept to engineering to bake into the larger code base. The primary change here is no longer relying on interviews and research to make product decisions that engineering spends months building only to have flop when it hits market. The PM is being required to build and test dozens of solutions before anything makes its way to engineering resources. How engineering builds the overall solution is still under their control but the fit is validated before it hits their desk.

        • phn 2 hours ago ago

          I think the problem starts with the name. I've been coding with LLMs for the past few months but most of it is far from "vibed", I am constantly reviewing the output and guiding it in the right direction, it's more like a turbo charged code editor than a "junior developer", imo.

      • falloutx 3 hours ago ago

        > The whole story about vibecoding and LLMs replacing engineers has become a huge distraction

        Because the first thing that comes from individual speed up is not engineers making more money but there being less engineers, How much less is the question? Would they be satisfied with 10%, 50% or may be 99%?

        • spacebanana7 3 hours ago ago

          Generally the demand for software engineers has increased as their productivity has increased, looking back over the past few decades. There seems to be effectively infinite demand for software from consumers and enterprises so the cheaper it gets the more they buy.

          If we doubled agricultural productivity globally we'd need to have fewer farmers because there's no way we can all eat twice as much food. But we can absolutely consume twice as much CSS, try to play call of duty on our smart fridge or use a new SaaS to pay our taxes.

          • zelphirkalt 11 minutes ago ago

            Oh but we can absolutely let all that food go to waste! In many places unbelievable amounts of food go to waste.

            Actually, most software either is garbage or goes to waste at some point too. Maybe that's too negative. Maybe one could call it rot or becoming obsolete or obscure.

          • lelanthran 2 hours ago ago

            > Generally the demand for software engineers has increased as their productivity has increased, looking back over the past few decades. There seems to be effectively infinite demand for software from consumers and enterprises so the cheaper it gets the more they buy.

            I see this fallacy all the time but I don't know if there is a name for it.

            I mean, we make used fun of MBAs for saying the same thing, but now we should be more receptive to the "Line Always Goes Up" argument?

            • kasey_junk 2 hours ago ago

              Jevons paradox and it’s not a fallacy. It’s an observable behavior. The problem is it’s not predictive.

              • lelanthran 2 hours ago ago

                > Jevons paradox and it’s not a fallacy. It’s an observable behavior. The problem is it’s not predictive.

                I was referring specifically to this point, which, IMHO, is a fallacy:

                >>> There seems to be effectively infinite demand for software from consumers and enterprises so the cheaper it gets the more they buy.

                There is no way to use the word "infinite" in this context, even if qualified, that is representative of reality.

                • rowanajmarshall an hour ago ago

                  As much as I appreciate the difference between literal infinity and consumers' demand for software, there's just so much bad software out there waiting to be improved that I can't see us hitting saturation soon.

                • falloutx an hour ago ago

                  There consumer internet is mostly cropped up by white collar people buying stuff online and clicking on ads. Once the cutting starts, the whole internet economy just becomes a money swapping machine between 7 VC groups.

                  The demand for paid software is decreasing cause these AI companies are saying "Oh dont buy that SAAS product because you can build it yourself now"

          • Ragnarork 2 hours ago ago

            This reasoning is flawed in my opinion, because at the end of the day, the software still has to be paid for (for the people that want/need to make a living out of it), and customers wallet are finite.

            Our attention is also a finite resource (24h a day max). We already see how this has been the cause for the enshittificaton of large swathes of software like social media where grabbing the attention for a few seconds more drives the main innovation...

          • kace91 2 hours ago ago

            the demand for software has increased. The demand for software engineers has increased proportionally, because we were the only source of software. This correlation might no longer hold.

            Depending on how the future shapes up, we may have gone from artisans to middlemen, at which point we're only in the business of added value and a lot of coding is over.

            Not the Google kind of coding, but the "I need a website for my restaur1ant" kind, or the "I need to agregate data from these excel files in a certain way" kind. Anything where you'd accept cheap and disposable. Perhaps even the traditional startup, if POCs are vibecoded and engineers are only introducer later.

            Those are huge businesses, even if they are not present in the HN bubble.

            • falloutx 2 hours ago ago

              > "I need a website for my restaurant" kind, or the "I need to aggregate data from these excel files in a certain way" kind

              I am afraid that kind of jobs were already over by 2015. There are no code website makers available since then and if you can't do it yourself you can just pay someone on fiverr and get it done for less than $5-50 at this point, its so efficient even AI wont be more cost effective than that. If you have $10k saved you can hire a competitive agency to maintain and build your website. This business is completely taken over by low cost fiverr automators and agencies for high budget projects. Agencies have become so good now that they manage websites from Adidas to Lando Norris to your average mom & pop store.

      • throwaway6734 39 minutes ago ago

        I perform software engineering at a research oriented institution and there are some projects I can now prototype without writing a line of code. The productivity benefits are massive

      • latexr 3 hours ago ago

        > It’s almost impossible to discuss LLMs on HN because everyone is busy attacking the vibecoding strawman all the time.

        What’s “the vibecoding strawman”? There are plenty of people on HN (and elsewhere) repeatedly saying they use LLMs by asking them to “produce full apps in hours instead of weeks” and confirming they don’t read the code.

        Just because everyone you personally know does it one way, it doesn’t mean everyone else does it like that.

        • Chris2048 3 hours ago ago

          I'd assume the straw-man isn't that vibe-coding (vbc) doesn't exist, but that all/most ai-dev is vbc, or that it's ok to derail any discussion on ai-assisted dev with complaints applicable only/mainly to vbc.

          • latexr 3 hours ago ago

            Neither of those would be a strawman, though. One would be a faulty generalization and the other is airing a grievance (could maybe be a bad faith argument?).

            https://en.wikipedia.org/wiki/Faulty_generalization

            Though I get that these days people tend to use “strawman” for anything they see as a bad argument, so you could be right in your assessment. Would be nice to have clarification on what they mean.

            • Chris2048 2 hours ago ago

              Hmm, if the purpose of either is so an "easier" target can be made, I think it could still qualify as a straw-man; I think an accusation of straw-manning is in part a accusation of another's intent (or bad faith - not engaging with the argument).

              • latexr an hour ago ago

                > Hmm, if the purpose of either is so an "easier" target can be made, I think it could still qualify as a straw-man

                Good point.

                > I think an accusation of straw-manning is in part a accusation of another's intent (or bad faith - not engaging with the argument).

                There I partially disagree. Straw-manning is not engaging with the argument but it can be done accidentally. As in, one may genuinely misunderstand the nuance in an argument and respond to a straw man by mistake. Bad faith does require bad intent.

      • kylecazar 12 hours ago ago

        Half strawman -- a mudman, perhaps. Because we're seeing proper experts with credentials jump on the 'shit, AI can do all of this for me' realization blog post train.

        • Chris2048 3 hours ago ago

          Which experts?

          • kylecazar 18 minutes ago ago

            Well, I have a lot of respect for antirez (Redis), and at the time of my writing this comment he had a front page blog post in which we find:

            "Writing code is no longer needed for the most part."

            It was a great post and I don't disagree with him. But it's an example of why it isn't necessarily a strawman anymore, because it is being claimed/realized by more than just vibecoders and hobbyists.

          • nl 3 hours ago ago

            Does Linus Torvalds count?

            • ruszki 2 hours ago ago

              When has he stated that he uses AI like that? The last I heard about him a month ago, he specifically stated that he was not interested in AI to write code: https://www.zdnet.com/article/linus-torvalds-ai-tool-maintai...

              • Philpax 2 hours ago ago

                3 days ago: https://github.com/torvalds/AudioNoise/blob/main/README.md

                > Also note that the python visualizer tool has been basically written by vibe-coding. I know more about analog filters -- and that's not saying much -- than I do about python. It started out as my typical "google and do the monkey-see-monkey-do" kind of programming, but then I cut out the middle-man -- me -- and just used Google Antigravity to do the audio sample visualizer.

                • darkwater an hour ago ago

                  For me there are two things notesworthy in that repo:

                  * the README was clearly not written by an LLM nor aided

                  * he still uses GPLv2 (not 3) as the license for his works

        • eaurouge 12 hours ago ago

          So another strawman?

    • conartist6 36 minutes ago ago

      The best advice to juniors is "do not use AI!"

      Dunno why the author thinks an AI-enhanced junior can match the "output"of a whole team unless he means in generating lines of code, which is to say tech debt.

      Being able to put a lot of words on screen is not the accomplishment in programming. It usually means you've gone completely out of your depth.

    • noufalibrahim 5 hours ago ago

      I'm somewhere in between myself. Before LLMs, I used to block a few sites that distracted me by adding entries in /etc/hosts file to mapping them to 127.0.0.1 on my work machine. I also made the file immutable so that it would take a few steps for me to unblock the sites.

      The next step was for me to write a cron job that would reapply the chattr +1 and rewrite the file once in 5 minutes. Sort of an enforcer. I used Claude (web) to write this and cut/pasted it just because I didn't want to bother with bash syntax that I learned and forgot several times.

      I then wanted something stronger and looked at publicly available things like pluckeye but they didn't really work the way I wanted. So I tried to write a quick version using Claude (web) and started running it (October 2025). It solved my problem for me.

      I wanted a program to use aider on and I started with this. Every time, I needed a feature (e.g. temporary unblocks, prevent tampering and uninstalling, blocking in the browser, violation tracking etc.), I wrote out what I wanted and had the agent do it. OVer the months, it grew to around 4k lines (single file).

      Around December, I moved to Claude code from aider and continued doing this. The big task I gave it was to refactor the code into smaller files so that I could manage context better. IT did this well and added tests too. (late December 2025).

      I added a helper script to update URLs to block from various sources. Vibe-coded too. Worked fine.

      Then, I found it hogging memory because of some crude mistakes I vibe-coded early on fixed that. Cost me around $2 to do so. (Jan 2026).

      Then I added support to lock the screen when I crossed a violation threshold. This required some Xlib code to be written. I'm sure I could have written it but it's not really worth it. I know what to do and doing it by hand wouldn't really teach me anything except the innards of a few libraries. I added that.

      So, in short, this is something that's 98% AI coded but it genuinely solves a problem for me and has helped me change my behaviour in front of a computer. There are no companies that my research revealed that offer this as a service for Linux. I know what to do but don't have the time write and debug it. With AI, my problem was solved and I have something which is quite valuable to me.

      So, while I agree with you that it isn't an "automation tool", the speed and depth which it brings to the environment has opened up possibilities that didn't previously exist. That's the real value and the window through which I'm exploring the whole thing.

      • falloutx 3 hours ago ago

        It seems alright, but I wonder if it crashes the economy for vast majority of internet businesses. I personally run some tool websites like ones to convert images, cut videos but the traffic for now seems stable, but my tools don't target devs. Most likely you didnt actually need it, but who am i to judge, I just find myself doing random projects because it "takes less time".

    • jvans 12 hours ago ago

      i notice a huge difference between working on large systems with lots of microservices and building small apps or tools for myself. The large system work is what you describe, but small apps or tools I resonate with the automate coding crowd.

      I've built a few things end to end where I can verify the tool or app does what I want and I haven't seen a single line of the code the LLM wrote. It was a creepy feeling the first time it happened but it's not a workflow I can really use in a lot of my day to day work.

    • trueismywork 6 hours ago ago

      You can think of LLMs as a higher level language for whatever programming language you are using, but informal with ambiguous grammar.

      • noufalibrahim 5 hours ago ago

        I don't think that works. The fact that it can produce different output for the same input, usage of tools etc. don't really fit into the analogy or mental model.

        What has worked for me is treating it like an enthusiastic intern with his foot always on the accelerator pedal. I need to steer and manage the brakes otherwise, it'll code itself off a cliff and take my software with it. The most workable thing is a pair programmer. For trivial changes and repeatedly "trying stuff out", you don't need to babysit. For larger pieces, it's good to make each change small and review what it's trying.

        • therealpygon 2 hours ago ago

          I feel like some of the frontier models are approaching run-of-the-mill engineer who does dumb stuff frequently. That said, with appropriate harnessing, it’s more like go-karts on a track; you can’t keep them out of the wall, but you can reset them and get them back on a path (when needed). Not every kart ends up in the wall, but all of them want to go fast, so the better defined the track is the more likely the karts will find a finish line. Certainly more likely than if you just stuck them in a field with no finish line and said “go!”.

          • noufalibrahim 2 hours ago ago

            I don't really know if I agree with you but the analogy is really good. :)

      • 334f905d22bc19 3 hours ago ago

        On the foolishness of "natural language programming". - prof.dr.Edsger W.Dijkstra

        https://www.cs.utexas.edu/~EWD/transcriptions/EWD06xx/EWD667...

        • djeastm 16 minutes ago ago

          >We would need all the intellect in the world to get the interface narrow enough to be usable, and, in view of the history of mankind, it may not be overly pessimistic to guess that to do the job well enough would require again a few thousand years.

          It seems it only took until about 2023 or so

    • antonymoose 12 hours ago ago

      It’s a better Google for me. Instead of searching AWS or StackOverflow it hallucinates a good enough output that I can refactor into an output.

      • bryanrasmussen 6 hours ago ago

        The reason why it is better is that with search you have to narrow your search down to a specific part of what you are trying to do, for example if you need a unique id generating function as part of what you are trying to do you first search for that, then if you need to make sure that whatever gets output is responsive 3 columns then you might search for that, and then do code to glue the things together to what you need, with AI you can ask for all of this together, get something that is about what the searched for results would have been, do your glue code and fixes you would normally have done.

        It trims the time requirement of a bit of functionality that you might have searched for 4 examples down by the time requirement of 3 of those searches.

        It does however remove the benefit of having done the search which might be that you see the various results, and find that a secondary result is better. You no longer get that benefit. Tradeoffs.

        • thunspa 38 minutes ago ago

          I resonate with the phrase: "You never learn to ask good questions"

    • lovich 3 hours ago ago

      All I know is that firing half my employees and never hiring entry level people again nets me a bonus next quarter.

      Not really sure why this article is talking about what happens 2 years from now since that’s 8 times longer than anything anyone with money or power cares about.

      • falloutx 3 hours ago ago

        What a benevolent bossman here, keeping 50% of the jockeys around this quarter. He is probably sacrificing one of his yachts for this.

        • thfuran 3 hours ago ago

          He’s keeping some around so he can fire half again next quarter for another bonus. That’s the sort of forward-thinking strategic direction that made him the boss man.

          • cppluajs 2 hours ago ago

            So log(N) times the bonus. Very smart boss here.

    • petesergeant 11 hours ago ago

      I’m doing both. For production code that I care about, I’m reading every line the LLM writes, correcting it a lot, chatting with an observer LLM who’s checking the work the first LLM and I are writing. It’s speeding stuff up, it also reduces the friction on starting on things. Definitely a time saver.

      Then I have some non-trivial side projects where I don’t really care about the code quality, and I’m just letting it run. If I dare look at the code, there’s a bunch of repetition. It rarely gets stuff right the first time, but that’s fine, because it’ll correct it when I tell it it doesn’t work right. Probably full of security holes, code is nasty, but it doesn’t matter for the use-cases I want. I have produced pieces of software here that are actively making my life better, and it’s been mostly unsupervised.

  • Havoc 7 minutes ago ago

    One of the better analysis of this question I think.

    On the optimistic take side - I suspect it might end up being true that software might be infused into more niches but not sure it follows that this helps on the jobs market side. Or put different demand for software and SWE might decouple somewhat for much of that additional software demand.

  • osigurdson 10 hours ago ago

    >> The skillset is shifting from implementing algorithms to knowing how to ask the AI the right questions and verify its output.

    The question is, how much faster is verification only vs writing the code by hand? You gain a lot of understanding when you write the code yourself, and understanding is a prerequisite for verification. The idea seems to be a quick review is all that should be needed "LGTM". That's fine as long as you understand the tradeoffs you are making.

    With today's AI you either trade speed for correctness or you have to accept a more modest (and highly project specific) productivity boost.

    • kace91 2 hours ago ago

      And there's a ton of human incentives here to take shortcuts in the review part. The process almost pushes you to drop your guard: you spend less physical time observing the code while you write, you get huge chunks of code dropped on you, iterations change a lot to keep a mind model, there's FOMO involved about the speed gain you're supposed to get... We're going to see worse review quality just by a mater of UX and friction of the tool.

      • judahmeek an hour ago ago

        Yes! It depends on the company, of course, but I think plenty of people are going to fall for the perverse incentives while reviewing AI output for tech debt.

        The perverse incentives being that tech debt is non-obvious & therefore really easy to avoid responsibility for.

        Meanwhile, velocity is highly obvious & usually tired directly to personal & team performance metrics.

        The only way I see to resolve this is strict enforcement of a comprehensive QA process during both the planning & iteration of an AI-assisted development cycle.

        But when even people working at Anthropic are talking about running multiple agents in parallel, I get the idea that CTO's are not taking this seriously.

  • misja111 2 hours ago ago

    > Senior developers: Fewer juniors means more grunt work landing on your plate

    I'm not sure I agree with that. Right now as a senior my task involves reviewing code from juniors; replace juniors with AI and it means reviewing code from AI. More or less the same thing.

    • thw_9a83c 2 hours ago ago

      > More or less the same thing.

      Worse. The AI doesn't share any responsibility.

      • cheschire 2 hours ago ago

        And can’t be mentored by the senior except in some ersatz flat text instruction files.

        • icedrift 2 hours ago ago

          And the mistakes AI makes don't carry the same code smells juniors make. There are markers in human code that signals how well they understood the problem, AI code more often looks correct at a glance even if it's horribly problematic.

    • girvo 2 hours ago ago

      The juniors get better and closer to the ideal that my team requires via this process. Current AIs don’t, not the same way.

    • groguzt 11 minutes ago ago

      currently my job as a junior is to review vibe code that was "written" by seniors. it's just such bullshit and they make mistakes I wouldn't even dare to make in my first year of school

  • FrustratedMonky 2 minutes ago ago

    Maybe a harsh criticism. The article seemed to be all over the place, maybe because the subject is also all over the place. I agree with everything, its just that it seemed like the same story we've been in for awhile.

    Wasn't the main take away generally "study everything even more than you were, and talk/network to everybody even more than you were, and hold on. Work more more more"

  • Jean-Papoulos 3 hours ago ago

    >The flip scenario: AI unlocks massive demand for developers across every industry, not just tech. Healthcare, agriculture, manufacturing, and finance all start embedding software and automation.

    I find this one hard to believe. Software is already massively present in all these industries and has already replaced jobs. The last step is complete automation (ie drone tractors that can load up at a hub, go to the field and spray all by themselves) but the bottleneck for this isn't "we need more code", it's real-world issues that I don't see AI help solving (political, notably)

    • Eupolemos 2 hours ago ago

      I think there's a good chance demand goes up in Europe.

      We are going to need to de-risk our software dependencies, and Germany is going to need to use computers.

      Germany is going to be crazy, I think.

      • cheschire 2 hours ago ago

        Germans were so quick to revert back to paper after COVID that it felt like one of the only reasons they came out of lockdown eventually was to get paper back.

        The Gewerkschaft tactics to resist AI is what I’m really interested in seeing.

    • alansaber 2 hours ago ago

      Agree, people were already worried about the excessive focus on software over physical technology well before LLMs significantly reduced the barrier to entry

  • stack_framer 8 hours ago ago

    Funny that he mentions people not pivoting away from COBOL. My neighbors work for a bank, programming in COBOL every day. When I moved in and met them 14 years ago, I wondered how much longer they would be able to keep that up.

    They're still doing it.

    • bossyTeacher 6 hours ago ago

      The market can stay irrational longer than you can stay solvent

      • bryanrasmussen 6 hours ago ago

        it sounds like these people are staying solvent as long as the market stays irrational.

      • webdevver 3 hours ago ago

        to be fair, that cobol program has been working for probably 30 years (maybe even longer than that) - thats unusually reliable and long-lived for a software project.

        the only real contender in this regard is the win32 api, and actually that did get used in enterprise for a long time too before the major shift to cloud and linux in the mid 2010s.

        ultimately the proof is in the real-world use, even if its ugly to look at... id say, even as someone who is a big fan of linux, if i were given a 30 year old obscure software stack that did nothing but work, i would be very hesitant to touch it too!

  • hncoder12345 7 hours ago ago

    Sometimes I wonder if I made the wrong choice with software development. Even after getting to a senior role, according to this article, you're still expected to get more education and work on side projects outside of work. Am I supposed to want to code all the time? When can I pursue hobbies, a social life, etc.

    • johnfn 7 hours ago ago

      To put it very directly - if you are OK with being good but not exceptional at your job, this is totally fine. If you want to be exceptional you will probably need to put in the extra work. Not everyone is OK with this tradeoff and it's totally fine to "just" be good and care more about having outside hobbies and a social life and etc.

      • hncoder12345 an hour ago ago

        I had a period of time where I really wanted to be exceptional. I spent many hours studying and working on side projects but it just never really clicked. I think I'm decent at what I do for work but more complicated topics (graphics programming, low level memory management, etc.) just seem to not stick, no matter how many hours I put into studying. Sometimes it feels like I'm forcing this career but after this many years it's hard to give it up. I do still enjoy it but I don't think I'll ever really get it.

    • tumetab1 3 hours ago ago

      Since you're getting into a senior role, learn the mantra, it depends :D

      The usual trade-off of a well paid software development job is lack of job security and always learning - the skill set is always changing in contrast with other jobs.

      My suggestion, stop chase trends and start to hear from mature software developers to get better perspective on what's best to invest on.

      And why the mantra is always true?

      You can find stable job (slow moving company) doing basic software development and just learn something new every 4 years and then change companies.

      Or never change company and be the default expert, because everyone else is changing jobs, get job security, work less hours and have time within your job to uplift your skills.

      Keep chasing latest high paid jobs/trends by sacrificing off time.

      What's the best option for you? Only you know, it's depends on your own goals.

    • francisofascii 43 minutes ago ago

      >I made the wrong choice with software development.

      I wonder what the best decision would have been. What job is AI immune and has a stable 40 hour week, no overtime, with decent pay. Teacher? Nursing?

    • qsera 3 hours ago ago

      >I made the wrong choice with software development.

      If you didn't like working with computers, then you (and another gazillion people who choose it for the $$$) probably made the wrong choice.

      But totally depends on what you wanted to get out of it. If you wanted to make $$$ and you are making it, what is the problem? That is assuming you have fun outside of work.

      But if you wanted to be the best at what you do, then you gotta love what you are doing. May be there are people who have super human discipline. But for normal people, loving what they goes a long way towards that end.

      • _heimdall an hour ago ago

        > If you didn't like working with computers, then you probably made the wrong choice.

        This doesn't match what I have seen in other industries. Many auto mechanics I know drive old Buicks or Ford's with the 4.6l v8 because the cars are reliable and the last thing they want to do on a day off is have to work on their own car. I know a few people in other trades like plumbers, electricians, and chefs and the pattern holds pretty well for them as well.

        You can enjoy working with computers and also enjoy not working in your personal time.

        • hncoder12345 an hour ago ago

          Exactly this. I love writing code and solving problems. In my 20s and very early 30s I worked a lot of long hours and tried my best to always be learning new things and upskilling but it's never ending. It's hard sometimes to not look back and think about the hours I spent on code instead of building stronger friendships and relationships.

      • menaerus 2 hours ago ago

        > If you didn't like working with <insert anything>, then you ...

        This type of argument can hold for any profession and yet we aren't seeing this pattern much in other white-collar professions. Professors, doctors, economists, mechanical engineers, ... it seems like pretty much everybody made the wrong choice then?

        I think this is a wrong way to look at it. OP says that he invested a lot of time into becoming proficient in something that today appears to be very close to part extinction.

        I think that the question is legit, and he's likely not the only person asking oneself this question.

        My take on the question is ability to adapt and learn new skills. Some will succeed some will fail but staying in status-quo position will certainly more likely lead to a failure rather than the success.

        • hncoder12345 an hour ago ago

          Your first point hits the nail on the head. We are expected to have side projects and to keep up with new things (outside of work) but most other jobs don't have that. I would be okay with my work sending me off for additional training, on company time, but I don't want it to consume the time I have left after work.

    • jedberg 7 hours ago ago

      It's funny you should ask this. When I started out, 30 years ago, here were the answers you'd get from most people:

      > Am I supposed to want to code all the time?

      Yes.

      > When can I pursue hobbies,

      Your hobby should be coding fun apps for yourself

      > a social life, etc.

      You social life should be hanging out with other engineers talking about engineering things.

      And the most successful people I know basically did exactly that.

      I'm not saying y'all should be doing that now, I'm just saying, that is in fact how it used to be.

      • misja111 2 hours ago ago

        > And the most successful people I know basically did exactly that.

        Well that depends heavily on how you define successful. Successful in life? I would tend to disagree, unless you believe that career is the only thing that counts. But even when career is concerned: the most successful people I know went on from being developer to some high end management role. The skills that brought them there definitely did not come from hanging out with other engineers talking about engineering things.

      • gofreddygo 3 hours ago ago

        Not my experience at all. The very notable engineers I know didn't do their most notable work because of engineering or coding skills. Instead it was finding interesting problems and making a start or thinking a bit differently about something and doing something about it and being approachable and available all along that made a difference.

        If all they did was code all the time, write code for fun and interacted mostly with other similar people, they probably wouldn't be the first choice for these projects.

      • MyFirstSass 3 hours ago ago

        That's not true at all.

        The ones who ace their careers are for the most people that are fun, driven, or psychos, all social traits that make you good in a political game.

        Spending lots of time with other socially awkward types talking about hard math problems or whatever will get you nowhere outside of some SF fantasy startup movie.

        I'd say it's especially important for the more nerdy (myself included) to be more outgoing, and do other stuff like sales or presentations, design/marketing og workshops - that will make you exceptional because you then got the "whole package" and undestand the process and other people.

      • KaiserPro 4 hours ago ago

        > You social life should be hanging out with other engineers talking about engineering things.

        Fuck. That.

        I worked at a faang, successful people weren't people that did engineering, it was people who did politics.

        The most successful people were the ones that joined at the same time as the current VP.

        Your hobbies need to be fun, to you. Not support your career. If its just there to support your career, its unpaid career development, not a hobby. Should people not code in their free time? thats not for me to decide. If they enjoy it, and its not hurting anyone, then be my guest.

        Engineers are generally useless at understanding whats going on in the real world, they are also quite bad at communicating.

        do. fun. things.

        • hncoder12345 an hour ago ago

          I love your last point. I asked this question because I used to be the person that would spend 4+ hours after work every day trying to keep up with new tech and working on side projects. But now, I've gotten into art and it's really changed my perspective on things like this. I've spent many hours doing, as you call it, unpaid career development instead of pursuing hobbies, building up my friendships, and in general just having fun. It feels like I've taken life so seriously and I don't have much to show for it.

        • izacus 4 hours ago ago

          You just sound very angry your career isn't fun to you. I'm sorry.

  • ch4s3 12 hours ago ago

    > junior developer employment drops by about 9-10% within six quarters, while senior employment barely budges. Big tech hired 50% fewer fresh graduates over the past three years.

    This study showing 9-10% drop is odd[1] and I'm not sure about their identification critria.

    > We identify GenAI adoption by detecting job postings that explicitly seek workers to implement or integrate GenAI technologies into firm workflows.

    Based on that MIT study it seems like 90+% of these projects fail. So we could easily be seeing an effect where firms posting these GenAI roles are burning money on the projects in a way that displaces investment in headcount.

    The point about "BigTech" hiring 50% fewer grads is almost orthogonal. All of these companies are shifting hiring towards things where new grads are unlikely to add value, building data centers and frontier work.

    Moreover the TCJA of 2017 caused software developers to not count for R&D tax write offs (I'm oversimplifying) starting in 2022. This surely has more of an effect than whatever "GenAI integrator roles" postings correlates to.

    [1] https://download.ssrn.com/2025/11/6/5425555.pdf

    • wefzyn 10 hours ago ago

      AI became very popular suddenly. This is something that wasn't in anyone's budget. I believe cost savings from hiring freezes and layoffs are to pay for AI projects and infrastructure.

      • ch4s3 10 hours ago ago

        Right so you shift budget away from other things. The “study” looked at ai integration job listings. You have to budget those.

    • garbawarb 11 hours ago ago

      Hiring was booming until about 2020 though.

      • ch4s3 11 hours ago ago

        The TCJA change (of 2017) went into effect in 2022, I should have been more clear.

        • garbawarb 10 hours ago ago

          I didn't know that but that makes perfect sense. A lot of layoffs and outsourcing coincided with that. Are there any signs it'll be reintroduced?

          • ch4s3 10 hours ago ago

            It was late last year.

  • reedf1 3 hours ago ago

    All well-documented knowledge fields will be gone if software goes. Then the undocumented ones will become documented, and they too will go. The best advice to junior devs is get a hands on job before robotic articulating sausages are perfected and humans become irrelevant blobs of watery meat.

  • danieltanfh95 4 hours ago ago

    The most useful thing juniors can do now is use AI to rapidly get up to the speed with the new skill floor. Learn like crazy. Self learning is empowered by AI.

    Engineers > developers > coders.

    • kace91 2 hours ago ago

      AI has a lot of potential as a personal, always on teaching assistant.

      It's also an 'skip intro' button for the friction that comes with learning.

      You're getting a bug? just ask the thing rather than spending time figuring it out. You don't know how to start a project from scratch? ask for scaffolding. Your first boss asks for a ticket? Better not to screw up, hand it to the machine just to be safe.

      If those temptations are avoided you can progress, but I'm not sure that lots of people will succeed. Furthermore, will people be afforded that space to be slow, when their colleagues are going at 5x?

      Modern life offers little hope. We're all using uber eats to avoid the friction of cooking, tinder to avoid the awkwardness of a club, and so on. Frictionless tends to win.

    • haspok 2 hours ago ago

      That is quite some wishful thinking there. Most juniors won't care, just vibe-code their way through.

      It takes extra discipline and willpower to force yourself do the painful thing, if there is a less painful way to do it.

    • auggierose 2 hours ago ago

      Scientists > engineers > developers > coders

    • amrocha 30 minutes ago ago

      Because employers famously hire based on skill and not credentials or existence

  • austin-cheney 14 hours ago ago

    I have been telling people that, titles aside, senior developers were the people not afraid to write original code. I don’t see LLMs changing this. I only envision people wishing LLMs would change this.

    • CSSer 14 hours ago ago

      I almost think what a lot of people are coming to grips is with is how much code is unoriginal. The ones who've adjusted the fastest were humble to begin with. I don't want to claim the title, but I can certainly claim the imposter syndrome! If anything, LLMs validated something I always suspected. The amount of truly unique, relevant to success, code in a given project is often very small. More often than not, it's not grouped together either. Most of the time it's tailored to a given functionality. For example, a perfectly accurate Haversine distance is slower than an optimized one with tradeoffs. LLMs have not yet become adept at housing the ability to identify the need for those tradeoffs in context well or consistently, so you end up with generic code that works but not great. Can the LLM adjust if you explicitly instruct it to? Sure, sometimes! Sometimes it catches it in a thought loop too. Other times you have to roll up your sleeves and do the work like you said, which often still involves traditional research or thinking.

    • HarHarVeryFunny 14 hours ago ago

      I disagree.

      1) Senior developers are more likely to know how to approach a variety of tasks, including complex ones, in ways that work, and are more likely to (maybe almost subconsciously) stick to these proven design patterns rather than reinvent the wheel in some novel way. Even if the task itself is somewhat novel, they will break it down in familar ways into familar subtasks/patterns. For sure if a task does require some thinking outside the box, or a novel approach, then the senior developer might have better intuition on what to consider.

      The major caveat to this is that I'm an old school developer, who started professionally in the early 80's, a time when you basically had to invent everything from scratch, so certainly there is no mental block to having to do so, and I'm aware there is at least a generation of developers that grew up with stack overflow and have much more of a mindset of building stuff using cut an paste, and less having to sit down and write much complex/novel code themselves.

      2) I think the real distinction of senior vs junior programmers, that will carry over into the AI era, is that senior developers have had enough experience, at increasing levels of complexity, that they know how to architect and work on large complex projects where a more junior developer will flounder. In the AI coding world, at least for time being, until something closer to AGI is achieved (could be 10-20 years away), you still need to be able to plan and architect the project if you want to achieve a result where the outcome isn't just some random "I let the AI choose everything" experiment.

      • austin-cheney 13 hours ago ago

        I completely agree with your second point. For your first point my experience tells me the people least afraid to write original code are the people least oppositional to reinventing wheels.

        The distinguishing behavior is not about the quantity of effort involved but the total cost after consideration of dependency management, maintenance time, and execution time. The people that reinvent wheels do so because they want to learn and they also want to do less work on the same effort in the future.

      • BoiledCabbage 12 hours ago ago

        > in the early 80's, a time when you basically had to invent everything from scratch, so certainly there is no mental block to having to do so, and I'm aware there is at least a generation of developers that grew up with stack overflow and have much more of a mindset of building stuff using cut an paste, and less having to sit down and write much complex/novel code themselves.

        I think this is really underappreciated and was big in driving how a lot of people felt about LLMs. I found it even more notable on a site named Hacker News. There is an older generation for whom computing was new. 80s through 90s probably being the prime of that era (for people still in the industry). There constantly was a new platform, language, technology, concept to learn. And nobody knew any best practices, nobody knew how anything "should work". Nobody knew what anything was capable of. It was all trying things, figuring them out. It was way more trailblazing / exploring new territory. The birth of the internet being one of the last examples of this from that era.

        The past 10-15 years of software development have been the opposite. Just about everything was evolutionary, rarely revolutionary. Optimizing things for scale, improving libraries, or porting successful ideas from one domain to another. A lot of shifting around deck chairs on things that were fundamentally the same. Just about every new "advance" in front-end technology was this. Something hailed as ground breaking really took little exploration, mostly solution space optimization. There was almost always a clear path. Someone always had an answer on stack overflow - you never were "on your own". A generation+ grew up in that environment and it felt normal to them.

        LLMs came about and completely broke that. And people who remembered when tech was new and had potential and nobody knew how to use it loved that. Here is a new alien technology and I get to figure out what makes it tick, how it works how to use it. And on the flip side people who were used to there being a happy path, or a manual to tell you when you were doing it wrong got really frustrated as their being no direction - feeling perpetually lost and it not working the way they wanted.

        I found it especially ironic being on hacker news how few people seemed to have a hacker mindset when it came to LLMs. So much was, "I tried something it didn't work so I gave up". Or "I just kept telling it to work and it didn't so I gave up". Explore, pretend you're in a sci-fi movie. Does it work better on Wednesdays? Does it work better if you stand on your head? Does it work differently if you speak pig-latin? Think side-ways. What behavior can you find about it that makes you go "hmm, that's interesting...".

        Now I think there has been a shift very recently of people getting more comfortable with the tech - but still was surprised of how little of a hacker mindset I saw on hacker news when it came to LLMs.

        LLMs have reset the playing field from well manicured lawn, to an unexplored wilderness. Figure out the new territory.

        • Terr_ 12 hours ago ago

          To me, the "hacker" distinction is not about novelty, but understanding.

          Bashing kludgy things together until they work was always part of the job, but that wasn't the motivational payoff. Even if the result was crappy, knowing why it was crappy and how it could've been better was key.

          LLMs promise an unremitting drudgery of the "mess around until it works" part, facing problems that don't really have a cause (except in a stochastic sense) and which can't be reliably fixed and prevented going forward.

          The social/managerial stuff that may emerge around "good enough" and velocity is a whole 'nother layer.

        • layer8 10 hours ago ago

          No, the negative feelings about LLMs are not because they are new territory, it’s because they lack the predictability and determinism that draw many people to computers. Case in point, you can’t really cleverly “hack” LLMs. It’s more a roll of the dice that you try to affect using hit-or-miss incantations.

          • bossyTeacher 6 hours ago ago

            >the negative feelings about LLMs are not because they are new territory, it’s because they lack the predictability and determinism that draw many people to computers

            Louder for those turned deaf by LLM hype. Vibe coders want to turn a field of applied math into dice casting.

        • hooverd 8 hours ago ago

          an unexplored wilderness that you pour casino chips into (unless you're doing local model stuff yea yea)

        • bossyTeacher 6 hours ago ago

          >I found it especially ironic being on hacker news how few people seemed to have a hacker mindset when it came to LLMs

          You keep using the word "LLMs" as if Opus 4.x came out in 2022. The first iterations of transformers were awful. Gpt-2 was more of a toy and Gpt-3 was an eyebrow-raising chatbot. It has taken years of innovations to reach the point of usable stuff without constant hallucinations. So don't fault devs for the flaws of early LLMs

  • athrowaway3z 3 hours ago ago

    I've been saying for a decade that one of the fundamental issues with SWE in the average company, is that management does not seem to understand that SWE is a management level job. Its not an assembly line worker. It requires reorganizing, streamlining, stake-holders, etc - in code and data - which directly affect people much the same that any other management role has. There are just fewer issues with HR and more with CDNs or CVEs.

    > A CEO of a low-code platform articulated this vision: in an “agentic” development environment, engineers become “composers,”

    I see we'll be twisting words around to keep avoiding the comparison.

  • megamix 4 hours ago ago

    The most important question is who will get paid the most? I don't think the future of software engineering will be attractive if all you do is more work for same or even less pay. A second danger is too much reliance on AI tools will centralise knowledge and THAT is the scariest thing. Software systems will need to perform for a long time, having juniors on board and people who understand software architecture will be massively important. Or will all software crash when this generation retires?

    • falloutx 4 hours ago ago

      The people who don't lose their jobs will also not be in a great spot, there wont be a guarantee that they will never lose their jobs, they will continue to live on the wobbly and uncertain foundation, will get fired for first no they say to the management. If software engineering falls, all the related industries will fall too, thus creating a domino effect, that none of the execs can imagine right now.

      • menaerus 2 hours ago ago

        I really do wonder what sort of economy change is coming to us because companies will hypothetically need to hire less people to sustain the equal output of today. They can do that basically today so not even hypothetically anymore, it just needs some time to take off.

        The question IMO is, who will be creating the demand on the other side for all of these goods produced if so many people will be left without the jobs? UBI, redistribution of wealth through taxes? I'm not so convinced about that ...

        • Ray20 an hour ago ago

          > The question IMO is, who will be creating the demand on the other side for all of these goods produced if so many people will be left without the jobs?

          There is no reason why people will left without jobs. Ultimately, "job" is simply a superstructure for satisfying people's needs. As long as people have needs and the ability to satisfy them, there will be jobs in the market. AI change nothing in those aspects.

        • falloutx 2 hours ago ago

          UBI is just a pipe dream. The rich are clutching their pearls even harder.

      • hoss1474489 3 hours ago ago

        > there wont be a guarantee that they will never lose their jobs, they will continue to live on the wobbly and uncertain foundation

        The people who lose their jobs prove this was always the case. No job comes with a guarantee, even ones that say or imply they do. Folks who believe their job is guaranteed to be there tomorrow are deceiving themselves.

  • tigrezno 5 hours ago ago

    The next two years of software engineering will be the last two years of software engineering (probably).

    • amelius 4 hours ago ago

      I don't see the market flooded yet with software that was "so easy to build using LLMs".

      Last year was, as it seems, just a normal year in terms of global software output.

      • steve1977 3 hours ago ago

        If anything, looking at for example what Microsoft has been releasing, it's been a year below average (in terms of quality).

      • falloutx 3 hours ago ago

        You are not looking at right places. Github repo counts have been high since 2020 because there are companies & individuals who run fork scripts. So AI cant match the numbers.

        But on product hunt, the amount of projects is First week of Jan: 5000+, Entire Jan 2018: 4000 approx.

        • amrocha 25 minutes ago ago

          That doesn’t mean industry output is high, it means people are starting new products.

          Has the output of existing companies/products increased substantially?

          Have more products proven successful and started companies?

          hard to say but maybe a little

      • cmpxchg8b 3 hours ago ago

        This is such a stupid argument. A very significant amount of code never makes it into the public sphere. None of the code I've written professionally in the last 26 years is publicly accessible, and if someone uses a product I've written they likely don't care if it was written with the aid of an LLM or not.

        Not to mention agent capabilities at the end of last year were vastly different to those at the start of the year.

        • amelius 2 hours ago ago

          Even if a portion of software is not released to the general public, you'd still expect an increase in the amount of software released to the general public.

          Even if LLMs became better during the year, you'd still expect an increase in releases.

    • kubb 5 hours ago ago

      Please don’t get my hopes up. Adaptable people like me will outcompete hard in the post-engineering world. Alas, I don’t believe it’s coming. The tech just doesn’t seem to have what it takes to do the job.

      • falloutx 3 hours ago ago

        Some related fields will be gone too. And the jobs which will remain will be impossible to get.

        • menaerus 2 hours ago ago

          > And the jobs which will remain will be impossible to get.

          Exactly my thoughts lately ... Even by yesterday's standards it was already very difficult to land a job and, by tomorrow's standards, it appears as if only the very best of the best will be able to keep their jobs and the ones in a position of decision making power.

  • burnermore 5 hours ago ago

    Something very odd about the tone of this article. Is it mostly AI written? There is a lot of references and info. But I am feeling far more disconnected with it.

    For the record, I was genuinely trying to read it properly. But it is becoming unbearable by mid article.

    • nerdsniper 5 hours ago ago

      Yes, lots of AI style/writing in this article. I wouldn't necessarily discredit an article just based on stylization if the content was worth engaging with ... but like you mentioned, when the AI is given too much creative control it goes off the rails by the middle and turns into what the kids call "AI slop".

      It resembles an article, it has the right ingredients (words), but they aren't combined and cooked into any kind of recognizable food.

      • burnermore 5 hours ago ago

        Thanks a lot for taking the time to confirm. Not hating on AI slop or something. But I do genuinely feel if he/she/they tried to invest time in writing it, people would consume and enjoy it better.

        Its hard to put my finger on it. But it lacks soul, it factor or whatever you want to call it. Feels empty in a way.

        I mean, this is not the first AI assisted article am reading. But usually, it's to a negligible level. Maybe it's just me. :)

    • gofreddygo 3 hours ago ago

      100% has that AI slop smell.

      intro... Problem... (The Bottom line... What to do about it...) Looped over and over. and then Finally...

      I want to read it, but I can't get myself to.

  • Eong 12 hours ago ago

    Love the article, I had a struggle with my new identity and thus had to write https://edtw.in/high-agency-engineering/ for myself, but also came to the realisation that the industry is shifting too especially for junior engineers.

    Curious about how the Specialist vs Generalist theme plays out, who is going to feel it more *first* when AI gets better over time?

  • mellosouls 15 hours ago ago

    On the junior developer question:

    A humble way for devs to look at this, is that in the new LLM era we are all juniors now.

    A new entrant with a good attitude, curiosity and interest in learning the traditional "meta" of coding (version control, specs, testing etc) and a cutting-edge, first-rate grasp of using LLMs to assist their craft (as recommended in the article) will likely be more useful in a couple of years than a "senior" dragging their heels or dismissing LLMs as hype.

    We aren't in coding Kansas anymore, junior and senior will not be so easily mapped to legacy development roles.

    • snovv_crash 4 hours ago ago

      Sorry but no. Software engineering is too high dimensional such that there is no rulebook for doing it the way there is for building a bridge. You need to develop taste, much like high level Go players do. This is even more critical as LLMs start to spit out code at an ever higher rate allowing entropy to accumulate much faster and letting unskilled people paint themselves into corners.

      I think of it a bit like ebike speed limits. Previously to go above 25mph on a 2-wheeled transport you needed a lot of time training on a bicycle, which gave you the skills, or you needed your motorcycle licence, which required you to pass a test. Now people can jump straight on a Surron and hare off at 40mph with no handling skills and no license. Of course this leads to more accidents.

      Not to say LLMs can't solve this eventually, RL approaches look very strong and maybe some kind of self-play can be introduced like AlphaZero. But we aren't there yet, that's for sure.

  • qsera 3 hours ago ago

    I would like to see how things will be when using AI would require half of a devs current paycheck.

  • jillesvangurp 42 minutes ago ago

    Change is a constant for software engineers. It always has been. If your job is doing stuff that should be automated, either you are automating it or you are not a very good software engineer.

    A few key fallacies at play here.

    - Assuming a closed world assumption: we'll do the same amount of work but with less people. This has never been true. As soon as you meaningfully drop the price of a unit of software (pick your favorite), demand goes up and we'll need more of them. Also it opens the door to building software that previously would have been too expensive. That's why the amount of software engineers has consistently increased over the years. This despite a lot of stuff getting a lot easier over time.

    - Assuming the type of work always stays the same. This too has never been true. Stuff changes over time. New tools, new frameworks, new types of software, new jobs to do. And the old ones fade away. Being a software engineer is a life of learning. Very few of us get to do the same things for decades on end.

    - Assuming people know what to ask for. AIs do as you ask, which isn't necessarily what you want. The quality of what you get correlates very much to your ability you ask for it. The notion that you get a coherent bit of software in response to poorly articulated incoherent prompts is about as realistic as getting a customer to produce coherent requirements. That never happened either. Converting customer wishes into maintainable/valuable software is still a bit of a dark art.

    The bottom line: many companies don't have a lot of in house software development capacity or competence. AI doesn't really help these companies to fix that in exactly the same way that Visual Basic didn't magically turn them into software driven companies either. They'll use third party companies to get the software they need because they lack the in house competence to even ask for the right things.

    Lowering the cost just means they'll raise the ambition level and ask for more/better software. The type of companies that will deliver that will be staffed with people working with AI tools to build this stuff for them. You might call these people software engineers. Demand for senior SEs will go through the roof because they deliver the best AI generated software because they know what good software looks like and what to ask for. That creates a lot of room for enterprising juniors to skill up and join the club because, as ever, there simply aren't enough seniors around. And thanks to AI, skilling up is easier than ever.

    The distinction between junior and senior was always fairly shallow. I know people that were in their twenties that got labeled as senior barely out of college. Maybe on their second or third job. It was always a bit of a vanity title that because of the high demand for any kind of SEs got awarded early. AI changes nothing here. It just creates more opportunities for people to use tools to work themselves up to senior level quicker. And of course there are lots of examples of smart young people that managed to code pretty significant things and create successful startups. If you are ambitious, now is a good time to be alive.

  • zqna 6 hours ago ago

    My question: are those people who were building crappy, brittle software, which was full of bugs and and orher suboptimal behavior, that were the main reasons of slowing down the evolution that software, will they now begin writing better software because of AI? Answering yes, implies that the main reason of those problems was that those developers didn't have enough time to spend on analyzing those problems or to build protection harnesses. I would stronly argue that was not the case, as the main reason is of intelectual and personal nature - inability to build abstractions, to follow up the route causes (thus not aquiring necessary knowledge), or to avoid being distracted by some new toy. In 2-5 years I expect the industry going into panic mode, as there will be a shortage of people who could maintain the drivel that is now being created en masse. The future is bright for those with the brains, just need to wait this out

  • streetcat1 9 hours ago ago

    For some reason miss two important points:

    1) The AI code maintainence question - who would maintain the AI generated code 2) The true cost of AI. Once the VC/PE money runs out and companies charge the full cost, what would happen to vibe coding at that point ?

    • haspok 2 hours ago ago

      Perhaps thinking about AI generated code in terms of machine code generated by a compiler helps. Who maintains the compiled program? Nobody. If you want to make changes to it, you recompile the source.

      In a similar fashion, AI generated code will be fed to another AI round and regenerated or refactored. What this also means is that in most cases nobody will care about producing code with high quality. Why bother, if the AI can refactor ("recompile") it in a few minutes?

    • NitpickLawyer 7 hours ago ago

      I think this post is a great example of a different point made in this thread. People confuse vibe-coding with llm-assisted coding all the time (no shade for you, OP). There is an implied bias that all LLM code is bad, unmaintainable, incomprehensible. That's not necessarily the case.

      1) Either you, the person owning the code, or you + LLms, or just the LLMs in the future. All of them can work. And they can work better with a bit of prep work.

      The latest models are very good at following instructions. So instead of "write a service that does X" you can use the tools to ask for specifics (i.e. write a modular service, that uses concept A and concept B to do Y. It should use x y z tech stack. It should use this ruleset, these conventions. Before testing run these linters and these formatters. Fix every env error before testing. etc).

      That's the main difference between vibe-coding and llm-assisted coding. You get to decide what you ask for. And you get to set the acceptance criteria. The key po9int that non-practitioners always miss is that once a capability becomes available to these models, you can layer them on top of previous capabilities and get a better end result. Higher instruction adherence -> better specs -> longer context -> better results -> better testing -> better overall loop.

      2) You are confusing the fact that some labs subsidise inference costs (for access to data, usage metrics, etc) with the true cost of inference on a given model size. Youc an already have a good indication on what the cost is today for any given model size. 3rd party inference shops exist today, and they are not subsidising the costs (they have no reason to). You can do the math as well, and figure out an average cost per token for a given capability. And those open models are out, they're not gonna change, and you can get the same capability tomorrow or in 10 years. (and likely at lower costs, since hardware improves, inference stack improves, etc).

  • xkcd1963 an hour ago ago

    Please dear developers be as lazy as possible and use LLMs. The amount of bugs that get shipped enable me a comfortable life as opsec.

  • bradleyjg 12 hours ago ago

    The bottom up and top down don’t seem to match.

    Where is all the new and improved software output we’d expect to see?

  • PraddyChippzz 13 hours ago ago

    The points mentioned in the article, regarding the things to focus on, is spot on.

  • mawadev 5 hours ago ago

    I mean it's pretty simple: management will take bad quality (because they don't understand the field) over having and paying more employees any day. Software engineer positions will shrink and be unrecognizable: one person expected to be doing the work of multiple departments to stay employed. People may leave the field or won't bother learning it. When the critical mass is reached, AI will be paywalled and rug pulled. Then the field evens itself out again over a long, expensive period of time for every company that fell for it, lowering the expectations back to reality.

    • falloutx 3 hours ago ago

      This is truly the problem: You either get fired or you get to work 10x more to survive. Only question is how many of us will be in 1st group and how many in the 2nd group, its a lose lose situation.

      • mawadev 3 hours ago ago

        Exactly. Some jobs moved from database, backend, frontend and devops to "fullstack", which means 4 jobs with the pay of one. People do that job, but with only 8h-10h in a day the quality is as expected. I think overall people will try to move out of the field, no matter how much of a force multiplier AI might be. Its simply a worse trade to carry so much responsibility and burden when you can work in IT or outside of IT in a less cognitively demanding field with set hours and expectations for the same pay (in EU, very hyperbolic statement tbh). Especially when the profit you bring dwarfs the compensation with all the frustrations that come with knowing that and being kept down in the corporate ladder.

  • tommica 7 hours ago ago

    One thing that fucks with juniors is the expecration of paying for subscriptions for AI models. If you need to know how the AI tools work, you need to learn them with your own money.

    Not everyone can afford it, and then we are at the point of changing the field that was so proud about just needing a computer and access to internet to teach oneself into a subscription service.

    • falloutx 3 hours ago ago

      This is why opencode is giving free access to one or two models, unlimited access.

    • boulos 6 hours ago ago

      You can get by pretty well with the ~$20/month plans for either Claude or Gemini. You don't need to be doing the $200/month ones just to get a sense of how they work.

      • tommica 5 hours ago ago

        Again, not everyone can afford it, and it becomes a hurdle. Computers are acquirable, but 20$ extra a month might not be.

        And yes, that plan can get you started, but when I tested it, I managed to get 1 task done, before having to wait 4 hours.

    • ares623 6 hours ago ago

      If the AI gets so good then they shouldn’t need to pre-learn.

  • globular-toast 4 hours ago ago

    This article suggests it is specialists who are "at risk", but as much more of a generalist I was thinking the opposite and starting to regret not specialising more.

    My value so far in my career has been my very broad knowledge of basically the entire of computer science, IT, engineering, science, mathematics, and even beyond. Basically, I read a lot, at least 10x more than most people it seems. I was starting to wonder how relevant that now is, given that LLMs have read everything.

    But maybe I'm wrong about what my skill actually is. Everyone has had LLMs for years now and yet I still seem better at finding info, contextualising it and assimilating it than a lot of people. I'm now using LLMs too but so far I haven't seen anyone use an LLM to become like me.

    So I remain slightly confused about what exactly it is about me and people like me that makes us valuable.

    • cowl 2 hours ago ago

      LLMs have read EVERYTHING yes. that includes a lot of not optimal solutions, repeating mantras about past best practices that are not relevant anymore, thousands of blog posts about how to draw an owl by drawing two circles and leaving the rest as an exercise to the reader etc.

      The value of a good engineer is his current-context judgment. Something that LLMs can not do Well.

      Second point, something that is being mentioned occasionally but not discussed seriously enough, is that the Dead Internet Theory is becoming a reality. The amount of good, professionally written training materials is by now exhausted and LLMs will start to feed on their own slop. See How little the LLM's core competency increased in the last year even with the big expansion of their parameters.

      Babysitting LLM's output will be the big thing in the next two years.

    • falloutx 3 hours ago ago

      I mean there is no strat that saves you 100% from it. The layoffs are kind of random, based on teams they dont see any vision for, or engineers who dont perform. Generalising is better imo.

  • keybored 4 hours ago ago

    Is there a Jeapordy for guessing prompts? Give an executive summary of GenAI trends where GenAI is the destiny and everything reacts to it. Touch on all “problems”. Don’t be divisive by making hard proclamations. Summarize in a safe way by appealing to the trope of the enthusiastic programmer who dutifully adapts to the world around them in order to stay “up to date”; the passive drone that accepts whatever environment they are placed in and never tries to change it. But add insult to injury by paradoxically concluding that the only safe future is the one you (individual) “actively engineer”.

    I’m not saying that this was prompted. I’m just summarizing it in my own way.

  • gassi 12 hours ago ago

    > Addy Osmani is a Software Engineer at Google working on Google Cloud and Gemini

    Ah, there it is.

    • falloutx 3 hours ago ago

      Ahhhh, this is like that guy who works at Claude Code and runs 100 agents at the same time to replace 100 juniors. Everyone is convinced he will be the last software engineer on earth.

  • ahmetomer 15 hours ago ago

    > Junior developers: Make yourself AI-proficient and versatile. Demonstrate that one junior plus AI can match a small team’s output. Use AI coding agents (Cursor/Antigravity/Claude Code/Gemini CLI) to build bigger features, but understand and explain every line if not most. Focus on skills AI can’t easily replace: communication, problem decomposition, domain knowledge. Look at adjacent roles (QA, DevRel, data analytics) as entry points. Build a portfolio, especially projects integrating AI APIs. Consider apprenticeships, internships, contracting, or open source. Don’t be “just another new grad who needs training”; be an immediately useful engineer who learns quickly.

    If I were starting out today, this is basically the only advice I would listen to. There will indeed be a vacuum in the next few years because of the drastic drop in junior hiring today.

    • falloutx 3 hours ago ago

      And you think juniors aren't doing this? At this point everyone in the market does more Vibe coding than those who are not in the market. Market is saturated most because Execs cutting jobs not because juniors are not good.

    • ares623 6 hours ago ago

      What. That’s written in a way that’s like “men writing women”. Not putting themselves in the shoes of a junior who has no context or almost no opportunities.

  • wakawaka28 14 hours ago ago

    The outlook on CS credentials is wrong. You'll never be worse off than someone without those credentials, all other things equal. Buried in this text is some assumption that the relatively studious people who get degrees are going to fall behind the non-degreed, because the ones who didn't go to school will out-study them. What is really going to happen generally is that the non-degreed will continue to not study, and they will lean on AI to avoid studying even the few things that they might have otherwise needed to study to squeak by in industry.

    • falloutx 3 hours ago ago

      The fundamentals of CS dont change and are more valuable to learn for the long term. Vibe coders think they can just bypass everything because they can ask a machine to write them a todo list.

      • wakawaka28 3 hours ago ago

        I think you're right but it's more like the theory and other thinking skills are harder to pick up on your own than particular technologies. You definitely still ought to learn both theory and particular tech skills, as they are not interchangeable. A person who only knows pure CS is difficult to employ as an engineer because programming entails particular technological skills.

  • doug_durham 14 hours ago ago

    The author has a bizarre idea of what a computer science degree is about. Why would it teach cloud computing or dev ops? The idea is you learn those on your own.

    • Ekaros 3 hours ago ago

      I am not sure abot devops. But Cloud Computing likely has lot of science behind it. When done properly. They are not any less complex systems to reason about than just code. And I mean it as understanding and designing cloud platforms. Not as deploying code to them.

    • happytoexplain 14 hours ago ago

      If that's "the idea", then clearly we need a more holistic, useful degree to replace CS as "the" software degree.

      • kibwen 13 hours ago ago

        Despite what completely uninformed people may think, the field "computer science" is not about software development. It's a branch of mathematics. If you want an education in software development, those are offered by trade schools.

        • AnimalMuppet 12 hours ago ago

          What I want is for universities to offer a degree in Software Engineering. That's a different field from Computer Science.

          You say that belongs in a trade school? I might agree, if you think trade schools and not universities should teach electrical engineering, mechanical engineering, and chemical engineering.

          But if chemical engineering belongs at a university, so does software engineering.

          • collingreen 12 hours ago ago

            Plenty of schools offer software engineering degrees alongside computer science, including mine ~20 years ago.

            The bigger problem when I was there was undergrads (me very much included) not understanding the difference at all when signing up.

          • xboxnolifes 10 hours ago ago

            Many do. Though, the one I'm familiar with is basically a CS-lite degree with software specific project design and management courses.

            Glad I did CS, since SE looked like it consisted of mostly group projects writing 40 pages of UML charts before implementing a CRUD app.

          • none2585 11 hours ago ago

            Saying this as a software engineer that has a degree in electrical engineering - software "engineering" is definitely not the same as other engineering disciplines and definitely belongs in a trade school.

            • menaerus an hour ago ago

              Right, because the guy sitting next to me and is designing a PCB for next copy of rPI is so much more for an engineer than the other guy designing a distributed computing algorithm? It shows that you only dealt with the trivial things in SE. There are very complex areas in both disciplines and as much as I can find trivial things in SE I can do the same for EE. Let's just not pretend it's a science fiction when it's not.

              • none2585 an hour ago ago

                Developing a distributed computing algorithm I think would squarely fall into CS. Engineering is the application of stuff like that.

          • pkaye 11 hours ago ago

            My university had Electrical Engineering, Computer Engineering, Software Engineering and Computer Science degrees (in additional to all the other standard ones.)

          • mxkopy 12 hours ago ago

            Last I checked ASU does, and I’m certain many other universities do too.

      • throwaway7783 13 hours ago ago

        The degree is (should be) about CS fundamentals and not today's hotness. Maybe a "trades" diploma in CS could teach today's hotness.

      • wrs 13 hours ago ago

        Cloud computing is not some new fundamental area of computer science. It’s just virtual CPUs with networks and storage. My CS degree from 1987 is still working just fine in the cloud, because we learned about CPUs, virtualization, networks, and storage. They’re all a lot bigger and faster, with different APIs, but so what?

        Devops isn’t even a thing, it’s just a philosophy for doing ops. Ops is mostly state management, observability, and designing resilient systems, and we learned about those too in 1987. Admittedly there has been a lot of progress in distributed systems theory since then, but a CS degree is still where you’ll find it.

        School is typically the only time in your life that you’ll have the luxury of focusing on learning the fundamentals full time. After that, it’s a lot slower and has to be fit into the gaps.

      • wakawaka28 14 hours ago ago

        There has to be a balance of practical skills and theory in a useful degree, and most CS curricula are built that way. It should not be all about random hot tech because that always changes. You can easily learn tech from tutorials, because the tech is simple compared to theory. Theory is also important to be able to judge the merits of different technology and software designs.

    • tibbar 14 hours ago ago

      Why is this necessarily true?

      • sys_64738 14 hours ago ago

        A CS degree is there to teach you concepts and fundamentals that are the foundation of everything computing related. It doesn't generally chase after the latest fads.

        • tibbar 12 hours ago ago

          Sure, but we need to update our definitions of concepts/fundamentals. A lot of this stuff has its own established theory and has been a core primitive for software engineering for many years.

          For example, the primitives of cloud computing are largely explained by papers published by Amazon, Google, and others in the early '00s (DynamoDB, Bigtable, etc.). If you want to explore massively parallel computation or container orchestration, etc, it would be natural to do that using a public cloud, although of course many of the platform-specific details are incidentals.

          Part of the story here is that the scale of computing has expanded enormously. The DB class I took in grad school was missing lots of interesting puzzle pieces around replication, consistency, storage formats, etc. There was a heavy focus on relational algebra and normalization forms, which is just... far from a complete treatment of the necessary topics.

          We need to extend our curricula beyond the theory that is require to execute binaries on individual desktops.

          • zipy124 40 minutes ago ago

            This is mostly software engineering not computer science though. That is but a small sub-section of computer science.

          • michaelsalim 9 hours ago ago

            I do agree that the scale has expanded a lot. But this is true with any other fields. Does that mean that you need to learn everything? Well at some point it becomes unfeasible.

            See doctors for example, you learn a bit of everything. But then if you want to specialise, you choose one.