60 comments

  • raesene9 18 hours ago ago

    I really don't understand why CEOs like this who are AI boosters try to justify the idea that this will create more developers, when that seems exceedingly unlikely, even if they are correct in their predictions of productivity increases.

    The quote from the article "if you 10x a single developer, then 10 developers can do 100x." implies that companies have 100x the things to be developed productively, what if that isn't the case?

    What if companies actually have about as many systems as they need, if you really can 10x your existing developers, then that would predicate a cut not an increase.

    Of course the follow on from that, for suppliers who have per-seat licensing is that they'll need to find some other way to monetize if there are fewer seats to be sold, I guess they could start charging AI Agents as "developer seats"....

    • ben_w 18 hours ago ago

      This line is the difference between your thinking and people like them:

      > What if companies actually have about as many systems as they need

      They think ~"companies don't have as many systems as they need".

      My opinion? Even without GenAI, our sector has too many cooks.

      I've not valued most of the stuff that was added to any consumer facing software product or service in the last decade or so. Facebook has, if anything, regressed since 2007. YouTube (itself, no comment on the content) peaked sometime around 2016. SwiftUI (2019) still isn't an improvement over UIKit. Google Maps is very marginally worse than it was in 2018. MacOS has improved so slowly since it was named after cats (2012) that it's hard to even care about the aggregated changes since then (even if I ignore all the bugs*), and similarly for Google Translate since they bought and integrated Word Lens in 2014.

      Entirely new things are sometimes (like ChatGPT or Stable Diffusion) genuinely interesting and make a novel departure from what came before, but it's very rare.

      * https://benwheatley.github.io/blog/2025/06/19-15.56.44.html

      • queenkjuul 17 hours ago ago

        macOS has very much changed for the worse since those days imo. System preferences is a cluttered mess, they've made it nigh impossible to run unsigned binaries, they've been changing CLI tools without documenting them which has broken some of my build scripts.

        Though we got dark mode, so, maybe actually it is better

        • ben_w 17 hours ago ago

          > Though we got dark mode, so, maybe actually it is better

          One of the bugs I didn't list in that blog post, was that dark mode sometimes only partially activates — I can get some (not even all!) dark mode icons on an otherwise light mode display.

    • silverlake 18 hours ago ago

      There is a mountain of code that needs to be written that can’t be due to costs. A project that needs 100 developers may be prohibitively expensive. But 10 10x developers would be within budget. Think health care, manufacturing, government, finance, etc.

      • ath3nd 16 hours ago ago

        There is also a mountain of code that shouldn't have been written, and increasingly it's Ai generated.

    • WithinReason 18 hours ago ago

      > what if that isn't the case?

      Why wouldn't it be? I believe programming is very much susceptible to Jevons paradox. I'm coding things I never would have started working on a year ago because it either would have taken too long or would have been simply impossible without an LLM (like and MCP server).

      • raesene9 18 hours ago ago

        I'm not saying more code won't be produced, I'm saying companies might only want and make money off a certain number of systems with a given amount of code, and therefore they'll need to employ fewer developers to create that code, if the productivity enhancements are borne out.

        For example, say you're the CEO of one of the many many companies that view development costs as an overhead. Your CTO comes to you and tells you "hey we've just 10x the productivity of all our developers", do you think that conversation leads to the CEO saying "great hire more developers"?

        • sundaeofshock 18 hours ago ago

          To make it even more explicit, if the company has 100 developers, the CEO’s response will be “Great! Layoff 90 developers.”

          • ath3nd 16 hours ago ago

            Great, more work for us as consultants and security researchers when your vibe coded system gets its api keys on a public s3 bucket.

            • raesene9 16 hours ago ago

              (speaking at least for myself) I'm not suggesting this is a good outcome, more that having seen the way many companies handle development and IT, it's quite a likely one.

              FWIW I'd agree that (for now at least) it's challenging to get LLM/AI created code to avoid security vulnerabilities without careful prompting and review. Whether that's a fixable problem or whether it'll just generate mountains of new CVEs, bug bounties and ransomware attacks, remains to be seen.

    • ryandvm 14 hours ago ago

      > implies that companies have 100x the things to be developed productively, what if that isn't the case?

      Good point. I think that's demonstrably already a problem. Software products are notoriously prone to enshittification in part because successful companies build up vast development teams that have to somehow justify their existence with continuous feature churn.

  • agos 19 hours ago ago

    I see a worrying trend of CEOs using openly hostile language towards employees, even in those companies who should be a bit better. I wonder if they have an endgame and what it is. Voluntary resignation?

    • JohnFen 17 hours ago ago

      Yes, the mask is off and the contempt that these CEOs have for people is no longer hidden.

    • dude250711 18 hours ago ago

      Cognitive dissonance: mates at CEO retreats say it's the next industrial revolution, people on the ground say it's just another power tool in the box.

    • ath3nd 16 hours ago ago

      There is a great article by Tom Renner about the "inevitability" of LLMs.

      https://tomrenner.com/posts/llm-inevitabilism/

      Of course, when phrased with adversarial language, these self-proclaimed gurus and "leaders" are basically saying: "We know everything and you know not too much, get on board or get behind", to boost their sales. Many times these futures didn't work out:

      - Elon's self driving cars

      - Elon's Boring Tunnel

      - NFTs (mega flop)

      - Web3/blockchain (it happened but wasn't the revolution and future they were proclaiming)

      These are just salespeople, they have no idea what they are talking about, they just want to create a need for their product and sell it to you.

      Would you listen to Linus Torvalds: https://www.theregister.com/2024/10/29/linus_torvalds_ai_hyp... or to some guy you've never heard of so far. Same for the creator of curl, a tool you use every day: https://www.theregister.com/2025/07/15/curl_creator_mulls_ni.... Nah, this guy is talking BS, as many before him.

    • surgical_fire 18 hours ago ago

      Nothing new. People who climb the corporate ladder to C-level are all sociopaths. They always have nothing bit contempt for the peasantry they unfortunately have to employ to make their millions in compensation.

      It's only trendy now to say those things publicly without PR and media training filter.

  • runemadsen 18 hours ago ago

    I just spent 25 minutes trying to install a command-line tool called citus_dump that Cursor told me could help port data between Postgres citus clusters only to find out that the tool doesn't exist and that the language model just made it up. AI-assisted coding is definitely enticing, but we have so long to go until you can trust AI-generated code in production environments.

  • bitmasher9 19 hours ago ago

    I’d like to hear this from a developer that went through this path, instead of someone selling AI.

    • JohnFen 17 hours ago ago

      Not a developer, but numerous devs chosen as a statistically representative sample.

      • bitmasher9 10 hours ago ago

        I actually want a seasoned developer that has 10xed with AI to give a conference talk about how they use AI with a live coding demonstration. If the productivity utopia is a hard path few travel then there might be a delay until the majority of devs get there. I want one or two people to show me the way.

        Right now I think it might just be a 1.5x boost, and most of the people claiming it’s more are just shovel sellers overstating the quality of near by mines.

  • herbst 19 hours ago ago

    Anyone remember when GitHub was still standing for development culture and community in a way?

    • 17 hours ago ago
      [deleted]
  • GOTO95 18 hours ago ago

    I simply got off of GitHub, no need to get out of my career. Why would I use a service who's CEO publicly threatens me?

  • uptownJimmy 19 hours ago ago

    Surely GitHub can (and should) find someone less unhinged to run the company.

  • frou_dh 18 hours ago ago

    I know some operations that have not even embraced source control yet and are still running, so I'm skeptical about being a laggard being so devastating.

    • 17 hours ago ago
      [deleted]
  • goalieca 18 hours ago ago

    I’ve been around long enough to see that 10x developers aren’t writing 10x code.

    • JohnFen 17 hours ago ago

      I've been around long enough to notice that there is no such thing as a "10x developer". Not in any numbers greater than a rounding error, anyway.

  • jaredcwhite 2 hours ago ago

    > Either you embrace AI, or get out of this career.

    OK. I quit. Bye!

    • jaredcwhite 2 hours ago ago

      How it started: platform starts up to help people share the code they wrote with each other.

      How it's going: platform wants to stop people from sharing the code they wrote with each other.

  • soraminazuki 17 hours ago ago

    > Dohmke argues computer science education is obsolete. "Students will rely on AI to write increasingly large portions of code. Teaching in a way that evaluates rote syntax or memorization of APIs is becoming obsolete."

    I'm not taking advice from a person who thinks computer science is about, uh, memorizing syntax and APIs.

    It's also hilarious how he downplays fundamental flaws of LLMs as something AI zealots, the truly smart people, can overcome by producing so much AI slop that they turn from skeptics into ...drumroll... AI strategists . lol

  • alex_suzuki 19 hours ago ago

    Domain name checks out.

  • vouaobrasil 18 hours ago ago

    I already got out of the tech career. But not because I didn't embrace AI – even though I never will. It's exactly because of people like Thomas Dohmke. What an irritating and contemptible individual.

    • assword 18 hours ago ago

      > What an irritating and contemptible individual.

      That’s just a majority of people in the industry. If there any reason to leave, that’s it.

      • vouaobrasil 18 hours ago ago

        True, but the degree to which they are so is proportional to the power and wealth they have.

  • lowsong 19 hours ago ago

    A blog post published by a company that's built their business on AI (Final Round AI), extrapolating from a comment by a CEO who's also in the business of AI (GitHub) and actively wants to sell you more expensive AI options, tells me I must use AI. It's hard to get more biased if you tried.

  • benterix 18 hours ago ago

    What is funny is that these poor Github employees wouldn't have to go through this shit[0] if they hadn't been bought by Microsoft.

    [0] https://news.ycombinator.com/item?id=44050152

  • assword 18 hours ago ago

    I’d love to, unfortunately there’s no other fields for me to run off to that are going pay remotely close to middle class money.

    If I could find something that even pays half of what I made as a developer, I’d consider it.

  • Trasmatta 18 hours ago ago

    AI is more capable of replacing a shitty CEO than a good senior developer

  • jstummbillig 18 hours ago ago

    Well, in terms of it being a career that seems obviously true (idk about the timeline).

    The developers resisting AI have a hard time understanding that "artisanal" is not real label for software developers. We are not glas blowers. We are not artists. We solve problems and there are infinitely many sufficiently good ways to solve a problem. Having opinions about how is only helpful in so far it helps solves the problem. It has no other purpose to the people paying for the product. If your opinion makes the good enough product more expensive then you make your work less desirable.

    As long as people are competing in the labor marketplace, that is just not a winning tactic.

    • benterix 18 hours ago ago

      I would agree with you if they were significantly better, now these tools can just drive you insane.[0]

      [0] https://news.ycombinator.com/item?id=44050152

    • davydm 18 hours ago ago

      all good and well if the ai agents actually solve the problem (not just delete your database or skip your tests) and do so at a lower overall cost (from the business perspective) - which also seems unlikely considering the studies showing:

      1. opensource devs who ran on 246 (iirc) problems with ai assistance all projected that they would save around 20% of their estimated time, but all went over by about the same margin - salaries cost money, people are paid per _time unit_, not per deliverable (though if we were, I think there'd be a lot fewer ai-first devs out there - see sunken time cost above)

      2. studies of opensource code monitoring error rates for ai-assisted repos showed a much higher defect rate than those "artisinal" coders you disdain so much - this is likely a corollary to (1) where in that case, devs have to debug ai-generated code that they could have written faster themselves

      And the whole argument forgets about the upskill issue that will become more prevalent in the future: if we're replacing low-skill workers with ai agents, even assuming that those agents can do a better job (and when you actually _measure_ it, instead of just _thinking_ it, it turns out they don't), we end up with a massive skill gap as no-one upskills when they're replaced.

      It's good for me - I'll probably still be able to get work when I'm 80 (anyone remember y2k?) - but it's bad for the industry as a whole, and definitely bad for any entry-level coders who were going to become great, but couldn't, because they couldn't learn on the job they didn't have.

      As for competing in the labor market - we'll see how that goes as more and more companies are starting to realise that the "savings" promised to them by ai code agents not only don't materialise, but are often actually _losses_. Again, see (1) and (2) above. I don't have links, but they should be relatively easy to find - I found them all right here.

      "ai" in it's current incarnation (glorified token predictors) will never surplant skilled devs, simply because it cannot understand anything. It can't understand the domain, it can't understand the users, it can't understand the code. It has no concept of understanding. All it knows is "sequence a,b,c likely is followed by d", just on a larger scale. If it can do a fully-functional, unbuggy bit of code for you, it's simply because it's seen that exact code before, and in that case, it's robbed you of learning anything because the journey to finding that code, or the docs that spawned it, is part of the learning process. Much like we always scolded devs who copy-pasta'd from Stack Overflow, we should hold these agents to the same standard.

      Please, please, at least if you're going to use these tools, _don't trust them_. Scrutinise everything. Personally, having to double check a confidently wrong junior dev all day (one which won't learn from its mistakes either) sounds like a step down from actually creating stuff.

  • noodletheworld 18 hours ago ago

    > You know what else we noticed in the interviews? Developers rarely mentioned “time saved” as the core benefit of working in this new way with agents.

    > They were all about increasing ambition. We believe that means that we should update how we talk about (and measure) success when using these tools

    I’m struggling to understand what this means.

    • alex_suzuki 18 hours ago ago

      I read it as "Don't concern yourself if you're not seeing any benefits on the bottom line (time saved), you dumb peon. You should be reaching for the stars and single-handedly slopping out the next Photoshop with the insane power we are giving you."

    • rsynnott 18 hours ago ago

      "increasing ambition" is a _far_ vaguer metric than "time saved", and thus a more useful metric to boosters if "time saved" isn't really working out for them.

  • sghiassy 17 hours ago ago

    Yeah Linus! You better let AI write all the kernel code now… because nothing could ever go wrong with that

    I use arch Linux btw

  • zamalek 18 hours ago ago

    I wish his own developers would take him up on the advice, but the market out there is tough and jobs where CEOs'/CTOs' eyes aren't rolling into the back of their head from AI hype are rare.

  • mr90210 19 hours ago ago

    > Our latest field study with *22 developers* who…

    This is the person they have running GitHub?

    (Yes, Ad Hominem)

    • dude250711 19 hours ago ago

      How much vision do you expect from a salaried non-founder outsider like this?

      He will just follow Satya's movements blindly.

      • mr90210 17 hours ago ago

        Good point. I expect a person in such a position to know when not to say anything.

        A world with less developers is likely to harm the business he runs. Regardless, his tweet adds no value to anyone including him.

  • tjpnz 18 hours ago ago

    Would their CEO be prepared to step into a car with the knowledge that all of its safety critical systems were vibe coded?

  • davydm 18 hours ago ago

    another muppet has entered the arena

    no, you don't have to "get out of this career", but it's probably prudent to get out of these "ai-first" companies before the company (or at least department) implodes and you don't have a job anyway

  • jmclnx 18 hours ago ago

    I have advice for this CEO too, but I like not being flagged :)

    Thanks to what Microsoft did to Github, I moved to Gitlab. So I doubt I will ever take advice from a CEO, never mind this person. FWIW, most CEOs come out of marketing, so that alone tells you how much he really knows about development.

  • 18 hours ago ago
    [deleted]
  • 1oooqooq 19 hours ago ago

    says guy with NVDA stocks...

  • tempfile 18 hours ago ago

    > if you 10x a single developer, then 10 developers can do 100x

    I think he might be drunk

    • NoGravitas 16 hours ago ago

      And not at the Ballmer Peak, either.

  • giraffe_lady 19 hours ago ago

    No.

  • ath3nd 18 hours ago ago

    Nah dawg, I'll pass. /s

    I think it's reasonable to be sceptic towards a guy who:

    - sells you a product (Copilot) that depends on him selling you the story that LLMs are the future

    - has a lot of vested interest in Microsoft's bet on LLMs to succeed

    On his own platform developers that I much more look up to have gone the other direction and have disabled AI PRs and/or bug bounties on their repos:

    - Daniel Stenberg (curl) https://www.theregister.com/2025/07/15/curl_creator_mulls_ni...

    - Linus (linux) https://www.theregister.com/2024/10/29/linus_torvalds_ai_hyp...

    - or socket.dev's quest to prevent copilot generated PRs (https://socket.dev/blog/oss-maintainers-demand-ability-to-bl...) this might have struck a nerve

    So who should I believe? A guy who I've never heard the name of (apparently the Github CEO's name is Thomas Dohmke, I just learned), or a bunch of legends whose tools I have been using since a child?

    No to slop! Down with slop.