13 comments

  • richk449 2 hours ago ago

    A colleague asked me for advice on being a manager. Amongst other advice, I suggested he read High Output Management.

    He came back a few days later and said that chatGPT knew what was in the book - was it okay if he just read the summary?

    I wasn’t sure what to say. It’s probably true that chatGPT can summarize all the main points from the book. But it has always been easy to find the key points from books on line. The hard part of being a manager is figuring out how to take the obvious instructions and act on them consistently.

    Maybe some people can do that just be reading the summary. For me though, reading the whole book is important. I find myself thinking back to the examples used to illustrate the points. And I find that repeating ideas in different ways as I read the book helps make them part of my mental framework. I read lots of interesting ideas in quick articles, but they rarely stick with me unless there is a specific translation to action.

    I ended up telling my colleague that it was up to him to decide how to learn best. If it was me, I need the book. But he needs to know his own learning system.

  • atrettel 4 hours ago ago

    Generative AI is just the new "bottom" in terms of quality. All that you have to do to compete against it is to be a little better than it. The question to me really is whether the quality of this new "bottom" is adequate. Sometimes it is for some people and for some applications, and sometimes it is not.

    I do not use it myself because I am a researcher and I often ask questions that don't have a lot of "training data" yet. And even if an area is well covered in terms of "training data", often there is a lot of "know how" that really isn't written down in an easily digestible form. It is passed verbally or through examples in person. So the idea that the "training data" is complete is also not true in general.

    Many other people in this thread have already covered that books are much more structured and organized than any answer generative AI gives you. Let me discuss another reason why books still matter. Books can give you a wider view than the "consensus" that something like ChatGPT gives you. I know a lot of books in my field that derive results in different ways, and I often find value in these different approaches. Moreover, suppose that only one book answers the question that you want answered but others gloss over that subject. Generative AI likely will not know precisely what one random book said on the subject, but if you were searching through multiple books on the subject yourself, you likely would pick up on this difference.

    Relevant Paul Graham quote [1]:

    > We can't all use AI. Someone has to generate the training data.

    [1] https://x.com/paulg/status/1635672262903750662

  • codingdave 11 hours ago ago

    LLMs do not generate new content, they just shuffle old content together in new ways. So no, it does not kill an industry of people creating new original content. And authors only need to worry about it if they were not adding anything new to the world to begin with, and were instead relying on marketing to sell re-packaged existing content.

    • runjake 6 hours ago ago

      As we progress into our inevitable AI future, I have to wonder whether good source materials (like books) will more or less die out and AI-generated content will be shuffled so much that it’s nonsensical and useless, thereby kicking off a new cycle of human-generated output.

      I never left RSS but social media like TikTok and X have me wondering if I’m ever reading human output or I’m just consuming and interacting with AI systems.

      I recently had a very red-pill dystopian experience where I figured out someone I interacted with on X was an AI. It slipped up on a response that I recognized as a common LLM idiom. Further inquiries confirmed. It turns out that a lot of X in particular is AI bots.

      • galfarragem 5 hours ago ago

        I suspect 80% of my X followers are AI bots.

  • al_borland 11 hours ago ago

    Books exist for people who want in depth information with the full context, in an organized manner.

    Short forms have always been available, it be blog posts, Wikipedia articles, cliff notes, or other such things. Books survive, because source material is needed to generate all of those other things, and those short form versions don’t cut it for everyone. I don’t see LLMs as any different.

    A book can tell you something you didn’t know. With an LLM you need to know enough to ask.

    • benrutter 11 hours ago ago

      Yeah I completely agree with this! I really like books for learning because they do exactly this.

      Take the Rust book, you have a neat and organised collection of the majority of Rust features you're likely to use.

      An LLM might be handy in answering questions like "Why is the borrow checker failing this code?" but that's a really different proposition to getting a detailed and complete summary of Rust's key features. It could maybe output something along these lines, but I think the output would be considerably less usable and reliable than a book.

      • skydhash 5 hours ago ago

        The moment you gain some expertise in a subject, LLMs fail horribly. Because you will have a mind model of the thing you're working with, LLM won't be able to solve what you can't, as it will often require a deep dive into the internals. And that's when you want a complete reference/manual nearby. As for boilerplate, most experts have project templates or can extract one from an old project.

  • galfarragem 5 hours ago ago

    I would rephrase it as: AI is shrinking the market for average human "creatives". Unless you are an outlier, adapt or perish.

  • latexr 11 hours ago ago

    Authors compete by being competent, doing research, and outputting factual information. Or just, you know, being original. In a world where LLMs can’t even differentiate between a recipe and an old Reddit joke and tell you to put glue on pizza, it is absurd to think they “killed the book industry”.

    What’s with this bloody obsession of killing other products and industries? Every time someone farts in tech, everyone starts shouting that it just killed something else. Calm down. Relax a little bit and get some perspective. You’re drowning yourself in the Kool-Aid.

    LLMs did not kill the book industry, just like Bitcoin did not kill the world’s financial system.

  • Mehticulous 11 hours ago ago

    The smell of paper, ink and binding glue.

    The feel of quality paper.

    The way the spine cracks when you first open a book.

    The way the spine creases after you've read a book a few times.

  • nonrandomstring 11 hours ago ago

    I think you might be confusing different activities that look similar. Search, research, exploratory reading, browsing, fact checking, cross referencing, debunking, genealogy, making etymological and epistemological connections are all different things. As an author and researcher I produce and consume a lot more types of connections and paths than a simple neural net that can make fast associations on past training material can offer. YMMV.