122 comments

  • ndiddy a day ago ago

    The other day I read this piece on how AI is already being used in schools, and it left quite an impression on me. https://archive.is/IW4B3

    > The Chromebooks, which the students use in every class and for homework, came pre-installed with an all-ages version of Gemini, a suite of A.I. tools. When my daughter, who is in sixth grade, begins writing an essay, she gets a prompt: “Help me write.” If she is starting work on a slide-show presentation, the prompt is “Help me visualize.” She shoos away these interruptions, but they persist: “Help me edit.” “Beautify this slide.” The image generator is there, if she’d ever wish to pull the plug on her imagination. The Gemini chatbot is there, if she ever wants to talk to no one.

    I'm not as anti-AI as the author of the piece, and I think that AI could have a role as a teaching aid. It's infinitely patient and it's able to adapt to a student's needs better than a textbook. Still, I hate the idea of students being encouraged to entirely offload their cognitive work onto an online service rather than think for themselves. The point of making fifth graders write essays, make art, design presentations, etc isn't the end product, it's that they now have the experience of having done the assignment. I would rather see students get taught how to think creatively, analyze a piece of writing, coherently explain an opinion, or draw a picture on their own, instead of giving this up in exchange for the nebulous skill of being "AI native" (aka being able to ask a computer to produce work for you).

    • NewsaHackO a day ago ago

      Yeah, I cannot imagine how anyone could learn anything well with access to AI. I am grateful that I finished my schooling before AI hit mainstream, because it is just too easy to turn your brain off and just AI a question before thinking about it. Great for getting things done, useless for learning. I guess hallucinations still keep us on our toes.

      • fasterik a day ago ago

        "Useless for learning" is just wrong. I've found LLMs immensely useful for directing my learning projects. Of course, a lot of the actual learning must come from doing things and puzzling through them myself. But I now find LLMs to be indispensable in finding out what I need to learn to accomplish a task, finding keywords to search on Wikipedia or in textbooks, and answering questions when I'm confused about something.

        • NewsaHackO a day ago ago

          Part of the difference in your case is the motivation for learning. Many of us in grade school had a motivation to get good grades/pass a class outside of the pursuit of knowledge. Even for those of us that really liked to learn, it was usually directed at a certain subject matter and not everything that we would need to be successful as adults (I loved math, but would never willingly write an essay if I could get away with it). Because grade school kids are "forced" to learn things they do not want to, they always look for the easiest way to get through the material, and AI provides a way to do this.

          • fasterik a day ago ago

            I agree with your general point, but if people are going to use AI regardless, the question is whether we should teach young people how to use it effectively. If they don't learn this, they're more likely to use it a way that hampers their development.

            Now, I don't know at what level that should begin. Probably somewhere around the high school level, when they're learning to do research projects and synthesize information from multiple sources, is when teaching AI literacy will be most important.

            • ryanobjc a day ago ago

              What value to a person does teaching "how to use it effectively" deliver?

              How does that benefit their development, learning, society as a whole?

              Before you start in with "it'll help them get a job", full stop - education as a public good isn't strictly vocational technician work. It's not a work training for companies.

              • fasterik 21 hours ago ago

                For the same reason that we should teach people how to use a library, or a search engine, or an academic database. The tools for information retrieval are constantly evolving, and in a democratic society it's important that people learn how to educate themselves on a continuous basis throughout their lives. If you use AI properly, you can learn things that you wouldn't have had the time or skillset to learn otherwise.

        • cmiles74 a day ago ago

          It's worth remembering that this isn't that. What the poster describes is constant pushing from the Chrome OS designed to train dependence on the tools and to essentially checkout of the education process. In my opinion this is definitely useless for learning.

      • basch 6 hours ago ago

        Using AI in an intentional way with purpose and direction should be great for practicing thinking.

        The right way to teach children to use AI is to teach them to scope, filter, design process, edit, refine. How to ask a question, how to think through steps, how to use language to describe all these things. Each kid has something that can think and respond as fast as they input.

        The goal should be to perfect sequence and iteration, not skip to final output.

        These skills also should NOT be framed in some kind of "teaching AI" as much as teaching communication and critical thinking and analysis. It is the exact same skills you need to solve problems and interact with humans.

      • jazzcomputer 18 hours ago ago

        I'm an adult with a fairly balanced view of AI and I find it difficult to learn coding without occasionally using AI to help me navigate to the most relevant bits of MDN or help me check if my thinking on an approach is correct (it's all entry level stuff so should be well represented in the training data).

        I find it easy to to into a long chat with an LLM about some project I'd like to try and what's involved with it. I find it easy to get into a chat with an LLM about a lot of things as a kind of unproductive excursion that my brain tells me at the time is 'useful'. I'm of average will, so I dread to think how this will work out with children who get to 'partner up with AI to assist them' or whatever marketing speak is used to obfuscate their goals. Then combine that with social developmental issues or below average focus.

        It's bleak because the more entangled they get with the system the more they'll seek to push back regulations.

      • Reubend 15 hours ago ago

        > Yeah, I cannot imagine how anyone could learn anything well with access to AI.

        You must not have much of an imagination then. Or maybe you're just being overdramatic? AI is arguably the best way to learn most subjects now. Frontier models have made a lot of progress on reducing hallucinations, and AI can teach you at whatever pace you're capable of learning it. There are very few topics it can't teach, and it can go into more depth than you'll find in any textbook.

        • dingaling 15 hours ago ago

          "and it can go into more depth than you'll find in any textbook"

          How does it manage that, when it it only knows as much as is written in text books?

      • esafak a day ago ago

        I would not say that. My child asks the AI factual questions the same way she would ask an adult. That's one kind of learning. There are others, of course.

        • NewsaHackO a day ago ago

          When I say AI, I obviously don't mean using AI like people used to use search engines. Of course asking it factual questions like it is an encyclopedia is okay.

    • Joeri a day ago ago

      It’s such a lazy way of integrating AI as well, as if they asked AI to do it.

      Why has no one tackled the Young Lady’s Illustrated Primer? We know what AI-enhanced education should be, and we finally have the tech to build it.

      • sollewitt 17 hours ago ago

        The missing bit is a representation of knowledge, and a way to represent a learner’s comprehension.

        Even if you shortcut by synthesizing a textbook in every major topic - that’s just one arbitrary representation, and the way topics overlap is outside of the material.

        I am very interested in this though, if anyone has references to relevant research I’m all ears.

      • tadfisher a day ago ago

        "You're absolutely right, Nell. I shouldn't have confused ethylene glycol with propylene glycol. Would you like to know more about funeral services?"

    • techjamie a day ago ago

      > If she is starting work on a slide-show presentation, the prompt is “Help me visualize.” She shoos away these interruptions, but they persist: “Help me edit.” “Beautify this slide.”

      To be fair, making slideshows sucks and I've never met anyone that actually enjoys the experience. I'm sure some people out there enjoy it, but anything that gets me out of PowerPoint faster is a win in my book.

      • neilv a day ago ago

        If you care about the information and communication, and you think you can do a good job of the slide deck if you think through it for this venue and audience -- and maybe even have new insights by going through the process -- then it can be enjoyable/rewarding.

        But I've also seen situations in which the presenter doesn't care, or the slides are just a backdrop for some better communication/selling/maneuvering they're doing, or they know the information is bogus or the presentation pointless, or they know the audience doesn't care, or for everyone it's just a meeting to be able to say you had a meeting.

        I'd guess that at least half the current use of LLMs is for "cheating on your homework" tasks, in which the person prompting it simply doesn't care -- whether it's for schoolwork, professional work, or socializing.

      • Izkata 17 hours ago ago

        There was one I did have fun making:

        Back around 2007 it was about AI, and for part of it I'd memorized like 2-3 minutes of what I was going to say, along with careful timing. The plan was that I'd start wandering around, including around the back of my laptop so I wasn't looking at the projector or laptop screen - and during that time, robot characters would wander onto the screen and start running around. The idea was I wouldn't look back until right after they hid themselves. I think I'd even put a spot in the middle where I could glance back while they were hidden, then they'd come back out.

        I don't remember why, but I never got to present it.

      • dubya a day ago ago

        David Byrne seems to like it: https://newsarchive.berkeley.edu/news/media/releases/2005/03...

        I haven't seen his actual slide deck anywhere online though.

      • alphabeta3r56 a day ago ago

        There are two parts of making a slide show: 1. Visualizing what you want to show 2. Finding tools to show it

        Developing 1 means you need to start with pen and paper without any pollution from existing tools. You might read and experiment beforehand and use as many tools as possible (including AI) but at least once before you make thr final version you should sit down and just think from scratch what you want to show.

        Then you can use AI for 2. Teaching kids 1 is very difficult while simultaneously giving them access to AI since they are too young to develop that self control.

      • Semaphor a day ago ago

        Back when I created them (high school), I enjoyed it, because it was about making an appealing presentation about the data we researched.

    • ekjhgkejhgk a day ago ago

      > I'm not as anti-AI as the author of the piece,

      It's not about being anti-AI, it's about being anti-distractions in education.

      These companies don't want to raise "AI literacy", they want to get to their future users young.

    • butlike a day ago ago

      It's the same thing as shoeing away Clippy, right? I don't know, I'm a little out of the loop. I do feel like there's some societal backlash to technology that's cascading down to the younger generations now that the negativity of social media is bearing fruit. Right?

    • red_admiral a day ago ago

      We had chromebooks in schools before AI - or iPads, depending on the area. We're about to repeat that disaster.

      • gwerbin a day ago ago

        It's never been about making education better. It's about building and legally mandating a money hose directly from your town property tax levy to accounts receivable at Google and Microsoft and OpenAI.

    • nektro a day ago ago

      dystopian :( i hope schools put more pressure on keeping that off their devices. or switch to neos.

    • iusadfkjasdf 17 hours ago ago

      [dead]

    • biophysboy a day ago ago

      [dead]

  • samagragune a day ago ago

    The conflict of interest is pretty obvious. OpenAI, Google, and Microsoft are backing a bill that funds teaching kids to use... OpenAI, Google, and Microsoft products. "AI literacy" as defined in the bill is literally "the ability to use artificial intelligence effectively." That's not literacy, that's onboarding lol. Real digital literacy teaches how systems work, who profits from them, and how to think critically about them. This bill will in practice hand curriculum design to the same vendors who endorsed it. Teaching kids to prompt ChatGPT is not the same as teaching them to understand what ChatGPT is. Nobody funding this wants the latter.

    • devanshranjan a day ago ago

      we'd have raised a generation of users not builders. thats exactly whats about to happen with AI if this passes as written.

      • andrekandre a day ago ago

        well, tv got that ball rolling a few generations ago; people learn by doing not watching something else do it for them, now ai will not just tell them or show them but do it as well it would seem...

        • duzer65657 a day ago ago

          this is what apple products did as well. They turned something that took some effort and desire (i.e. using a computer) into streamlined entertainment. Watch someone use their phone; it's passive, one-way consumption.

          • andrekandre 17 hours ago ago

            apple is an especially sad case in some respects as steve used to call the mac "a bycicle for the mind" and NeXT was originally a computer/os for higher-ed... but with products like the iphone, locking it down and keeping things passive is the best way to monetize it (sadly)

        • rexpop 18 hours ago ago

          [flagged]

    • gosub100 a day ago ago

      wouldn't it be fun to hear the lobbyists' reactions to "Great idea, we'll run an instance of DeepSeek locally in the school's computer lab, trained with data from each individual school for a personalized experience. Thanks for the money, guys!"

  • schnitzelstoat a day ago ago

    It reminds me of the 'IT Literacy' classes we had when I was in high school where they just taught us to use Microsoft Office products.

    • Kadecgos a day ago ago

      A lot of those were definitely sponsored by MS and co as well, but at least you did learn a practical, transferable, morph-able skill. You'll come out of that with experience using the features and structures of a general purpose OS, as well as the workflow of mode-base production software (in some cases). Excel at least is also just such a powerful 'everything' tool that I'm not even that mad about it.

      'AI Literacy' is just very much not that at all and is just state-mandated brain rot.

      • HeWhoLurksLate a day ago ago

        I was started on learning how to make PowerPoint presentations and present them in kindergarten, and I'm incredibly thankful for that. More broadly, building a slide deck is a critical part of public speaking and presenting and helps kids out a lot.

        In third grade I got taught how to type properly and hit 60-70 WPM, which is roughly where I still type to this day when doing tasks that require thinking instead of just doing a pre-compiled speed benchmark.

        Kids really need to learn the fundamentals of things, but on the other hand some of the same arguments came out when calculators were going mainstream and classes just evolved to take the new tools into account. I think eventually we'll see the same thing happen with AI, but I'm not sure what that will look like for every case yet. Probably more paper and pencil work tbh

        • strange_quark a day ago ago

          I hate the calculator argument. Kids still need to learn how to do basic arithmetic by hand. There's a reason that CAS calculators are banned on standardized tests. Even in college, I had classes where profs would force us to do complex calculus by hand even though Mathematica could spit out the answer. Understanding things from first principles is important, and probably even more so with AI!

      • maniiijiii a day ago ago

        [dead]

    • linguae a day ago ago

      We had the same requirement at my high school in Sacramento back in the early 2000s. I was given the option to test out of it, since I already knew how to use Office, which I had been using at home since fifth grade for reports and presentations. I had to study harder for Excel and Access, since most high school students don’t need sophisticated spreadsheets or databases, but I passed the exam on my first attempt.

      A far better computer literacy course was the one I took at Sacramento City College as a dual-enrollment student in summer 2004, which was the prerequisite to programming courses. Even though I already knew how to program in QBASIC, Visual Basic 6 and C++, I still had to take this course. Anyway, we learned very basic computer architecture (the roles of the CPU, memory, storage, buses, etc.), the role of the operating system and the difference between it and applications, computer networking, the Web (with an introduction to HTML and CSS), the history of computing, and a brief introduction to programming, with exercises in C++ and even Scheme (the professor showed us his copy of SICP and threatened students who talked during his lectures with Scheme homework assignments).

      It was a fun class. The professor knew I was a Linux fan, but I had a hard time downloading a distro at home due to my having dial-up. He gave me some FreeBSD install CDs. I became a fan of FreeBSD since, and exploring FreeBSD led me down a rabbit hole where I devoured the history of Unix and BSD. By the time I graduated from high school, I wanted to be a systems software researcher like Ken Thompson and Dennis Ritchie. This shaped my early career; I’ll never forget meeting Marshall Kirk McKusick my senior year of college at USENIX FAST 2009.

      Turned out that computer literacy course I was required to take at Sacramento City College despite having computer literacy had far-reaching impacts in my life.

    • arjie a day ago ago

      One of the bright lights of that class was knowing how to bring up the "Flight Sim" easter egg in Excel.

    • gensym a day ago ago

      Any sort of "X Literacy" raises red flags for me. Actual _literacy_ - as in, being able to critically read and comprehend stuff of sufficient complexity - is basically a superpower that makes learning all these other skills possible, and it seems to be in terribly short supply.

    • ThatMedicIsASpy a day ago ago

      Logo, MS Office, Counter-Strike 1.0-1.6, PHP, War§ow, Quake, ..

      01010101 0123456789ABCDEF AND OR XOR, ..

    • red_admiral a day ago ago

      It sounds like you actually learned something in your class, though?

    • jcgrillo a day ago ago

      Mavis Beacon Teaches Typing

    • lloydatkinson a day ago ago

      I had the same experience in the UK around 2005 to 2011, I wonder if it's the same everywhere?

      I feel that my experience was far worse and bordering on the absurd and bureaucratic. We spent years following instructions, taking screenshots of us opening specific windows and dialogs in Office etc, saving all these screenshots into a Word document, and then printing the document.

      To be clear, it was every single action you took. Moved the mouse to "Insert"? Don't click it yet, take a screenshot of your mouse on the "Insert" button, and then click it, and take a screenshot of the menu that opened. Then, take more screenshots of moving your mouse to buttons and lists in dialogs that opened. Then, take a screenshot of the document with the thing you just inserted.

      Now, write several paragraphs in detail about what you just did. Print everything, and that includes both the document you just created for the exercise and then the document writing about the document creating exercise with all it's dozens of screenshots.

      Each individual printed piece of paper needed to be kept in a plastic wallet, which was then kept in document folder. In the end we had multiple of these document folders that were without a doubt a complete waste of paper and time.

      The argument was that it was needed in case the exam board decided it needed to double check the teachers scores, which I think never happened once anyway. There was never once a reason given for why each individual piece of paper needed to be put in a plastic wallet.

      This was during a period of time where CS education at schools had essentially totally vanished from the curriculum for decades, it was added back after I'd finished school.

      Words cannot describe how much I despised the entire ordeal. There simply are not enough words to describe the total absurdity of an IT class requiring screenshots of clicking buttons and being printed onto paper.

      While the teacher was trying to explain how to add PowerPoint transitions I was writing scripts that would fetch currency conversions and graph them because I was that bored. One time I write some terrible "chat" system via some type of free shared HTML/PHP hosting and meta tag based auto refreshing of the chat history for a few class friends to talk across the room.

      • rogual 16 hours ago ago

        Fellow former British schoolkid here. One part that really sticks in my memory about "IT" class was when they were preparing us for an exam that asked "which of these are functions of an image editor" and we had to memorize that, I think "fill tool" was, "pen tool" wasn't, "adjust brightness" was, and so on, without reason or reference to reality. There was just a list and you had to know it.

        I imagine these people were delighted when a Big Computer Company offered to step in and design a curriculum for them.

      • schnitzelstoat 12 hours ago ago

        Yeah, my experience was from the UK between like 2002 and 2007.

        Speaking with my younger cousins it seems nowadays they have the opportunity to learn actual programming and so on.

        We just got Doom (the 1995 one) and Street Fighter 2 to work via LAN and played that during the class, one person would do the actual work each lesson so we still had something to hand in.

        Getting the LAN to work wasn't easy so I suppose it taught me that!

      • MagicMoonlight a day ago ago

        And yet that generation knows how to use computers, and the current generation doesn’t

  • fantasizr a day ago ago

    this is a step beyond the drug dealers who give you the first sample for free. Attempts at legally mandated injection sites.

    • mghackerlady a day ago ago

      Not the first time this has happened. There was a big push for schools to teach windows and microsoft office while conveniently ignoring other things exist. Nowadays some have moved to the google office suite which isn't that much better

      • fantasizr a day ago ago

        the textbook companies give the hard sell too but it's more honorable with traditional palm greasing and what not

    • a day ago ago
      [deleted]
  • marricks a day ago ago

    > Young people increasingly hate AI[1], and children already struggle with AI-enabled harassment that traumatizes them and disrupts their learning. And studies show kids are offloading learning onto AI models, undermining their education and social development.

    [1] https://www.theverge.com/ai-artificial-intelligence/920401/g...

    The coyote is already running beyond the cliff so indoctrinating kids won't save them from an AI winter 6-18 months away.

    • bigyabai a day ago ago

      I swear that I read this same "6-18 months" timeframe 3 years ago.

      • marricks a day ago ago

        Honestly, yeah, fair point. There's enough money in tech to keep the party going for who knows how long.

        The housing market was unsustainable for a long time and went even further up before it crashed. The smart people who called that have lost a lot of money making reasonable bets against things which should have crashed.

        The difference now is rich people have positively stupid amounts of money to keep stupid things going for even stupidly longer periods of time.

      • a day ago ago
        [deleted]
      • gosub100 a day ago ago

        that's true, but if calling a crash was that easy, it would have been priced into the market already. regardless of when the crash occurs, I think we can all agree that the current form of AI is financially unsustainable. How much would you allow if you paid its true cost? $300/mo, $500/mo $1k? for intern-level slop.

  • moolcool a day ago ago

    This is way too close to the Simpsons joke about the periodic tables provided by Oscar Mayer

    https://www.youtube.com/watch?v=pohXWbMrXZI

  • fyrn_ 17 hours ago ago

    I'd back a bill to ban AI from scholls in many contexts, just like phones I think it's pretty obvious what the result will be.

    I guess like with phones we'll all have to pretend it's not obvious for ten years until we have overwhelming scientific evidence, then wait another ten for US policy makers to start talking about making a committee to design a study to develop a plan

  • rebolek a day ago ago

    Of course they will back it up. Nice source of income.

    • excrementfan a day ago ago

      [flagged]

      • kerkeslager a day ago ago

        So what? AI companies are buying laws, and a lot of people thinking that's true doesn't make it less true.

  • saidinesh5 a day ago ago

    Putting all the cynicism side.. it's amazing how big the changes in how we deal with information in our life time changed..

    When I was younger, to solve a problem, we had to memorize a large amount of information. Or know someone who does. Or visit libraries and pray they have a book on what you need.

    Then came the internet. All of that memorizing was replaced by web searches. You just focus on solving the problem, figuring out what you don't know and searching for that.

    Now, it feels like we're automating the searching, connecting the dots and most of the problem solving. We focus on the high level problem description, verification of the results.

    I wonder what they'd be adding to this curriculum.

    Now, it feels like we're even offloading

  • tsoukase a day ago ago

    AI, in the form of LLMs, should be used as an augmenting tool and not as a substituting one. The human must conceive the idea, design the solution and fill most of the gaps. The AI will only refine, improve and suggest options upon an already existing base. As a parent I promote such a use to my kids, rather than ban AI which is futile and might become dangerous in the future.

  • 17 hours ago ago
    [deleted]
  • wtetzner a day ago ago

    If AI worked as advertised then "AI Literacy" would just be "Literacy".

  • techteach00 a day ago ago

    As a teacher, if permitted to teach about and have students use chat bots, I think I'd focus on prompting first.

    The best results I read about on here using LLM's have to do with prompt mastery I think.

  • kmeisthax a day ago ago

    If by "AI literacy" they mean "learning how AI works and how to use it effectively", then this probably would wind up backfiring. Because when you improve people's AI literacy, they use it less. They don't swear off it, but because they know what it is and is not good for, they are way more cautious in their application of AI.

    Of course, they probably plan to do to education what iPads did to education: deskill children. Apple successfully abliterated the concept of a file from a generation of students by making them do their computing in a straitjacket. I can only imagine how an AI-first or AI-only educational curriculum could make kids even worse at using computers.

    • techjamie a day ago ago

      > They don't swear off it, but because they know what it is and is not good for, they are way more cautious in their application of AI.

      Like the time I got given a swelling tablet at work to dispose of and had to go through phone tag to get an answer on what to do with it or how dangerous it was. And my coworker asked "if [I] tried asking AI?" I said I am not relying on ChatGPT for something that might explode, I'll wait for the person who's paid to tell me about this thing that might explode.

  • slopinthebag a day ago ago

    Gotta get em hooked while they're young.

  • HomeDeLaPot a day ago ago

    Maybe a more general focus on getting students to practice critical thinking and fact-checking would be better. AI could be addressed as a small part of that, since chatbots are everywhere and students need to know how to filter out their BS.

    But are NSF grants really necessary for this? To what degree is this funneling taxpayer money to buy ChatGPT subscriptions and advertise to students by getting them to use AI in the classroom?

  • wiseowise a day ago ago

    Got to train serfs early!

  • claysmithr a day ago ago

    Proof the people running things are stupid I guess.

  • PunchyHamster a day ago ago

    So far AI is funding illiteracy in schools

  • red_admiral a day ago ago

    I remember "media literacy", "digital literacy" and "smartphone literacy". Why is no-one pointing out the obvious?

  • classified 10 hours ago ago

    You bet they do. Everything that normalizes AI use generates profit for them.

  • caconym_ a day ago ago

    I'm going to do everything in my power to keep this dog shit technology away from my daughter for as long as I possibly can. I can imagine implementations that I might be willing to consider for educational applications, but given these companies' demonstrated and profound lack of restraint in cramming AI into literally everything they sell, I almost can't believe anyone who doesn't work for them is suggesting that we should expose children to their products in the same context where they are supposed to be learning.

    Almost.

    • iugtmkbdfil834 a day ago ago

      I can already see a fight on the horizon and it won't be easy. My kid is in a catholic school and they do try to instill some good habits, I am seeing some teachers just kinda checking out. Previously, it was watch a movie/we in class. Now it will be: play with AI. At that point, I might have to submit a withdrawal from class.. If she is gonna use it anyway, I might as well have a say in it.

  • nathan_compton a day ago ago

    This is the reason I recently ran for my kids school board. I use AI every day and I think there is a lot of utility there, but I don't want it anywhere near my kids school. Honestly, I don't think kids need to even lay eyes on a screen until they are in highschool.

  • nalekberov a day ago ago

    What is ‘AI Literacy’? How to prepare a prompt for maximum token efficiency?

    • wiseowise a day ago ago

      Where to buy the subscription, how to convince parents to buy Pro instead of Plus, prevent original thought as early as possible, so they stay addicted - sorry I meant empowered - asap.

  • cavino a day ago ago

    The thing about AI is it'll teach you how to use it (aka 'AI literacy').

  • stuaxo 11 hours ago ago

    FFS.

    This makes me glad that the crappy chromebooks given to kids at my daughter's school aren't used.

  • rexpop 17 hours ago ago

    Cool, Taylorism 2.0

  • lofaszvanitt 18 hours ago ago

    The US really wanna ruin themselves and conjure a cyberpunk like reality... the elite seemingly went south there.

  • Devasta a day ago ago

    The Chromebook has already been an unmitigated disaster for computer literacy, this will only make it worse.

  • jmclnx a day ago ago

    What a big waste of $, for an example how did the 'coding' schools go ? AI literacy will go the same way.

    How about funding something useful ? Like real literacy as in reading books ? That will help kids out far better than "AI literacy".

  • sublinear a day ago ago

    It will be interesting to see the backlash to this one.

  • Darwins_Toffees a day ago ago

    Imagine telling parents that the new teacher they hired to teach their children just makes shit up like 30% of the time.

  • whateveracct 16 hours ago ago

    stop it

  • righthand a day ago ago

    I thought AI was so easy to use no one would have to be trained? Are they going to teach the kids to steal copyrighted data? And write AI slop articles? And to evangelize useless side projects as time savings?

    • frangonf a day ago ago

      Kids don't need to be trained in AI but the models do need to be trained by kids.

    • noobermin a day ago ago

      The drug dealers get to get them hooked young.

      • spwa4 a day ago ago

        Come on, AI can work both ways. It's easy to use AI to greatly increase your knowledge of a subject. It's also easy to use AI to prevent yourself from having to learn anything.

        Both kinds of students will exist.

        • kerkeslager a day ago ago

          > It's easy to use AI to greatly increase your knowledge of a subject.

          It's actually not.

          It's easy to get an AI to say a lot about a subject, but that doesn't mean anything the AI said was true. There's a significant risk that the AI has simply hallucinated the information, and now you "know" a bunch of false ideas about the subject, which is worse than not knowing anything about it.

          • veber-alex a day ago ago

            Right because without AI everything you read on the internet is 100% true and correct.

            Learn how to use AI properly just like any tool and you can benefit.

            • contagiousflow a day ago ago

              Can you explain the differences between using AI "properly" and "improperly" for learning?

              • veber-alex a day ago ago

                Double check what the AI tells you. Apply common sense instead of blindly trusting everything. If it's something technical in nature try to verify and test it.

                I treat AI as any other information I see online with the added value that it's customized exactly to my needs and it works pretty well for me.

        • xienze a day ago ago

          > Both kinds of students will exist.

          Yeah and I'm betting there's gonna be a whole lot more "press the button to have all your work done for you" students than "work hard" students. FFS even before all this there's been an alarming number of students attending college who have to take remedial classes.

    • zamadatix a day ago ago

      It's K12 so I'm honestly not going to try to take that dunk, as satisfying as it'd be, as plenty of things which seem blazingly obvious/intuitive to adults are complete mystery to a pool of kids where being able to read to learn (instead of the other way around) is a recent development.

      Unfortunately, the AI literacy big tech companies want to push won't align very well with the AI literacy kids need. It'd be like ad literacy for K12 being pushed by Google - obviously what's delivered would not match what the kids actually needed.

    • SpicyLemonZest a day ago ago

      If you're curious about these questions, you'll be happy to review the links from the source article, which include statements from two Senators and the head of the largest US teacher's union about what they hope for kids to learn.

      • righthand a day ago ago

        Will the kids who miss the important parts of training miss out on being able to use AI effectively? It should be easy enough for them to use without training…

        • SpicyLemonZest a day ago ago

          Why do you presume that it should be easy enough for them to use without training? Keyboards are a pretty simple technology, and serve as a subset of the primary interface to most modern AI models, but training is still required to use them well. A user who's never learned proper keyboard skills will type much more slowly and with much more frustration than you or I can, and that will have meaningful impacts on their ability to perform tasks requiring a keyboard.

          It's just a kind of training that's receded into the background as "normal", and that many of us who enjoy recreationally typing out comments on the Internet self-taught.

          • righthand a day ago ago

            I didn’t learn writing, speaking, research skills from typing out comments on the internet. I was required to use hand written note cards up until I graduated high school (heck even had blue book tests in college). The first paper I ever wrote was hand written. When we did start using computers, none of those skills were altered by passive internet chatting.

            So AI training is going to be a basic communication course? Because AI is sold as being easy to use without training and as modeled after existing human social constructs, hence artificial intelligence.

  • hsuduebc2 a day ago ago

    Of course they do when it must be taught on their products which will hook the users in time and make some money.

  • rvz a day ago ago

    Why think for yourself when you have ChatGPT, Claude and Gemini to do all the thinking for you?

    The owners of these systems do not even use the technologies that they are creating.

    The deskilling programme will continue, until morale improves.

  • pessimizer a day ago ago

    This is entirely backwards. AI should be used as a tool to tutor kids. Kids shouldn't be learning about AI. I thought the point of AI was that people didn't have to know anything to talk to it. Not to cheat at writing exercises.

    Writing exercises that children produce in school are immediately thrown into the trash after being graded and reviewed. The product is supposed to be better educated children, not better written papers.

  • adybray2 a day ago ago

    [dead]

  • butforwho 19 hours ago ago

    [dead]

  • maniiijiii a day ago ago

    [dead]

  • charcircuit a day ago ago

    The entirety of school should eventually be replaced with just this one class. AI is able to teach people anything they may want or need to know and it can design effective ways for people to study. Being able to use, interpret, and work together with AI is going to be one of the most important skills of the 21st century.

    • Darwins_Toffees a day ago ago

      You know why most kids don't do this already, because they don't know what they don't know. Telling a 2nd grader to go learn anything they want is not going to have the result you apparently think it will.

      • charcircuit a day ago ago

        But the AI does. It can create plans for a target career. It can evaluate English skills. It can provide suggestions based off the person's interest.

    • wiseowise a day ago ago

      > Being able to use, interpret, and work together with AI is going to be one of the most important skills of the 21st century.

      But I thought the models are so good we don’t need humans anymore?

    • armitron a day ago ago

      This level of naivety is characteristic of certain SV types where wishful thinking is the order of the day. We're already living through the disastrous effects of the "social media" revolution and this is going to be much more of the same, with even worse negative effects on society.

      Just imagine what this will do to critical thinking, interpersonal relationships and family dynamics in a country where illiteracy is rapidly climbing. I don't think it's a stretch to write that if the unrestrained capitulation in terms of societal costs towards big tech continues, we're setting ourselves up for {generational, class-based} conflict that will rip our country to pieces.

    • AnimalMuppet a day ago ago

      Maybe so. Still, learning how to tell when the AI is blowing smoke is going to be an important skill, and I'm not sure that AIs are going to be great at teaching that to you.

      And learning when other people (AI salespeople, say) are blowing smoke is also an important skill. Again, I'm not sure that AIs are great at teaching that.

    • iusadfkjasdf 17 hours ago ago

      [dead]

  • arjie a day ago ago

    There is a class of such thing that could be useful. I will likely be teaching my children this literacy myself. Obviously the interstitial pop-ups don't work, and the next generation will not be coming to this technology from the point of view of watching it develop. They will see it as having always existed and while they may be appropriately sceptical, I suspect they will be far more trusting of it. So some degree of understanding the mechanics will probably allow them to learn to treat this technology appropriately.

    After all, it's nigh magical stuff. A machine that talks to you in common language and is almost always right. If you weren't already prepared for it, you would trust it implicitly. When Wikipedia first came onto the scene, people behaved this way there too. They would believe it was entirely correct. But at some point there was a concerted effort in pedagogy to say things like "You can't cite a Wikipedia article" and that one simply-remembered rule allowed for children to be forced to treat it as an aggregator.

    Naturally, setting up a fund for this is nearly always a bad structure. Earmarked funds have a bad habit of ending up being written to be primarily a vehicle to transfer money to pet constituencies. Teachers unions and so on are always advocating for these because that's what funds the complex ecosystem of teacher educators, the certification and curriculum development programs, and so on. This is just social welfare by a different means. Funds should be flexibly used to meet some outcome. Earmarked funds have a habit of ratcheting up. When there is no need for programs, they continue to exist, and bleed money from the actual work product of education - informed students.

    I get why these articles are always written in this style but I really would appreciate some better news media. Students hate a lot of things. Their opinion is mostly moot as to whether a subject is a good thing to learn or not. And all this polemic style of "shoehorn" and so on is completely unnecessary, and just makes me treat this whole thing in the realm of some partisan Twitter post.

    But the one thing I did appreciate is a link to the text of the bill.