• krisevol@lemmus.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      That’s not what the data says. These kids are going to outpace traditional learning kids by miles.

      • Really? Because the data I’ve seen says the exact opposite and that Gen Z is the first generation of people dumber than the generation before them. These kids are already fucked and AI is going to make it even worse.

        • krisevol@lemmus.org
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          That generation is fucked yes. There is no fixing that with AI. This is for young gen A or gen beta. Green Z is already too old for this to be useful for them.

        • Pinto, the Bean@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          I’ve seen says the exact opposite and that Gen Z is the first generation of people dumber than the generation before them.

          Do you have a citation for this?

            • Pinto, the Bean@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 months ago

              https://en.wikipedia.org/wiki/New_York_Post

              The New York Post (NY Post), founded as the New York Evening Post (originally New-York Evening Post), is an American conservative[3] daily tabloid newspaper published in New York City. Th

              The Post has been criticized since the beginning of Murdoch’s ownership for sensationalism, blatant advocacy, and conservative bias. In 1980, the Columbia Journalism Review stated that the “New York Post is no longer merely a journalistic problem. It is a social problem—a force for evil.”[9] The Post has been accused of contorting its news coverage to suit Murdoch’s business needs, in particular avoiding subjects which could be unflattering to the government of the People’s Republic of China, where Murdoch has invested heavily in satellite television.[63]

              On October 14, 2020, three weeks before the 2020 United States presidential election, the Post published a front-page story purporting to reveal “smoking gun” emails recovered from a laptop abandoned by Hunter Biden at a computer repair store in Wilmington, Delaware.[105] The only sources named in the story were Trump personal attorney Rudy Giuliani and strategy advisor Steve Bannon.[105] The story came under heavy criticism from other news sources and anonymous reporters at the Post itself for “flimsy” reporting, including questions about the reliability of its sourcing and the lack of outreach to either Hunter Biden or the Joe Biden campaign for pre-publication comment.[106][107]

              Right wing tabloid, you fell for republican propaganda.

      • thebestaquaman@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        The only research I’ve seen on using LLMs in a school setting found that the kids that were given access to an LLM performed a bit better on exercises that those without. At the same time their experienced learning was a lot better. When they finally got a test assignment, the kids that had been using LLMs during exercises flopped and performed significantly worse than those that hadn’t.

          • zkfcfbzr@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago
            • “AI should serve as a scaffold for cognitive construction rather than a substitute.”
            • “…the teacher’s role is shifting from knowledge transmission to instructional design and behavioral facilitation… Teachers must develop digital literacy and data fluency while acting as safeguards against over‑automation, ensuring that human judgment and educational values mediate AI adoption.”
            • “…while AI offers efficiency and feedback advantages, traditional teaching remains essential for tasks requiring cultural interpretation, discourse depth, and emotional connection. A blended model—AI for repetitive or procedural tasks and teachers for critical discourse—appears most effective.”

            This study explicitly does not advocate for replacing teachers with AI, and repeatedly cautions against doing so

              • zkfcfbzr@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                2 months ago

                Ironically… so did I 🙃 But I hand-verified everything it said, and adjusted the quotes.

            • krisevol@lemmus.org
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 months ago

              And the school that is opening will still have human “guides” so I’m curious how it will work out. I agree it should be a mix of AI and human, and not fully AI.

          • Nurse_Robot@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            These findings highlight both the promise and the limitations of AI in language education, underscoring the importance of teacher facilitation and thoughtful design of human–AI interaction to support deep and sustainable learning.

            The problem is there’s no teachers in this scenario, at least that’s my understanding

            • krisevol@lemmus.org
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 months ago

              You’re right, they will have “guides” instead of teachers. This might be to far, but we won’t know until they try it. A mix of human and AI teachers would probably be best.

      • CTDummy@aussie.zone
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        AI hasn’t even been around long enough for any meaningful data to be collected surely. Also, post this “data” you’ve twice now claimed exists.

          • CTDummy@aussie.zone
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            This is for college students (aka students educated enough to learn on their own already), reads like a promotion for AI, has a limited sample size and does not translate to school kids at all and from the study itself:

            Finally, the study’s limitations include its single-institution sample, short duration, and reliance on proxy behavioral indicators. Ethical concerns around informed consent, data privacy, and AI dependency also warrant closer attention. Future research should pursue longer-term and cross-institutional designs, employ multimodal behavioral measures, and develop governance frameworks that align technical gains with equity, autonomy, and critical capacity.

            This “”study”” seems to spend more time opining on AI learning frameworks than actually measuring scores on standardised testing and only dedicates a minimal amount of the paper to the results. It also states in paper that higher achieving college students saw less benefits (poorer performing student, AI can bump your grades enough to be noticeable for a unit/pass an exam).

            Did you read this study or google something in order to provide a study? This study does not support the claim that “these kids will perform traditional learning by miles”.

            • krisevol@lemmus.org
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 months ago

              No, the end part was my own opinion. I do believe classrooms that embrace AI will outperform tradition learning classrooms by a mile.

              Already yes the study is limited, AI learning is very new. Want me to pull out of study from 20 years ago with decades of proven data?

              • CTDummy@aussie.zone
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                2 months ago

                You said the data says otherwise which you then used to support that opinion. The data doesn’t say otherwise.

                Want me to pull out of study from 20 years ago with decades of proven data?

                Almost like that was in my original comment that you then replied to with a study as if it were compelling, so spare me the sassy comment. Don’t claim the data says otherwise when it doesn’t if you don’t want to be called out on it.

            • Deebster@infosec.pub
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 months ago

              It’s also for learning English, which is something a large language model is probably the most suitable for. It’s not going to be much use teaching music or drama.

        • TheTechnician27@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          2 months ago

          Why are you hounding them for the data? They would swear on their honor that Grok said it, and that’s somehow not enough for you. They even asked a follow-up “Are you sure?”, to which Grok reaffirmed its findings. Maybe you should be practicing law if you want to act like you care so much about “evidence”.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 months ago

      Traditionally, these pilot programs operate as a marketing program rather than an educational program.

      You’re going to see a class of students enter the system with enormous supplemental aid and resources. The AI will be included but largely incidental. The students will be cherry-picked for media optics, rather than randomly selected from within the school district. Tons of paid professionals will write long-winded hagiographies about the affordability and effectiveness of the program. Some Ivy League University or Fortune 500 business will make a big show of admitting the most charismatic and saleable student graduates.

      Then the program will be rolled out to the rest of the country as quickly and sloppily as possible. AI will be jammed down people’s throats. You’ll get an earful about stupid idiot parents hysterically complaining about their dumb baby children, because they’re afraid of The Terminator movies. This will be book-ended with Steven Pinker and Bill Gates calmly explaining how AI turns dumbies into geniuses. A string of movies and TV shows will be released about kids getting AI education and becoming too smart (and time traveling or getting magic powers or some other silly bullshit).

      The YIMBY coalition of very informed TV nerds will be assembled to scream at anyone who doesn’t like AI. If you don’t like AI you’re Ableist or a Bigot or Not Serious About Education. Meanwhile, we’ll get an earful about how certain migrants and POCs are incapable of learning from AI because of their inferior genes. School districts will be told to either adopt AI or lose their funding / get taken over by the state / federal agencies. National media will be saturated with “AI is normal” media content until people stop resisting.

      And all of this will culminate in more school privatization, more public education defunding, and more militant policing of young people. Because that’s always been the real end goal.

              • Akasazh@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                2 months ago

                It was a kerny joke about how in most fonts the r and n together read as an m, which is about kerning again.

                • fartographer@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  2 months ago

                  Oops, I remembered them backwards. I thought kerning was vertical spacing and leading was horizontal. You’d think I’d remember that from the typography class that I failed.

                  Turns out that I was too incorrect to appreciate your joke. I appreciate it now, but can’t bring myself to laugh at it since it’s now saturated in my embarrassment. But your pun got a good chuckle out of me!

    • altkey (he\him)@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I’d be pretty surprised if ambassadors of AI technologies would send their kids there themselves. Usually you hear the opposite about bosses of tech giants. And the concern the author of the article had - that her neurodivergent daughter could benefit from that more than from traditional learning environment - leaves me even less inspired, for the failing of the latter is generally in the lack of teachers, 1-on-1 personal work with a student, while this “personalized” approach removes teachers even further.

  • UnspecificGravity@piefed.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    I love that we act like creating a whole school worth of idiots who don’t know how to think is a small enough risk that its worth just yoloing on something like this.

  • pelespirit@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    Bean bag chairs, soundproof booths and portable whiteboards make Alpha Schools look more like a tech startup than a school. But its biggest break from tradition is not the design: It’s an education model that uses artificial intelligence to teach core subjects while adults in the room serve as “guides,” not teachers.

    • Zephorah@discuss.online
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      More importantly, the kids are taught how Google, Palantir, Meta, and Microsoft want them to be taught.

      • Obinice@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        One day people will look back on your comment and be impressed at how you predicted the future four great nations of the world.

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      This feels like it’s trying to skirt unions/regulations. The teachers aren’t actually teachers, they’re just guides to make sure the students and AI stay on track.

      • pelespirit@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        I hadn’t thought of that, it always comes down to money. They’ll have the worst part of teaching, security and misbehaving “guiding.”

  • chosensilence@pawb.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    i need this fucking bubble to pop. Disney cut their death with OpenAI. they were planning on investing $1B into OpenAI lol. let’s go, pop already. this is insanity, a fucking AI school? our children are suffering and struggling already in school this would make everything 10x worse.