Does AI actually help students learn? A recent experiment in a high school provides a cautionary tale.

Researchers at the University of Pennsylvania found that Turkish high school students who had access to ChatGPT while doing practice math problems did worse on a math test compared with students who didn’t have access to ChatGPT. Those with ChatGPT solved 48 percent more of the practice problems correctly, but they ultimately scored 17 percent worse on a test of the topic that the students were learning.

A third group of students had access to a revised version of ChatGPT that functioned more like a tutor. This chatbot was programmed to provide hints without directly divulging the answer. The students who used it did spectacularly better on the practice problems, solving 127 percent more of them correctly compared with students who did their practice work without any high-tech aids. But on a test afterwards, these AI-tutored students did no better. Students who just did their practice problems the old fashioned way — on their own — matched their test scores.

  • vin@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    15 days ago

    Did those using tutor AI spend less time on learning? That would have been worth measuring

    • Gestrid@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      15 days ago

      No, I think the point here is that the kids never learned the material, not that AI taught them the wrong material (though there is a high possibility of that).

      • utopiah@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        15 days ago

        Yes yet there is indeed a deeper point. If the AI is to be used as a teaching tool it still has to give genuinely useful advice. No good sounding advice that might actually still be wrong. LLMs can feed wrong final answers but they can also make poor suggestions on the process itself too. So there are both problematic, how the tool is used but also its intrinsic limitations.

  • Mr_Dr_Oink@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    15 days ago

    Because AI and previously google searches are not a substitute for having knowledge and experience. You can learn by googling something and reading about how something works so you can figure out answers for yourself. But googling for answers will not teach you much. Even if it solves a problem, you won’t learn how. And won’t be able to fix something in the future without googling th answer again.

    If you dont learn how to do something, you won’t be experienced enough to know when you are doing it wrong.

    I use google to give me answers all the time when im problem solving. But i have to spend a lot more time after the fact to learn why what i did fixed the problem.

    • prosp3kt@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      15 days ago

      Nope, this doesn’t work like this. sometimes you need someone to explain, specially on math, youtube can take that spot, but not always.

      • Mr_Dr_Oink@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        15 days ago

        That’s what i am saying. You need to learn it. If someone explains it to you, then you are learning. If someone gives you the answer, then you dont understand it, so you are less good at said something.

        You agree with me…

    • Rivalarrival@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      15 days ago

      Paradoxically, they would probably do better if the AI hallucinated more. When you realize your tutor is capable of making mistakes, you can’t just blindly follow their process; you have to analyze and verify their work, which forces a more complete understanding of the concept, and some insight into what errors can occur and how they might affect outcomes.

        • blackbirdbiryani@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          15 days ago

          Because a huge part about learning is actually figuring out how to extract/summarise information from imperfect sources to solve related problems.

          If you use CHATGPT as a crutch because you’re too lazy to read between the lines and infer meaning from text, then you’re not exercising that particular skill.

          • billwashere@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            15 days ago

            I don’t disagree, but thats like saying using a calculator will hurt you in understanding higher order math. It’s a tool, not a crutch. I’ve used it many times to help me understand concepts just out of reach. I don’t trust anything LLMs implicitly but it can and does help me.

            • obbeel@lemmy.eco.br
              link
              fedilink
              English
              arrow-up
              0
              ·
              15 days ago

              ChatGPT hallucinations inspire me to search for real references. It teaches we cannot blindly trust on things that are said. Teachers will commonly reinforce they are correct.

            • WordBox@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              15 days ago

              Congrats but there’s a reason teachers ban calculators… And it’s not always for the pain.

              • Skates@feddit.nl
                link
                fedilink
                English
                arrow-up
                0
                ·
                15 days ago

                There are many reasons for why some teachers do some things.

                We should not forget that one of them is “because they’re useless cunts who have no idea what they’re doing and they’re just powertripping their way through some kids’ education until the next paycheck”.

                • Zoot@reddthat.com
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  15 days ago

                  Not knowing how to add 6 + 8 just because a calculator is always available isn’t okay.

                  I have friends in my DnD session who have to count the numbers together on their fingers. I feel bad for the person. Don’t blame a teacher for wanting you to be a smarter more efficient and productive person, for banning a calculator.

              • assassin_aragorn@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                15 days ago

                In some cases I’d argue, as an engineer, that having no calculator makes students better at advanced math and problem solving. It forces you to work with the variables and understand how to do the derivation. You learn a lot more manipulating the ideal gas formula as variables and then plugging in numbers at the end, versus adding numbers to start with. You start to implicitly understand the direct and inverse relationships with variables.

                Plus, learning to directly use variables is very helpful for coding. And it makes problem solving much more of a focus. I once didn’t have enough time left in an exam to come to a final numerical answer, so I instead wrote out exactly what steps I would take to get the answer – which included doing some graphical solutions on a graphing calculator. I wrote how to use all the results, and I ended up with full credit for the question.

                To me, that is the ultimate goal of math and problem solving education. The student should be able to describe how to solve the problem even without the tools to find the exact answer.

              • billwashere@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                15 days ago

                Take a college physics test without a calculator if you wanna talk about pain. And I doubt you could find a single person who could calculate trig functions or logarithms long hand. At some point you move past the point to prove you can do arithmetic. It’s just not necessary.

                The real interesting thing here is whether an LLM is useful as a study aid. It looks like there is more research necessary. But an LLM is not smart. It’s a complicated next word predictor and they have been known to go off the rails for sure. And this article suggests its not as useful and you might think for new learners.

                • WordBox@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  9 days ago

                  Chem is a long forgotten memory, but trig… It’s a matter of precision to do by hand. Very far from impossible… I’m pretty sure you learn about precision before trig… maybe algebra I or ii. E.g. can you accept pi as 3.14? Or 3.14xxxxxxxxxxxxxxxxxxxxxxxxxx

                  Trig is just rad with pi.

  • Saki@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    0
    ·
    15 days ago

    I mean, is it really that surprising? You’re not analyzing anything, an algorithm just spits text at you. You’re not gonna learn much from that.

    • daniskarma@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      15 days ago

      In the study they said they used a modified version that acted as a tutor, that refused to give direct answers and gave hints to the solution instead.

      • BakerBagel@midwest.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        15 days ago

        So it’s still not surprising since ChatGPT doesn’t give you factual information. It just gives you what it statistically thinks you want to read.

    • Cryophilia@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      15 days ago

      Which, in a fun bit of meta, is a decent description of artificial “intelligence” too.

      Maybe the real ChatGPT was the children we tested along the way

  • Kusimulkku@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    15 days ago

    I’ve found AI helpful in asking for it to explain stuff. Why is the problem solved like this, why did you use this and not that, could you put it in simpler terms and so on. Much like you might ask a teacher.

    • JackbyDev@programming.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      15 days ago

      To an extent, but it’s often just wrong about stuff.

      It’s been a good second step for things I have questions about that I can’t immediately find good search results for. I don’t wanna get off topic but I have major beef with Stack Overflow and posting questions there makes me anxious as hell because I’ll do so much diligence to make sure it is clear, reproducible, and not a duplicate only for my questions to still get closed. It’s a major fucking waste of my time. Why put all that effort in when it’s still going to get closed?? Anyways – ChatGPT never gets mad at me. Sure, it’s often wrong as hell but it never berates me or makes me feel stupid for asking a question. It generally gets me close enough on topics that I can search for other terms in search engines and get different results that are more helpful.

    • NιƙƙιDιɱҽʂ@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      15 days ago

      I think this works great if the student is interested in the subject, but if you’re just trying to work through a bunch of problems so you can stop working through a bunch of problems, it ain’t gonna help you.

      I have personally learned so much from LLMs (although you can’t really take anything at face value and have to look things up independently, but it gives you a great starting place), but it comes from a genuine interest in the questions I’m asking and things I dig at.

      • utopiah@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        15 days ago

        I have personally learned so much from LLMs

        No offense but that’s what the article is also highlighting, naming that students, even the good, believe they did learn. Once it’s time to pass a test designed to evaluate if they actually did, it’s not that positive.

    • Homescool@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      15 days ago

      Yep. My first interaction with GPT pro lasted 36 hours and I nearly changed my religion.

      AI is the best thing to come to learning, ever. If you are a curious person, this is bigger than Gutenberg, IMO.

  • randon31415@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    15 days ago

    Kids who use ChatGPT as a study assistant do worse on tests

    But on a test afterwards, these AI-tutored students did no better. Students who just did their practice problems the old fashioned way — on their own — matched their test scores

    Headline: People who flip coins have a much worse chance of calling it if they call heads!

    Text: Studies show that people who call heads when flipping coins have an even chance of getting it right compared to people who do the old fashion way of calling tails.

    • IzzyScissor@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      15 days ago

      You skipped the paragraph where they used two different versions of LLMs in the study. The first statement is regarding generic ChatGPT. The second statement is regarding an LLM designed to be a tutor without directly giving answers.

      • randon31415@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        15 days ago

        I didn’t skip it. If you are going to use a tool, use it right. “Study shows using the larger plastic end of screwdriver makes it harder to turn screws than just using fingers to twist them. Researchers caution against using screwdriver to turn screws.”

  • ???@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    15 days ago

    Yeh because it’s just like having their dumb parents do homework for them