Researchers compare math progress of almost 1,000 high school students
Does AI actually help students learn? A recent experiment in a high school provides a cautionary tale.
Researchers at the University of Pennsylvania found that Turkish high school students who had access to ChatGPT while doing practice math problems did worse on a math test compared with students who didn’t have access to ChatGPT. Those with ChatGPT solved 48 percent more of the practice problems correctly, but they ultimately scored 17 percent worse on a test of the topic that the students were learning.
A third group of students had access to a revised version of ChatGPT that functioned more like a tutor. This chatbot was programmed to provide hints without directly divulging the answer. The students who used it did spectacularly better on the practice problems, solving 127 percent more of them correctly compared with students who did their practice work without any high-tech aids. But on a test afterwards, these AI-tutored students did no better. Students who just did their practice problems the old fashioned way — on their own — matched their test scores.
I don't even know of this is ChatGPT's fault. This would be the same outcome if someone just gave them the answers to a study packet. Yes, they'll have the answers because someone (or something) gave it to them, but won't know how to get that answer without teaching them. Surprise: For kids to learn, they need to be taught. Shocker.
I've found chatGPT to be a great learning aid. You just don't use it to jump straight to the answers, you use it to explore the gaps and edges of what you know or understand. Add context and details, not final answers.
The study shows that once you remove the LLM though, the benefit disappears. If you rely on an LLM to help break things down or add context and details, you don't learn those skills on your own.
I used it to learn some coding, but without using it again, I couldn't replicate my own code. It's a struggle, but I don't think using it as a teaching aid is a good idea yet, maybe ever.
There are lots of studies out there, and many of them contradict each other. Having a study with references contribute to the discussion, but it isn't the absolute truth.
I wouldn't say this matches my experience. I've used LLMs to improve my understanding of a topic I'm already skilled in, and I'm just looking to understand something nuanced. Being able to interrogate on a very specific question that I can appreciate the answer to is really useful and definitely sticks with me beyond the chat.