We Need to Change the Way We Let Students Use AI

 

                                                            Picture: Steve Johnson   

    Is generative AI helping students learn or just helping them finish assignments faster? That question gained national attention after MIT researchers released a study nicknamed “Your Brain on ChatGPT,” which found lower neural connectivity among students who used ChatGPT while writing essays. As reported in The Chronicle of Higher Education’s article The Student Brain on AI, the findings sparked fears that AI might be damaging students’ thinking. But the researchers themselves cautioned against that conclusion. Their study does not show that AI makes us less intelligent. It shows that our brains operate differently when we use tools.

    The MIT study used EEG scans to measure brain activity while 54 students wrote SAT-style essays. Students who used ChatGPT showed lower connectivity during the task and produced more homogeneous essays, often converging around similar ideas. That raises an important concern like that AI may subtly shape not just how we write, but what we think. At the same time, cognitive psychologists quoted in the Chronicle piece emphasize that “cognitive offloading,” which is relying on tools to handle mental tasks, isn’t new. We offload memory to calculators, search engines, and even sticky notes. Offloading a task does not necessarily damage intelligence overall.

    According to the Reuters report, Google commits $1 billion for AI training at US universities, Google has pledged $1 billion over three years to expand AI access across American higher education. More than 100 universities have already signed on, and students will receive free access to advanced AI tools like Gemini. AI is no longer just an optional app students experiment with, it’s becoming institutional infrastructure. If AI can reduce originality in certain contexts, what happens when every student has unlimited access? I feel like this is already the case with ChatGPT, even if they don't pay of it is already quite advanced. In a Harvard physics course, a carefully designed AI tutor improved student engagement and test performance. The difference was in design. The AI gave hints and guided thinking rather than simply generating answers.

    That distinction is also emphasized in Stanford University’s student guide AI and Your Learning: A Guide for Students. The guide reminds students that learning depends on what they think and do. AI produces responses based on statistical patterns, not understanding. It can deliver incorrect information, misattribute sources, or reflect biases in its training data. Because of this, Stanford advises students to treat AI use like receiving help from another person. Always check course policies, disclose use, and avoid delegating substantial work to the tool.

    Most importantly, the Stanford guide connects AI use to learning science. Research shows that deep learning strategies like explaining ideas in your own words, comparing concepts, and self-quizzing, are more effective than passive strategies like rereading. Does AI support these deeper strategies, or replace them? If AI generates a summary before you read, you might save time but you may also miss the cognitive effort that builds understanding. If AI quizzes you and prompts reflection, it might enhance engagement. I think that could be a great way to teach people how to properly use AI instead of relying on it entirely to just give the answers to the questions that we are asking. 

    What surprised me across all three sources is how consistent one theme is which is that context matters. The MIT study suggests early reliance on AI may reduce cognitive engagement, but later strategic use might enhance it. The Harvard tutor study shows that guided AI can support learning. Stanford’s guide reinforces that students must remain active participants in their own learning process. AI itself is not the problem but the passive use is. As universities invest billions and tech companies integrate AI into every platform, the responsibility increasingly shifts to students and educators. The question is whether we will use it as a shortcut that replaces thinking, or as a scaffold that strengthens it.


Comments

Popular posts from this blog

We Accept Easy Explanations Instead of the Truth about Learning Styles and Digital Natives

My vision for my Portfolio

What crowds teach us about the internet