I think AI has upsides, especially in furthering research at a pace never seen before. However, when it comes to classes, I do think the easily accessible LLMs run the risk of students cognitively offloading and passing on the smallest (and sometimes most crucial) design and structure choices for their assignments. For some assignments I could understand why a student would use an LLM, especially if you could give it an idea and ask it to expand on it, but for furthering your development as a student, it doesn’t really help. I thought the Dinsmore and Fryeer delved really deep into why that is: “They do not replace the need for humans to learn and practice the necessary processes to complete a task themselves in order to build on those processes to continue to develop in a field, domain, or topic” (Dinsmore & Fryer 7).
Although I do agree that overuse of LLMs does stunt your growth as a student, I do believe the onus is on schools to figure out how to incorporate AI into their curriculums. AI is rapidly evolving – which, for better or worse, we can’t do anything about.
An AI workshop I went to in Columbus a few days ago actually had a professor from University of Cincinnati talk about how a big focus of theirs is making students more AI-literate and how they’re using agents to help personalize learning for their students.