Post 5: Academic use of AI

When it comes to AI use in the classroom, there is a not-so-clear line between “enough” and “too much”. According to the DInsmore and Fryer [1], the most important part of the learning process is also the part that using AI tends to skip over, the struggle phase. There is an argument to be made that this is evidence AI does not belong in the classroom end-of story, and I do not disagree. As a student, my end goal is not just to get a degree but also to use the information I’ve learned in my endeavors. A degree is something you can put on your resume as proof you did the work and know things that others don’t. AI clearly muddies this process. Once a student has obtained their degree, they can get hired at a job based on those credentials. They are not “preparing” anymore, so the use of AI in the workplace is a much more open discussion. traditionally, whoever gets things done the fastest is the “better” worker but they are both doing the job at equal levels of success. in this case, AI is actively cutting down on wasted time and inefficiencies, it is not sacrificing any level of quality since it is not the one determining the quality.

Works Cited

[1] Dinsmore, Dan L., and Luke K. Fryer. “What Does Current genAI Actually Mean for Student Learning?” Learning and Individual Differences, vol. 125, 2026, article 102834,

Post 5: Academic AI

I think AI has upsides, especially in furthering research at a pace never seen before. However, when it comes to classes, I do think the easily accessible LLMs run the risk of students cognitively offloading and passing on the smallest (and sometimes most crucial) design and structure choices for their assignments. For some assignments I could understand why a student would use an LLM, especially if you could give it an idea and ask it to expand on it, but for furthering your development as a student, it doesn’t really help. I thought the Dinsmore and Fryeer delved really deep into why that is: “They do not replace the need for humans to learn and practice the necessary processes to complete a task themselves in order to build on those processes to continue to develop in a field, domain, or topic” (Dinsmore & Fryer 7).
Although I do agree that overuse of LLMs does stunt your growth as a student, I do believe the onus is on schools to figure out how to incorporate AI into their curriculums. AI is rapidly evolving – which, for better or worse, we can’t do anything about.
An AI workshop I went to in Columbus a few days ago actually had a professor from University of Cincinnati talk about how a big focus of theirs is making students more AI-literate and how they’re using agents to help personalize learning for their students.

Post 5 Academic AI

When it comes to AI use in the classroom, in your professional lives, how do we determine how much is too much? 

When it comes to AI in the classroom, the most challenging part is determining the appropriate level of its involvement into educational process. At first glance, AI seems to be a shortcut, but after reading the readings this week, it became clear to me that there is another side of the coin. 

One line that stuck with me was that “there are no shortcuts” in education got me to reflect on how I use AI in my studies. I realize that if it does the job for me, I do not learn anything, because I simply repeat the answers that were generated by the algorithm. 

 However, I believe that AI itself is not bad. On the contrary, the problem lies within people’s excessive usage of it. In this regard, for me, AI stops being beneficial when I stop working on the assignment but let it work for me. 

 In summary, while using AI, one should find the right balance between using it as a tool and abusing it for personal benefit. 

How can we (as a college) best communicate those issues to students – for example, in institutional or course policies or guidelines?

First, you have to understand what these “issues” are. The main issue is the fact that students are using AI to come up with ideas instead of them using their prior knowledge. However, we have to understand the motive behind a student’s desire to use AI, which is to pass a class and get a good grade. When the curriculum is not engaging and overwhelming with rigorous grading, students aren’t enticed to learn, and will likely use AI in the process. Dinsmore and Fryer indirectly address my point as they describe how “These trends illustrate the need to possess a wide range of cognitive and motivational attributes (e.
g., knowledge, interest, engagement, strategies) to be able to produce
positive learning outcomes, whether that be reading, problem solving, or many other activities.” (2)

This describes how students have an innate desire to use their prior knowledge, but learn best when they are engaged and interested. Having said this, it is evident that there needs to be more creative learning in schools and more opinion-based questions where a student can pick their own brain and come up with their own ideas that they feel strongly about, heavily discouraging the use of AI.

Dinsmore, Dan, and Luke K. Fryer. What Does Current GenAI Actually Mean for Student Learning?, 5 Mar. 2025, https://doi.org/10.31219/osf.io/f8z56_v1.

Post 5 – Academic AI


I think AI is most helpful right up until it starts doing the thinking for us. When the only real work is Control C and Control V, then it is too much. I think the line is crossed when AI replaces the skills we’re supposed to build. If it writes, summarizes, or analyzes everything for us, then we’re not actually learning those processes. Dinsmore and Fryer (2026) say, “there are no shortcuts” to developing knowledge and expertise. That directly challenges the idea that AI can just “free us” to think at a higher level.

I think the biggest gray area is intent. Using AI to check your understanding or generate ideas can support learning. But using it to produce answers and work can replace learning entirely. The tool isn’t the problem, it is the way it’s used.

I think this will matter even more in future jobs. AI can make work faster, but if people rely on it too much, they risk losing the ability to think independently and know how to execute their job. The goal shouldn’t be to avoid AI, but to use it so that it still makes you think.

Dinsmore, D. L., & Fryer, L. K. (2026). What does current genAI actually mean for student learning? Learning and Individual Differences, 125, 102834. https://doi.org/10.1016/j.lindif.2025.102834

Prompt 5 – Academic AI

When it comes to AI use in the classroom, in your professional lives, how do we determine how much is too much?

Regarding the use of AI in the classroom, LLMs should be able to be tools to help facilitate learning, such as help generate ideas, help find relevant sources (checked by you), and summarization. What AI should not be doing is acting as your primary learning source and acting as a teacher for you when trying to learn. “Expertise provides the clearest use case for automating less complex tasks for the expert (such as summarizing) since the expert should possess the requisite knowledge to spot inconsistencies in a genAI summary (Dinsmore & Fryer, 2026). Too much is too much when Generative AI is being used as a primary learning tool for kids who haven’t fully developed their learning and cognitive processes. Additionally, in the professional realm, some jobs require the use of Generative AI in tasks, such as video, image, and text generation. Being able to use AI to aid in facilitating work is generally deemed as good; employees just need to be aware of how to best use these tools. Too much is when the job requirements are done through strictly AI as the expert of learning and knowledge.

Reference: Dinsmore, D. L., & Fryer, L. K. (2026). What does current genAI actually mean for student learning? Learning and Individual Differences, 125, 102834. https://doi.org/10.1016/j.lindif.2025.102834