Academic writing

The use of AI can be characterized by the extent to which it replaces individual learning and creativity. We use AI in our professional and personal lives so much that, in some way, we undermine our creativity. AI can be useful, for instance, when brainstorming or outlining ideas, but if professionals or students depend on it to handle all of the thinking or writing, it becomes a crutch that impedes growth. Generally, AI should complement human labor rather than take its place.

This week’s readings made me realize I should not rely on AI to do my work. I should only use it for brainstorming ideas. For example, we should not rely on AI to summarize a reading or a paper. As it can miss out on little details that we need to know.

Over time, a workforce that is tech-savvy, morally anchored, and creatively empowered can be shaped with the help of carefully planned AI policies. In the future, AI might start training humans for their job roles.

Academic writing

This week’s readings made me think more seriously about how AI fits into both school and work. It’s easy to rely on it too much, especially when you’re stuck or just being lazy. For me it’s okay to use AI to help organize thoughts or brainstorm ideas, but not to do all the work for you. Tang et al. (2023) stresses the importance of transparency and understanding that AI is a tool not a collaborator noting that “Generative AI tools are nonlegal entities and are incapable of accepting responsibility and accountability for the content.” That hits home because AI can’t replace you. If we’re not honest about how we use AI, it raises big questions about authorship and integrity.

The grey areas with AI is hard to ignore. It can spark new ideas and help shape your work, but it also blurs the line between your own thinking and what AI generated. That’s why we need clear, supportive guidelines. Not rules that punish but encourage honesty. AI has real potential in lots of fields, but that only works if we’re transparent about how we’re using it.

https://sigmapubs.onlinelibrary.wiley.com/doi/pdfdirect/10.1111/jnu.12938

Week 5: Academic Writing

This week’s readings got me thinking: How much is AI not just in our classrooms, but also in our future careers? It feels like we’re navigating a brand-new world, and the lines haven’t been drawn yet.

One area that stood out from the paper “Augmenting the Author” is the issue of transparency. The authors highlight the “depth of transparency in researchers’ access and utilization of AI” and the concerns this raises about the “reliability and credibility of AI-generated text.” This really resonates when we think about assignments or professional reports. How do we know if the work is truly our own or mostly AI-generated?

A key grey area lies in defining authentic authorship and intellectual contribution when AI is involved. For example, in one of my previous classes, an alumnus from Pfizer told us that they sometimes use AI to generate mock drug syntheses, which they then refine and test. I found this interesting in relation to this week’s discussion, but I didn’t get a chance to ask about it at the time AI’s usage and credited contributions can be a touchy subject.

As the researchers point out, there’s a concern that reviewers (or instructors) might end up “validating AI’s work” instead of the author’s. Moving forward, clear guidelines emphasizing transparency about AI use will be crucial. We need to foster a culture where acknowledging AI assistance isn’t seen as a weakness but as a step toward ethical and responsible innovation. 

links:https://dspacemainprd01.lib.uwaterloo.ca/server/api/core/bitstreams/40009604-deb5-4250-a341-ceccb3c6561c/content

Academic writing and AI

When I first asked ChatGPT to outline my lecture, I felt the thrill of amplified creativity, but that lasted only in the first few times. Balancing AI’s power means acknowledging its role without letting it subsume ours. As Tang et al. remind us, “transparency in declaring the use of generative AI is vital to uphold the integrity and credibility of academic research writing.” ​

In practice, the grey zones are everywhere: Is summarizing primary sources with AI just good scaffolding or academic shortcut? Can we allow AI-driven captioning in class for accessibility but forbid it in formal assessments? Policies must draw clear boundaries, however it is difficult considering how little we know about AI extensibility. Transparent guidelines, akin to IRB declarations, could require students and faculty to note AI‑assisted sections, letting evaluators focus on original insight. AI could also be normalized as an assistive tool across all forms of assignments and in person discussion based evaluations might be prioritized instead.

Ultimately, defining “too much” is less about word count and more about intent. If AI amplifies human potential and honors ethical guardrails, it’s a tool; if it replaces responsibility, it’s overreach.

Week 5: Academic Writing

Generative AI has quickly become one of the most transformative technologies of our time since its mainstream debut in 2022. By early 2023, ChatGPT had gained over 100 million users, leading to widespread integration of AI into our daily lives. Using AI is not black and white, there are definitely many grey areas, especially while considering ethics and creativity which humans are still navigating.

Our readings this week raised an interesting point: “Generative AI tools are nonlegal entities and are incapable of accepting responsibility and accountability for the content within the manuscript. Additional ethical concerns beyond authorship issues include copyright implications arising from the use of third-party content, conflict of interest, and the broader concept of plagiarism which encompasses not only verbatim text copying but also the replication of ideas, methods, graphics, and other forms of intellectual output originating from others” (Lund et al.,2023) Lund and his co-authors point out that AI can’t be held responsible for what it creates. What happens if something goes wrong? and who would be held accountable? If AI writes something based on thousands of existing texts, who actually owns that work? As a student, It’s easy to blur the line between getting support and cutting corners. AI is convenient because you can ask it think critically for you, which defeats the purpose of learning. College is where students are supposed to develop their voice, and ability to communicate complex ideas.

For me, the key to developing my own perspective on academic and professional use of AI is transparency and balance. I think AI can be a great tool if we’re honest about how we use it as and make a conscience effort to not let it replace our own thinking. Just like with calculators, it’s not about banning the tech but about learning how to use it responsibly and with intention.

Creative Use for AI

In last weeks lab, the class went on to use AI in ways to engage in creative thinking. This was done by using AI to generate a poem as a start. It was very generic due to the weak prompting at first but after giving it some more specific details about what kind of poem AI was supposed to write, the LLM started to get more sophisticated with the response. This applied to other applications such as building a DnD character in my case. I have not played DnD before, but I am aware of the creative freedom the journey’s can go on, so I wanted to see what ChatGPT would create for me. I asked for a character name, backstory, and suggested items to ChatGPT.

When it comes to this creative response, I have to say I do not agree that something needs to be novel in order for it the be creative. A lot of this response has pieces in it that are, on their own, not original. Though, when you consider all of the pieces put together to make this unique character. That’s when you get novelty, surprise, and value all in one (Arriagada & Arriagada-Brunea, 2022). The surprise factor comes from the character’s backstory alongside the value. More value can be expected if you intend on using this character during a campaign. That is why I thought this response was great for creative outlook. DnD is full of creativity, and you could seemingly regenerate the same prompt and get endless amounts of characters to choose from. This method of using AI in a field that is already very creative is a perfect use for it. Creating characters in this way can cut down on time thinking, or even completely, while creating a unique character to go on campaigns.