Week 2 – Reflection on Ethics of AI – Let Me Introduce You to My Friend, ChatGPT?

When thinking about the ethical issues involved in AI before this class, the first things that came to my mind used to be the invasion of privacy (e.g., usage of personal data or surveillance systems), displacement of humans in various jobs (e.g., in data entry/analysis), and issues of plagiarism (i.e., using AI tools to create work without giving credit).

This week’s lectures, the class readings, and discussions with peers have all taught me that the ethical issues of AI are far more wide-reaching and complex. What I did not know before, and thus what really struck me, was the fact that one single search on an LLM like ChatGPT takes up an entire bottle of water. (Coincidentally, a friend of mine posted this on her Instagram this week: https://www.instagram.com/zerowastestore/reel/DHB4s3UyIX0/). As we learned in the article by Bender et al. (2021), the people who suffer the most from the negative environmental consequences of LLMs are usually those who benefit the least from their progress, thus highlighting the intersectionality of ethical issues in LLMs.

After class, I continued my reflection on potential areas of conflict. One area we haven’t really discussed yet but that certainly is of relevance too is the use of AI for providing psycho-social support, such as in psychological counseling and perhaps even as a replacement for social and romantic relationships. This is an area of AI use that surfaced a long time ago in many utopian or dystopian movies (e.g., Her, 2013) but, as a quick online search shows, no longer just exists in movies.

Several articles (e.g., Ping, 2024) have investigated the use of AI technology, such as AI chatbots, in psychological counseling. Scholars have also examined the potential of AI for social and romantic relationships, focusing, for example, on how people can have meaningful relationships with chatbots (e.g., Pan & Mou, 2024). Considering that we live in a world in which individuals seem to increasingly prioritize self-fulfillment over (romantic) relationships, this area of AI seems particularly interesting to me.

With the last ChatGPT update, I have noticed that the chatbot has become more personal, for instance, by using smileys and asking questions like, “Let me know if I can help you with x or y….”

  • What do you think about the use of AI in areas like counseling or as a replacement for or addition to social relationships?
  • Do you use neutral or friendly language when engaging with an AI chatbot?

I’m looking forward to hearing your opinions on this topic! 🙂

Sources:

  • Bender, E. M., McMillan-Major, A., Gebru, T., & Shmitchell, S. (2021). On the dangers of stochastic parrots: Can language models be too big? Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’21), 610–623. https://doi.org/10.1145/3442188.3445922
  • Her. IMBD. https://www.imdb.com/title/tt1798709/
  • Pan, S., & Mou, Y. (2024). Constructing the meaning of human–AI romantic relationships from the perspectives of users dating the social chatbot Replika. Personal Relationships, 31(3), 1090–1112.https://doi.org/10.1111/pere.12572
  • Ping, Y. (2024). Experience in psychological counseling supported by artificial intelligence technology. Technology and Health Care, 32(6), 3871–3888. https://doi.org/10.3233/THC-230809

Leave a Reply