There has been lot of both enthusiasm and doubt about ChatGPT since OpenAI first introduced this Natural Language Processing (NLP) chatbot to the wider audience in November 2022. ChatGPT allows users to conduct searches and create text by asking questions and engaging in humanlike conversations with the chatbot.
As long as we treat artificial intelligence solely as an “idiot savant” – something that is extremely good at one task, and one task only – it does not pose a real threat to knowledge workers. But ChatGPT has shown that many of these jobs can in fact be replaced by AI (Artificial Intelligence) colleagues.
In educational institutions the emergence of this kind of technology has brought a new challenge. We are now trying to catch up with the development by creating regulations and instructions on how to deal with applications of NLP. But should we just concentrate on regulating how students use AI? Or should we, with open minds, greet the opportunities of artificial intelligence and acknowledge its importance as a tool for business professionals?
Should we trust a hallucinating assistant?
If we accept the fact that students are going to use ChatGPT, we should be able to educate them to critically assess the information provided by artificial intelligence. ChatGPT is known to suffer from hallucinations, which means that it sometimes makes up false facts. Students should be able to formulate their prompts for the virtual assistant in such a manner that they will get the information or the results that they are really looking for.
Students need digital literacy, which is the ability to find, evaluate, and communicate information in digital platforms. According to Weimann-Sandig (2023) students should have enough knowledge about the topic they are searching for so that they can use the right words and judge if the resulting information is correct. They should also have a basic understanding of how to do research and where to find reliable sources.
To teach students how to use ChatGPT as a tool for gathering information, we have asked them to gather preliminary understanding of a certain topic by using artificial intelligence tools. This challenges students to form their prompts in such a way that they will get usable results. The difficulty level of this kind of exercise can be adjusted by either providing students with ready-made questions or asking them to formulate the prompts. The results received are then critically assessed and discussed during the lectures.
As teachers, we emphasize the fact that the text ChatGPT provides is not something to copy and paste to one’s assignment, but a starting point for the research and thinking processes.
If you cannot beat it, embrace it
Even though, sometimes biased or suffering from hallucinations, ChatGPT is a powerful tool for searching information and composing text. As it is easily accessible and widely discussed, it would be naive to think that students would not take advantage of it while preparing their written assignments.
As Murray and Williamson (2023) state in their paper presented in the Edulearn23 conference, this puts us against new dilemmas of plagiarism and assessment. How should we treat assignments that are (partly) written by artificial intelligence instead of our human students? When students use ChatGPT in their essays or other written assignments, whose knowledge are we assessing? How do we ensure fair evaluation for all students, if some of them are taking advantage of artificial intelligence, whereas others rely solely on reference materials and their own prior knowledge of the subject in hand?
We are not able to prevent students from using AI tools. By making the use of artificial intelligence as difficult as possible, we create an environment that puts students in unequal situations. Some are against instructions using AI, while others are doing the work themselves as advised. Instead of prohibiting usage of artificial intelligence tools, we should consider our assignments more carefully.
Artificial intelligence is good in gathering information and composing, for example, texts based on given rules. But it is not that good in applying knowledge to real-life situations. If we ask our students to apply the knowledge and assess how well they can do that, the original source of knowledge is not relevant anymore.
As academic institutions we have certain things we need to teach our students about references and sources, but otherwise we might start proudly practicing applying sciences.
References
Murray, J. & Williamson, A. 2023. To embrace, or not to embrace: ChatGPT is the question. EDULEARN23 Proceedings.
Weimann-Sandig, N. 2023. Digital literacy and artificial intelligence – Does ChatGPT introduce the end of critical thinking in higher education? EDULEARN23 Proceedings.