Aarhus University Seal

Chatbots

Brief description 

Chatbots, such as ChatGPT, are an example of a Large Language Model (LLM) that can answer questions and generate text so that the end result resembles something produced by a human. The arrival of chatbots has encouraged us to reflect on how we and our students work with text within an academic context.

Should we use chatbots at university? 

We are educating students for a labour market increasingly characterised by technology that uses AI. By being at the forefront of chatbots and teaching our students how to use them, we can prepare our students for their future on the labour market.  

Of course, like all technology, chatbots have their limitations and there are several ethical aspects we need to consider when using them. That’s why it’s essential that we learn to use them with integrity – which is also something we can teach our students.  

How can chatbots be used at university? 

Chatbots can be used as a valuable resource to generate ideas, creative inputs and text. As such, they also have the potential to support members of teaching staff and students in their academic work, provided we appraoch them consciously, reflectively and critically. 

Here are some examples of what chatbots can be used for at university and some things to consider when using them.

Teaching students about chatbots

Your students need to know how they may and may not use chatbots. Have a discussion with your students about the limitations of chatbots and the ethical implications of using them. It’s important that students develop a better understanding of how they can use chatbots in a responsible way. It could be relevant to discuss: 

  • How to use chatbots responsibly within the framework of the course/subject and in a way that does not constitute cheating. This also relates to the general advice and guidelines on academic standards – which students can read more about on AU Studypedia.
  • Plagiarism and cheating. Students should be aware of AU’s rules on plagiarism and ensure that they provide correct references when taking text from other sources, including chatbots. 
  • Chatbots and reliability. Students should be aware that they cannot trust all the information they get from chatbots. Chatbots such as ChatGPT are often based on text from the internet, which is not always correct. It is therefore important that students check the reliability of the information elsewhere. 
  • How to reference chatbots. Students should know how to reference their chosen chatbot if they decide to refer to it, quote from it or include information from it in their assignment. If they use ChatGPT, they can reference it as follows: “OpenAI. (2021). ChatGPT. Retrieved from openai.com/blog/chatgpt/". 
  • The limitations of chatbots. Students should be aware that chatbots are a tool and not a replacement for their own critical thinking and writing. They are always required to exercise their own assessment and expertise when using chatbots. 
  • How the students use chatbots. Ask students how they use chatbots so that you can learn more about their practices. Perhaps you can find an interesting way to integrate this into your teaching. 
  • Materials for students. Students can find more information and inspiration on how to use chatbots in connection with their degree programme on AU Studypedia.

Using chatbots in the classroom

You can use chatbots either as a resource to improve the students’ academic and study competencies or as a current topic that you examine from an academic perspective. There are many ways to do this.

Here are a few suggestions: 

Chatbots as a current topic 

  • Discuss the use of chatbots from an academic angle. Take advantage of the fact that chatbots are a current topic and discuss their use from an academic perspective, for example a legal, philosophical, ethical, pedagogical or technical perspective. 
  • Take the Turing test. Ask the students to write answers to a number of academic questions before class. In class, ask the students to guess whether a given answer has been written by one of their fellow students or by a chatbot. You can do this using a Mentimeter poll or a show of hands. (The Turing test is a way of determining whether a machine can exhibit intelligence equivalent to a human.) 
  • Bust the myths: confirm or reject the chatbot’s answer. Students should adopt a critical approach to the chatbot’s answers and, if relevant, find sources for its answers to specific questions. This will train the students’ critical awareness and give them an insight into the chatbot’s limitations. 

Chatbots as a resource 

  • Conduct a feedback activity. The students write a piece of text and ask the chatbot for suggestions to improve it (focusing on specific feedback points, if possible). After this, the students improve their own text. If relevant, this could form part of a larger feedback process that includes subsequent peer or teacher feedback. 
  • Ask good academic questions. Students ask the chatbot different versions of the same academic question and reflect on the chatbot’s answers. They can reflect on the types of answers they receive, whether the chatbot understood the question when it was reformulated, or whether the chatbot gave generic answers. By doing this, the students will train themselves to ask good academic questions. 
  • Analyse the chatbot’s answer from a particular academic perspective, for example its ability to add to a rhetorical argument, to produce a specific text genre, or to translate text.   
  • Generate precise formulations. Students can ask the chatbot to suggest ways to vary the title of a text or to change the angle of a research question or hypothesis. 
  • Test how much the students know. Students define or describe a topic and then compare their text with the chatbot’s text. By comparing the texts, students can identify knowledge overlap, knowledge gaps and ways of communicating knowledge. 

 

Getting to know chatbots as a teacher

Try out chatbots like ChatGPT for yourself and experiment with ways to use them within your academic field. It’s a good idea to get to know the technology so that you can teach the students about it and be a good role model for how chatbots can be used within your subject. Here are just a few examples of how you can work with chatbots: 

  • Get feedback on your own text by asking a chatbot for suggestions for improvement. 
  • Get help brainstorming titles or formulations of research questions. 
  • Avoid writer’s block by letting a chatbot inspire you to start writing about a specific topic. 
  • Get a brief introduction to a topic, concept or theory, which can act as a starting point for your own, more in-depth study. 
  • Get an objective view of data by asking the chatbot to identify patterns in a text. 

Using chatbots to support students’ academic work

Chatbots can be a valuable resource for students’ academic work. Of course, students should not use chatbots to produce their text, but, if they use chatbots wisely and responsibly, they can help improve students’ understanding of academic content and assist them with the writing process. Students can use chatbots like ChatGPT to: 

Get feedback on their own text

Students can give the chatbot a text excerpt, such as a report or an analysis, and ask the chatbot to give feedback on the grammar, coherence and clarity in the text. 

Suggest alternative formulations

Students can ask the chatbot to suggest ways to vary the title of a text or to change the angle of a research question or hypothesis. 

Get started on their writing

The chatbot can help students get started on their writing by giving them initial suggestions for sentences or paragraphs, which the students can then work on. This can be helpful for students who often experience writer’s block. 

Understand a topic better

A chatbot can be a good tool for students to get a quick overview of a complicated topic, theory or concept before they investigate it in more detail and depth. 

Assist with the reading process

Students can use chatbots to get a basic understanding of a primary text, especially if the text is complicated. This could make it easier for students to relate to the text and to know what they should take from it and how they should approach it. Students can learn more about reading strategies on AU Studypedia.

Avoiding cheating when using chatbots

Rules regarding chatbots for exams

As a starting point, it is not allowed to use ChatGPT and similar LLMs in connection with exams, unless it is explicitly stated in the course description. If ChatGPT and LLMs are allowed for an exam, it is important that the students are aware that this is a source.

  

Using chatbots as a source

If it is allowed to use ChatGPT and other LLMs for exams, students should be aware that the same requirements for the use of quotations and source references apply as for all other sources. Otherwise it will be considered plagiarism. Students can read more about AU's rules for exams as well as the use of chatbots on AU Studypedia.

What can you do as a teacher?

In order to reduce the risk that students use chatbots to cheat, consider the following: 

  • Try chatbots for yourself. Get to know chatbots for yourself so that you understand what they can and cannot do. 
  • Reformulate exam questions. Ask the chatbot to answer a previous exam question. If it produces an academically correct answer, try reformulating the question so that you start to see which questions the chatbot can and cannot answer satisfactorily. 
  • Use other modalities, such as images and videos. This makes it more difficult for the student to use chatbots, which rely on text input. 
  • Ask questions of a higher taxonomic level. Chatbots do better when answering questions of a low taxonomic level. Consider asking exam questions of a higher taxonomic level. The chatbot’s answer to these questions will often be weaker and lack the required context and academic insight.  
  • Hold oral examinations whenever possible. 
  • Reference chatbots correctly when they are referred to or quoted. For example, you can reference ChatGPT in the following way: “OpenAI. (2021). ChatGPT. Retrieved from https://openai.com/blog/chatgpt/". 
  • Ask students to attach appendices to their assignments. If students choose to use their conversations with chatbots as part of their assignment, consider asking students to attach these conversations as appendices. This can help to clarify how the student has used the chatbot.  

Examples of how teachers use chatbots

We asked some students how they use ChatGPT in connection with their degree programme.

Here are some of the things they said: 

Examples of how students can use chatbots

We asked some students how they use ChatGPT in connection with their degree programme. Here are some of the things they said: 

Limitations of chatbots 

Like any other technology, there are both advantages and disadvantages to using chatbots. It is important to be aware of the limitations of chatbots and to adapt the way you use them accordingly. Here are some examples of the advantages and disadvantages of chatbots. 

Advantages

  • Provides access to a vast amount of information 
  • Allows you to extract points from, refer to, summarise or shorten a text. 
  • Offers explanations that are easier to understand than those in a textbook and can thus help students approach a complicated topic
  • Provides an objective view of data by identifying patterns in a text 
  • Offers suggestions for how to improve a text and can thus help students develop themselves as writers. 

Disadvantages

  • Out-of-date information. Chatbots generate their output based on a limited dataset that may not be entirely up-to-date. For example, ChatGPT’s output is based on data from the internet harvested before a specific date (currently before 2021). This means that if your question refers to events or developments after this date, the chatbot’s output may be inaccurate. As a general rule, a chatbot’s ‘knowledge’ of contemporary events will be limited. The same goes for extremely esoteric or specialized subjects and information. 
  • A skilled imposter. Chatbots can generate a lot of different kinds of text, but they’re not always accurate – and they won’t tell you when they’re making stuff up. They have been trained to generate coherent, fluent texts which often sound very authoritative and convincing. Despite the authoritative tone, these texts are not necessarily correct, and can contain errors, omissions, inaccuracies and even false information that is presented as fact. 
  • Limited subject knowledge. Chatbots don’t necessarily have the advanced knowledge of your subject you as a student are expected to have. That can affect their ability to analyse and interpret data correctly.  
  • Lack of context. If your question doesn’t provide the chatbot with enough context, the chatbot will fill in the gaps itself, which can result in output that is unrelated to the original question or that is too generic to be useful.  
  • No sources cited. It’s impossible to see what data the chatbot is basing its output on. Its responses are generated based on patterns in text; it doesn’t use sources per se. This means its output is based on general texts from the internet on the subject in question, not on research or surveys.   

Contact

Please contact the editors at AU Educate if you have any questions about the content of the platform or if you need consultation on your teaching from one of the many skilled professionals at the Centre for Educational Development