Updated: Apr 11
After long months of not writing posts here on my own blog, I'm now back with a Chat GPT-related one. Read on to find out why I was over the moon to receive (and detect) a task submission that was written by AI!
I've been playing around with the OpenAI-developed tool for a while, looking at the potential of using artificial intelligence in materials writing and lesson preparation. You can read how I used ChatGPT in a lesson here, and how I was experimenting with another tool called Twee here. I also had the opportunity to do a short presentation on the dangers and benefits of using AI in higher education for one of the faculties at my university.
These are the things that I've noticed about this phenomenon so far:
ChatGPT and other similar services are definitely going to stay with us, period.
Instead of trying to kill it, penalise it, or banish it, teachers should find alternative ways of assessment and students should be taught how they can use it for learning and not just for getting away with an assignment.
In Bloom's taxonomy (remember, understand, apply, analyse, evaluate, create), ChatGPT can easily perform any task on the first three levels, if not partially on the fourth as well. This means we need to move on to more cognitively demanding tasks when designing assessment.
It can simplify so many time-consuming tasks, but my favourite is sample text creation. There have been many occasions when I spent hours trying to find a text online that would go with the topic of my lesson. But now I just write a good prompt and get the result in seconds!
Yesterday I received the first AI-written task submission from one of my university students, and I couldn't be happier. Why? It made me realise how important it is for teachers to use ChatGPT regularly because that's how they're going to be able to detect it - many times even better than other online detection tools.
So what gave this submission away?
The task (for my teacher trainee students) was to evaluate a written letter at B2 level: find strengths and weaknesses, support them with specific examples, and suggest two things that should be improved with clear action points for the future. These were the things that were suspicious to me in the assignment:
The starting sentence - If you've asked a couple of things from ChatGPT, you must have noticed that the first sentence is always something like "Based on your question/request, here is my response/analysis:"
The style - Having other submissions written by students helps here a lot because that's how you get the feeling that ChatGPT-written responses tend to be professionally neutral in style (unless you ask for something else). These responses are also consistent in their style, which human students might not always get right.
Grammar and vocabulary - Again, it's really worth knowing your students' abilities and skills because you will quickly see that someone who cannot really form complex sentences in speaking without making mistakes will probably not produce a completely swift and correct text in writing.
Repetitions - ChatGPT tends to repeat the key words that were given in the prompt.
Being vague - This tool is very good at producing responses that require some "thinking" but you can easily see its deficiencies (for the present moment) when you ask more specific questions. My student's response for the "suggestions for improvement" part was well-written but didn't have any substance.
Tips for the future
To be honest, I wasn't actually that joyful when I received this response because, of course, it kind of hurt that my student just wanted to quickly get this task over with and didn't see how it could add to their skillset. But I've drawn my conclusions for the future:
Design more hands-on tasks that require students to specify and justify what they're thinking or doing
Put more emphasis on what happens in class - Give students project tasks and make them show their skills in class (after all, with teacher trainees the most important thing is that they can actually do what they're taught)
Have a conversation with them within the task submission - I didn't penalise this student for having used ChatGPT, I just asked them to justify certain parts of the text and add detail to where I think it's needed. Asking further questions could possibly make them think more. (Of course, they could write those answers with AI as well, but I'm hoping that they at least engage with my comments to a small extent). If they decide not to do anything more, they will keep their low score, which they didn't actually get for using ChatGPT but for not doing what I asked for.
Ask for a ChatGPT method section - it's no problem if they decide to use the service but ask your students to explain how they went about it, as in what their prompt was, what they decided to modify, and how satisfied they were with the result.
Use ChatGPT myself - to become better at sensing that something might be fishy, the best thing you can do is to use the tool yourself because you basically train yourself to see what it can do and how.
What's your reaction to your students using chatGPT for submissions?
Further interesting readings:
ChatGPT and the Future of University Assessment https://katelindsayblogs.com/2023/01/16/chatgpt-and-the-future-of-university-assessment/?fbclid=IwAR1loIRmm0f_chyKJcsyC1Trto_0tIXa_dKx6Oz646qWHCNXWYehjH-_hAs
Updating university syllabi for ChatGPT https://medium.com/@rwatkins_7167/updating-your-course-syllabus-for-chatgpt-965f4b57b003