The WSU Office for Teaching and Learning (OTL) has gathered suggestions for using ChatGPT from POD listserv discussions and chats. Here are some ideas to consider:
Embrace the tools used in the real world. One rule of thumb I have for teaching is to try to think about how or if this task would be done in the real world or workplace. Would they use AI to help them do some task? Then let students use it, too. Of course, that's not always feasible or desirable.
Have students generate something with AI and then discuss how to improve on it. Or how it could be used to in the professional world to speed up tedious or repetitive tasks.
The WSU Office for Teaching and Learning (OTL) has gathered suggestions for using ChatGPT from POD listserv discussions and chats. Here are some ideas to consider:
AI (so far) is not very good at scaffolding work from one assignment to another, so any time you can build writing assignments that build on prior work, it's more difficult to rely on AI to produce meaningful content. Topic proposals, intro paragraphs, drafts and revisions, any kind of scaffolding is difficult to fake with AI.
AI (so far) has no concept of "indexicallity," it can't refer to external objects in a meaningful way. If your writing prompt is "based on the annotated bibliography your peers shared last week, write about the trends or themes..." AI has absolutely no idea what to do with a prompt like that.
AI (so far) doesn't know how to synthesize knowledge or make inferences. If you ask it to write a book report or summarize an article, it can do that surprisingly well. If you ask it to compare and contrast the themes of two different books, it struggles to do that coherently and tends to just stack two book reports on top of each other.
From OpenAI, a classifier trained to distinguish between text written by a human and text written by AIs from a variety of providers.