OpenAI’s advice on writing with AI
Writing with AI is a contentious topic. Some believe that any use of AI is inappropriate while others are fine with having AI write complete documents. My opinion falls in the middle. AI is great for refining your writing, but it’s sketchy to represent AI’s writing as your own. OpenAI, the company that brought generative AI (GAI) into public consciousness, agrees.
To understand this better, let’s look at OpenAI’s “A Student’s Guide to Writing with ChatGPT.” The document lays out OpenAI’s advice on how to use GAI (or ChatGPT, since it’s OpenAI) to enhance your writing ethically. I’ve been saying this for a couple of years now. As educators and as users, our mantra should be assist, not replace. That applies to virtually any application of GAI in higher ed. GAI is a helper, not a servant (although the line between the two can be blurry at times).
You should check out the article for yourself, but I want to give a brief overview of OpenAI’s advice, along with my commentary on that advice. OpenAI’s advice is in italics and my comments are in plain text.
Delegate citation grunt work to ChatGPT.
I like the spirit of this advice. Generally, it’s a good idea to offload grunt work if you can. By grunt work, I mean those mundane aspects of a task that need to be done, but that aren’t rewarding or enjoyable. Typically, you don’t add much value to those activities either. Formatting citations certainly falls into the grunt work category for me. But some caution is in order. If citation formatting is important, you (or the student) should double-check ChatGPT’s accuracy. GAI can also check to see if all of the cited references are in the bibliography and vice versa, although I’m not sure I’d rely totally on AI.
Quickly get up to speed on a new topic.
This is a great suggestion. GAI is fantastic at giving you the gist of a new topic, especially if you use tools like Perplexity.ai that cite its sources. Google Learn is another great tool for this. What makes this suggestion even better is that it reinforces the idea of getting AI to help you rather than write for you. I wish OpenAI had mentioned the importance of fact-checking and not taking AI’s response as gospel though.
Get a roadmap of relevant sources.
Hmm … I had to think about this one. To be fair, the document does mention the need to double-check, but they didn’t mention the hallucination problem. With ChatGPT’s recently released web search feature, ChatGPT is better at this than it used to be, but I’d still use something like Perplexity.ai. Also, I’m not entirely confident that AI will access the full range of sources on a topic, but whether this matters depends on the exact nature of the task.
Complete your understanding by asking specific questions.
This is solid advice. I love to use GAI for this sort of thing, although I often like Claude better than ChatGPT. AI can be a great virtual colleague for helping you think through ideas. This isn’t exactly what OpenAI was talking about here, but it’s related. One twist on this advice is to enter your understanding of something, then ask AI if there are any gaps. Chances are, if you can write cogently about a topic, you understand it. But checking whether you’re writing cogently (what a great word) is tough to do by yourself; you really need another set of eyes, even if they’re artificial.
Improve your flow by getting feedback on structure.
OpenAI is suggesting that students get feedback on their outline. I love this idea, since I’m a strong proponent of outlining, especially for anything complex. As was the case with the last point, checking your own outline for flow and gaps isn’t easy, which makes AI a valuable partner. Of course, this assumes that students (or you) are creating an outline before you start to write. (You really should.)
Test your logic with reverse outlining.
What is reverse outlining, you ask? (Well, that’s what I asked.) Reverse outlining involves getting AI to summarize each paragraph’s main points in outline form. If the outline from AI looks like your desired flow, you’re in good shape. You could do this on your own, but you might have the tendency to inadvertently fill in gaps. After all, you know what you meant to write.
Develop your ideas through Socratic dialogue.
This isn’t a bad idea, but I’m skeptical about whether students can actually do this effectively. Socratic dialogue is complex in many respects; it also is humbling on occasion. A Socrates custom bot would be awesome though. (A number of these exist, but I haven’t tested them.)
Pressure test your thesis by asking for counterarguments.
This is solid advice. I do this sort of thing a lot. It’s pretty common for me to ask AI to identify weaknesses in my arguments, gaps in my logic, and questionable conclusions. If someone actually does this, their writing will be vastly improved, but it might be a bit painful.
Compare your ideas against history’s greatest thinkers.
Is this a little presumptuous on OpenAI’s part? AI CAN seem to simulate great thinkers from history, I suppose, and this can be a useful technique to improving your ideas. But, in some cases the success of the technique depends on the accuracy of AI’s representation of the great thinker. If the idea is to further stress test your ideas, this is good advice.
Elevate your writing through iterative feedback.
I do this for most of my writing. In fact, I asked ChatGPT for feedback on the paragraph above. The trick here is to be specific about the type of feedback you want. Do you want feedback on flow, structure, argument, quality of support/evidence or something more specific? For example, I asked if the paragraph above was a fair comment on OpenAI’s advice. (According to ChatGPT, it was.)
Use Advanced Voice Mode as a reading companion.
This one seems strange and only loosely connected to writing with AI. The idea is to talk with ChatGPT like you would a tutor to help increase your understanding of a text. This could work well IF used correctly and if ChatGPT responds like a tutor and doesn’t just spit out pat answers. Benefiting from challenging texts requires thinking. If AI helps you refine that thinking, great. If AI does the thinking for you, not so great.
Don’t just go through the motions — hone your skills.
This is the crux of it and something we should instill in ourselves and in students. Learning takes work. Offload the learning and you offload the learning. That doesn’t mean that all work related to writing is helpful for learning. (I’m looking at you, APA format.) The trick is to use AI for the right things, which is really the tl;dr of the matter. Use AI to assist, not replace your learning.
Well, that’s all for this time. If you have any questions or comments, you can leave them below, or email me - craig@AIGoesToCollege.com. I’d love to hear from you. Be sure to check out the AI Goes to College podcast, which I co-host with Dr. Robert E. Crossler. It’s available at https://www.aigoestocollege.com/follow. Thanks for reading!