As I've mentioned before, I'm not a fan of the term "prompt engineering" (You can read my thoughts here: LINK). The gist of my view is that "engineering" is not the right description for what most of us do with generative AI. (From here on, I'm going to use AI to refer to generative AI.) I prefer "prompt design" but even "design" is a bit complex for how I use AI most of the time. There is a place for complex, engineered prompts and for carefully designed prompts as well. But the vast majority of the time, I use three short, simple prompts. In fact, I use these multiple times in many of my conversations. What are these super prompts? There are three of them:
What do you think?
Does this (that) make sense?
Please critique this (sentence, paragraph, passage, article, etc.).
That's it, they're just that simple. But there also incredibly powerful and flexible. Let's break down each of these.
What do you think? This is my most used prompt. I use AI a lot for developing ideas. One of AI strong points in ideation (what a word) is that it can point out the strengths, weaknesses, and gaps in your thinking. Typically, I'll lay out some idea, then ask AI what it thinks. That's it. This is kind of like having a cup of coffee (or something stronger) with a friend who helps you think through stuff. You can almost picture this. You rattle on about some idea as your friend sips their beverage. You finish with "What do you think?" and your friend gives their input. It's really the same sort of exchange with AI.
The upside of this approach is that it almost always yields something useful. The downside is that the results aren't always focused on what I had in mind. Sometimes AI will go further than I'd like (just like the proverbial friend). Usually, I don't care much about these divergences. I either just ignore the extras or refine my instructions.
Take a look at the example below, which is from Claude. Notice that I gave Claude a bit of context, which may or may not be necessary depending on the situation. (It almost never hurts though.) Then, I laid out my idea and asked What do you think?
Prompt: I write a newsletter, AI Goes to College, which is focused on helping higher education professionals navigate the world of generative AI. I'm thinking about an article on my three favorite prompts, which are super simple and useful: What do you think?, Does that make sense?, and Please critique this text.. What do you think?
Response: That's an interesting concept for your AI Goes to College newsletter article. Focusing on simple yet powerful prompts is a great way to introduce higher education professionals to effective interactions with AI systems. Let's break down each of your chosen prompts:
"What do you think?" This open-ended prompt encourages the AI to provide its analysis or opinion on a given topic. It's versatile and can lead to insightful responses across various subjects.
"Does that make sense?" This prompt is excellent for checking the clarity and coherence of ideas or explanations. It can help users refine their understanding or communication of complex topics.
"Please critique this text." This prompt leverages the AI's analytical capabilities to provide feedback on written work, which can be particularly useful in academic settings.
These prompts are indeed simple and broadly applicable, which makes them accessible to AI newcomers. They also demonstrate different ways to engage with AI: seeking opinions, checking understanding, and requesting feedback.
To enhance your article, you might consider:
Providing specific examples of how each prompt could be used in higher education contexts.
Discussing how these prompts can be combined or expanded for more nuanced interactions.
Explaining why these particular prompts are effective and how they relate to AI's capabilities and limitations.
Would you like me to elaborate on any of these points or suggest ways to structure your article?
Claude's response was pretty good. It validated the idea by providing it's own rationale for why the three prompts are effective. That's great, but I already thought this was a good idea for an article, so there's not a ton of value added. That changes quickly though when Claude brings up the different ways to engage with AI (seeking opinions, checking understanding, and requesting feedback). This makes total sense, but it hadn't occurred to me.
Claude also gave me some ideas for enhancing the article. Although I'd already planned to do #1 and part of #3, the second suggestion was not on my radar, but it's a solid suggestion.
Does that (this) make sense? This is my go-to prompt for a couple of situations, when I'm trying to understand a concept, or when I'm trying to develop a complex idea. The complex ideas are often either seeing something common from a different angle or synthesizing disparate concepts. Occasionally, I'll use this prompt to check how I'm planning to explain something to some audience such as my students.
Here's an example related to this article. You might remember that I framed the article around the claim that "prompt engineering" is kind of scary. That seems to be a reasonable opening, but I wanted another opinion ... Claudes in this case.
Prompt: Thanks! One reason I want to write this article is that the term "prompt engineering" is kind of scary and off-putting to more casual users and new users. Does that make sense?
Response: Yes, that absolutely makes sense. The term "prompt engineering" can indeed be intimidating, especially for newcomers to AI technology. Here's why your approach is valuable:
Simplicity vs. complexity: "Prompt engineering" sounds technical and complex, implying a need for specialized knowledge. Your simple prompts demonstrate that effective AI interaction doesn't always require advanced skills.
Accessibility: By focusing on everyday phrases like "What do you think?", you're showing that interacting with AI can be as natural as having a conversation. This can help reduce the barrier to entry for many users.
Demystification: Your approach helps demystify AI interaction. It shows that you don't need to learn a new "language" or set of skills to start using AI effectively.
Practical focus: These prompts emphasize practical usage rather than technical optimization, which aligns well with the needs of many educators and administrators.
Confidence building: Simple prompts can help build confidence in users who might otherwise feel out of their depth with AI technology.
By framing your article this way, you're effectively lowering the perceived barriers to AI adoption in higher education. You're showing that meaningful AI interaction can start with familiar, everyday language rather than complex technical skills.
Would you like to discuss ways to further emphasize this "simplicity over complexity" message in your article?
Again, Claude's response is a mix of confirming my thoughts and expanding the idea. I really like Claude's points, especially 3, 4, and 5. There's another bit of gold in Claude's response, although its implied rather than explicitly stated.
Taken together, Claude's points lead me to believe that these three prompts will be great when I teach new users how to use AI. You can get a lot of mileage (kilometerage??) out of these three prompts if you provide a bit of context as well. So, these three simple, powerful prompts could be the foundation of an entire short-course on AI (I think, but I'll have to ask Claude). Side note: I asked and Claude liked the idea and offered a basic structure for the course.
Please critique this (sentence, paragraph, passage, etc.): The third prompt provides a useful second set of "eyes" on your writing. Using this one is pretty straightforward. Just type in the prompt, then give AI the text and you're good to go. If you haven't already provided context you may want to do so. Here's an example:
Prompt: Here's what I have so far for the article. I still need to write the conclusion. Please critique the article. (Then I pasted the article into the chat window.)
Response:
Critique of "AI Goes to College" Article Draft
Strengths
Approachable tone: The writing style is conversational and engaging, making the content accessible to readers who might be intimidated by AI technology.
Clear structure: The article is well-organized, introducing the concept and then breaking down each prompt with explanations and examples.
Practical examples: The inclusion of real interactions with Claude helps illustrate how these prompts work in practice.
Addressing concerns: By acknowledging the intimidating nature of "prompt engineering," you're likely to resonate with your target audience of higher education professionals who may be new to AI.
Emphasis on simplicity: The focus on three simple prompts effectively demonstrates that AI interaction doesn't have to be complex.
Areas for Improvement
Introduction: Consider a more attention-grabbing opening. While you mention your thoughts on "prompt engineering," the link isn't included, which might be confusing for readers.
Consistency: The formatting of the three prompts varies throughout the article. Consider standardizing how you present them for clarity.
Balance: The section on "What do you think?" is significantly longer than the others. Consider expanding on the other two prompts or trimming this section for better balance.
Transitions: The shift from discussing the prompts to mentioning the potential short course feels abrupt. This idea might fit better in a conclusion or as a separate section.
Concrete applications: While you mention higher education contexts, more specific examples of how these prompts could be used in academic settings would strengthen the article's relevance to your audience.
Suggestions
Consider adding a brief conclusion that ties everything together and reiterates the value of these simple prompts for higher education professionals.
You might want to include a brief section on best practices or tips for using these prompts effectively.
If space allows, a discussion on the limitations of these prompts or when more complex interactions might be necessary could provide a more rounded view.
The idea of using these prompts as the foundation for a short course is intriguing. You might want to expand on this, perhaps as a call-to-action for readers interested in learning more.
Consider adding subheadings within each prompt section (e.g., "How to use it", "Example", "Benefits") to improve readability and scannability.
Overall, this article draft effectively communicates your main points and provides valuable insights for your target audience. With some refinement, it could be an excellent resource for higher education professionals looking to start using AI tools.
(I cut off part of the response since Claude got a bit repetitive.)
Claude's areas for improvement and suggestions are reasonable, although I decided not to implement any of them because the article is already getting long. This brings up an important point. All of these prompts can be improved by adding a little detail. For example, I could have mentioned that I didn't want the article to get longer when I gave Claude the "please critique" prompt. You can also get AI to focus on specific areas by adding detail to the other prompts as well. However, I almost always use the simple versions of the prompts. You'll need to play around with them to see what's best for you.
You don't need to be an AI or prompting expert to benefit from AI. If all you ever did was use these three prompts, I'm sure you'll find ways to use AI productively.
Do you have any favorite prompts? If so, let me know at craig@AIGoesToCollege.com.