π’ Introduction to Thought Generation Techniques
Welcome to the thought generation section of the advanced Prompt Engineering Guide.
Thought generation uses various techniques to prompt an Large Language Model (LLM) to clearly articulate its reasoning while solving problems.
We've already explored the Contrastive Chain-of-Thought prompting technique. Stay tuned for more advanced techniques coming soon!
In this section, you'll learn about:
- Contrastive Chain-of-Thought
- Automatic Chain-of-Thought (Auto-CoT)
- Tabular Chain-of-Thought (Tab-CoT)
- Memory-of-Thought (MoT)
- Active Prompting
- Analogical Prompting
- Complexity-Based Prompting
- Step-Back Prompting
Valeriia Kuka
Valeriia Kuka, Head of Content at Learn Prompting, is passionate about making AI and ML accessible. Valeriia previously grew a 60K+ follower AI-focused social media account, earning reposts from Stanford NLP, Amazon Research, Hugging Face, and AI researchers. She has also worked with AI/ML newsletters and global communities with 100K+ members and authored clear and concise explainers and historical articles.