Prompt Engineering Guide
πŸ˜ƒ Basics
πŸ’Ό Applications
πŸ§™β€β™‚οΈ Intermediate
🧠 Advanced
Special Topics
βš–οΈ Reliability
πŸ”“ Prompt Hacking
πŸ–ΌοΈ Image Prompting
🌱 New Techniques
πŸ”§ Models
πŸ—‚οΈ RAG
πŸ€– Agents
πŸ’ͺ Prompt Tuning
πŸ” Language Model Inversion
πŸ”¨ Tooling
πŸ“™ Vocabulary Resource
🎲 Miscellaneous
πŸ“š Bibliography
πŸ“¦ Prompted Products
πŸ›Έ Additional Resources
πŸ”₯ Hot Topics
✨ Credits
🧠 AdvancedThought Generation🟒 Introduction

🟒 Introduction to Thought Generation Techniques

Reading Time: 1 minute
Last updated on September 27, 2024

Valeriia Kuka

Welcome to the thought generation section of the advanced Prompt Engineering Guide.

Thought generation uses various techniques to prompt an Large Language Model (LLM) to clearly articulate its reasoning while solving problems.

We've already explored the Contrastive Chain-of-Thought prompting technique. Stay tuned for more advanced techniques coming soon!

In this section, you'll learn about:

Valeriia Kuka

Valeriia Kuka, Head of Content at Learn Prompting, is passionate about making AI and ML accessible. Valeriia previously grew a 60K+ follower AI-focused social media account, earning reposts from Stanford NLP, Amazon Research, Hugging Face, and AI researchers. She has also worked with AI/ML newsletters and global communities with 100K+ members and authored clear and concise explainers and historical articles.

🟦 Active Prompting

🟦 Analogical Prompting

🟦 Automatic Chain of Thought (Auto-CoT)

🟒 Chain of Draft (CoD)

🟦 Complexity-Based Prompting

🟦 Contrastive Chain-of-Thought

🟦 Memory-of-Thought (MoT)

🟦 Step-Back Prompting

🟦 Tabular Chain-of-Thought (Tab-CoT)

🟦 Thread of Thought (ThoT)