Prompt Engineering Guide
😃 Basics
💼 Applications
🧙‍♂️ Intermediate
🧠 Advanced
Special Topics
🌱 New Techniques
🤖 Agents
⚖️ Reliability
🖼️ Image Prompting
🔓 Prompt Hacking
🔨 Tooling
💪 Prompt Tuning
🗂️ RAG
🎲 Miscellaneous
Models
📝 Language Models
Resources
📙 Vocabulary Resource
📚 Bibliography
📦 Prompted Products
🛸 Additional Resources
🔥 Hot Topics
✨ Credits
🧠 AdvancedThought Generation🟢 Introduction

🟢 Introduction to Thought Generation Techniques

Reading Time: 1 minute

Last updated on September 27, 2024

Welcome to the thought generation section of the advanced Prompt Engineering Guide.

Thought generation uses various techniques to prompt an Large Language Model (LLM) to clearly articulate its reasoning while solving problems.

We've already explored the Contrastive Chain-of-Thought prompting technique. Stay tuned for more advanced techniques coming soon!

In this section, you'll learn about:

Valeriia Kuka

Valeriia Kuka, Head of Content at Learn Prompting, is passionate about making AI and ML accessible. Valeriia previously grew a 60K+ follower AI-focused social media account, earning reposts from Stanford NLP, Amazon Research, Hugging Face, and AI researchers. She has also worked with AI/ML newsletters and global communities with 100K+ members and authored clear and concise explainers and historical articles.

🟦 Active Prompting

🟦 Analogical Prompting

🟦 Automatic Chain of Thought (Auto-CoT)

🟦 Complexity-Based Prompting

🟦 Contrastive Chain-of-Thought

🟦 Memory-of-Thought (MoT)

🟦 Step-Back Prompting

🟦 Tabular Chain-of-Thought (Tab-CoT)

Edit this page
Word count: 0

Get AI Certified by Learn Prompting


Copyright © 2024 Learn Prompting.