Prompt Engineering Guide
😃 Basics
💼 Applications
🧙‍♂️ Intermediate
🧠 Advanced
Special Topics
⚖️ Reliability
🔓 Prompt Hacking
🖼️ Image Prompting
🌱 New Techniques
🔧 Models
🗂️ RAG
🤖 Agents
💪 Prompt Tuning
🔁 Language Model Inversion
🔨 Tooling
📙 Vocabulary Resource
🎲 Miscellaneous
📚 Bibliography
📦 Prompted Products
🛸 Additional Resources
🔥 Hot Topics
✨ Credits

Introduction

🟢 This article is rated easy
Reading Time: 1 minute
Last updated on March 2, 2025

Valeriia Kuka

Retrieval-Augmented Generation (RAG) represents a powerful paradigm in AI that bridges the gap between static language models and dynamic knowledge systems. Rather than a single technique, RAG encompasses a family of methods that share a common philosophy: enhancing language model outputs by incorporating external knowledge retrieved at inference time.

This hybrid architecture creates a flexible framework that can be adapted to various applications and domains. The RAG approach has spawned numerous implementations and variations, each optimizing different aspects of the retrieval-generation pipeline.

Valeriia Kuka

Valeriia Kuka, Head of Content at Learn Prompting, is passionate about making AI and ML accessible. Valeriia previously grew a 60K+ follower AI-focused social media account, earning reposts from Stanford NLP, Amazon Research, Hugging Face, and AI researchers. She has also worked with AI/ML newsletters and global communities with 100K+ members and authored clear and concise explainers and historical articles.

🟦 Auto-RAG

🟦 Corrective RAG

🟦 FLARE / Active RAG

🟦 GraphRAG

🟦 HybridRAG

🟦 InFO-RAG

🟦 Multi-Fusion Retrieval Augmented Generation (MoRAG)

🟦 R^2AG

🟦 Retrieval-Augmented Generation (RAG)

🟦 Reliability-Aware RAG (RA-RAG)

🟦 Self-RAG

🟦 Speculative RAG

Footnotes

  1. Lewis, P., Perez, E., Piktus, A., Petroni, F., Karpukhin, V., Goyal, N., Küttler, H., Lewis, M., tau Wen-Yih, Rocktäschel, T., Riedel, S., & Kiela, D. (2021). Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks. https://arxiv.org/abs/2005.11401