From the course: Prompt Engineering with LangChain

Unlock the full course today

Join today to access over 23,200 courses taught by industry experts.

RAG

RAG

- [Instructor] Retrieval Augmented Generation or RAG for short is a technique that's designed to enhance the capabilities of large language models by allowing them access to external knowledge sources. This was a technique that was introduced by Meta AI in September of 2020 and emphasized the potential for language models to excel in knowledge intensive tasks. In the original Meta paper, they combined an information retrieval component with a text generation model, allowing the LLM to access the latest information for generating reliable outputs. This approach was especially beneficial for knowledge intensive tasks where the model needs to provide accurate and up-to-date information. While LLMs have shown remarkable capabilities in understanding and generating human-like text, they have some severe limitations. One is that they could be expensive to train, or fine tune. Two is that their knowledge is static. It's based on the latest training data and it doesn't update with new…

Contents