From the course: RAG Fine-Tuning: Advanced Techniques for Accuracy and Model Performance
Unlock this course with a free trial
Join today to access over 25,200 courses taught by industry experts.
Combining RAG and fine-tuning: RAFT
From the course: RAG Fine-Tuning: Advanced Techniques for Accuracy and Model Performance
Combining RAG and fine-tuning: RAFT
- [Instructor] Researchers have overcome the limitation of both RAG and Fine-Tuning with this new method called RAFT, RAFT or Retrieval Augmented Fine Tuning, combines the best aspects of RAG, and fine tuning into a single powerful approach. It's like teaching an AI to both access a library, and become an expert in using it. So RAFT is a hybrid superhero that combines the strengths of both RAG and fine-tuning while overcoming their individual weaknesses. Let's break down how these three approaches compare. First, let's talk about accuracy and responses. RAG has high potential accuracy since it can access and retrieve relevant information directly from source documents. Fine-tuning on the other hand, has variable accuracy that can be inconsistent. While it can be good for specific domains it was trained on, it may struggle with new information or variations not seen during training. RAFT maintains high and consistent accuracy by combining the benefits of both approaches. It can…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.