From the course: Developing RAG Apps with LlamaIndex and Next.js

Unlock this course with a free trial

Join today to access over 25,200 courses taught by industry experts.

LlamaIndex flow: Overview

LlamaIndex flow: Overview

Now let's bring it all back to Llama Index. So the beauty here is that with Llama Index, Llama makes it way easier to create RAG systems because it has a very easy process for us to be able to ingest data into the system because this is, as you know, will convert unstructured data into different formats for large language models. And so, of course, we have also RAG here, the way in which we can answer queries across our own data, which is very, very important. And we have autonomous agents, so we can build software that can really smartly select tools to do tasks, to perform tasks, to get to a certain result. And that can happen repeatedly until it gets the correct answer. So all of these tools are inserted or are part of Lava Index as a framework. So the process is still the same. We have documents which then are loaded, and then they are parsed, and then an index is created, which then are passed through and added into a vector store. Do you know what a vector store is now? And…

Contents