From the course: Developing RAG Apps with LlamaIndex and Next.js

Unlock this course with a free trial

Join today to access over 25,200 courses taught by industry experts.

LlamaIndex: Data ingestion, indexing and query interface overview

LlamaIndex: Data ingestion, indexing and query interface overview

From the course: Developing RAG Apps with LlamaIndex and Next.js

LlamaIndex: Data ingestion, indexing and query interface overview

So, LLAMA allows for the connection of existing data sources in various formats. So we could have SQL databases, we could have just documents of some sort, even APIs, right? So these are different source of data, which are then connected through LLAMA index, through connections that already exist in the framework, LLAMA index framework, to be connected, in this case, to the large language models. Now, when it comes to data indexing, the idea is that we create indexes. So, Llama Index, the name implies, is really good at providing, in this case, the tools necessary to store and index data for various use cases. And also, Llama Index is really good at integrating with downstream vector stores and database providers, as you will see. So, this whole process here is done easily in Llama Index as a framework. And so once this data comes in and then it goes through the process of indexing and then that indexing can be for various use cases. And the beauty is that internally it can be…

Contents