From the course: Developing RAG Apps with LlamaIndex and Next.js
Unlock this course with a free trial
Join today to access over 25,200 courses taught by industry experts.
LlamaIndex: Data ingestion, indexing and query interface overview
From the course: Developing RAG Apps with LlamaIndex and Next.js
LlamaIndex: Data ingestion, indexing and query interface overview
So, LLAMA allows for the connection of existing data sources in various formats. So we could have SQL databases, we could have just documents of some sort, even APIs, right? So these are different source of data, which are then connected through LLAMA index, through connections that already exist in the framework, LLAMA index framework, to be connected, in this case, to the large language models. Now, when it comes to data indexing, the idea is that we create indexes. So, Llama Index, the name implies, is really good at providing, in this case, the tools necessary to store and index data for various use cases. And also, Llama Index is really good at integrating with downstream vector stores and database providers, as you will see. So, this whole process here is done easily in Llama Index as a framework. And so once this data comes in and then it goes through the process of indexing and then that indexing can be for various use cases. And the beauty is that internally it can be…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.
Contents
-
-
-
-
(Locked)
Deep dive into LlamaIndex and key features: Overview4m 30s
-
(Locked)
RAG crash course6m 20s
-
(Locked)
LlamaIndex flow: Overview2m 41s
-
(Locked)
LlamaIndex: Data ingestion, indexing and query interface overview3m 37s
-
(Locked)
Hands-on: Set up LlamaIndex simple RAG system12m 19s
-
(Locked)
Summary1m 13s
-
(Locked)
-
-
-
-
-