From the course: Hands-On AI: Build a RAG Model from Scratch with Open Source

Unlock this course with a free trial

Join today to access over 25,300 courses taught by industry experts.

Vector embeddings and their implementation

Vector embeddings and their implementation

- [Instructor] In this chapter, we'll be building the machinery to use a vector database that employs vector embeddings to help us find the content we need to help our model respond to a query. But first, we'll be covering what exactly vector embeddings are, and why they'll help us find the information we need in a manner more native to the meaning behind the sentence, than simply trying to match words in our query to words in our database. Now, one of the first challenges faced in natural language processing was converting words into a format that computers could understand and manipulate. Vector embedding solved this by assigning a vector to each word. The vector could live in a high-dimensional vector space, meaning that it's represented by a long list of numbers where each number in the list would represent some attribute. It could be that the first number indicates the positivity of the word, and the 651st…

Contents