From the course: Generative AI vs. Traditional AI

Unlock this course with a free trial

Join today to access over 25,300 courses taught by industry experts.

Inferencing

Inferencing

- My grandmother used to say that words should be weighed and not counted. It must be why that part of the family was so quiet. But even in the age of AI, sometimes large language models should think before they speak. Once the foundation model is trained, there's still a lot that goes into getting that training into action. When you work with a large language model, the system is basically going through two steps. You've already seen that you need to train these systems using self-supervised learning. At the end, you should have a massive foundation model. This foundation model contains everything the system has learned. Once that model exists, you need to put that knowledge into practice. When you ask the system a question, it needs to infer the answer from the data. In generative AI, large language models, this is called inferencing. As humans, we make inferences all the time. If you know how to make scrambled eggs, then it's easier to make an omelet. If you know how to say cat in…

Contents