Skip to content

Commit 14e19f6

Browse files
vmeselavelino
authored andcommitted
Adds docs
1 parent 6651d1a commit 14e19f6

File tree

1 file changed

+21
-0
lines changed

1 file changed

+21
-0
lines changed

‎docs/custom-llms-and-plugins.md‎

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -28,6 +28,27 @@ If your class doesn't implement the AbstractLLM or doesn't return a Runnable obj
2828

2929
Dialog is pretty extensible, being a FastAPI based project allows you to be very creative.
3030

31+
### Adding new models to the project through settings
32+
33+
In the release v0.1.3, we enabled users to create multiple endpoints using different models with just a simple tweak in the prompt config config file (the prompt file).
34+
35+
The default model is still configured in the same way as the previous versions: you need to define the environment variable 'LLM_CLASS' to use a model that implements `AbstractLLM` class (any of the models available in dialog-lib implements this class and are ready to use) of your choice and use it on the `/chats/{chat_id}` or `/ask` endpoints.
36+
37+
To add a new model, you need to implement a new `[endpoint]` in the toml, just as shown below:
38+
39+
```toml
40+
[model]
41+
model_name = "gpt-4o"
42+
temperature = 0.1
43+
44+
... some other settings over here ...
45+
46+
[[endpoint]]
47+
path = "/my-awesome-new-model"
48+
model_name = "newmodel"
49+
model_class_path = "the.importable.path.to.your.ModelClass"
50+
```
51+
3152
### Writing a new plugin without a PyPI Package.
3253

3354
To add new endpoints or features, you need to create a package inside the `src/plugins` folder and, inside the new package folder, add the following file:

0 commit comments

Comments
 (0)