Skip to content

Commit 92981aa

Browse files
committed
update docs order
1 parent 9ad79ec commit 92981aa

File tree

1 file changed

+28
-29
lines changed

1 file changed

+28
-29
lines changed

‎docs/open_source/setting_up/index.md‎

Lines changed: 28 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -2,35 +2,6 @@
22

33
This guide focuses primarily on configuring and using various LLM clients supported to run Giskard's LLM-assisted functionalities. We are using [LiteLLM](https://github.com/BerriAI/litellm) to handle the model calls, you can see the list of supported models in the [LiteLLM documentation](https://docs.litellm.ai/docs/providers).
44

5-
6-
## Groq Client Setup
7-
8-
More information on [Groq LiteLLM documentation](https://docs.litellm.ai/docs/providers/groq)
9-
10-
**Note: Groq does not currently support embedding models.**
11-
For a complete list of supported embedding providers, see: [LiteLLM Embedding Documentation](https://docs.litellm.ai/docs/embedding/supported_embedding)
12-
13-
### Setup using .env variables
14-
15-
```python
16-
import os
17-
import giskard
18-
19-
os.environ["GROQ_API_KEY"] = "" # "my-groq-api-key"
20-
21-
# Optional, setup a model (default LLM is llama-3.3-70b-versatile)
22-
giskard.llm.set_llm_model("groq/llama-3.3-70b-versatile")
23-
```
24-
25-
### Setup using completion params
26-
27-
```python
28-
import giskard
29-
30-
api_key = "" # "my-groq-api-key"
31-
giskard.llm.set_llm_model("groq/llama-3.3-70b-versatile", api_key=api_key)
32-
```
33-
345
## OpenAI Client Setup
356

367
More information on [OpenAI LiteLLM documentation](https://docs.litellm.ai/docs/providers/openai)
@@ -178,6 +149,34 @@ giskard.llm.set_llm_model("gemini/gemini-1.5-pro")
178149
giskard.llm.set_embedding_model("gemini/text-embedding-004")
179150
```
180151

152+
## Groq Client Setup
153+
154+
More information on [Groq LiteLLM documentation](https://docs.litellm.ai/docs/providers/groq)
155+
156+
**Note: Groq does not currently support embedding models.**
157+
For a complete list of supported embedding providers, see: [LiteLLM Embedding Documentation](https://docs.litellm.ai/docs/embedding/supported_embedding)
158+
159+
### Setup using .env variables
160+
161+
```python
162+
import os
163+
import giskard
164+
165+
os.environ["GROQ_API_KEY"] = "" # "my-groq-api-key"
166+
167+
# Optional, setup a model (default LLM is llama-3.3-70b-versatile)
168+
giskard.llm.set_llm_model("groq/llama-3.3-70b-versatile")
169+
```
170+
171+
### Setup using completion params
172+
173+
```python
174+
import giskard
175+
176+
api_key = "" # "my-groq-api-key"
177+
giskard.llm.set_llm_model("groq/llama-3.3-70b-versatile", api_key=api_key)
178+
```
179+
181180
## Custom Client Setup
182181

183182
More information on [Custom Format LiteLLM documentation](https://docs.litellm.ai/docs/providers/custom_llm_server)

0 commit comments

Comments
 (0)