Skip to content
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Docs: Added Groq Client
  • Loading branch information
Kunjan Shah committed Jun 9, 2025
commit b9d3f01d3938230a0b6595438902581056f4edac
29 changes: 29 additions & 0 deletions docs/open_source/setting_up/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,35 @@

This guide focuses primarily on configuring and using various LLM clients supported to run Giskard's LLM-assisted functionalities. We are using [LiteLLM](https://github.com/BerriAI/litellm) to handle the model calls, you can see the list of supported models in the [LiteLLM documentation](https://docs.litellm.ai/docs/providers).


## Groq Client Setup

More information on [Groq LiteLLM documentation](https://docs.litellm.ai/docs/providers/groq)

### Setup using .env variables

```python
import os
import giskard

os.environ["GROQ_API_KEY"] = "" # "my-groq-api-key"

# Optional, setup a model (default LLM is llama-3.3-70b-versatile)
giskard.llm.set_llm_model("groq/llama-3.3-70b-versatile")

# Note: Groq does not currently support embedding models
# Use another provider for embeddings if needed
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perfect! Thanks.
Could you just mention to access https://docs.litellm.ai/docs/embedding/supported_embedding to see the full list of supported providers?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perfect! Thanks. Could you just mention to access https://docs.litellm.ai/docs/embedding/supported_embedding to see the full list of supported providers?

  1. I'll add the documentation reference you suggested in docs/open_source/setting_up/index.md linking to LiteLLM's supported embedding providers
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, thanks!

```

### Setup using completion params

```python
import giskard

api_key = "" # "my-groq-api-key"
giskard.llm.set_llm_model("groq/llama-3.3-70b-versatile", api_key=api_key)
```

## OpenAI Client Setup

More information on [OpenAI LiteLLM documentation](https://docs.litellm.ai/docs/providers/openai)
Expand Down
Loading