Deploying models to LM Studio
Saving models to GGUF so you can run and deploy them to LM Studio


1) Export to GGUF (from Unsloth)
2) Import the GGUF into LM Studio

3) Load and chat in LM Studio
4) Serve your fine-tuned model as a local API (OpenAI-compatible)
Troubleshooting
Model runs in Unsloth, but LM Studio output is gibberish / repeats
LM Studio doesn’t show my model in “My Models”
OOM / slow performance
More resources
Last updated
Was this helpful?

