Daniel Han’s Post

You can now run Unsloth AI GGUFs locally via Docker! 🐋 Run LLMs on Mac or Windows with one line of code or no code at all! We collabed with Docker, Inc to make Dynamic GGUFs available for everyone! Just run: docker model run ai/gpt-oss:20B ⭐Guide: https://lnkd.in/gCJvBNQc You can also use Docker Desktop for a no-code UI to run your LLMs.

  • graphical user interface, text

Read our complete guide with the 2 terminal and UI methods. You can also use OpenAI's chat completion library to run the LLMs. https://docs.unsloth.ai/models/how-to-run-llms-with-docker

Docker for the win!! Amazing work 🦥

Daniel Han, great share. Dockerizing Dynamic GGUFs is a big win for reproducibility and easy local pilots.

Great work Daniel! Would this be a good fit for production, multi user scenarios?

this is huge for accessibility and speed. docker + no-code = more builders in the mix

Oh, that is cool! Windows is a bit cooler now. Just a bit… But not enough to make me switch back from Linux :)

See more comments

To view or add a comment, sign in

Explore content categories