HELP : failed to inference ONNX model #746
Replies: 1 comment
-
|
That error usually means the browser (or runtime) cannot properly load or execute the ONNX model, even if the file is in the right folder. A few things you might want to check: Path issue : Make sure the path to the model in your index.html matches exactly (relative paths like ./model.onnx vs /model.onnx can matter). Serving the file : If you’re just opening index.html directly in the browser, the ONNX file may not be served correctly. Try running a simple local server (e.g., npx http-server or python -m http.server) so the file is delivered over HTTP. Model compatibility : Ensure the model was exported in a format supported by the ONNX runtime you’re using (e.g., ONNX.js, onnxruntime-web). Some ops are not supported in the browser. Check the console/network tab : Open DevTools → Network, and confirm that the ONNX file is actually being downloaded without a 404 or MIME-type error. It’s unlikely that Node.js itself is the problem unless your setup depends on it for serving the files. Most of the time it’s either a file path issue or unsupported model ops. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
When I tried to test the code from the "applied" course (Classification) :

First when I clic the button "What kind of cuisine can you make?" nothing happen :
So I check the consol and this error appears : "failed to inference ONNX model"
So I checked several times but the model is at the right place (same folder) and the model is called like this in the index.html :


Do you have an idea of what is missing here ? Maybe it's the node.js application that I download which is responsible of this.
Beta Was this translation helpful? Give feedback.
All reactions