Skip to content

Cannot install any backend #7662

@wilcomir

Description

@wilcomir

LocalAI version:
3.8.0 via macos dmg launcher

Environment, CPU architecture, OS, and Version:
MacOS 26.2

uname -a
Darwin fqdn.example.com 25.2.0 Darwin Kernel Version 25.2.0: Tue Nov 18 21:09:56 PST 2025; root:xnu-12377.61.12~1/RELEASE_ARM64_T6041 arm64

Describe the bug
Whenever I try to install a backend, I get the following error in the front end:

Error installing backend "llama-cpp": not a valid backend: run file not found "/Users/vladimir/.localai/backends/metal-llama-cpp/run.sh"

Installing models work fine

To Reproduce
Simply try to install any backend

Expected behavior
The backend installs

Logs

[14:12:53] STDERR: 2:12PM DBG API job submitted to install backend: localai@llama-cpp
[14:12:53] STDERR: 
[14:12:53] STDERR: 2:12PM INF HTTP request method=POST path=/api/backends/install/localai@llama-cpp status=200
[14:12:53] STDERR: 2:12PM WRN installing backend localai@llama-cpp
[14:12:53] STDERR: 2:12PM DBG backend galleries: [{github:mudler/LocalAI/backend/index.yaml@master localai}]
[14:12:53] STDERR: 2:12PM DBG Installing backend from gallery galleries=[{"name":"localai","url":"github:mudler/LocalAI/backend/index.yaml@master"}] name=localai@llama-cpp
[14:12:53] STDERR: 2:12PM DBG No system backends found
[14:12:53] STDERR: 2:12PM INF Using metal capability (arm64 on mac), set LOCALAI_FORCE_META_BACKEND_CAPABILITY to override
[14:12:53] STDERR: 2:12PM DBG Backend is a meta backend name=localai@llama-cpp systemState={"Backend":{"BackendsPath":"/Users/vladimir/.localai/backends","BackendsSystemPath":"/fusr/share/localai/backends"},"GPUVendor":"","Model":{"ModelsPath":"/Users/vladimir/.localai/models"},"VRAM":0}
[14:12:53] STDERR: 2:12PM INF Using metal capability (arm64 on mac), set LOCALAI_FORCE_META_BACKEND_CAPABILITY to override
[14:12:53] STDERR: 2:12PM DBG Using reported capability capMap={"amd":"rocm-llama-cpp","default":"cpu-llama-cpp","intel":"intel-sycl-f16-llama-cpp","metal":"metal-llama-cpp","nvidia":"cuda12-llama-cpp","nvidia-cuda-12":"cuda12-llama-cpp","nvidia-cuda-13":"cuda13-llama-cpp","nvidia-l4t":"nvidia-l4t-arm64-llama-cpp","nvidia-l4t-cuda-12":"nvidia-l4t-arm64-llama-cpp","nvidia-l4t-cuda-13":"cuda13-nvidia-l4t-arm64-llama-cpp","vulkan":"vulkan-llama-cpp"} reportedCapability=metal
[14:12:53] STDERR: 2:12PM INF Using metal capability (arm64 on mac), set LOCALAI_FORCE_META_BACKEND_CAPABILITY to override
[14:12:53] STDERR: 2:12PM DBG Using reported capability capMap={"amd":"rocm-llama-cpp","default":"cpu-llama-cpp","intel":"intel-sycl-f16-llama-cpp","metal":"metal-llama-cpp","nvidia":"cuda12-llama-cpp","nvidia-cuda-12":"cuda12-llama-cpp","nvidia-cuda-13":"cuda13-llama-cpp","nvidia-l4t":"nvidia-l4t-arm64-llama-cpp","nvidia-l4t-cuda-12":"nvidia-l4t-arm64-llama-cpp","nvidia-l4t-cuda-13":"cuda13-nvidia-l4t-arm64-llama-cpp","vulkan":"vulkan-llama-cpp"} reportedCapability=metal
[14:12:53] STDERR: 2:12PM DBG Found backend for reported capability backend=llama-cpp reportedCapability=metal
[14:12:53] STDERR: 2:12PM DBG Installing backend from meta backend bestBackend=metal-llama-cpp name=localai@llama-cpp
[14:12:53] STDERR: 2:12PM DBG Downloading backend backendPath=/Users/vladimir/.localai/backends/metal-llama-cpp uri=quay.io/go-skynet/local-ai-backends:latest-metal-darwin-arm64-llama-cpp
[14:12:53] STDERR: 2:12PM DBG [downloader] File already exists filePath=/Users/vladimir/.localai/backends/metal-llama-cpp
[14:12:53] STDERR: 2:12PM DBG File "/Users/vladimir/.localai/backends/metal-llama-cpp" already exists. Skipping download
[14:12:53] STDERR: 2:12PM DBG Downloaded backend backendPath=/Users/vladimir/.localai/backends/metal-llama-cpp uri=quay.io/go-skynet/local-ai-backends:latest-metal-darwin-arm64-llama-cpp
[14:12:53] STDERR: 2:12PM ERR Run file not found runFile=/Users/vladimir/.localai/backends/metal-llama-cpp/run.sh
[14:12:53] STDERR: 2:12PM ERR error installing backend localai@llama-cpp error="not a valid backend: run file not found \"/Users/vladimir/.localai/backends/metal-llama-cpp/run.sh\""
[14:12:53] STDERR: 2:12PM DBG No system backends found
[14:12:53] STDERR: 2:12PM INF Using metal capability (arm64 on mac), set LOCALAI_FORCE_META_BACKEND_CAPABILITY to override
[14:12:53] STDERR: 2:12PM INF HTTP request method=GET path=/api/backends/job/97b984fa-dda5-11f0-b71c-8afba55952d7 status=200

Additional context
For some reason it seems to think that the model is already there, but it is not; there is the folder but it is empty.

vladimir at avh-mac-mini-01 in ~/.localai 
$ du -h -d2 .          
 72M	./bin
  0B	./backends/metal-llama-cpp
  0B	./backends/metal-whisper
  0B	./backends/metal-diffusers
 12K	./backends
801M	./models/mmproj
3.1G	./models
8.0K	./checksums
740K	./logs
4.0K	./metadata
3.2G	.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions