-
Hello, I would like to ask if there is a plan to also publish CPU only backend images on the Backend Gallery. I am asking this, since I am running LocalAI on CPU inference as a background service on my iGPU only equipped laptop, and for example bark-cpp now having now 4 GB in size CUDA enabled image as only option is not insignificant shift in size from before, when I just built it with LocalAI from source, set as CPU only. Since this is my every day machine, I need to optimize my space usage, and since I cannot run any GPU based optimisation, downloading a GPU compatible image would be significant waste of space. Thank you in advance for answer. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
yes, the backend gallery will have also CPU backends. Bark-cpp in the gallery is indeed packaged for CPU only LocalAI/.github/workflows/backend.yml Line 501 in c546774 |
Beta Was this translation helpful? Give feedback.
OK, seems that I will have to look into manually building native backends without Docker if I want to keep optimizing for storage then. Would that still work, or now only Docker will be checked for possible backend software? Big image sizes make Docker a bit painful space-wise for deployment on a personal every day use laptop or PC. EDIT: CPU images got smaller in contrast to last time. I consider this discussion as answered.