Replies: 2 comments 1 reply
-
I am curious about this as well (and the fact that nobody answered is probably an answer). I have a local Ollama setup. I don't want ANYTHING sent out. Does this plugin support completely local? |
Beta Was this translation helpful? Give feedback.
-
This basically means that you can connect ProxyAI to your locally running/self-hosted LLMs.
The platform type must be changed to PY - https://github.com/carlrobertoh/ProxyAI/blob/master/gradle.properties#L14 Yes, ProxyAI supports completely offline functionality. If trust is a concern, then one could simply disconnect from the internet to test this out, as well as monitor your network traffic to see what requests are being sent out. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I have followed the directions on the readme for "running locally", but I don't know what I have done.
I have asked "what llm are you and are you running locally or on the cloud?" and I got the answer "I am based on OpenAI's GPT-3 model, and I operate in the cloud. I do not run locally on your machine or environment."
Is the answer correct?
If yes, then what is running locally?
And how can it work if I didn't provide an API key for it to work?
Edit
One additional question: how do I get
gradlew runIde
to run PyCharm instead of running IntelliJ Idea?Beta Was this translation helpful? Give feedback.
All reactions