Replies: 2 comments
-
|
Ollama is primarily for local inference. However, if you want to use cloud models (like Kimi or GPT), check if the tool supports an OpenAI-compatible endpoint setting. Many tools allow you to point to a different (e.g., OpenRouter or a proxy) to use cloud models with the same interface. |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
Most tools support custom endpoints via an |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Can I use ollama cloud models like : gpt Oss,Kimi K2.5:cloud etc
Beta Was this translation helpful? Give feedback.
All reactions