Set VITE_MYIPSPACE_API_BASE to point this UI at a running gateway.
Local LLMs Ollama / LM Studio / llama.cpp 0
No local models configured Connect Ollama, LM Studio, or any local inference server
Cloud LLMs OpenAI / Anthropic / Gemini / etc. 0
No cloud models configured Add API keys for OpenAI, Anthropic, Google, and more