Local LLM
LLMZap Gateway
The platform now supports a configurable local OpenAI-compatible LLM gateway. Change the base URL and endpoint paths through the server environment without changing application code.
Not configured
Saved Config
Frontend-managed gateway
Saved config overrides env defaults. You can still override the connection per request for one-off testing.
Connection
Gateway configuration
{
"health": "/health",
"models": "/v1/models",
"props": "/props",
"chatCompletions": "/v1/chat/completions",
"completions": "/v1/completions",
"embeddings": "/v1/embeddings"
}Env knobs: `LLM_API_BASE_URL`, `LLM_CHAT_COMPLETIONS_PATH`, `LLM_MODELS_PATH`, `LLM_EMBEDDINGS_PATH`, `LLM_DEFAULT_MODEL`, `LLM_API_KEY`.
Models
Remote inventory
No models reported
—
Chat Test
Direct model call
—
Planner Test
Mirror-aware plan
—