LLM Models
The ecosystem can connect to multiple LLM providers (cloud or local). Model configuration is managed from the Portal Admin UI.
Configuration model (high level)
Typical fields:
- ID: unique name inside the portal
- Provider:
openai,google,anthropic,ollama, ... - Model: provider model name
- Timeout
- Max tools (tool-calling limit)
- Max context history
- API key / base URL
- Default/global flags
Related documentation
- Portal:
docs/internal/services/platform/portal.md