After deploying the application, you can configure LLM providers through the web interface.
Providers define which LLM backends (Ollama, Anthropic, OpenAI, Gemini, etc.) are available to the system. All provider configuration is stored in the PostgreSQL database.
docker-compose up -d
The web interface will be available at http://localhost:8081.
Fill in the provider form with your chosen provider’s information:
Ollama
ollama-local, local-model, etc.)qwen:8b, mistral:latest, llama2:latest, etc.)http://localhost:11434Choose one or more of these options:
Anthropic Claude
claude-haiku, claude-sonnet, etc.)claude-haiku-4-5-20251001, claude-sonnet-4-20250514, etc.)OpenAI
gpt-4o, gpt-4-turbo, etc.)gpt-4o, gpt-4-turbo, gpt-3.5-turbo, etc.)Google Gemini
gemini-flash, gemini-pro, etc.)gemini-2.0-flash, gemini-1.5-pro, etc.)After entering provider details, click Save or Test Connection to verify the configuration works.
| Platform | Local? | Requires API Key? | Notes |
|---|---|---|---|
| Ollama | Yes | No | Local inference, requires Ollama running on machine |
| Anthropic | No | Yes | Claude API from Anthropic |
| OpenAI | No | Yes | GPT models from OpenAI |
| Google Gemini | No | Yes | Gemini models from Google |
ollama serve) and the host URL is correct (usually http://localhost:11434)ollama pull <model-name> to download the model firstFor programmatic provider configuration, you can use the REST API directly.
curl -X POST http://localhost:8080/providers \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_API_KEY" \
-d '{
"name": "claude-haiku",
"platform": "anthropic",
"model": "claude-haiku-4-5-20251001",
"api_key": "sk-ant-...",
"timeout": 120
}'
curl http://localhost:8080/providers \
-H "Authorization: Bearer YOUR_API_KEY"
curl -X PUT http://localhost:8080/providers/{id} \
-H "Authorization: Bearer YOUR_API_KEY" \
-d '{"api_key": "sk-..."}'
curl -X DELETE http://localhost:8080/providers/{id} \
-H "Authorization: Bearer YOUR_API_KEY"
curl -X PUT http://localhost:8080/providers/jobs/frontal-cortex \
-H "Authorization: Bearer YOUR_API_KEY" \
-d '{"provider_id": 1}'
If you need to use embedding models (e.g., embeddinggemma from Ollama):
embeddinggemma:latest)