Get running in minutes
You can run Chalie locally. It takes a few minutes.
1. Quick Start
Clone the repository
git clone https://github.com/chalie-ai/chalie.git && cd chalie
Build and start
docker-compose build && docker-compose up -d
Open onboarding
Visit http://localhost:8081/on-boarding/
Chalie is running. The onboarding page will walk you through the rest.
2. LLM Provider Setup
Choose your LLM provider and follow the setup steps.
Ollama (Local)
Free, private, runs entirely on your machine. Best for private use.
1. Install Ollama
Download from ollama.ai
2. Pull a model
ollama pull qwen:8b
3. In onboarding, select Ollama
Endpoint: http://localhost:11434
3. Environment Configuration
Create your .env file. Docker Compose has sensible defaults — only needed if you want to override settings.
cp .env.example .env
Key variables:
| Variable | Purpose |
|---|---|
| DATABASE_PASSWORD | Postgres password |
| REDIS_URL | Redis connection string |
| LLM_PROVIDER | Your LLM choice (ollama, openai, anthropic, gemini) |
| API_KEY_* | Provider-specific API keys |
| CORS_ORIGIN | Allowed origins for CORS |
| PORT | Application port (default 8081) |
Note: Changes to .env require a container restart: docker-compose down && docker-compose up -d
4. Security Notes (Production)
⚠ Change default Postgres password
Set a strong, unique password in .env
⚠ Enable HTTPS
Use a reverse proxy (nginx, Caddy, Traefik) to terminate TLS connections
⚠ Restrict CORS to your domain
Update CORS_ORIGIN to your actual domain
⚠ Use a strong Chalie account password
Set during initial onboarding. This protects your data and conversations
⚠ Don't expose database/Redis publicly
Keep these ports behind your firewall or private network