Your AI Chat Companion

Try these examples:

Offline Models

Ollama not available

Make sure Ollama is running on localhost:11434

Select a model to start chatting offline