pub const DEFAULT_OLLAMA_MODEL: &'static str;
Default Ollama model for local inference.
Llama 3.2 provides good performance for local use cases.