stream_ollama

Function stream_ollama 

Source
pub async fn stream_ollama(
    base_url: &str,
    model: &str,
    prompt: &str,
    tools: Option<&[Value]>,
    tx: Sender<StreamChunk>,
    cancel_token: CancellationToken,
) -> Result<(), AppError>
Expand description

Stream a response from Ollama local API