Expand description
LLM provider selection and integration.
Domain types (traits, provider structs) are defined in gestura-core-llm and
re-exported here. This module adds the core-owned select_provider
function which bridges crate::config::AppConfig to concrete provider
instances.
Modules§
- default_
models - Centralized default AI model constants for all providers.
- model_
capabilities - Dynamic model capabilities discovery and caching.
- model_
discovery - Dynamic model metadata discovery via provider APIs.
- model_
listing - Dynamic and static model listing for all LLM providers.
- openai
- OpenAI model capability and endpoint routing helpers.
- token_
tracker - Token usage tracking and cost estimation for Gestura
Structs§
- Agent
Context - Context hints for provider selection (agent, tenant, etc.)
- Anthropic
Provider - HTTP-based Anthropic Claude provider
- Gemini
Provider - HTTP-based Google Gemini provider (Generative Language API).
- Grok
Provider - HTTP-based Grok (xAI) provider (OpenAI-compatible endpoint)
- LlmCall
Response - Response from an LLM call including token usage
- Ollama
Provider - HTTP-based Ollama local provider
- Open
AiProvider - HTTP-based OpenAI completion provider
- Token
Usage - Token usage information from an LLM API call
- Tool
Call Info - A structured tool call returned by the LLM when using native function calling.
- Unconfigured
Provider - A provider that returns an error when no real provider is configured. Used when config is missing or invalid.
Traits§
- LlmProvider
- Unified LLM interface (async)
Functions§
- select_
provider - Select a provider based on config and context.
- unconfigured_
provider - Create an unconfigured provider that returns an error when called. Used when a provider is not properly configured.