Module llm_provider

Module llm_provider 

Source
Expand description

LLM provider selection and integration.

Domain types (traits, provider structs) are defined in gestura-core-llm and re-exported here. This module adds the core-owned select_provider function which bridges crate::config::AppConfig to concrete provider instances.

Modules§

default_models
Centralized default AI model constants for all providers.
model_capabilities
Dynamic model capabilities discovery and caching.
model_discovery
Dynamic model metadata discovery via provider APIs.
model_listing
Dynamic and static model listing for all LLM providers.
openai
OpenAI model capability and endpoint routing helpers.
token_tracker
Token usage tracking and cost estimation for Gestura

Structs§

AgentContext
Context hints for provider selection (agent, tenant, etc.)
AnthropicProvider
HTTP-based Anthropic Claude provider
GeminiProvider
HTTP-based Google Gemini provider (Generative Language API).
GrokProvider
HTTP-based Grok (xAI) provider (OpenAI-compatible endpoint)
LlmCallResponse
Response from an LLM call including token usage
OllamaProvider
HTTP-based Ollama local provider
OpenAiProvider
HTTP-based OpenAI completion provider
TokenUsage
Token usage information from an LLM API call
ToolCallInfo
A structured tool call returned by the LLM when using native function calling.
UnconfiguredProvider
A provider that returns an error when no real provider is configured. Used when config is missing or invalid.

Traits§

LlmProvider
Unified LLM interface (async)

Functions§

select_provider
Select a provider based on config and context.
unconfigured_provider
Create an unconfigured provider that returns an error when called. Used when a provider is not properly configured.