Models
AgentOpera provides a flexible and powerful abstraction for working with large language models (LLMs) through its models module. This module implements a standard protocol for interacting with various LLM providers, making it easy to switch between different models or providers while maintaining a consistent interface.
ChatCompletionClient Protocol
At the core of the models module is the ChatCompletionClient abstract base class, which defines the protocol that all model clients must implement. This protocol standardizes how applications interact with language models:
class ChatCompletionClient(ABC):
@abstractmethod
async def create(
self,
messages: Sequence[LLMMessage],
*,
tools: Sequence[Tool | ToolSchema] = [],
json_output: Optional[bool] = None,
extra_create_args: Mapping[str, Any] = {},
cancellation_token: Optional[CancellationToken] = None,
) -> CreateResult: ...
@abstractmethod
def create_stream(
self,
messages: Sequence[LLMMessage],
*,
tools: Sequence[Tool | ToolSchema] = [],
json_output: Optional[bool] = None,
extra_create_args: Mapping[str, Any] = {},
cancellation_token: Optional[CancellationToken] = None,
) -> AsyncGenerator[Union[str, CreateResult], None]: ...
@property
@abstractmethod
def model_info(self) -> ModelInfo: ...Key Methods
create()- Makes a synchronous call to the LLM and returns the complete response.create_stream()- Makes a streaming call to the LLM, yielding results as they become available.model_info- Returns information about the model's capabilities.
Model Information and Capabilities
AgentOpera uses the ModelInfo type to represent model capabilities:
The ModelFamily class provides constants for common model families:
OpenAIChatCompletionClient
The OpenAIChatCompletionClient is the commonly used implementation of the ChatCompletionClient protocol, providing access to OpenAI's models.
Basic Usage
Function Calling
The client supports function calling through the tools parameter:
Other Model Clients
AgentOpera also provides clients for other LLM providers:
AzureOpenAIChatCompletionClient
For Azure OpenAI-hosted models:
AnthropicChatCompletionClient
For Claude and other Anthropic models:
OllamaChatCompletionClient
For locally hosted models using Ollama:
Advanced Features
Cancellation
Long-running requests can be cancelled using a CancellationToken:
Last updated