Abstraction that a concrete large-language-model provider (Anthropic, OpenAI, etc.) must implement. At present only streaming chat completions are required, but the interface can evolve as the code-base grows.
info(): ProviderInfo
Get provider information.
streamChat(params: StreamChatParams,fileAttachmentCacheMap?: FileAttachmentCacheMap,): ModelStream
Request a streaming chat completion. The returned object must satisfy the
structural LLMStream interface so that downstream code can interact with
it in a provider-agnostic manner.