Class: abstract
BaseLLM<AdditionalChatOptions, AdditionalMessageOptions>
Unified language model interface
Extended by
Type Parameters
• AdditionalChatOptions extends object
= object
• AdditionalMessageOptions extends object
= object
Implements
LLM
<AdditionalChatOptions
>
Constructors
new BaseLLM()
new BaseLLM<
AdditionalChatOptions
,AdditionalMessageOptions
>():BaseLLM
<AdditionalChatOptions
,AdditionalMessageOptions
>
Returns
BaseLLM
<AdditionalChatOptions
, AdditionalMessageOptions
>
Properties
metadata
abstract
metadata:LLMMetadata
Implementation of
Defined in
packages/core/dist/llms/index.d.ts:167
Methods
chat()
chat(params)
abstract
chat(params
):Promise
<AsyncIterable
<ChatResponseChunk
<object
>>>
Get a chat response from the LLM
Parameters
• params: LLMChatParamsStreaming
<AdditionalChatOptions
, AdditionalMessageOptions
>
Returns
Promise
<AsyncIterable
<ChatResponseChunk
<object
>>>
Implementation of
Defined in
packages/core/dist/llms/index.d.ts:170
chat(params)
abstract
chat(params
):Promise
<ChatResponse
<AdditionalMessageOptions
>>
Get a chat response from the LLM
Parameters
• params: LLMChatParamsNonStreaming
<AdditionalChatOptions
, AdditionalMessageOptions
>
Returns
Promise
<ChatResponse
<AdditionalMessageOptions
>>
Implementation of
Defined in
packages/core/dist/llms/index.d.ts:171
complete()
complete(params)
complete(
params
):Promise
<AsyncIterable
<CompletionResponse
>>
Get a prompt completion from the LLM
Parameters
• params: LLMCompletionParamsStreaming
Returns
Promise
<AsyncIterable
<CompletionResponse
>>
Implementation of
Defined in
packages/core/dist/llms/index.d.ts:168
complete(params)
complete(
params
):Promise
<CompletionResponse
>
Get a prompt completion from the LLM
Parameters
• params: LLMCompletionParamsNonStreaming
Returns
Promise
<CompletionResponse
>
Implementation of
Defined in
packages/core/dist/llms/index.d.ts:169