Class: Portkey
Unified language model interface
Extends
Constructors
new Portkey()
new Portkey(
init
?):Portkey
Parameters
• init?: Partial
<Portkey
> & ApiClientInterface
Returns
Overrides
Defined in
packages/llamaindex/src/llm/portkey.ts:67
Properties
apiKey?
optional
apiKey:string
=undefined
Defined in
packages/llamaindex/src/llm/portkey.ts:63
baseURL?
optional
baseURL:string
=undefined
Defined in
packages/llamaindex/src/llm/portkey.ts:64
session
session:
PortkeySession
Defined in
packages/llamaindex/src/llm/portkey.ts:65
Accessors
metadata
get
metadata():LLMMetadata
Returns
Overrides
Defined in
packages/llamaindex/src/llm/portkey.ts:79
Methods
chat()
chat(params)
chat(
params
):Promise
<AsyncIterable
<ChatResponseChunk
>>
Get a chat response from the LLM
Parameters
• params: LLMChatParamsStreaming
<object
, object
>
Returns
Promise
<AsyncIterable
<ChatResponseChunk
>>
Overrides
Defined in
packages/llamaindex/src/llm/portkey.ts:83
chat(params)
chat(
params
):Promise
<ChatResponse
<object
>>
Get a chat response from the LLM
Parameters
• params: LLMChatParamsNonStreaming
<object
, object
>
Returns
Promise
<ChatResponse
<object
>>
Overrides
Defined in
packages/llamaindex/src/llm/portkey.ts:86
complete()
complete(params)
complete(
params
):Promise
<AsyncIterable
<CompletionResponse
>>
Get a prompt completion from the LLM
Parameters
• params: LLMCompletionParamsStreaming
Returns
Promise
<AsyncIterable
<CompletionResponse
>>
Inherited from
Defined in
packages/core/dist/llms/index.d.ts:168
complete(params)
complete(
params
):Promise
<CompletionResponse
>
Get a prompt completion from the LLM
Parameters
• params: LLMCompletionParamsNonStreaming
Returns
Promise
<CompletionResponse
>
Inherited from
Defined in
packages/core/dist/llms/index.d.ts:169
streamChat()
streamChat(
messages
,params
?):AsyncIterable
<ChatResponseChunk
>
Parameters
• messages: ChatMessage
[]
• params?: Record
<string
, any
>
Returns
AsyncIterable
<ChatResponseChunk
>