Class: ContextChatEngine
ContextChatEngine uses the Index to get the appropriate context for each query. The context is stored in the system prompt, and the chat history is chunk: ChatResponseChunk, nodes?: NodeWithScore<import("/Users/marcus/code/llamaindex/LlamaIndexTS/packages/core/src/Node").Metadata>[], nodes?: NodeWithScore<import("/Users/marcus/code/llamaindex/LlamaIndexTS/packages/core/src/Node").Metadata>[]lowing the appropriate context to be surfaced for each query.
Extends
Implements
Constructors
new ContextChatEngine()
new ContextChatEngine(
init
):ContextChatEngine
Parameters
• init
• init.chatHistory?: ChatMessage
[]
• init.chatModel?: LLM
<object
, object
>
• init.contextRole?: MessageType
• init.contextSystemPrompt?: ContextSystemPrompt
• init.nodePostprocessors?: BaseNodePostprocessor
[]
• init.retriever: BaseRetriever
• init.systemPrompt?: string
Returns
Overrides
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:43
Properties
chatHistory
chatHistory:
ChatHistory
<object
>
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:39
chatModel
chatModel:
LLM
<object
,object
>
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:38
contextGenerator
contextGenerator:
ContextGenerator
&PromptMixin
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:40
systemPrompt?
optional
systemPrompt:string
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:41
Methods
_getPromptModules()
protected
_getPromptModules():ModuleRecord
Return a dictionary of sub-modules within the current module that also implement PromptMixin (so that their prompts can also be get/set).
Can be blank if no sub-modules.
Returns
Overrides
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:77
_getPrompts()
protected
_getPrompts():PromptsRecord
Returns
Overrides
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:65
_updatePrompts()
protected
_updatePrompts(prompts
):void
Parameters
• prompts
• prompts.contextSystemPrompt: ContextSystemPrompt
Returns
void
Overrides
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:71
chat()
chat(params)
chat(
params
):Promise
<AsyncIterable
<EngineResponse
>>
Send message along with the class's current chat history to the LLM.
Parameters
• params: ChatEngineParamsStreaming
Returns
Promise
<AsyncIterable
<EngineResponse
>>
Implementation of
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:83
chat(params)
chat(
params
):Promise
<EngineResponse
>
Send message along with the class's current chat history to the LLM.
Parameters
• params: ChatEngineParamsNonStreaming
Returns
Promise
<EngineResponse
>
Implementation of
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:86
getPrompts()
getPrompts():
PromptsRecord
Returns
Inherited from
Defined in
packages/core/dist/prompts/index.d.ts:58
reset()
reset():
void
Resets the chat history so that it's empty.
Returns
void
Implementation of
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:123
updatePrompts()
updatePrompts(
prompts
):void
Parameters
• prompts: PromptsRecord
Returns
void
Inherited from
Defined in
packages/core/dist/prompts/index.d.ts:59
validatePrompts()
validatePrompts(
promptsDict
,moduleDict
):void
Parameters
• promptsDict: PromptsRecord
• moduleDict: ModuleRecord
Returns
void
Inherited from
Defined in
packages/core/dist/prompts/index.d.ts:57