Przejdź do głównej zawartości

Interface: LLM

Unified language model interface

Implemented by

Properties

hasStreaming

hasStreaming: boolean

Defined in

packages/core/src/llm/LLM.ts:69


metadata

metadata: LLMMetadata

Defined in

packages/core/src/llm/LLM.ts:67

Methods

chat

chat<T, R>(messages, parentEvent?, streaming?): Promise<R>

Get a chat response from the LLM

Type parameters

NameType
Textends undefined | boolean = undefined
RT extends true ? AsyncGenerator<string, void, unknown> : ChatResponse

Parameters

NameTypeDescription
messagesChatMessage[]The return type of chat() and complete() are set by the "streaming" parameter being set to True.
parentEvent?Event-
streaming?T-

Returns

Promise<R>

Defined in

packages/core/src/llm/LLM.ts:76


complete

complete<T, R>(prompt, parentEvent?, streaming?): Promise<R>

Get a prompt completion from the LLM

Type parameters

NameType
Textends undefined | boolean = undefined
RT extends true ? AsyncGenerator<string, void, unknown> : ChatResponse

Parameters

NameTypeDescription
promptMessageContentthe prompt to complete
parentEvent?Event-
streaming?T-

Returns

Promise<R>

Defined in

packages/core/src/llm/LLM.ts:89


tokens

tokens(messages): number

Calculates the number of tokens needed for the given chat messages

Parameters

NameType
messagesChatMessage[]

Returns

number

Defined in

packages/core/src/llm/LLM.ts:101