class MaxTokensInterceptor
implements LoopInterceptor
Loop interceptor that handles automatic continuation when max tokens are reached. When the LLM response is truncated due to max tokens, this interceptor can automatically continue the conversation.
new
MaxTokensInterceptor(options?: MaxTokensInterceptorOptions)
writeonly
continueMessage: string
Set custom continue message
readonly
description: string
writeonly
maxContinuations: number | undefined
Set maximum number of continuations allowed
readonly
continueMessage: string
Get the current continue message
readonly
maxContinuations: number | undefined
Get maximum number of continuations allowed
intercept(context: InterceptorContext): Promise<InterceptorResult>