Protected runOptional _parentRunId: stringOptional _tags: string[]Optional kwargs: { Optional inputs?: Record<string, unknown>Optional parentRunId: stringOptional tags: string[]Optional metadata: KVMapOptional runType: stringOptional name: stringOptional parentRunId: stringOptional extraParams: KVMapOptional tags: string[]Optional metadata: KVMapOptional name: stringOptional parentRunId: stringOptional extraParams: KVMapOptional tags: string[]Optional metadata: KVMapOptional name: stringOptional parentRunId: stringOptional tags: string[]Optional metadata: KVMapOptional name: stringOptional parentRunId: stringOptional tags: string[]Optional metadata: KVMapOptional name: stringOptional onOptional onOptional onOptional onOptional onOptional onLLMEndOptional onLLMErrorOptional onLLMNewOptional onLLMStartOptional onOptional onOptional onOptional onOptional onOptional onOptional onOptional onOptional onStatic fromOptional parentRunId: stringOptional tags: string[]Called when an agent finishes execution, before it exits. with the final output and the run ID.
Optional parentRunId: stringOptional tags: string[]Called at the end of a Chain run, with the outputs and the run ID.
Optional parentRunId: stringOptional tags: string[]Optional kwargs: { Optional inputs?: Record<string, unknown>Called if a Chain run encounters an error
Optional parentRunId: stringOptional tags: string[]Optional kwargs: { Optional inputs?: Record<string, unknown>Called at the start of a Chain run, with the chain name and inputs and the run ID.
Optional parentRunId: stringOptional tags: string[]Optional metadata: Record<string, unknown>Optional runType: stringOptional name: stringCalled at the start of a Chat Model run, with the prompt(s) and the run ID.
Optional parentRunId: stringOptional extraParams: Record<string, unknown>Optional tags: string[]Optional metadata: Record<string, unknown>Optional name: stringCalled at the end of an LLM/ChatModel run, with the output and the run ID.
Optional parentRunId: stringOptional tags: string[]Called when an LLM/ChatModel in streaming mode produces a new token
Optional parentRunId: stringOptional tags: string[]Optional fields: HandleLLMNewTokenCallbackFieldsCalled at the start of an LLM or Chat Model run, with the prompt(s) and the run ID.
Optional parentRunId: stringOptional extraParams: Record<string, unknown>Optional tags: string[]Optional metadata: Record<string, unknown>Optional name: stringOptional parentRunId: stringOptional tags: string[]Optional metadata: Record<string, unknown>Optional name: stringCalled at the start of a Tool run, with the tool name and input and the run ID.
Optional parentRunId: stringOptional tags: string[]Optional metadata: Record<string, unknown>Optional name: stringProtected persistProtected stringifyGenerated using TypeDoc
Called when an agent is about to execute an action, with the action and the run ID.