Schema to represent a basic prompt for an LLM.

Example

import { PromptTemplate } from "langchain/prompts";

const prompt = new PromptTemplate({
inputVariables: ["foo"],
template: "Say {foo}",
});

Type Parameters

Hierarchy

Constructors

Properties

PromptValueReturnType: StringPromptValueInterface
inputVariables: Extract<keyof RunInput, string>[]

A list of variable names the prompt template expects

partialVariables: PartialValues<any>

Partial variables

renderer: ((template, values) => string)

Type declaration

    • (template, values): string
    • Parameters

      Returns string

template: MessageContent

The prompt template

templateFormat: "f-string"

The format of the prompt template. Options are 'f-string'

Default Value

'f-string'

validateTemplate: boolean

Whether or not to try validating the template on initialization

Default Value

true

name?: string
outputParser?: BaseOutputParser<unknown>

How to parse the output of calling an LLM on this formatted prompt

templateValidator?: ((template, inputVariables) => boolean)

Type declaration

    • (template, inputVariables): boolean
    • Parameters

      • template: string
      • inputVariables: string[]

      Returns boolean

Methods

  • Default implementation of batch, which calls invoke N times. Subclasses should override this method if they can batch more efficiently.

    Parameters

    • inputs: RunInput[]

      Array of inputs to each batch call.

    • Optional options: Partial<RunnableConfig> | Partial<RunnableConfig>[]

      Either a single call options object to apply to each batch call or an array for each call.

    • Optional batchOptions: RunnableBatchOptions & {
          returnExceptions?: false;
      }

    Returns Promise<StringPromptValueInterface[]>

    An array of RunOutputs, or mixed RunOutputs and errors if batchOptions.returnExceptions is set

  • Parameters

    Returns Promise<(StringPromptValueInterface | Error)[]>

  • Parameters

    Returns Promise<(StringPromptValueInterface | Error)[]>

  • Formats the prompt given the input values and returns a formatted prompt value.

    Parameters

    • values: TypedPromptInputValues<RunInput>

      The input values to format the prompt.

    Returns Promise<StringPromptValueInterface>

    A Promise that resolves to a formatted prompt value.

  • Parameters

    • Optional suffix: string

    Returns string

  • Invokes the prompt template with the given input and options.

    Parameters

    • input: RunInput

      The input to invoke the prompt template with.

    • Optional options: BaseCallbackConfig

      Optional configuration for the callback.

    Returns Promise<StringPromptValueInterface>

    A Promise that resolves to the output of the prompt template.

  • Merges partial variables and user variables.

    Parameters

    • userVariables: TypedPromptInputValues<RunInput>

      The user variables to merge with the partial variables.

    Returns Promise<InputValues<any>>

    A Promise that resolves to an object containing the merged variables.

  • Partially applies values to the prompt template.

    Type Parameters

    • NewPartialVariableName extends string

    Parameters

    • values: PartialValues<NewPartialVariableName>

      The values to be partially applied to the prompt template.

    Returns Promise<PromptTemplate<InputValues<Exclude<Extract<keyof RunInput, string>, NewPartialVariableName>>, any>>

    A new instance of PromptTemplate with the partially applied values.

  • Create a new runnable sequence that runs each individual runnable in series, piping the output of one runnable into another runnable or runnable-like.

    Type Parameters

    • NewRunOutput

    Parameters

    • coerceable: RunnableLike<StringPromptValueInterface, NewRunOutput>

      A runnable, function, or object whose values are functions or runnables.

    Returns Runnable<RunInput, Exclude<NewRunOutput, Error>, RunnableConfig>

    A new runnable sequence.

  • Stream output in chunks.

    Parameters

    Returns Promise<IterableReadableStream<StringPromptValueInterface>>

    A readable stream that is also an iterable.

  • Generate a stream of events emitted by the internal steps of the runnable.

    Use to create an iterator over StreamEvents that provide real-time information about the progress of the runnable, including StreamEvents from intermediate results.

    A StreamEvent is a dictionary with the following schema:

    • event: string - Event names are of the format: on_[runnable_type]_(start|stream|end).
    • name: string - The name of the runnable that generated the event.
    • run_id: string - Randomly generated ID associated with the given execution of the runnable that emitted the event. A child runnable that gets invoked as part of the execution of a parent runnable is assigned its own unique ID.
    • tags: string[] - The tags of the runnable that generated the event.
    • metadata: Record<string, any> - The metadata of the runnable that generated the event.
    • data: Record<string, any>

    Below is a table that illustrates some events that might be emitted by various chains. Metadata fields have been omitted from the table for brevity. Chain definitions have been included after the table.

    event name chunk input output
    on_llm_start [model name] {'input': 'hello'}
    on_llm_stream [model name] 'Hello' OR AIMessageChunk("hello")
    on_llm_end [model name] 'Hello human!'
    on_chain_start format_docs
    on_chain_stream format_docs "hello world!, goodbye world!"
    on_chain_end format_docs [Document(...)] "hello world!, goodbye world!"
    on_tool_start some_tool {"x": 1, "y": "2"}
    on_tool_stream some_tool {"x": 1, "y": "2"}
    on_tool_end some_tool {"x": 1, "y": "2"}
    on_retriever_start [retriever name] {"query": "hello"}
    on_retriever_chunk [retriever name] {documents: [...]}
    on_retriever_end [retriever name] {"query": "hello"} {documents: [...]}
    on_prompt_start [template_name] {"question": "hello"}
    on_prompt_end [template_name] {"question": "hello"} ChatPromptValue(messages: [SystemMessage, ...])

    Parameters

    • input: RunInput
    • options: Partial<RunnableConfig> & {
          version: "v1";
      }
    • Optional streamOptions: Omit<LogStreamCallbackHandlerInput, "autoClose">

    Returns AsyncGenerator<StreamEvent, any, unknown>

  • Stream all output from a runnable, as reported to the callback system. This includes all inner runs of LLMs, Retrievers, Tools, etc. Output is streamed as Log objects, which include a list of jsonpatch ops that describe how the state of the run has changed in each step, and the final state of the run. The jsonpatch ops can be applied in order to construct state.

    Parameters

    • input: RunInput
    • Optional options: Partial<RunnableConfig>
    • Optional streamOptions: Omit<LogStreamCallbackHandlerInput, "autoClose">

    Returns AsyncGenerator<RunLogPatch, any, unknown>

  • Default implementation of transform, which buffers input and then calls stream. Subclasses should override this method if they can start producing output while input is still being generated.

    Parameters

    • generator: AsyncGenerator<RunInput, any, unknown>
    • options: Partial<RunnableConfig>

    Returns AsyncGenerator<StringPromptValueInterface, any, unknown>

  • Create a new runnable from the current one that will try invoking other passed fallback runnables if the initial invocation fails.

    Parameters

    • fields: {
          fallbacks: Runnable<RunInput, StringPromptValueInterface, RunnableConfig>[];
      }
      • fallbacks: Runnable<RunInput, StringPromptValueInterface, RunnableConfig>[]

        Other runnables to call if the runnable errors.

    Returns RunnableWithFallbacks<RunInput, StringPromptValueInterface>

    A new RunnableWithFallbacks.

  • Bind lifecycle listeners to a Runnable, returning a new Runnable. The Run object contains information about the run, including its id, type, input, output, error, startTime, endTime, and any tags or metadata added to the run.

    Parameters

    • params: {
          onEnd?: ((run, config?) => void | Promise<void>);
          onError?: ((run, config?) => void | Promise<void>);
          onStart?: ((run, config?) => void | Promise<void>);
      }

      The object containing the callback functions.

      • Optional onEnd?: ((run, config?) => void | Promise<void>)
          • (run, config?): void | Promise<void>
          • Called after the runnable finishes running, with the Run object.

            Parameters

            Returns void | Promise<void>

      • Optional onError?: ((run, config?) => void | Promise<void>)
          • (run, config?): void | Promise<void>
          • Called if the runnable throws an error, with the Run object.

            Parameters

            Returns void | Promise<void>

      • Optional onStart?: ((run, config?) => void | Promise<void>)
          • (run, config?): void | Promise<void>
          • Called before the runnable starts running, with the Run object.

            Parameters

            Returns void | Promise<void>

    Returns Runnable<RunInput, StringPromptValueInterface, RunnableConfig>

  • Parameters

    Returns Promise<PromptTemplate<any, any>>

    ⚠️ Deprecated ⚠️

    Load a prompt template from a json-like object describing it.

    This feature is deprecated and will be removed in the future.

    It is not recommended for use.

    Remarks

    Deserializing needs to be async because templates (e.g. FewShotPromptTemplate) can reference remote resources that we read asynchronously with a web request.

  • Take examples in list format with prefix and suffix to create a prompt.

    Intended to be used a a way to dynamically create a prompt from examples.

    Parameters

    • examples: string[]

      List of examples to use in the prompt.

    • suffix: string

      String to go after the list of examples. Should generally set up the user's input.

    • inputVariables: string[]

      A list of variable names the final prompt template will expect

    • Optional exampleSeparator: string

      The separator to use in between examples

    • Optional prefix: string

      String that should go before any examples. Generally includes examples.

    Returns PromptTemplate<any, any>

    The final prompt template generated.

Generated using TypeDoc