Interface ChatGroqInput

Represents the parameters for a base chat model.

Hierarchy

Properties

apiKey?: string

The Groq API key to use for requests.

Default

process.env.GROQ_API_KEY
cache?: boolean | BaseCache<Generation[]>
callbackManager?: CallbackManager

⚠️ Deprecated ⚠️

Use callbacks instead

This feature is deprecated and will be removed in the future.

It is not recommended for use.

callbacks?: Callbacks
maxConcurrency?: number

The maximum number of concurrent calls that can be made. Defaults to Infinity, which means no limit.

maxRetries?: number

The maximum number of retries that can be made for a single call, with an exponential backoff between each attempt. Defaults to 6.

metadata?: Record<string, unknown>
modelName?: string

The name of the model to use.

Default

"llama2-70b-4096"
onFailedAttempt?: FailedAttemptHandler

Custom handler to handle failed attempts. Takes the originally thrown error object as input, and should itself throw an error if the input error is not retryable.

stop?: null | string | string[]

Up to 4 sequences where the API will stop generating further tokens. The returned text will not contain the stop sequence.

streaming?: boolean

Whether or not to stream responses.

tags?: string[]
temperature?: number

The temperature to use for sampling.

Default

0.7
verbose?: boolean

Generated using TypeDoc