Optional
allowedOptional
callbacksCallbacks for this call and any sub-calls (eg. a Chain calling an LLM). Tags are passed to all callbacks, metadata is passed to handle*Start callbacks.
Optional
configurableRuntime values for attributes previously made configurable on this Runnable, or sub-Runnables.
Optional
maxMaximum number of parallel calls to make.
Optional
metadataMetadata for this call and any sub-calls (eg. a Chain calling an LLM). Keys should be strings, values should be JSON-serializable.
Optional
recursionMaximum number of times a call can recurse. If not provided, defaults to 25.
Optional
runUnique identifier for the tracer run for this call. If not provided, a new UUID will be generated.
Optional
runName for the tracer run for this call. Defaults to the name of the class.
Optional
signalAbort signal for this call. If provided, the call will be aborted when the signal is aborted.
Optional
stopStop tokens to use for this call. If not provided, the default stop tokens for the model will be used.
Optional
streamWhether or not to include usage data, like token counts in the streamed response chunks.
Optional
tagsTags for this call and any sub-calls (eg. a Chain calling an LLM). You can use these to filter calls.
Optional
timeoutTimeout for this call in milliseconds.
Optional
tool_Specifies how the chat model should use tools.
undefined
Possible values:
- "auto": The model may choose to use any of the provided tools, or none.
- "any": The model must use one of the provided tools.
- "none": The model must not use any tools.
- A string (not "auto", "any", or "none"): The name of a specific tool the model must use.
- An object: A custom schema specifying tool choice parameters. Specific to the provider.
Note: Not all providers support tool_choice. An error will be thrown
if used with an unsupported model.
Optional
tools
Allowed functions to call when the mode is "any". If empty, any one of the provided functions are called.