OptionaladditionalOptionalcallbacksCallbacks for this call and any sub-calls (eg. a Chain calling an LLM). Tags are passed to all callbacks, metadata is passed to handle*Start callbacks.
OptionalconfigurableRuntime values for attributes previously made configurable on this Runnable, or sub-Runnables.
OptionalguardrailConfiguration information for a guardrail that you want to use in the request.
Optionalls_Describes the format of structured outputs. This should be provided if an output is considered to be structured
An object containing the method used for structured output (e.g., "jsonMode").
Optionalschema?: JsonSchema7TypeThe JSON schema describing the expected output structure.
OptionalmaxMaximum number of parallel calls to make.
OptionalmetadataMetadata for this call and any sub-calls (eg. a Chain calling an LLM). Keys should be strings, values should be JSON-serializable.
OptionalperformanceModel performance configuration. See https://docs.aws.amazon.com/bedrock/latest/userguide/latency-optimized-inference.html
OptionalrecursionMaximum number of times a call can recurse. If not provided, defaults to 25.
OptionalrequestKey-value pairs that you can use to filter invocation logs.
OptionalrunUnique identifier for the tracer run for this call. If not provided, a new UUID will be generated.
OptionalrunName for the tracer run for this call. Defaults to the name of the class.
OptionalsignalAbort signal for this call. If provided, the call will be aborted when the signal is aborted.
OptionalstopA list of stop sequences. A stop sequence is a sequence of characters that causes the model to stop generating the response.
OptionalstreamWhether or not to include usage data, like token counts in the streamed response chunks. Passing as a call option will take precedence over the class-level setting.
OptionaltagsTags for this call and any sub-calls (eg. a Chain calling an LLM). You can use these to filter calls.
OptionaltimeoutTimeout for this call in milliseconds.
Optionaltool_Tool choice for the model. If passing a string, it must be "any", "auto" or the name of the tool to use. Or, pass a BedrockToolChoice object.
If "any" is passed, the model must request at least one tool. If "auto" is passed, the model automatically decides if a tool should be called or whether to generate text instead. If a tool name is passed, it will force the model to call that specific tool.
Optionaltools
Additional inference parameters that the model supports, beyond the base set of inference parameters that the Converse API supports in the
inferenceConfigfield. For more information, see the model parameters link below.