Optional
anthropicOptional
anthropicAnthropic API URL
Optional
apiAnthropic API key
Optional
clientOverridable Anthropic ClientOptions
Optional
createOptional method that returns an initialized underlying Anthropic client. Useful for accessing Anthropic models hosted on other cloud services such as Google Vertex.
Optional
invocationHolds any additional parameters that are valid to pass to anthropic.messages
that are not explicitly specified on this class.
Optional
llm?: undefined | BaseChatModel<BaseChatModelCallOptions, AIMessageChunk>Optional
maxA maximum number of tokens to generate before stopping.
Optional
maxA maximum number of tokens to generate before stopping.
Optional
model?: undefined | stringModel name to use
Optional
modelOptional
stopA list of strings upon which to stop generating.
You probably want ["\n\nHuman:"]
, as that's the cue for
the next turn in the dialog agent.
Optional
streamWhether or not to include token usage data in streamed chunks.
Optional
streaming?: undefined | booleanWhether to stream the results or not
Optional
systemOptional
temperature?: undefined | numberAmount of randomness injected into the response. Ranges from 0 to 1. Use temp closer to 0 for analytical / multiple choice, and temp closer to 1 for creative and generative tasks.
Optional
topK?: undefined | numberOnly sample from the top K options for each subsequent token. Used to remove "long tail" low probability responses. Defaults to -1, which disables it.
Optional
topP?: undefined | numberDoes nucleus sampling, in which we compute the cumulative distribution over all the options for each subsequent token in decreasing probability order and cut it off once it reaches a particular probability specified by top_p. Defaults to -1, which disables it. Note that you should either alter temperature or top_p, but not both.
Anthropic API key