Class ChatAnthropicMessages<CallOptions>

Wrapper around Anthropic large language models.

To use this package, you should have an Anthropic API key set as an environment variable named ANTHROPIC_API_KEY or passed into the constructor.

Any parameters that are valid to be passed to anthropic.messages can be passed through invocationKwargs, even if not explicitly available on this class.

import { ChatAnthropic } from "@langchain/anthropic";

const model = new ChatAnthropic({
temperature: 0.9,
apiKey: 'YOUR-API-KEY',
});
const res = await model.invoke({ input: 'Hello!' });
console.log(res);

Type Parameters

Hierarchy (view full)

Implements

Constructors

Properties

clientOptions: ClientOptions

Overridable Anthropic ClientOptions

maxTokens: number = 2048

A maximum number of tokens to generate before stopping.

model: string = "claude-2.1"

Model name to use

modelName: string = "claude-2.1"

Model name to use

streamUsage: boolean = true

Whether or not to include token usage data in streamed chunks.

true
streaming: boolean = false

Whether to stream the results or not

temperature: number = 1

Amount of randomness injected into the response. Ranges from 0 to 1. Use temp closer to 0 for analytical / multiple choice, and temp closer to 1 for creative and generative tasks.

topK: number = -1

Only sample from the top K options for each subsequent token. Used to remove "long tail" low probability responses. Defaults to -1, which disables it.

topP: number = -1

Does nucleus sampling, in which we compute the cumulative distribution over all the options for each subsequent token in decreasing probability order and cut it off once it reaches a particular probability specified by top_p. Defaults to -1, which disables it. Note that you should either alter temperature or top_p, but not both.

anthropicApiKey?: string

Anthropic API key

apiKey?: string

Anthropic API key

apiUrl?: string
invocationKwargs?: Kwargs

Holds any additional parameters that are valid to pass to anthropic.messages that are not explicitly specified on this class.

stopSequences?: string[]

A list of strings upon which to stop generating. You probably want ["\n\nHuman:"], as that's the cue for the next turn in the dialog agent.

batchClient: Anthropic
streamingClient: Anthropic

Methods

  • Formats LangChain StructuredTools to AnthropicTools.

    Parameters

    • tools: undefined | any[]

      The tools to format

    Returns undefined | Tool[]

    The formatted tools, or undefined if none are passed.

    If a mix of AnthropicTools and StructuredTools are passed.

  • Get the parameters used to invoke the model

    Parameters

    • Optionaloptions: unknown

    Returns Omit<MessageCreateParamsNonStreaming | MessageCreateParamsStreaming, "messages"> & Kwargs

  • Creates a streaming request with retry.

    Parameters

    • request: MessageCreateParamsStreaming & Kwargs

      The parameters for creating a completion.

    • Optionaloptions: AnthropicRequestOptions

    Returns Promise<Stream<RawMessageStreamEvent>>

    A streaming request.