A class that represents an LLM router chain in the LangChain framework. It extends the RouterChain class and implements the LLMRouterChainInput interface. It provides additional functionality specific to LLMs and routing based on LLM predictions.

Hierarchy (view full)

Implements

Constructors

Properties

llmChain: LLMChain<RouterOutputSchema, LLMType>
memory?: BaseMemory

Accessors

Methods

  • Parameters

    • inputs: ChainValues[]
    • Optional config: (RunnableConfig | CallbackManager | (BaseCallbackHandler | BaseCallbackHandlerMethodsClass)[])[]

    Returns Promise<ChainValues[]>

    ⚠️ Deprecated ⚠️

    Use .batch() instead. Will be removed in 0.2.0.

    This feature is deprecated and will be removed in the future.

    It is not recommended for use.

    Call the chain on all inputs in the list

  • Parameters

    • values: ChainValues & {
          signal?: AbortSignal;
          timeout?: number;
      }
    • Optional config: RunnableConfig | CallbackManager | (BaseCallbackHandler | BaseCallbackHandlerMethodsClass)[]
    • Optional tags: string[]

      Deprecated

    Returns Promise<ChainValues>

    Deprecated

    Use .invoke() instead. Will be removed in 0.2.0.

    Run the core logic of this chain and add to output if desired.

    Wraps _call and handles memory.

  • Invoke the chain with the provided input and returns the output.

    Parameters

    • input: ChainValues

      Input values for the chain run.

    • Optional options: RunnableConfig

    Returns Promise<ChainValues>

    Promise that resolves with the output of the chain run.

  • Parameters

    • inputs: Record<string, unknown>
    • outputs: Record<string, unknown>
    • returnOnlyOutputs: boolean = false

    Returns Promise<Record<string, unknown>>

  • Parameters

    • input: any
    • Optional config: RunnableConfig | CallbackManager | (BaseCallbackHandler | BaseCallbackHandlerMethodsClass)[]

    Returns Promise<string>

    Deprecated

    Use .invoke() instead. Will be removed in 0.2.0.

  • A static method that creates an instance of LLMRouterChain from a BaseLanguageModel and a BasePromptTemplate. It takes in an optional options object and returns an instance of LLMRouterChain with the specified LLMChain.

    Parameters

    • llm: BaseLanguageModelInterface<any, BaseLanguageModelCallOptions>

      A BaseLanguageModel instance.

    • prompt: BasePromptTemplate<any, BasePromptValueInterface, any>

      A BasePromptTemplate instance.

    • Optional options: Omit<LLMRouterChainInput, "llmChain">

      Optional LLMRouterChainInput object, excluding "llmChain".

    Returns LLMRouterChain

    An instance of LLMRouterChain.

Generated using TypeDoc