LLM Wrapper to use
Key to use for output, defaults to text
Prompt object to use
Optional
llmKwargs to pass to LLM
Optional
memoryOptional
outputOutputParser to use
Optional
config: (RunnableConfig | CallbackManager | (BaseCallbackHandler | BaseCallbackHandlerMethodsClass)[])[]Use .batch() instead. Will be removed in 0.2.0.
This feature is deprecated and will be removed in the future.
It is not recommended for use.
Call the chain on all inputs in the list
Run the core logic of this chain and add to output if desired.
Wraps _call and handles memory.
Optional
config: BaseCallbackConfig | CallbackManager | (BaseCallbackHandler | BaseCallbackHandlerMethodsClass)[]Static
deserializeLoad a chain from a json-like object describing it.
Static
fromLLMCreates a new TaskCreationChain instance. It takes an object of type LLMChainInput as input, omitting the 'prompt' field. It uses the PromptTemplate class to create a new prompt based on the task creation template and the input variables. The new TaskCreationChain instance is then created with this prompt and the remaining fields from the input object.
An object of type LLMChainInput, omitting the 'prompt' field.
A new instance of TaskCreationChain.
Generated using TypeDoc
Chain to generate tasks.