• Create a chain that feeds documents to a model one at a time and updates the output.

    Parameters

    • llm: LanguageModelLike

      Language model to use for responding.

    • initialPrompt: BasePromptTemplate<any, BasePromptValueInterface, any>

      The prompt to use on the first document. Must accept "context" as one of the input variables. The first document will be passed in as "context".

    • refinePrompt: BasePromptTemplate<any, BasePromptValueInterface, any>

      The prompt to use on all subsequent documents. Must accept "context" and "output" as input variables. A document will be passed in as "context" and the refined output up to this iteration will be passed in as "output.

    • documentPrompt: BasePromptTemplate<any, BasePromptValueInterface, any> = DEFAULT_DOCUMENT_PROMPT

      Prompt used for formatting each document into a string. Input variables can be "page_content" or any metadata keys that are in all documents. "page_content" will automatically retrieve the Document.page_content, and all other inputs variables will be automatically retrieved from the Document.metadata dictionary. Default to a prompt that only contains Document.page_content.

    Returns Promise<RunnableSequence<any, any>>

    An LCEL Runnable chain. Expects a dictionary as input with a list of Documents being passed under the "context" key. Returns a dictionary as output. The output dictionary contains two keys, "output" and "intermediate_steps". "output" contains the final output. "intermediate_steps" contains the list of intermediate output strings generated by the chain, in the order that they were generated.

    Example

    import { ChatOpenAI } from "@langchain/openai";
    import { ChatPromptTemplate } from "@langchain_core/prompts/chat";

    const llm = new ChatOpenAI();
    const initialPrompt = ChatPromptTemplate.fromMessages([
    ["system", "Summarize this information:"],
    ["user", {context}"]
    ])
    const refinePrompt = ChatPromptTemplate.fromMessages([
    ["system", `You are summarizing a long document one page at a time.
    You have summarized part of the document. Given the next page, update your
    summary. Respond with only the updated summary and no other text.
    Here is your working summary:\n\n{output}.`],
    ["user", "Here is the next page:\n\n{context}"]
    ])

    const chain = await createRefineDocumentsChain(llm, initialPrompt, refinePrompt);
    const output = await chain.invoke({ context: documents, question: "..." });

Generated using TypeDoc