Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DeepInfraLLM Cannot read properties of undefined (reading 'reduce') #7637

Open
5 tasks done
theodufort opened this issue Feb 2, 2025 · 2 comments
Open
5 tasks done
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@theodufort
Copy link

theodufort commented Feb 2, 2025

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain.js documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain.js rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

I was using the @langchain/Ollama package for my llm (which was working) but i decided to try @langchain/community/llms/deepinfra, but i am getting the error: Cannot read properties of undefined (reading 'reduce')

My code is the following:

"use server";
import { createStuffDocumentsChain } from "langchain/chains/combine_documents";
import {
  ChatPromptTemplate,
  MessagesPlaceholder,
} from "@langchain/core/prompts";
import { HumanMessage, AIMessage } from "@langchain/core/messages";
import { createClient } from "@/utils/supabase/server";
import { Ollama } from "@langchain/ollama";
import { retrieveContentChunks } from "../retrieve-content-embeddings";
import { retrieveDocumentContentChunks } from "../retrieve-document-content-embeddings";
import { ChatOpenAI } from "@langchain/openai";
import { DeepInfraLLM } from "@langchain/community/llms/deepinfra";
// const llm = new Ollama({
//   numGpu: 2,
//   numCtx: 4096,
//   baseUrl: process.env.OLLAMA_BASEURL,
//   model: "phi4:14b-q4_K_M",
// });
const llm = new DeepInfraLLM({
  apiKey: process.env.DEEPINFRA_API_KEY,
  temperature: 1,
  maxTokens: 500,
  model: "meta-llama/Llama-3.3-70B-Instruct",
});

interface Ask {
  user_id: string;
  workspace_id: string;
  message: string;
}
export const AskAIChat = async ({ user_id, workspace_id, message }: Ask) => {
  if (!message) {
    throw new Error('Missing "message" in request body.');
  }

  if (!process.env.OLLAMA_BASEURL) {
    throw new Error("Missing OLLAMA_BASEURL environment variable");
  }

  console.log("AI Request:", {
    user_id,
    workspace_id,
    message,
    timestamp: new Date().toISOString(),
  });

  try {
    const SYSTEM_TEMPLATE = `Answer the user's questions based on the below context. If documents or notes are referenced, include their titles in the response.
If the context doesn't contain any relevant information to the question, respond with "I don't have enough relevant information to answer that question."

For document references, include the document name and relevant section where possible.

<context>
{context}
</context>
`;

    const questionAnsweringPrompt = ChatPromptTemplate.fromMessages([
      ["system", SYSTEM_TEMPLATE],
      new MessagesPlaceholder("messages"),
    ]);

    const documentChain = await createStuffDocumentsChain({
      llm,
      prompt: questionAnsweringPrompt,
    });
    const notesContext = await retrieveContentChunks({ query: message });
    const documentsContext = await retrieveDocumentContentChunks({
      query: message,
    });
    console.log("Notes Context:", notesContext.length);
    console.log("Documents Context:", documentsContext.length);

    const context = [...notesContext, ...documentsContext];
    console.log("Combined Context:", context.length);

    if (!context || context.length === 0) {
      console.warn("No context found for query:", message);
      return "I couldn't find any relevant information to answer that question.";
    }
    // Build source variable with metadata (avoiding duplicates)
    let source = "## Information Sources\n\n";
    const uniqueSourceIds = new Set();
    context.forEach((doc) => {
      const sourceId =
        doc.metadata.note_id || doc.metadata.document_id || "Unknown";
      if (!uniqueSourceIds.has(sourceId)) {
        uniqueSourceIds.add(sourceId);
        const sourceType = doc.metadata.note_id ? "Note" : "Document";
        source += `- ${sourceType}: [View Source](#source-${sourceId})\n`;
      }
    });
    console.log("before");
    // Stream the response
    const responseStream = await documentChain.stream({
      messages: [new HumanMessage(message)],
      context: context,
    });
    console.log("after");
    let finalResponse = "";

    try {
      for await (const chunk of responseStream) {
        if (typeof chunk === "string") {
          finalResponse += chunk;
        } else {
          console.warn("Received non-string chunk:", chunk);
        }
      }
    } catch (streamError) {
      console.error("Stream Error:", streamError);
      throw new Error("Failed to process response stream");
    }

    // Add metadata markers to the response
    const sourceMetadata = context.map((doc) => ({
      id: doc.metadata.note_id || doc.metadata.document_id,
      type: doc.metadata.note_id ? "note" : "document",
    }));

    finalResponse += `\n\n${source}`;
    finalResponse += `\n<!-- METADATA:${JSON.stringify(sourceMetadata)} -->`;

    return finalResponse;
    // const response = await ollama.chat({
    //   model: "llama3.1:8b-instruct-q6_K",
    //   messages: messages,
    //   stream: true,
    // });
  } catch (error) {
    console.error("AI Response Error:", error);
    throw new Error(
      `Error generating response: ${error instanceof Error ? error.message : "Unknown error"}`
    );
  }
};

Error Message and Stack Trace (if applicable)

AI Request: {
  user_id: '68723abd-6fda-48d5-86b6-0d9badcae0e8',
  workspace_id: '',
  message: 'Find me a property with a big kitchen',
  timestamp: '2025-02-02T01:16:03.292Z'
}
Notes Context: 2
Documents Context: 2
Combined Context: 4
before
AI Response Error: [TypeError: Cannot read properties of undefined (reading 'reduce')]
 ⨯ Error: Error generating response: Cannot read properties of undefined (reading 'reduce')
    at AskAIChat (file://E%3A/Clones/website-frontend/utils/supabase/rag/chat/ask.ts:133:10)
  131 |   } catch (error) {
  132 |     console.error("AI Response Error:", error);
> 133 |     throw new Error(
      |          ^
  134 |       `Error generating response: ${error instanceof Error ? error.message : "Unknown error"}`
  135 |     );
  136 |   } {
  digest: '1383777046'
}
 POST /dashboard/chat 500 in 5871ms

Description

  1. I instanciate the llm
  2. the document chain returns an llm response

System Info

langchain ^0.3.28
windows 10
Bun 1.1.8
Node v18.19.1

Copy link

dosubot bot commented Feb 2, 2025

Hello there, @theodufort! I'm here to help you out with any bugs or questions you have. Let's get to the bottom of this together!

The error you're encountering, "Cannot read properties of undefined (reading 'reduce')," typically occurs when the reduce method is called on an undefined variable or an object that doesn't have the reduce method. In the context of your code, this could be happening in several places, but it's most likely related to how the context or messages are being handled in the documentChain.stream method.

Here are a few things to check and try:

  1. Ensure Context is Properly Formatted: The context object passed to documentChain.stream should be an array of Document objects, each with a pageContent property. Make sure that notesContext and documentsContext are arrays of such Document objects before they are combined into context.

  2. Check the documentChain.stream Method: Verify that the documentChain.stream method is being called with the correct parameters. The context should be an array, and messages should be an array of message objects like HumanMessage.

  3. Debug the Context and Messages: Add console logs before the documentChain.stream call to ensure that context and messages are defined and have the expected structure:

    console.log("Context:", context);
    console.log("Messages:", messages);
  4. Review the createStuffDocumentsChain Implementation: Ensure that the createStuffDocumentsChain function is correctly handling the input parameters and that the llm and prompt are properly configured.

  5. Check for Undefined Variables: Look for any variables that might be undefined when passed to functions or methods that use reduce. This includes checking any transformations or operations on context or messages.

If these steps don't resolve the issue, consider isolating the problem by creating a minimal reproducible example and checking each part of the code that interacts with the reduce method. This can help identify where the undefined value is being introduced.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@dosubot dosubot bot added the auto:bug Related to a bug, vulnerability, unexpected error with an existing feature label Feb 2, 2025
@jacoblee93
Copy link
Collaborator

CC @ovuruska - would you be willing to take a look?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

2 participants