题意:"无法在 Node.js 中使用 LangChain 设置自定义 OpenAI 模型"
问题背景:
I am trying to set "gpt-3.5-turbo" model in my OpenAI instance using langchain in node.js, but below way sends my requests defaultly as text-davinci model.
"我正在尝试在我的 OpenAI 实例中使用 LangChain 在 Node.js 中设置 'gpt-3.5-turbo' 模型,但以下方式默认将我的请求发送为 text-davinci 模型。"
const { OpenAI } = require("langchain/llms");
const { ConversationChain } = require("langchain/chains");
const { BufferMemory } = require("langchain/memory");
const model = new OpenAI({ model:"gpt-3.5-turbo", openAIApiKey: "###", temperature: 0.9 });
const memory = new BufferMemory();
const chain = new ConversationChain({llm:model, memory: memory});
async function x(){
const res = await chain.call({input:"Hello this is xyz!"});
const res2 = await chain.call({input:"Hello what was my name?"});
console.log(res);
console.log(res2);
}
x();
On documentation, i found the way to setting model with python. It sets with model_name attribute on the instance. But this way doesn't work with nodejs. Is there any way to setting custom models with langchain node.js ?
"在文档中,我找到了使用 Python 设置模型的方法,它通过实例中的 model_name 属性进行设置。但这种方法在 Node.js 中不起作用。有没有办法在 LangChain 的 Node.js 版本中设置自定义模型?"
问题解决:
I looked at the codebase and it seems like modelName
is the parameter that you should use.
"我查看了代码库,似乎应该使用 `modelName` 参数。"
It's strange that the search function in the docs give no results for modelName
. I guess it is the same parameters used in python API but in camel case.
"奇怪的是,文档中的搜索功能没有返回 `modelName` 的结果。我猜这与 Python API 中使用的参数相同,只是使用了驼峰命名法。"
Relationship with Python LangChain
edit
Docs have it here