了解的运作原理之后,就可以开始使用Semantic Kernel来制作应用了。
Semantic Kernel将embedding的功能封装到了Memory中,用来存储上下文信息,就好像电脑的内存一样,而LLM就像是CPU一样,我们所需要做的就是从内存中取出相关的信息交给CPU处理就好了。
内存配置
使用Memory需要注册 embedding
模型,目前使用的就是 text-embedding-ada-002
。同时需要为Kernel添加MemoryStore,用于存储更多的信息,这里Semantic Kernel提供了一个 VolatileMemoryStore
,就是一个普通的内存存储的MemoryStore。
var kernel = Kernel.Builder.Configure(c => | |
{ | |
c.AddOpenAITextCompletionService("openai", "text-davinci-003", Environment.GetEnvironmentVariable("MY_OPEN_AI_API_KEY")); | |
c.AddOpenAIEmbeddingGenerationService("openai", "text-embedding-ada-002", Environment.GetEnvironmentVariable("MY_OPEN_AI_API_KEY")); | |
}) | |
.WithMemoryStorage(new VolatileMemoryStore()) | |
.Build(); |
信息存储
完成了基础信息的注册后,就可以往Memroy中存储信息了。
const string MemoryCollectionName = "aboutMe"; | |
await kernel.Memory.SaveInformationAsync(MemoryCollectionName, id: "info1", text: "My name is Andrea"); | |
await kernel.Memory.SaveInformationAsync(MemoryCollectionName, id: "info2", text: "I currently work as a tourist operator"); | |
await kernel.Memory.SaveInformationAsync(MemoryCollectionName, id: "info3", text: "I currently live in Seattle and have been living there since 2005"); | |
await kernel.Memory.SaveInformationAsync(MemoryCollectionName, id: "info4", text: "I visited France and Italy five times since 2015"); | |
await kernel.Memory.SaveInformationAsync(MemoryCollectionName, id: "info5", text: "My family is from New York"); |
SaveInformationAsync
会将text的内容通过 embedding
模型转化为对应的文本向量,存放在的MemoryStore中。其中CollectionName如同数据库的表名,Id就是Id。
语义搜索
完成信息的存储之后,就可以用来语义搜索了。
直接使用Memory.SearchAsync
方法,指定对应的Collection,同时提供相应的查询问题,查询问题也会被转化为embedding,再在MemoryStore中计算查找最相似的信息。
var questions = new[] | |
{ | |
"what is my name?", | |
"where do I live?", | |
"where is my family from?", | |
"where have I travelled?", | |
"what do I do for work?", | |
}; | |
foreach (var q in questions) | |
{ | |
var response = await kernel.Memory.SearchAsync(MemoryCollectionName, q).FirstOrDefaultAsync(); | |
Console.WriteLine(q + " " + response?.Metadata.Text); | |
} | |
// output | |
/* | |
what is my name? My name is Andrea | |
where do I live? I currently live in Seattle and have been living there since 2005 | |
where is my family from? My family is from New York | |
where have I travelled? I visited France and Italy five times since 2015 | |
what do I do for work? I currently work as a tourist operator | |
*/ |
到这个时候,即便不需要进行总结归纳,光是这样的语义查找,都会很有价值。
引用存储
除了添加信息以外,还可以添加引用,像是非常有用的参考链接之类的。
const string memoryCollectionName = "SKGitHub"; | |
var githubFiles = new Dictionary<string, string>() | |
{ | |
["https://github.com/microsoft/semantic-kernel/blob/main/README.md"] | |
= "README: Installation, getting started, and how to contribute", | |
["https://github.com/microsoft/semantic-kernel/blob/main/samples/notebooks/dotnet/2-running-prompts-from-file.ipynb"] | |
= "Jupyter notebook describing how to pass prompts from a file to a semantic skill or function", | |
["https://github.com/microsoft/semantic-kernel/blob/main/samples/notebooks/dotnet/Getting-Started-Notebook.ipynb"] | |
= "Jupyter notebook describing how to get started with the Semantic Kernel", | |
["https://github.com/microsoft/semantic-kernel/tree/main/samples/skills/ChatSkill/ChatGPT"] | |
= "Sample demonstrating how to create a chat skill interfacing with ChatGPT", | |
["https://github.com/microsoft/semantic-kernel/blob/main/dotnet/src/SemanticKernel/Memory/Volatile/VolatileMemoryStore.cs"] | |
= "C# class that defines a volatile embedding store", | |
["https://github.com/microsoft/semantic-kernel/tree/main/samples/dotnet/KernelHttpServer/README.md"] | |
= "README: How to set up a Semantic Kernel Service API using Azure Function Runtime v4", | |
["https://github.com/microsoft/semantic-kernel/tree/main/samples/apps/chat-summary-webapp-react/README.md"] | |
= "README: README associated with a sample starter react-based chat summary webapp", | |
}; | |
foreach (var entry in githubFiles) | |
{ | |
await kernel.Memory.SaveReferenceAsync( | |
collection: memoryCollectionName, | |
description: entry.Value, | |
text: entry.Value, | |
externalId: entry.Key, | |
externalSourceName: "GitHub" | |
); | |
} |
同样的,使用SearchAsync搜索就行。
string ask = "I love Jupyter notebooks, how should I get started?"; | |
Console.WriteLine("===========================\n" + | |
"Query: " + ask + "\n"); | |
var memories = kernel.Memory.SearchAsync(memoryCollectionName, ask, limit: 5, minRelevanceScore: 0.77); | |
var i = 0; | |
await foreach (MemoryQueryResult memory in memories) | |
{ | |
Console.WriteLine($"Result {++i}:"); | |
Console.WriteLine(" URL: : " + memory.Metadata.Id); | |
Console.WriteLine(" Title : " + memory.Metadata.Description); | |
Console.WriteLine(" ExternalSource: " + memory.Metadata.ExternalSourceName); | |
Console.WriteLine(" Relevance: " + memory.Relevance); | |
Console.WriteLine(); | |
} | |
//output | |
/* | |
=========================== | |
Query: I love Jupyter notebooks, how should I get started? | |
Result 1: | |
URL: : https://github.com/microsoft/semantic-kernel/blob/main/samples/notebooks/dotnet/Getting-Started-Notebook.ipynb | |
Title : Jupyter notebook describing how to get started with the Semantic Kernel | |
ExternalSource: GitHub | |
Relevance: 0.8677381632778319 | |
Result 2: | |
URL: : https://github.com/microsoft/semantic-kernel/blob/main/samples/notebooks/dotnet/2-running-prompts-from-file.ipynb | |
Title : Jupyter notebook describing how to pass prompts from a file to a semantic skill or function | |
ExternalSource: GitHub | |
Relevance: 0.8162989178955157 | |
Result 3: | |
URL: : https://github.com/microsoft/semantic-kernel/blob/main/README.md | |
Title : README: Installation, getting started, and how to contribute | |
ExternalSource: GitHub | |
Relevance: 0.8083238591883483 | |
*/ |
这里多使用了两个参数,一个是limit,用于限制返回信息的条数,只返回最相似的前几条数据,另外一个是minRelevanceScore,限制最小的相关度分数,这个取值范围在0.0 ~ 1.0 之间,1.0意味着完全匹配。
语义问答
将Memory的存储、搜索功能和语义技能相结合,就可以快速的打造一个实用的语义问答的应用了。
只需要将搜索到的相关信息内容填充到 prompt中,然后将内容和问题都抛给LLM,就可以等着得到一个满意的答案了。
const string MemoryCollectionName = "aboutMe"; | |
await kernel.Memory.SaveInformationAsync(MemoryCollectionName, id: "info1", text: "My name is Andrea"); | |
await kernel.Memory.SaveInformationAsync(MemoryCollectionName, id: "info2", text: "I currently work as a tourist operator"); | |
await kernel.Memory.SaveInformationAsync(MemoryCollectionName, id: "info3", text: "I currently live in Seattle and have been living there since 2005"); | |
await kernel.Memory.SaveInformationAsync(MemoryCollectionName, id: "info4", text: "I visited France and Italy five times since 2015"); | |
await kernel.Memory.SaveInformationAsync(MemoryCollectionName, id: "info5", text: "My family is from New York"); | |
var prompt = | |
""" | |
It can give explicit instructions or say 'I don't know' if it does not have an answer. | |
Information about me, from previous conversations: | |
{{ $fact }} | |
User: {{ $ask }} | |
ChatBot: | |
"""; | |
var skill = kernel.CreateSemanticFunction(prompt); | |
var ask = "Hello, I think we've met before, remember? my name is..."; | |
var fact = await kernel.Memory.SearchAsync(MemoryCollectionName,ask).FirstOrDefaultAsync(); | |
var context = kernel.CreateNewContext(); | |
context["fact"] = fact?.Metadata?.Text; | |
context["ask"] = ask; | |
var resultContext =await skill.InvokeAsync(context); | |
resultContext.Result.Dump(); | |
//output | |
/* | |
Hi there! Yes, I remember you. Your name is Andrea, right? | |
*/ | |
优化搜索过程
由于这种场景太常见了,所以Semantic Kernel中直接提供了一个技能TextMemorySkill,通过Function调用的方式简化了搜索的过程。
// .. SaveInformations | |
// TextMemorySkill provides the "recall" function | |
kernel.ImportSkill(new TextMemorySkill()); | |
var prompt = | |
""" | |
It can give explicit instructions or say 'I don't know' if it does not have an answer. | |
Information about me, from previous conversations: | |
{{ recall $ask }} | |
User: {{ $ask }} | |
ChatBot: | |
"""; | |
var skill = kernel.CreateSemanticFunction(prompt); | |
var ask = "Hello, I think we've met before, remember? my name is..."; | |
var context = kernel.CreateNewContext(); | |
context["ask"] = ask; | |
context[TextMemorySkill.CollectionParam] = MemoryCollectionName; | |
var resultContext =await skill.InvokeAsync(context); | |
resultContext.Result.Dump(); | |
// output | |
/* | |
Hi there! Yes, I remember you. Your name is Andrea, right? | |
*/ |
这里直接使用 recall 方法,将问题传给了 TextMemorySkill,搜索对应得到结果,免去了手动搜索注入得过程。
内存的持久化
VolatileMemoryStore
本身也是易丢失的,往往使用到内存的场景,其中的信息都是有可能长期存储的,起码并不会即刻过期。那么将这些信息的 embedding
能够长期存储起来,也是比较划算的事情。毕竟每一次做 embedding的转化也是需要调接口,需要花钱的。
Semantic Kernel库中包含了SQLite、Qdrant和CosmosDB的实现,自行扩展的话,也只需要实现 IMemoryStore
这个接口就可以了。
至于未来,可能就是专用的 Vector Database
了。