前言
https://www.modelscope.cn/models/qwen/Qwen2-7B-Instruct
其实完全可以按照这个介绍来装,不过容易遇到一些问题,新学习,也是记录一下
环境
python 3.10
- 在自己电脑上用conda创建一个新环境
- 找租用的服务器租一台,3090一小时也就一两块钱
我个人是租的服务器来运行的
具体步骤
安装依赖
pip install transformers==4.37.0 accelerate tiktoken einops scipy transformers_stream_generator==0.0.4 peft deepspeed
pip install modelscope
假如说,依赖的包找不到,可以从网上找到zip或者tar,通过绝对地址来进行安装
pip install 包的绝对路径
模型下载
我没有预先下载模型,完全通过代码自动下载的,快慢主要受限于网速,我在服务器上下载模型很快,十分钟左右就下好了。
from modelscope import AutoModelForCausalLM, AutoTokenizer
device = "cuda" # the device to load the model onto
model = AutoModelForCausalLM.from_pretrained(
"qwen/Qwen2-7B-Instruct",
torch_dtype="auto",
device_map="auto"
)
生成tokenizer
tokenizer = AutoTokenizer.from_pretrained("qwen/Qwen2-7B-Instruct")
设置提示词
prompt = "给我讲一个故事。"
messages = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": prompt}
]
text = tokenizer.apply_chat_template(
messages,
tokenize=False,
add_generation_prompt=True
)
模型初始角色试system,通过content中的提示词You are a helpful assistant.
,这时候模型的角色就会变成了assistant
,此时用户再进行对话Give me a short introduction to large language model.
,模型就会给出回应
使用tokenizer分词
model_inputs = tokenizer([text], return_tensors="pt").to(device)
代码运行结果
完整代码
from modelscope import AutoModelForCausalLM, AutoTokenizer
device = "cuda" # the device to load the model onto
model = AutoModelForCausalLM.from_pretrained(
"qwen/Qwen2-7B-Instruct",
torch_dtype="auto",
device_map="auto"
)
tokenizer = AutoTokenizer.from_pretrained("qwen/Qwen2-7B-Instruct")
prompt = "给我讲一个故事。"
messages = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": prompt}
]
text = tokenizer.apply_chat_template(
messages,
tokenize=False,
add_generation_prompt=True
)
model_inputs = tokenizer([text], return_tensors="pt").to(device)
generated_ids = model.generate(
model_inputs.input_ids,
max_new_tokens=512
)
generated_ids = [
output_ids[len(input_ids):] for input_ids, output_ids in zip(model_inputs.input_ids, generated_ids)
]
response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)[0]
print(response)
问题解决
not found in your environment: transformers_stream_generator
transformers版本问题,卸载安装对应版本,而且原文写了要安装什么版本,transformers>=4.37.0
NoValidRevisionError: The model: qwen/Qwen2-7B-Instruct has no valid revision!
modelscope版本问题,更新modelscope版本,使用命令pip install --upgrade modelscope
进行更新