"modelscope加载模型,如何指定gpu卡?
tokenizer = AutoTokenizer.from_pretrained(
model_path_image_interpretation_1, trust_remote_code=True
)
# pipe = AutoModelForCausalLM.from_pretrained(
# model_path_image_interpretation_1,
# device_map=""auto"",
# trust_remote_code=True,
# fp16=True,
# ).eval()
os.environ[""CUDA_LAUNCH_BLOCKING""] = ""0""
pipe = AutoModel.from_pretrained(
model_path_image_interpretation_1,
torch_dtype=torch.bfloat16,
low_cpu_mem_usage=True,
trust_remote_code=True,
device_map=""auto"",
).eval()"
ModelScope旨在打造下一代开源的模型即服务共享平台,为泛AI开发者提供灵活、易用、低成本的一站式模型服务产品,让模型应用更简单!欢迎加入技术交流群:微信公众号:魔搭ModelScope社区,钉钉群号:44837352