使用vllm部署加载成功,但测试调用接口/v1/chat/completions有时会报这个错误{"object":"error","message":"The model DeepSeek-R1-0528-Qwen3-8B does not exist.","type":"NotFoundError","param":null,"code":404},请问这有可能是什么问题?

This commit is contained in:
changchang123456
2025-06-05 06:13:25 +00:00
parent 5ee5fa4832
commit 42931deb16

Diff Content Not Available