forked from p04798526/LLaMA-Factory-Mirror
update readme
This commit is contained in:
parent
c9b166615c
commit
8ede3128df
|
@ -475,6 +475,9 @@ python src/export_model.py \
|
|||
--export_dir path_to_export
|
||||
```
|
||||
|
||||
> [!WARNING]
|
||||
> Merging LoRA weights into a GPTQ quantized model is not supported.
|
||||
|
||||
### API Demo
|
||||
|
||||
```bash
|
||||
|
|
|
@ -475,6 +475,9 @@ python src/export_model.py \
|
|||
--export_dir path_to_export
|
||||
```
|
||||
|
||||
> [!WARNING]
|
||||
> 尚不支持 GPTQ 量化模型的 LoRA 权重合并及导出。
|
||||
|
||||
### API 服务
|
||||
|
||||
```bash
|
||||
|
|
Loading…
Reference in New Issue