update readme

This commit is contained in:
hiyouga 2024-05-04 17:01:21 +08:00
parent e984ba3167
commit 57a39783d1
2 changed files with 2 additions and 2 deletions

View File

@ -339,7 +339,7 @@ To enable FlashAttention-2 on the Windows platform, you need to install the prec
### Train with LLaMA Board GUI (powered by [Gradio](https://github.com/gradio-app/gradio))
> [!IMPORTANT]
> LLaMA Board GUI only supports training on a single GPU, please use [CLI](#command-line-interface) for distributed training.
> LLaMA Board GUI only supports training on a single GPU, please use [CLI](#train-with-command-line-interface) for distributed training.
#### Use local environment

View File

@ -339,7 +339,7 @@ pip install https://github.com/jllllll/bitsandbytes-windows-webui/releases/downl
### 利用 LLaMA Board 可视化界面训练(由 [Gradio](https://github.com/gradio-app/gradio) 驱动)
> [!IMPORTANT]
> LLaMA Board 可视化界面目前仅支持单 GPU 训练,请使用[命令行接口](#命令行接口)来进行多 GPU 分布式训练。
> LLaMA Board 可视化界面目前仅支持单 GPU 训练,请使用[命令行接口](#利用命令行接口训练)来进行多 GPU 分布式训练。
#### 使用本地环境