This commit is contained in:
BUAADreamer 2024-05-11 13:11:10 +08:00
commit 7944cbc576
7 changed files with 21 additions and 7 deletions

View File

@ -310,13 +310,19 @@ huggingface-cli login
### Installation ### Installation
> [!IMPORTANT]
> Installation is mandatory.
```bash ```bash
git clone https://github.com/hiyouga/LLaMA-Factory.git git clone https://github.com/hiyouga/LLaMA-Factory.git
cd LLaMA-Factory cd LLaMA-Factory
pip install -e .[metrics] pip install -e .[torch,metrics]
``` ```
Extra dependencies available: metrics, deepspeed, bitsandbytes, vllm, galore, badam, gptq, awq, aqlm, qwen, modelscope, quality Extra dependencies available: torch, metrics, deepspeed, bitsandbytes, vllm, galore, badam, gptq, awq, aqlm, qwen, modelscope, quality
> [!TIP]
> Use `pip install --no-deps -e .` to resolve package conflicts.
<details><summary>For Windows users</summary> <details><summary>For Windows users</summary>

View File

@ -310,13 +310,19 @@ huggingface-cli login
### 安装 LLaMA Factory ### 安装 LLaMA Factory
> [!IMPORTANT]
> 此步骤为必需。
```bash ```bash
git clone https://github.com/hiyouga/LLaMA-Factory.git git clone https://github.com/hiyouga/LLaMA-Factory.git
cd LLaMA-Factory cd LLaMA-Factory
pip install -e .[metrics] pip install -e .[torch,metrics]
``` ```
可选的额外依赖项metrics、deepspeed、bitsandbytes、vllm、galore、badam、gptq、awq、aqlm、qwen、modelscope、quality 可选的额外依赖项torch、metrics、deepspeed、bitsandbytes、vllm、galore、badam、gptq、awq、aqlm、qwen、modelscope、quality
> [!TIP]
> 遇到包冲突时,可使用 `pip install --no-deps -e .` 解决。
<details><summary>Windows 用户指南</summary> <details><summary>Windows 用户指南</summary>

Binary file not shown.

Before

Width:  |  Height:  |  Size: 141 KiB

After

Width:  |  Height:  |  Size: 186 KiB

View File

@ -41,7 +41,7 @@
}, },
"glaive_toolcall": { "glaive_toolcall": {
"file_name": "glaive_toolcall_10k.json", "file_name": "glaive_toolcall_10k.json",
"file_sha1": "a6917b85d209df98d31fdecb253c79ebc440f6f3", "file_sha1": "36aea64548fbf6aa300bef411b9221092ed84902",
"formatting": "sharegpt", "formatting": "sharegpt",
"columns": { "columns": {
"messages": "conversations", "messages": "conversations",

View File

@ -1,4 +1,3 @@
torch>=1.13.1
transformers>=4.37.2 transformers>=4.37.2
datasets>=2.14.3 datasets>=2.14.3
accelerate>=0.27.2 accelerate>=0.27.2

View File

@ -20,6 +20,7 @@ def get_requires():
extra_require = { extra_require = {
"torch": ["torch>=1.13.1"],
"metrics": ["nltk", "jieba", "rouge-chinese"], "metrics": ["nltk", "jieba", "rouge-chinese"],
"deepspeed": ["deepspeed>=0.10.0,<=0.14.0"], "deepspeed": ["deepspeed>=0.10.0,<=0.14.0"],
"bitsandbytes": ["bitsandbytes>=0.39.0"], "bitsandbytes": ["bitsandbytes>=0.39.0"],

View File

@ -1,6 +1,8 @@
import os import os
from contextlib import asynccontextmanager from contextlib import asynccontextmanager
from typing import Annotated, Optional from typing import Optional
from typing_extensions import Annotated
from ..chat import ChatModel from ..chat import ChatModel
from ..extras.misc import torch_gc from ..extras.misc import torch_gc