Commit Graph

1593 Commits

Author SHA1 Message Date
BUAADreamer 7944cbc576 Merge branch 'main' of https://github.com/BUAADreamer/LLaMA-Factory 2024-05-11 13:11:10 +08:00
BUAADreamer 7be7972f28 add full parameter finetuning of mllm 2024-05-11 13:11:00 +08:00
BUAADreamer 15dab0677e
Merge branch 'hiyouga:main' into main 2024-05-11 13:10:36 +08:00
codingma 101409d9cb
Merge pull request #3661 from codemayq/main
fix sha1 of glaive_toolcall dataset
2024-05-11 10:12:21 +08:00
kkkl b5c5c315a5
Update constants.py
Fix the download issue of the Phi3 model
2024-05-11 00:22:40 +08:00
BUAADreamer 508d474754
Merge branch 'hiyouga:main' into main 2024-05-10 20:34:41 +08:00
codingma cd5bb2a0a1 update wechat 2024-05-10 17:25:25 +08:00
hiyouga 75aec4cf8e resolve python 3.8 package 2024-05-09 16:52:27 +08:00
codingma d5520b6017 fix sha1 of glaive_toolcall dataset 2024-05-09 16:33:45 +08:00
Tendo33 fd2e6dec58 1.Change the name of is_fastapi_available function
2. Added the log of printing requests when deploying using vllm
2024-05-09 14:28:01 +08:00
BUAADreamer 8b997e32fb add push processor to hub 2024-05-09 14:05:19 +08:00
BUAADreamer fdb3955448 add mllm processor save and Chinese-LLaVA-Med show 2024-05-09 13:53:39 +08:00
BUAADreamer 83f2f0de1d
Merge branch 'hiyouga:main' into main 2024-05-09 13:45:43 +08:00
cocktailpeanut 3c11157a49 yet another removal of unnecessary environment variables 2024-05-09 01:33:20 -04:00
cocktailpeanut 425b9d6166 more removal of unnecessary environment variables 2024-05-09 01:32:00 -04:00
cocktailpeanut b783673e0a remove unnecessary environment variable usage 2024-05-09 01:26:15 -04:00
BUAADreamer ef33856380 add mllm export 2024-05-08 22:50:42 +08:00
hiyouga d9cdddd19c fix #3625 2024-05-08 17:12:56 +08:00
hiyouga 48ee46dac1 add llama3 chinese chat 2024-05-08 17:10:03 +08:00
hiyouga 10ab83f4c4 add deepseek moe 236B 2024-05-08 16:37:54 +08:00
hiyouga 2ba662faab Update wechat.jpg 2024-05-08 12:11:57 +08:00
BUAADreamer 0ca1d1967d modify export model 2024-05-08 10:36:36 +08:00
hiyouga b3a9ae4085 update readme 2024-05-07 22:17:04 +08:00
hiyouga 1ccbfe562d remove big file 2024-05-07 22:14:06 +08:00
hiyouga 92e9195b3c update readme 2024-05-07 21:17:31 +08:00
hiyouga 5177f3ba90 update readme 2024-05-07 19:03:47 +08:00
hiyouga 0f8f7d3b90 fix #3560 2024-05-07 19:03:35 +08:00
hoshi-hiyouga 3c560119ca
Merge pull request #3601 from Katehuuh/main
Add contribution Luminia
2024-05-07 18:01:48 +08:00
hiyouga b0888262e3 fix #3602 2024-05-07 17:50:27 +08:00
hoshi-hiyouga 6159acbaa0
Merge pull request #3604 from gaussian8/main
fix: splitted Dockerfile's CMD
2024-05-07 16:53:23 +08:00
junwooo.lee 4598734a0d fix: splitted Dockerfile's CMD 2024-05-07 15:09:48 +09:00
Katehuuh 19a85bf52d
Update README_zh.md
Add Projects Nekochu/Luminia-13B-v3
2024-05-07 06:28:48 +02:00
Katehuuh 984f7fbbf7
Update README.md
Add Projects Nekochu/Luminia-13B-v3
2024-05-07 06:23:36 +02:00
hiyouga 8e09e20ece update readme 2024-05-07 06:19:29 +08:00
hiyouga 09f3ef1de4 fix stop param 2024-05-07 00:41:04 +08:00
hoshi-hiyouga bcf7ec5ceb
Merge pull request #3527 from zhaonx/dev
"add support for vllm api stop parameter"
2024-05-07 00:37:49 +08:00
hoshi-hiyouga 17d0005b8c
Update vllm_engine.py 2024-05-07 00:37:05 +08:00
hoshi-hiyouga f32eefae3d
Update generating_args.py 2024-05-07 00:28:16 +08:00
hoshi-hiyouga 7ae7ae64f0
Update generating_args.py 2024-05-07 00:27:56 +08:00
hoshi-hiyouga d6ca7853fa
Merge pull request #3588 from ZeyuTeng96/patch-1
update hf_hub_url for nectar_rm in dataset_info
2024-05-07 00:06:11 +08:00
hoshi-hiyouga c3910ab98a
Update dataset_info.json 2024-05-07 00:05:45 +08:00
hiyouga f50c365871 update readme 2024-05-06 23:34:59 +08:00
hiyouga a153039380 fix gradio args 2024-05-06 23:33:06 +08:00
hoshi-hiyouga c8cd00bec6
Merge pull request #3596 from hiyouga/dev_doc
Add CLI document
2024-05-06 23:10:38 +08:00
hiyouga 047313f48e update examples 2024-05-06 23:07:55 +08:00
hiyouga f02f87c6fb update example docs 2024-05-06 22:51:02 +08:00
hiyouga 34d33e2257 update docs 2024-05-06 21:47:00 +08:00
ZeyuTeng96 044af36442
update hf_hub_url for nectar_rm in dataset_info
Hi there,

I cannot find the "mlinmg/RLAIF-Nectar" on hf, seems like it changed as "AstraMindAI/RLAIF-Nectar". So, making a PR for updating.

See: https://huggingface.co/datasets/AstraMindAI/RLAIF-Nectar
2024-05-06 16:44:50 +08:00
zhouwei 28ae947161 The training efficiency of the Ascend 910A has been significantly enhanced, leveraging the full computational power of the NPU (Neural Processing Unit) and the capabilities of torch_npu, a PyTorch library optimized for NPUs. This improvement has resulted in a remarkable tenfold increase in efficiency. 2024-05-06 13:29:59 +08:00
zhaonx96 80645751bc ”add stop parameter in chat.py“ 2024-05-06 10:10:00 +08:00