wql
|
42d9773188
|
feat: add npu status
|
2024-09-06 14:03:54 +08:00 |
wql
|
113966157c
|
fix: fix max steps
|
2024-09-05 15:54:33 +08:00 |
wql
|
f15e37dfad
|
fix: fix bf16
|
2024-09-05 15:49:32 +08:00 |
wql
|
e754b62ccd
|
fix: test
|
2024-09-05 15:28:27 +08:00 |
wql
|
2248960fe7
|
fix: fix format
|
2024-09-05 15:18:37 +08:00 |
wql
|
3d018c8248
|
fix: fix typo
|
2024-09-05 15:17:39 +08:00 |
wql
|
5baa46a798
|
fix: test fix
|
2024-09-05 15:15:37 +08:00 |
wql
|
62a486dfc0
|
add: add test file
|
2024-09-05 07:07:49 +00:00 |
wql
|
c6a4d43c06
|
fix: remove no need test file
|
2024-09-05 07:05:47 +00:00 |
wql
|
ab4bf8bd4d
|
add: add all test results
|
2024-09-05 06:52:33 +00:00 |
wql
|
f71f62f2f6
|
fix: fix typo
|
2024-09-05 14:47:17 +08:00 |
wql
|
ba90bf1625
|
chore: sort run_once
|
2024-09-05 14:41:48 +08:00 |
wql
|
36840f0310
|
fix: fix baichuan2 path
|
2024-09-05 14:35:47 +08:00 |
wql
|
190fddf27d
|
fix: small change
|
2024-09-05 13:37:17 +08:00 |
wql
|
64044380bd
|
fix: add not supported model err msg
|
2024-09-05 13:21:38 +08:00 |
wql
|
8162a54aa5
|
fix:small fix
|
2024-09-05 13:10:34 +08:00 |
wql
|
95b4b493e6
|
chore: add echo
|
2024-09-05 13:09:43 +08:00 |
wql
|
ceb01459fe
|
fix: fix bug
|
2024-09-05 13:04:54 +08:00 |
wql
|
cc99691cf4
|
fix: fix file type
|
2024-09-05 13:03:22 +08:00 |
wql
|
4058fd7d64
|
fix: remove no use import
|
2024-09-05 13:01:49 +08:00 |
wql
|
846fb7bfef
|
fix: fix space
|
2024-09-05 12:59:52 +08:00 |
wql
|
f23e9d417e
|
fix: fix space
|
2024-09-05 12:59:02 +08:00 |
wql
|
0cf37e5ec1
|
fix: fix para
|
2024-09-05 12:57:41 +08:00 |
wql
|
ae308991fb
|
fix: fix first line
|
2024-09-05 12:54:15 +08:00 |
wql
|
3e548489ed
|
feat: done easy run
|
2024-09-05 11:28:19 +08:00 |
wql
|
fa9a9007f9
|
chore: add lora sft and predict template yaml file
|
2024-09-04 16:52:15 +08:00 |
wql
|
a61372ee0f
|
chore: change gpu status save folder
|
2024-09-04 16:24:15 +08:00 |
wql
|
91db09b6c7
|
feat: add gpu_status.py
|
2024-09-04 16:15:17 +08:00 |
wql
|
e7a7a22f19
|
feat: add save_state in prediction
|
2024-09-04 16:08:57 +08:00 |
wql
|
7af7b40955
|
chore: change docker to ubuntu22.04
|
2024-09-04 09:26:09 +08:00 |
wql
|
d2a4d909c1
|
chore: change docker env to ubuntu 20.04
|
2024-08-29 11:19:48 +08:00 |
wql
|
e0a4a0e451
|
chore: change pip to tuna source in docker
|
2024-08-29 09:33:31 +08:00 |
hiyouga
|
0f5a0f64f7
|
update wechat
|
2024-08-27 12:55:23 +08:00 |
hiyouga
|
d14edd350d
|
add extra requires
|
2024-08-27 12:52:12 +08:00 |
hiyouga
|
f6ae4e75dd
|
tiny fix
|
2024-08-27 12:49:32 +08:00 |
hoshi-hiyouga
|
dbe886ae5c
|
Merge pull request #5237 from marko1616/patch-1
Fix mllm api
|
2024-08-27 12:24:43 +08:00 |
marko1616
|
df8d5b6985
|
ruff pass.
|
2024-08-27 11:30:16 +08:00 |
marko1616
|
1545684c3f
|
Update chat.py
|
2024-08-27 11:27:56 +08:00 |
hiyouga
|
72bc8f0111
|
support liger kernel
|
2024-08-27 11:20:14 +08:00 |
marko1616
|
3a28521710
|
Force re check.
|
2024-08-23 14:43:18 +08:00 |
marko1616
|
8eb2092921
|
Update chat.py
|
2024-08-22 12:24:34 +08:00 |
marko1616
|
a4f1de9d82
|
Update chat.py
|
2024-08-22 12:14:34 +08:00 |
hoshi-hiyouga
|
36039b0fe0
|
Merge pull request #5230 from MengqingCao/image
[NPU] Update npu base image
|
2024-08-21 22:13:07 +08:00 |
hiyouga
|
8907150c1b
|
update wechat
|
2024-08-21 22:07:34 +08:00 |
MengqingCao
|
b3f4acd1b4
|
update npu base image
|
2024-08-21 09:12:38 +00:00 |
hiyouga
|
c8b4c7fee5
|
tiny fix
|
2024-08-20 00:10:52 +08:00 |
hoshi-hiyouga
|
15be296347
|
Merge pull request #5156 from YeQiuO/main
fix Llama-template's system prompt bug
|
2024-08-20 00:09:03 +08:00 |
hoshi-hiyouga
|
ec72eeca52
|
Update template.py
|
2024-08-20 00:03:33 +08:00 |
hoshi-hiyouga
|
da335d42c3
|
Merge pull request #5163 from liu-zichen/fix_ppo_optim
fix lr not change
|
2024-08-19 23:56:24 +08:00 |
hoshi-hiyouga
|
f59c9bef31
|
Merge pull request #5185 from chenhuiyu/feature/add-sailorllm-template
Add SailorLLM template
|
2024-08-19 23:51:49 +08:00 |