Commit Graph

2099 Commits

Author SHA1 Message Date
hiyouga fb0c400116 fix ppo trainer 2024-07-10 11:05:45 +08:00
hiyouga 2f09520c0d fix #4742 2024-07-09 23:24:24 +08:00
hiyouga 86b1594823 Update wechat.jpg 2024-07-09 09:25:11 +08:00
hoshi-hiyouga 563a27dab7
Merge pull request #4706 from T-Atlas/main
chore: Update vllm_engine.py to support vllm version >= 0.5.1
2024-07-07 15:50:38 +08:00
hoshi-hiyouga f84b007ebb
Update packages.py 2024-07-07 15:48:29 +08:00
Lian Junhong 322663bf90 chore: Update vllm_engine.py to support vllm version >= 0.5.1 2024-07-07 15:08:12 +08:00
hiyouga a15782cb9f fix #4705 2024-07-07 13:10:06 +08:00
marko1616 e0562521bb
Update utils.py
In windows mutiline command should like
command --arg1 xxx `
--arg2 xxx `
2024-07-06 20:40:13 +08:00
hiyouga 53b1002fb7 add codegeex4, internlm2.5 2024-07-06 16:16:47 +08:00
hiyouga c9bb0757ec update pissa example 2024-07-06 15:47:32 +08:00
codingma 76f3bbcfc0 1. add custom eval dataset support
2. merge load dataset and split dataset function
2024-07-05 15:52:10 +08:00
hiyouga 9f33f1edf5 fix processors 2024-07-05 08:33:22 +08:00
hiyouga e43809bced fix #4683 2024-07-05 00:58:05 +08:00
hiyouga ed232311e8 fix #4674 2024-07-05 00:41:03 +08:00
hiyouga 226a9e563f Merge branch 'main' of https://github.com/hiyouga/LLaMA-Factory 2024-07-04 14:23:37 +08:00
hiyouga 1e27e8c776 fix #4677 2024-07-04 14:22:07 +08:00
hoshi-hiyouga 07d96d497c
Merge pull request #4673 from hzhaoy/main
tiny fix
2024-07-04 10:40:41 +08:00
hzhaoy 738df47748 tiny fix 2024-07-04 10:20:28 +08:00
hiyouga 636bb9c1e6 update tests 2024-07-04 04:00:12 +08:00
hiyouga 0c699de39d tiny fix 2024-07-04 03:47:05 +08:00
hiyouga 44747cebd2 tiny fix 2024-07-04 03:02:23 +08:00
hiyouga b5d101e1bf fix data map for packing 2024-07-04 03:01:31 +08:00
hiyouga b03e4a74ba update wechat 2024-07-04 01:55:05 +08:00
hiyouga 6fd6aa4530 fix packing for eager/sdpa attn 2024-07-04 01:52:43 +08:00
hoshi-hiyouga 87d9b2d005
Merge pull request #4224 from chuan298/main
Implement efficient packing without cross-contamination attention
2024-07-04 01:18:54 +08:00
hiyouga cce7083024 update packing 2024-07-04 01:10:55 +08:00
hoshi-hiyouga a36e8f2dd5
Update packing.py 2024-07-03 23:36:01 +08:00
hiyouga c346f79f99 update func name 2024-07-03 23:29:33 +08:00
hiyouga 8a6a7b9c8a update arg name 2024-07-03 23:23:24 +08:00
hiyouga 575a02a23d update hparams 2024-07-03 23:18:58 +08:00
hiyouga 7f770f6895 update ui 2024-07-03 23:13:49 +08:00
hiyouga a4a1ddbcb9 test 2024-07-03 23:05:39 +08:00
hiyouga 1e0c860c8c update scripts 2024-07-03 20:07:44 +08:00
hiyouga 8845e94f91 fix #4609
unwrap_model_for_generation(reward_model) is necessary for zero3 training
2024-07-03 19:45:51 +08:00
hiyouga 87346c0946 update readme 2024-07-03 19:39:05 +08:00
hoshi-hiyouga 3449c3531f
Merge pull request #4662 from wzh1994/wzh/readme
Add `LazyLLM` to `Projects using LLaMA Factory` in `README.md`
2024-07-03 15:51:02 +08:00
wangzhihong 6f8f53f879
Update README_zh.md 2024-07-03 14:59:09 +08:00
wangzhihong 22da47ba27
add LazyLLM to `Projects using LLaMA Factory` in `README.md` 2024-07-03 11:12:20 +08:00
hiyouga 8b1172b910 tiny fix 2024-07-03 02:31:50 +08:00
hiyouga 71cdf8956e tiny fix 2024-07-02 23:06:13 +08:00
hiyouga 821bb6660e remove rlhf support for chatglm2&3 2024-07-02 23:03:17 +08:00
hiyouga c13ae2df19 upcast logits 2024-07-02 22:32:05 +08:00
hiyouga c47ab6c072 improve rlhf 2024-07-02 22:23:08 +08:00
ancv e8e13b0942 move efficient_packing from data_args to model_args 2024-07-02 18:37:55 +07:00
hiyouga 9dcff3a5b5 Update bug-report.yml 2024-07-02 19:18:56 +08:00
hiyouga c81687963a Update bug-report.yml 2024-07-02 19:16:12 +08:00
hoshi-hiyouga 4e4b3cc905
Merge pull request #4651 from hzhaoy/add-telechat-1b
Add TeleChat-1B
2024-07-02 17:56:43 +08:00
hzhaoy 57b7c00430 add TeleChat-1B 2024-07-02 17:49:04 +08:00
hiyouga 4c296001c4 fix ppo callbacks 2024-07-02 17:34:56 +08:00
hoshi-hiyouga e8e6af2651
Merge branch 'main' into main 2024-07-01 21:01:09 +08:00