Merge branch 'main' of https://github.com/hiyouga/LLaMA-Factory
This commit is contained in:
commit
226a9e563f
|
@ -41,7 +41,7 @@ def prepare_4d_attention_mask(attention_mask_with_indices: "torch.Tensor", dtype
|
||||||
[x, x, o, x, x, x],
|
[x, x, o, x, x, x],
|
||||||
[x, x, o, o, x, x],
|
[x, x, o, o, x, x],
|
||||||
[x, x, o, o, o, x],
|
[x, x, o, o, o, x],
|
||||||
[x, x, o, x, x, x],
|
[x, x, x, x, x, x],
|
||||||
]
|
]
|
||||||
]
|
]
|
||||||
]
|
]
|
||||||
|
|
Loading…
Reference in New Issue