2.1 KiB
2.1 KiB
(autodelta)=
AutoDelta Mechanism
Inspired by Huggingface transformers AutoClasses , we provide an AutoDelta features for the users to
- Easily to experiment with different delta models
- Fast deploy from configuration file, especially from the repos in DeltaHub.
Easily load from dict, so that subject to change the type of delta models.
from opendelta import AutoDeltaConfig, AutoDeltaModel
from transformers import T5ForConditionalGeneration
backbone_model = T5ForConditionalGeneration.from_pretrained("t5-base")
We can load a config from a dict
config_dict = {
"delta_type":"lora",
"modified_modules":[
"SelfAttention.q",
"SelfAttention.v",
"SelfAttention.o"
],
"lora_r":4}
delta_config = AutoDeltaConfig.from_dict(config_dict)
Then use the config to add a delta model to the backbone model
delta_model = AutoDeltaModel.from_config(delta_config, backbone_model=backbone_model)
# now visualize the modified backbone_model
from opendelta import Visualization
Visualizaiton(backbone_model).structure_graph()
```{figure} ../imgs/t5lora.png
---
width: 600px
name: t5lora
---
```
Fast deploy from a finetuned delta checkpoints from DeltaHub
delta_model = AutoDeltaModel.from_finetuned("DeltaHub/sst2-t5-base", backbone_model=backbone_model) # TODO: the link may change.
**Hash checking**
Since the delta model only works together with the backbone model. we will automatically check whether you load the delta model the same way it is trained.We calculate the trained model's [md5](http://some_link) and save it to the config. When finishing loading the delta model, we will re-calculate the md5 to see whether it changes.
Pass `check_hash=False` to disable the hash checking.