73fe77bb72 | ||
---|---|---|
dist | ||
docs | ||
examples | ||
opendelta | ||
.gitignore | ||
.readthedocs.yaml | ||
LICENSE | ||
README.md | ||
requirements.txt | ||
setup.py |
README.md
An Open-Source Framework for Paramter-Efficient Tuning (Delta Tuning).
Overview • Installation • Basic Usage • Docs • Performance •
Overview
OpenDelta is a toolkit for parameter efficient methods (we dub it as delta tuning), by which users could flexibly assign (or add) a small amount parameters to update while keeping the most paramters frozen. By using OpenDelta, users could easily implement prefix-tuning, adapters, Lora, or any other types of delta tuning with preferred PTMs.
-
Our repo is tested on Python 3.8 and PyTorch 1.9.0. Lower version may also be supported.
Updates
- 2022.03.24 We notice several bugs in Soft Prompt Tuning and Prefix Tuning, mainly due to their need to customize attention ids, token_type_ids, we are fixing it! Currently, please use the other methods since they are stabler and better in performance.
- 2022.03.20 Add a colab example to illustrate efficient training and space-saving multitask-serving.
- 2022.03.20 A new pip version released.
- 2022.02.16 Support regular expression in named-based addressing.
Installation
create a virtualenv (optional)
conda create -n opendelta_env python=3.8
conda activate opendelta_env
Using Pip
Install OpenDelta using pip as follows:
pip install opendelta
To play with the latest features, you can also install OpenDelta from the source.
Build from Source
git clone https://github.com/thunlp/OpenDelta.git
cd OpenDelta
Option 1: If you won't modify the code, run
python setup.py install
Option 2: If you want to modify the code or keep the repo updated by git clone, run
python setup.py develop
If you encounter network error using setup.py, please firstly install the dependencies via
pip install -r requirements.txt && python setup.py develop
Must Try
from transformers import AutoModelForSeq2SeqLM
t5 = AutoModelForSeq2SeqLM.from_pretrained("t5-base")
from opendelta import AutoDeltaModel
delta = AutoDeltaModel.from_finetuned("DeltaHub/lora_t5-base_mrpc", backbone_model=t5)
delta.log()
Verified Supported Models
-
You can try to use OpenDelta on any backbone models based on PyTorch.
-
However, with small chances thatThe interface of the submodules of the backbone model is not supported. Therefore we verified some commonly used models that OpenDelta are sure to support.
-
We will keep testing more and more emerging models.
-
Pull requests are welcomed when you successfully apply OpenDelta on your own backbone model.
Lora | Bias Tuning |
Adapter Houstbly |
Adapter Preffier |
Adapter Drop |
Adapater Low-Rank |
Compactor | Prefix Tuning |
Prompt Tuning |
|
---|---|---|---|---|---|---|---|---|---|
T5 | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
GPT-2 | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | |
BART | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | |
DistilBERT | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | |
RoBERTa | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | |
BERT | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
T5-3b(parallel) | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
Deberta-v2 | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ||
CTRL | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ||
ViT | ✅ |
Performance Checked Combination
Google sheet here
Subject to change at any moment.