Go to file
shengdinghu 2d3fc201d2 Merge branch 'with_bmtrain' of github.com:thunlp/OpenDelta into with_bmtrain 2022-10-17 09:11:55 +00:00
dist fixbugs 2022-06-07 01:52:32 +08:00
docs merge parallel 2022-03-13 22:04:38 +08:00
examples Merge branch 'with_bmtrain' of github.com:thunlp/OpenDelta into with_bmtrain 2022-10-17 09:11:55 +00:00
opendelta init 2022-09-03 10:12:12 +00:00
.gitignore update examples 2022-04-22 19:30:05 +08:00
.readthedocs.yaml first commit 2022-02-14 21:19:03 +08:00
LICENSE Create LICENSE 2022-05-30 17:18:04 +08:00
README.md Update README.md 2022-08-23 01:21:55 +08:00
requirements.txt fixbug 2022-06-06 16:21:55 +08:00
setup.py fixbugs 2022-06-07 02:04:02 +08:00

README.md

An Open-Source Framework for Paramter-Efficient Tuning (Delta Tuning).


OverviewInstallationBasic UsageDocsPerformance

version

Overview

OpenDelta is a toolkit for parameter-efficient tuning methods (we dub it as delta tuning), by which users could flexibly assign (or add) a small amount parameters to update while keeping the most paramters frozen. By using OpenDelta, users could easily implement prefix-tuning, adapters, Lora, or any other types of delta tuning with preferred PTMs.

  • Our repo is tested on Python 3.8 and PyTorch 1.9.0. Lower version may also be supported.

  • A demo of using Opendelta to modify the PLM (E.g., BART). How PLM changes using Delta-tuning

Updates

  • 2022.03.24 We notice several bugs in Soft Prompt Tuning and Prefix Tuning, mainly due to their need to customize attention ids, token_type_ids, we are fixing it! Currently, please use the other methods since they are stabler and better in performance.
  • 2022.03.20 Add a colab example to illustrate efficient training and space-saving multitask-serving.
  • 2022.03.20 A new pip version released.
  • 2022.02.16 Support regular expression in named-based addressing.

Installation

create a virtualenv (optional)

conda create -n opendelta_env python=3.8
conda activate opendelta_env

Using Pip

Install OpenDelta using pip as follows:

pip install opendelta

To play with the latest features, you can also install OpenDelta from the source.

Build from Source

git clone https://github.com/thunlp/OpenDelta.git
cd OpenDelta

Option 1: If you won't modify the code, run

python setup.py install

Option 2: If you want to modify the code or keep the repo updated by git clone, run

python setup.py develop

Must Try

from transformers import AutoModelForSeq2SeqLM
t5 = AutoModelForSeq2SeqLM.from_pretrained("t5-large")
from opendelta import AutoDeltaModel
delta = AutoDeltaModel.from_finetuned("thunlp/FactQA_T5-large_Adapter", backbone_model=t5)
delta.log()

Verified Supported Models

  • You can try to use OpenDelta on any backbone models based on PyTorch.

  • However, with small chances thatThe interface of the submodules of the backbone model is not supported. Therefore we verified some commonly used models that OpenDelta are sure to support.

  • We will keep testing more and more emerging models.

  • Pull requests are welcomed when you successfully apply OpenDelta on your own backbone model.

Lora Bias
Tuning
Adapter
Houstbly
Adapter
Preffier
Adapter
Drop
Adapater
Low-Rank
Compactor Prefix
Tuning
Prompt
Tuning
T5
GPT-2
BART
DistilBERT
RoBERTa
BERT
T5-3b(parallel)
Deberta-v2
CTRL