Go to file
DingDing 73fe77bb72
Update bart.py
2022-07-09 21:20:30 +08:00
dist update delta center arguments 2022-07-08 03:48:02 +00:00
docs merge parallel 2022-03-13 22:04:38 +08:00
examples Update bart.py 2022-07-09 21:20:30 +08:00
opendelta update delta center arguments 2022-07-08 03:48:02 +00:00
.gitignore finish delta center development 2022-07-03 02:10:18 +00:00
.readthedocs.yaml first commit 2022-02-14 21:19:03 +08:00
LICENSE Create LICENSE 2022-05-30 17:18:04 +08:00
README.md finish delta center development 2022-07-03 02:10:18 +00:00
requirements.txt update delta center arguments 2022-07-08 03:48:02 +00:00
setup.py update delta center arguments 2022-07-08 03:48:02 +00:00

README.md

An Open-Source Framework for Paramter-Efficient Tuning (Delta Tuning).


OverviewInstallationBasic UsageDocsPerformance

version

Overview

OpenDelta is a toolkit for parameter efficient methods (we dub it as delta tuning), by which users could flexibly assign (or add) a small amount parameters to update while keeping the most paramters frozen. By using OpenDelta, users could easily implement prefix-tuning, adapters, Lora, or any other types of delta tuning with preferred PTMs.

  • Our repo is tested on Python 3.8 and PyTorch 1.9.0. Lower version may also be supported.

  • A demo of using Opendelta to modify the PLM (E.g., BART). How PLM changes using Delta-tuning

Updates

  • 2022.03.24 We notice several bugs in Soft Prompt Tuning and Prefix Tuning, mainly due to their need to customize attention ids, token_type_ids, we are fixing it! Currently, please use the other methods since they are stabler and better in performance.
  • 2022.03.20 Add a colab example to illustrate efficient training and space-saving multitask-serving.
  • 2022.03.20 A new pip version released.
  • 2022.02.16 Support regular expression in named-based addressing.

Installation

create a virtualenv (optional)

conda create -n opendelta_env python=3.8
conda activate opendelta_env

Using Pip

Install OpenDelta using pip as follows:

pip install opendelta

To play with the latest features, you can also install OpenDelta from the source.

Build from Source

git clone https://github.com/thunlp/OpenDelta.git
cd OpenDelta

Option 1: If you won't modify the code, run

python setup.py install

Option 2: If you want to modify the code or keep the repo updated by git clone, run

python setup.py develop

If you encounter network error using setup.py, please firstly install the dependencies via

pip install -r requirements.txt && python setup.py develop

Must Try

from transformers import AutoModelForSeq2SeqLM
t5 = AutoModelForSeq2SeqLM.from_pretrained("t5-base")
from opendelta import AutoDeltaModel
delta = AutoDeltaModel.from_finetuned("DeltaHub/lora_t5-base_mrpc", backbone_model=t5)
delta.log()

Verified Supported Models

  • You can try to use OpenDelta on any backbone models based on PyTorch.

  • However, with small chances thatThe interface of the submodules of the backbone model is not supported. Therefore we verified some commonly used models that OpenDelta are sure to support.

  • We will keep testing more and more emerging models.

  • Pull requests are welcomed when you successfully apply OpenDelta on your own backbone model.

Lora Bias
Tuning
Adapter
Houstbly
Adapter
Preffier
Adapter
Drop
Adapater
Low-Rank
Compactor Prefix
Tuning
Prompt
Tuning
T5
GPT-2
BART
DistilBERT
RoBERTa
BERT
T5-3b(parallel)
Deberta-v2
CTRL
ViT

Performance Checked Combination

Google sheet here

Subject to change at any moment.