Go to file
shengdinghu b856ad0fb9 first commit 2022-02-14 21:19:03 +08:00
docs first commit 2022-02-14 21:19:03 +08:00
examples first commit 2022-02-14 21:19:03 +08:00
opendelta first commit 2022-02-14 21:19:03 +08:00
.gitignore first commit 2022-02-14 21:19:03 +08:00
.readthedocs.yaml first commit 2022-02-14 21:19:03 +08:00
README.md first commit 2022-02-14 21:19:03 +08:00
requirements.txt first commit 2022-02-14 21:19:03 +08:00
setup.py first commit 2022-02-14 21:19:03 +08:00

README.md

An Open-Source Framework for Paramter Efficient Tuning.


OverviewInstallationSupported ModelsDocsPerformance

version

Overview

OpenDelta is a toolkit for parameter efficient methods (we dub it as delta tuning), by which users could flexibly assign (or add) a small amount parameters to update while keeping the most paramters frozen. By using OpenDelta, users could easily implement prefix-tuning, adapters, Lora, or any other types of delta tuning with preferred PTMs.

Installation

create a virtualenv (optional)

conda create -n opendelta_env python=3.8
conda activate opendelta_env

Using Pip

Our repo is tested on Python 3.6+ and PyTorch 1.8.1+, install OpenDelta using pip as follows:

pip install opendelta

To play with the latest features, you can also install OpenDelta from the source.

Build from Source

git clone https://github.com/thunlp/OpenDelta.git
cd OpenDelta

Option 1: If you won't modify the code, run

python setup.py install

Option 2: If you want to modify the code, run

python setup.py develop

Verified Supported Models

** You can try to use OpenDelta on any backbone models based on PyTorch.** However, with small chances that The interface of the submodules of the backbone model is not supported. Therefore we verified some commonly used models that OpenDelta are sure to support.

We will keep testing more and more emerging models.

Pull requests are welcomed when you successfully apply OpenDelta on your own backbone model.

Lora Bias
Tuning
Adapter
Houstbly
Adapter
Preffier
Adapter
Drop
Adapater
Low-Rank
Compactor Prefix
Tuning
Prompt
Tuning
T5
GPT-2
BART
DistilBERT
RoBERTa
BERT
T5-3b(parallel)
Deberta-v2
CTRL
ViT

Performance Checked Combination

Google sheet here