2022-02-14 21:19:03 +08:00
< div align = "center" >
< img src = "https://s4.ax1x.com/2022/02/14/Hy7lAf.png" width = "350px" >
**An Open-Source Framework for Paramter Efficient Tuning.**
------
< p align = "center" >
< a href = "#Overview" > Overview< / a > •
< a href = "#installation" > Installation< / a > •
2022-02-14 23:11:04 +08:00
< a href = "https://opendelta.readthedocs.io/en/latest/notes/usage.html" > Basic Usage< / a > •
2022-02-14 21:19:03 +08:00
< a href = "https://opendelta.readthedocs.io/" > Docs< / a > •
2022-02-14 23:13:20 +08:00
< a href = "https://docs.google.com/spreadsheets/d/1BIVa8ocAPga-u7rBOXLYaTfaJSjI1dWfwohmLjmFDrY/edit?usp=sharing" > Performance< / a > •
2022-02-14 21:19:03 +08:00
< / p >
< / div >
2022-02-14 23:13:20 +08:00
![version ](https://img.shields.io/badge/version-0.0.1-blue )
2022-02-14 21:19:03 +08:00
## Overview
OpenDelta is a toolkit for parameter efficient methods (we dub it as *delta tuning* ), by which users could flexibly assign (or add) a small amount parameters to update while keeping the most paramters frozen. By using OpenDelta, users could easily implement prefix-tuning, adapters, Lora, or any other types of delta tuning with preferred PTMs.
2022-02-14 23:11:04 +08:00
Our repo is tested on Python 3.8 and PyTorch 1.9.0. Lower version may also be supported.
2022-02-15 00:12:12 +08:00
**A demo of using Opendelta to modify the PLM (E.g., BART).**
2022-02-15 10:47:56 +08:00
![How PLM changes using Delta-tuning ](docs/source/imgs/demo.gif )
2022-02-15 00:07:25 +08:00
2022-02-14 21:19:03 +08:00
## Installation
create a virtualenv (optional)
```shell
conda create -n opendelta_env python=3.8
conda activate opendelta_env
```
### Using Pip
2022-02-14 23:11:04 +08:00
Install OpenDelta using pip as follows:
2022-02-14 21:19:03 +08:00
```shell
pip install opendelta
```
To play with the latest features, you can also install OpenDelta from the source.
### Build from Source
```shell
git clone https://github.com/thunlp/OpenDelta.git
cd OpenDelta
```
#### Option 1: If you won't modify the code, run
```shell
python setup.py install
```
#### Option 2: If you want to modify the code, run
```shell
python setup.py develop
```
2022-02-14 23:23:01 +08:00
## Must Try
```python
from transformers import AutoModelForSeq2SeqLM
t5 = AutoModelForSeq2SeqLM.from_pretrained("t5-base")
from opendelta import AutoDeltaModel
delta = AutoDeltaModel.from_finetuned("DeltaHub/lora_t5-base_mrpc", backbone_model=t5)
delta.log()
```
2022-02-14 21:19:03 +08:00
2022-02-14 23:24:21 +08:00
## Verified Supported Models
2022-02-14 21:19:03 +08:00
2022-02-14 23:47:40 +08:00
- **You can try to use OpenDelta on *any* backbone models based on PyTorch.**
- However, with small chances thatThe interface of the submodules of the backbone model is not supported. Therefore we verified some commonly
2022-02-14 21:19:03 +08:00
used models that OpenDelta are sure to support.
2022-02-14 23:47:40 +08:00
- We will keep testing more and more emerging models.
2022-02-14 21:19:03 +08:00
2022-02-14 23:47:40 +08:00
- Pull requests are welcomed when you successfully apply OpenDelta on your own backbone model.
2022-02-14 21:19:03 +08:00
| | Lora | Bias< br > Tuning | Adapter< br > Houstbly | Adapter< br > Preffier | Adapter< br > Drop | Adapater< br > Low-Rank | Compactor |Prefix< br > Tuning | Prompt < br > Tuning |
| --------- | ---- | ---- | ---- | ---- | ---- | ---- | ---- | ----- | ----- |
| T5 | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| GPT-2 | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | |
| BART | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | |
| DistilBERT | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | |
| RoBERTa | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | |
| BERT | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| T5-3b(parallel)| ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Deberta-v2 | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | |
| CTRL | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | |
| ViT | ✅ | | | | | | | | |
2022-02-14 23:24:21 +08:00
## Performance Checked Combination
2022-02-14 21:19:03 +08:00
Google sheet [here ](https://docs.google.com/spreadsheets/d/1BIVa8ocAPga-u7rBOXLYaTfaJSjI1dWfwohmLjmFDrY/edit?usp=sharing )
2022-02-14 23:11:04 +08:00
Subject to change at any moment.
2022-02-14 21:19:03 +08:00