Go to file
futianyi3 9925ac3fb3 first commit 2024-11-20 12:20:08 +08:00
Test first commit 2024-11-20 12:20:08 +08:00
__pycache__ first commit 2024-11-20 12:20:08 +08:00
configs first commit 2024-11-20 12:20:08 +08:00
datasets first commit 2024-11-20 12:20:08 +08:00
docs first commit 2024-11-20 12:20:08 +08:00
figs first commit 2024-11-20 12:20:08 +08:00
models first commit 2024-11-20 12:20:08 +08:00
tools first commit 2024-11-20 12:20:08 +08:00
util first commit 2024-11-20 12:20:08 +08:00
LICENSE first commit 2024-11-20 12:20:08 +08:00
README.md first commit 2024-11-20 12:20:08 +08:00
benchmark.py first commit 2024-11-20 12:20:08 +08:00
engine_multi.py first commit 2024-11-20 12:20:08 +08:00
engine_multi_test.py first commit 2024-11-20 12:20:08 +08:00
engine_single.py first commit 2024-11-20 12:20:08 +08:00
eval.json first commit 2024-11-20 12:20:08 +08:00
eval.py first commit 2024-11-20 12:20:08 +08:00
find.py first commit 2024-11-20 12:20:08 +08:00
git first commit 2024-11-20 12:20:08 +08:00
json_info first commit 2024-11-20 12:20:08 +08:00
main.py first commit 2024-11-20 12:20:08 +08:00
requirements.txt first commit 2024-11-20 12:20:08 +08:00
result_save.json first commit 2024-11-20 12:20:08 +08:00
sky_data_vid_test.json first commit 2024-11-20 12:20:08 +08:00
sky_data_vid_train.json first commit 2024-11-20 12:20:08 +08:00
sky_data_vid_val.json first commit 2024-11-20 12:20:08 +08:00
test.py first commit 2024-11-20 12:20:08 +08:00
test.sh first commit 2024-11-20 12:20:08 +08:00
train.sh first commit 2024-11-20 12:20:08 +08:00

README.md

Install

  • Python>=3.8

    We recommend you to use Anaconda to create a conda environment:

    conda create -n MVSOD python=3.8 pip
    

    Then, activate the environment:

    conda activate MVSOD
    
  • Other requirements

    pip install -r requirements.txt
    
  • Build MultiScaleDeformableAttention

    cd ./models/ops
    sh ./make.sh # or python setup.py build install
    

Usage

Dataset preparation

  1. Please download sky_data3 from here. The numbers of RGB/IR images in the train, validation and test sets are 8782, 1999, and 1858. The json file can be genetated by './tools/covert2coco.py' And the path structure should be as follows:
project_root/
└── data/
    └── vid/
        ├── Data
            ├── sky_data3/
        └── annotations/
        	├── sky_data_vid_test.json
            ├── sky_data_vid_train.json
        	└── sky_data_vid_val.json

Training on multi-gpus

export CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7
python tools/launch.py --nnodes 1 --node_rank 0 --master_addr 127.0.0.1 --master_port 3000 --nproc_per_node 2 configs/r101_train_multi_mine_multi.sh

Evaling on multi-gpus

export CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7
python tools/launch.py --nnodes 1 --node_rank 0 --master_addr 127.0.0.1 --master_port 3000 --nproc_per_node 2 configs/r101_eval_multi_mine_multi.sh
  1. Pretrained model from here.

Training on single-gpus

python main.py --backbone resnet101 \
               --epochs 10 \
               --num_feature_levels 1 \
               --num_queries 300 \
                --dilation \
                --batch_size 1 \
                --num_ref_frames 14 \
                --lr_drop_epochs 7 9 \
                --num_workers 8 \
                --with_box_refine \
                --coco_pretrain  \
                --dataset_file vid_multi_mine_multi \
                --resume exps/COCO_pretrained_model/checkpoint0020.pth \
                --output_dir exps/singlebaseline/r101_e8_nf4_ld6,7_lr0.0002_nq300_bs4_wbox_joint_MEGA_detrNorm_class31_pretrain_coco_dc5

                #--resume exps/our_models/COCO_pretrained_model/r101_deformable_detr_single_scale_bbox_refinement-dc5_checkpoint0049.pth \
                #./exps/COCO_pretrained_model/checkpoint0020.pth

Validating on single-gpus

python main.py --backbone resnet101 \
               --epochs 6 \
               --eval \
               --num_feature_levels 1 \
               --num_queries 300 \
               --dilation \
               --batch_size 1 \
               --num_ref_frames 14 \
               --resume /home/fty/Documents/fty/mvsod/exps/multibaseline/r101_grad/e7_nf1_ld4,6_lr0.0002_nq300_wbox_MEGA_detrNorm_preSingle_nr14_dc5_nql3_filter150_75_40/checkpoint0012.pth \
               --lr_drop_epochs 4 6 \
               --num_workers 16 \
               --with_box_refine \
               --dataset_file vid_multi_mine_multi \
               --output_dir exps/our_models/exps_multi/r101_81.7

Testing on single-gpus

python test.py  --backbone resnet101 \
                --epochs 7 \
                --eval  \
                --num_feature_levels 1 \
                --num_queries 300 \
                --dilation \
                --batch_size 1 \
                --num_ref_frames 14 \
                --resume /home/fty/Documents/fty/mvsod/exps/multibaseline/r101_grad/e7_nf1_ld4,6_lr0.0002_nq300_wbox_MEGA_detrNorm_preSingle_nr14_dc5_nql3_filter150_75_40/checkpoint0099.pth \
                --lr_drop_epochs 4 6 \
                --num_workers 16 \
                --with_box_refine \
                --dataset_file \
                vid_multi_mine_multi \
                --output_dir exps/our_models/exps_multi/epoch100

The you will get a 'test_save.json' file for evaluation.

Acknowledgment

This code is based on TransVOD(https://github.com/SJTU-LuHe/TransVOD)