Go to file
freeneuro a8c6995b61
Update README.md
2021-09-12 19:40:57 +08:00
assets update 114 2021-09-12 18:39:11 +08:00
README.md Update README.md 2021-09-12 19:40:57 +08:00
_config.yml Set theme jekyll-theme-slate 2021-09-07 16:16:21 +08:00

README.md

FCA: Learning a 3D Full-coverage Vehicle Camouflage for Multi-view Physical Adversarial Attack

Case study of the FCA. The code can be find in FCA.

Cases of digital attack

Carmear distance is 3

before
after

Carmear distance is 5

before
after

Carmear distance is 10

before
after

Cases of multi-view robust

before
after

The first row is the original detection result. The second row is the camouflaged detection result.

before
after

The first row is the original detection result. The second row is the camouflaged detection result.

Ablation study

Different combination of loss terms

As we can see from the Figure, different loss term plays different role in attacking. For example, the camouflaged car generated by obj+smooth (we omit the smooth loss, and denotes as obj) hardly hidden from the detector, while the camouflaged car generated by iou successfully suppress the detecting bounding box of the car region, and finally the camouflaged car generated by cls successfully make the detector to misclassify the car to anther category.

Different initialization ways

original basic initialization random initialization zero initialization