Go to file
freeneuro b5a116dce1
Update README.md
2021-09-12 16:33:38 +08:00
assets update 2021-09-11 23:03:44 +08:00
README.md Update README.md 2021-09-12 16:33:38 +08:00
_config.yml Set theme jekyll-theme-slate 2021-09-07 16:16:21 +08:00
index.md update 2021-09-12 11:06:23 +08:00

README.md

FCA: Learning a 3D Full-coverage Vehicle Camouflage for Multi-view Physical Adversarial Attack

Case study of the FCA. The code can be find in FCA.

Cases of digital attack

!(image)[https://github.com/winterwindwang/Full-coverage-camouflage-adversarial-attack/blob/gh-pages/assets/distance_10_elevation_30_adv_pred.gif]

Cases of multi-view robust

Ablation study

Different combination of loss terms

image

As we can see from the Figure, different loss term plays different role in attacking. For example, the camouflaged car generated by obj+smooth (we omit the smooth loss, and denotes as obj) hardly hidden from the detector, while the camouflaged car generated by iou successfully suppress the detecting bounding box of the car region, and finally the camouflaged car generated by cls successfully make the detector to misclassify the car to anther category.