<to do list>
understand how visualization of herbs works
success visualizaing attention map in vit
understand what do each feature maps in swin transformer stand for (and also in vit)
<reference>
>> 스윈트랜스포머 object detection
==>스윈 트랜스포머 explainable
Transformer Interpretability Beyond Attention Visualization (thecvf.com)
==> explainability vit (paper + code)
[개념정리] LRP(Layer-wise Relevance Propagation) (velog.io)
Layer-wise Relevance Propagation - 공돌이의 수학정리노트 (Angelo's Math Notes) (angeloyeo.github.io)
=> lrp
Vision Transformer (ViT) : Visualize Attention Map | Kaggle
==>how to visualize attention map in vit