News
Knowledge Distillation (KD) has been extensively studied as a means to enhance the performance of smaller models in Convolutional Neural Networks (CNNs). Recently, the Vision Transformer (ViT) has ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results