News
Knowledge distillation is widely used as an effective model compression technique to improve the performance of small models. Most of the current researches on knowledge distillation focus on the ...
This survey article delves into the emerging and critical area of symbolic knowledge distillation in large language models (LLMs). As LLMs such as generative pretrained transformer-3 (GPT-3) and ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results