Blanket
About
Posts
Tags
知識蒸留
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter (2019)
Distilling the Knowledge in a Neural Network (2015)