国际通行的概念啊,你去看看论文,knowledge distillation
Knowledge distillation is a machine learning technique that aims to transfer the learnings of a large pre-trained model, the “teacher model,” to a smaller “student model.”
【 在 strongcore 的大作中提到: 】
: 蒸馏本来是我们生化环材一个最普通的物理分离手段,你们搞个高级点的词不好吗?
--
FROM 14.150.194.*