MIT introduces Self-Distillation Fine-Tuning to reduce catastrophic forgetting; it uses student-teacher demonstrations and needs 2.5x compute.
A South Korean research team has developed a technology that allows AI models to retain existing knowledge without needing to be retrained from scratch even when the model changes. KAIST School of ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results