Top papers
- From Images to Words: Efficient Cross-Modal Knowledge Distillation to Language Models from Black-box Teachers(7.0)
- DAIT: Distillation from Vision-Language Models to Lightweight Classifiers with Adaptive Intermediate Teacher Transfer(6.0)
- Integrating Knowledge Distillation Methods: A Sequential Multi-Stage Framework(3.0)