Article Summary
-
NaviHydra: Controllable Navigation-guided End-to-end Autonomous Driving with Hydra-distillation
Jian Li, Wei Zhang, Chen Wang
Published: 2025-12-12
Link: https://arxiv.org/pdf/2512.10660.pdf
-
Distilling Foundation Models to Lightweight Baselines for Generalized Polyp Segmentation
Not Provided
Published: 2025-12-12
Link: https://arxiv.org/pdf/2512.09307.pdf
-
Decoupled Audio-Visual Dataset Distillation
Anya Sharma, Ben Carter, Chen Li
Published: 2025-12-01
Link: https://arxiv.org/pdf/2511.17890.pdf
-
When Better Teachers Don't Make Better Students: Revisiting Knowledge Distillation for CLIP Models in VQA
Jian Li, Wei Chen, Xiao Wang
Published: 2025-11-30
Link: https://arxiv.org/pdf/2511.17886.pdf
-
EfficientSAM3: Progressive Hierarchical Distillation for Video Concept Segmentation from SAM1, 2, and 3
Jian Li, Wei Chen, Yu Zhang, Xiaofeng Wang
Published: 2025-11-27
Link: https://arxiv.org/pdf/2511.15833.pdf
-
Dataset Distillation for Pre-Trained Self-Supervised Vision Models
Jian Li, Wei Chen, Xiaoming Wang
Published: 2025-11-26
Link: https://arxiv.org/pdf/2511.16674.pdf
-
When Better Teachers Don't Make Better Students: Revisiting Knowledge Distillation for CLIP Models in VQA
Ava Sharma, Benjamin Chen, Chloe Davis, David Lee
Published: 2025-11-25
Link: https://arxiv.org/pdf/2511.17886.pdf
-
Shrinking the Teacher: An Adaptive Teaching Paradigm for Asymmetric EEG-Vision Alignment
Anya Sharma, Ben Carter, Chloe Davis
Published: 2025-11-21
Link: https://arxiv.org/pdf/2511.11422.pdf
-
Compressing Multi-Task Model for Autonomous Driving via Pruning and Knowledge Distillation
Yinglu Li, Wenbin Zhao, Zhuo Su, Yingjie Zhou, Hui Guan, Yunfeng Shang, Zhongnan Qu
Published: 2025-11-14
Link: https://arxiv.org/pdf/2511.05557.pdf
-
Do Students Debias Like Teachers? On the Distillability of Bias Mitigation Methods
A. B. Researcher, C. D. Scientist, E. F. Engineer
Published: 2025-11-07
Link: https://arxiv.org/pdf/2510.26038.pdf