Papers
1–3 of 3Research Paper·Mar 2, 2026
From Fewer Samples to Fewer Bits: Reframing Dataset Distillation as Joint Optimization of Precision and Compactness
Dataset Distillation (DD) compresses large datasets into compact synthetic ones that maintain training performance. However, current methods mainly target sample reduction, with limited consideration ...
8.0 viability
Research Paper·Jan 15, 2026
Difficulty-guided Sampling: Bridging the Target Gap between Dataset Distillation and Downstream Tasks
In this paper, we propose difficulty-guided sampling (DGS) to bridge the target gap between the distillation objective and the downstream task, therefore improving the performance of dataset distillat...
6.0 viability
Research Paper·Mar 2, 2026
Towards Principled Dataset Distillation: A Spectral Distribution Perspective
Dataset distillation (DD) aims to compress large-scale datasets into compact synthetic counterparts for efficient model training. However, existing DD methods exhibit substantial performance degradati...
5.0 viability