Dataset Distillation Comparison Hub
3 papers - avg viability 6.3
Top Papers
- From Fewer Samples to Fewer Bits: Reframing Dataset Distillation as Joint Optimization of Precision and Compactness(8.0)
QuADD offers a quantization-aware dataset distillation framework optimizing data compactness and precision for efficient machine learning.
- Effective Dataset Distillation for Spatio-Temporal Forecasting with Bi-dimensional Compression(7.0)
STemDist is a novel dataset distillation method that enhances spatio-temporal forecasting by compressing both spatial and temporal dimensions.
- EVLF: Early Vision-Language Fusion for Generative Dataset Distillation(7.0)
Improve dataset distillation by fusing visual and textual embeddings early in the generative process, creating more visually coherent synthetic data for downstream classification tasks.
- HIERAMP: Coarse-to-Fine Autoregressive Amplification for Generative Dataset Distillation(7.0)
HIERAMP leverages a vision autoregressive model to amplify hierarchical semantics in dataset distillation, improving validation performance by focusing on discriminative parts and structures.
- Difficulty-guided Sampling: Bridging the Target Gap between Dataset Distillation and Downstream Tasks(6.0)
A method that enhances dataset distillation with difficulty-guided sampling to optimize deep learning model training.
- Towards Principled Dataset Distillation: A Spectral Distribution Perspective(5.0)
Create a compact and efficient dataset distillation tool for imbalanced data that improves model training stability and performance.