Neural Network Training Comparison Hub
4 papers - avg viability 4.3
Top Papers
- Multilevel Training for Kolmogorov Arnold Networks(6.0)
Innovative multilevel training algorithms significantly speed up Kolmogorov Arnold Networks, enhancing neural network performance.
- SAL: Selective Adaptive Learning for Backpropagation-Free Training with Sparsification(5.0)
Selective Adaptive Learning offers an alternative training method potentially improving neural network scalability without backpropagation.
- Learning with Boolean threshold functions(4.0)
Develop a tool to train sparse Boolean networks for tasks where traditional methods fail, focusing on interpretability and efficient inference.
- Joint Training Across Multiple Activation Sparsity Regimes(2.0)
Develop a training strategy to enhance neural network generalization across dense and sparse activation regimes.