Papers
1–3 of 3Research Paper·Jan 29, 2026
SAL: Selective Adaptive Learning for Backpropagation-Free Training with Sparsification
Standard deep learning relies on Backpropagation (BP), which is constrained by biologically implausible weight symmetry and suffers from significant gradient interference within dense representations....
5.0 viability
Research Paper·Feb 19, 2026
Learning with Boolean threshold functions
We develop a method for training neural networks on Boolean data in which the values at all nodes are strictly $\pm 1$, and the resulting models are typically equivalent to networks whose nonzero weig...
4.0 viability
Research Paper·Mar 3, 2026
Joint Training Across Multiple Activation Sparsity Regimes
Generalization in deep neural networks remains only partially understood. Inspired by the stronger generalization tendency of biological systems, we explore the hypothesis that robust internal represe...
2.0 viability