Papers
1–3 of 3Research Paper·Feb 3, 2026·B2B
Rational ANOVA Networks
Deep neural networks typically treat nonlinearities as fixed primitives (e.g., ReLU), limiting both interpretability and the granularity of control over the induced function class. While recent additi...
5.0 viability
Research Paper·Feb 4, 2026·B2B
From Dead Neurons to Deep Approximators: Deep Bernstein Networks as a Provable Alternative to Residual Layers
Residual connections are the de facto standard for mitigating vanishing gradients, yet they impose structural constraints and fail to address the inherent inefficiencies of piecewise linear activation...
4.0 viability
Research Paper·Feb 9, 2026
Gradient Residual Connections
Existing work has linked properties of a function's gradient to the difficulty of function approximation. Motivated by these insights, we study how gradient information can be leveraged to improve neu...
3.0 viability