Papers
1–2 of 2Research Paper·Mar 10, 2026
Verifying Good Regulator Conditions for Hypergraph Observers: Natural Gradient Learning from Causal Invariance via Established Theorems
We verify that persistent observers in causally invariant hypergraph substrates satisfy the conditions of the Conant-Ashby Good Regulator Theorem. Building on Wolfram's hypergraph physics and Vanchuri...
3.0 viability
Research Paper·Mar 10, 2026
Memorization capacity of deep ReLU neural networks characterized by width and depth
This paper studies the memorization capacity of deep neural networks with ReLU activation. Specifically, we investigate the minimal size of such networks to memorize any $N$ data points in the unit ba...
2.0 viability