Rational Neural Networks have Expressivity Advantages

PDF Viewer

BUILDER'S SANDBOX

Build This Paper

Use an AI coding agent to implement this research.

OpenAI Codex
OpenAI CodexAI Agent

Lightweight coding agent in your terminal.

Claude Code
Claude CodeAI Agent

Agentic coding tool for terminal workflows.

AntiGravity IDE
AntiGravity IDEScaffolding

AI agent mindset installer and workflow scaffolder.

Cursor
CursorIDE

AI-first code editor built on VS Code.

VS Code
VS CodeIDE

Free, open-source editor by Microsoft.

Estimated $9K - $13K over 6-10 weeks.

See exactly what it costs to build this -- with 3 comparable funded startups.

7-day free trial. Cancel anytime.

Discover the researchers behind this paper and find similar experts.

7-day free trial. Cancel anytime.

References (33)

[1]
Balancing Expressivity and Robustness: Constrained Rational Activations for Reinforcement Learning
2025Rafal Surdej, Michal Bortkiewicz et al.
[2]
MNO : A Multi-modal Neural Operator for Parametric Nonlinear BVPs
2025Vamshi C. Madala, N. Govindarajan et al.
[3]
Transformers with Learnable Activation Functions
2022Haishuo Fang, Ji-Ung Lee et al.
[4]
ERA: Enhanced Rational Activations
2022Martin Trimmel, M. Zanfir et al.
[5]
Offline Reinforcement Learning with Implicit Q-Learning
2021Ilya Kostrikov, Ashvin Nair et al.
[6]
A Minimalist Approach to Offline Reinforcement Learning
2021Scott Fujimoto, S. Gu
[7]
Data-driven discovery of Green’s functions with human-understandable deep learning
2021N. Boull'e, C. Earls et al.
[8]
Going deeper with Image Transformers
2021Hugo Touvron, M. Cord et al.
[9]
Adaptive Rational Activations to Boost Deep Reinforcement Learning
2021Quentin Delfosse, P. Schramowski et al.
[10]
Swin Transformer: Hierarchical Vision Transformer using Shifted Windows
2021Ze Liu, Yutong Lin et al.
[11]
An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
2020Alexey Dosovitskiy, Lucas Beyer et al.
[12]
D4RL: Datasets for Deep Data-Driven Reinforcement Learning
2020Justin Fu, Aviral Kumar et al.
[13]
Rational neural networks
2020N. Boull'e, Y. Nakatsukasa et al.
[14]
Mish: A Self Regularized Non-Monotonic Activation Function
2020Diganta Misra
[15]
Padé Activation Units: End-to-end Learning of Flexible Activation Functions in Deep Networks
2019Alejandro Molina, P. Schramowski et al.
[16]
Approximating the pth Root by Composite Rational Functions
2019Evan S. Gawlik, Y. Nakatsukasa
[17]
Approximation Theory and Approximation Practice, Extended Edition
2019L. Trefethen
[18]
Group Normalization
2018Yuxin Wu, Kaiming He
[19]
MobileNetV2: Inverted Residuals and Linear Bottlenecks
2018M. Sandler, Andrew G. Howard et al.
[20]
Searching for Activation Functions
2018Prajit Ramachandran, Barret Zoph et al.

Showing 20 of 33 references

Founder's Pitch

"Exploring the expressivity of rational activation functions in neural networks for enhanced parameter efficiency."

Neural Network TheoryScore: 2View PDF ↗

Commercial Viability Breakdown

0-10 scale

High Potential

0/4 signals

0

Quick Build

1/4 signals

2.5

Series A Potential

0/4 signals

0

Sources used for this analysis

arXiv Paper

Full-text PDF analysis of the research paper

GitHub Repository

Code availability, stars, and contributor activity

Citation Network

Semantic Scholar citations and co-citation patterns

Community Predictions

Crowd-sourced unicorn probability assessments

Analysis model: GPT-4o · Last scored: 2/12/2026

Explore the full citation network and related research.

7-day free trial. Cancel anytime.

Understand the commercial significance and market impact.

7-day free trial. Cancel anytime.

Get detailed profiles of the research team.

7-day free trial. Cancel anytime.

Related Papers

Loading…