PDF Viewer

BUILDER'S SANDBOX

Build This Paper

Use an AI coding agent to implement this research.

OpenAI Codex
OpenAI CodexAI Agent

Lightweight coding agent in your terminal.

Claude Code
Claude CodeAI Agent

Agentic coding tool for terminal workflows.

AntiGravity IDE
AntiGravity IDEScaffolding

AI agent mindset installer and workflow scaffolder.

Cursor
CursorIDE

AI-first code editor built on VS Code.

VS Code
VS CodeIDE

Free, open-source editor by Microsoft.

Estimated $9K - $13K over 6-10 weeks.

See exactly what it costs to build this -- with 3 comparable funded startups.

7-day free trial. Cancel anytime.

Discover the researchers behind this paper and find similar experts.

7-day free trial. Cancel anytime.

References (72)

[1]
Revisiting the Generic Transformer: Deconstructing a Strong Baseline for Time Series Foundation Models
2026Yunshi Wen, Wesley M. Gifford et al.
[2]
Super-Linear: A Lightweight Pretrained Mixture of Linear Experts for Time Series Forecasting
2025Liran Nochumsohn, Raz Marshanski et al.
[3]
Conformal Prediction for Time-series Forecasting with Change Points
2025S. Sun, Rose Yu
[4]
TiRex: Zero-Shot Forecasting Across Long and Short Horizons with Enhanced In-Context Learning
2025Andreas Auer, Patrick Podest et al.
[5]
Output Scaling: YingLong-Delayed Chain of Thought in a Large Pretrained Time Series Forecasting Model
2025Xue Wang, Tian Zhou et al.
[6]
Foundation Models for Time Series: A Survey
2025Siva Rama Krishna Kottapalli, Karthik Hubli et al.
[7]
TVNet: A Novel Time Series Analysis Method Based on Dynamic Convolution and 3D-Variation
2025Chenghan Li, Mingchen Li et al.
[8]
DeltaProduct: Improving State-Tracking in Linear RNNs via Householder Products
2025Julien N. Siems, Timur Carstensen et al.
[9]
Sundial: A Family of Highly Capable Time Series Foundation Models
2025Yong Liu, Guo Qin et al.
[10]
Understanding Warmup-Stable-Decay Learning Rates: A River Valley Loss Landscape View
2025Kaiyue Wen, Zhiyuan Li et al.
[11]
Gated Delta Networks: Improving Mamba2 with Delta Rule
2024Songlin Yang, Jan Kautz et al.
[12]
A Mamba Foundation Model for Time Series Forecasting
2024Haoyu Ma, Yushu Chen et al.
[13]
Mamba4Cast: Efficient Zero-Shot Time Series Forecasting with State Space Models
2024Sathya Kamesh Bhethanabhotla, Omar Swelam et al.
[14]
Timer-XL: Long-Context Transformers for Unified Time Series Forecasting
2024Yong Liu, Guo Qin et al.
[15]
Are Language Models Actually Useful for Time Series Forecasting?
2024Mingtian Tan, Mike A. Merrill et al.
[16]
An Empirical Study of Mamba-based Language Models
2024R. Waleffe, Wonmin Byeon et al.
[17]
Parallelizing Linear Transformers with the Delta Rule over Sequence Length
2024Songlin Yang, Bailin Wang et al.
[18]
Transformers are SSMs: Generalized Models and Efficient Algorithms Through Structured State Space Duality
2024Tri Dao, Albert Gu
[19]
Zamba: A Compact 7B SSM Hybrid Model
2024Paolo Glorioso, Quentin Anthony et al.
[20]
xLSTM: Extended Long Short-Term Memory
2024Maximilian Beck, Korbinian Poppel et al.

Showing 20 of 72 references

Founder's Pitch

"Develop smaller, more efficient time series models for zero-shot forecasting via interleaved convolutional and RNN layers."

Time Series ForecastingScore: 2View PDF ↗

Commercial Viability Breakdown

0-10 scale

High Potential

0/4 signals

0

Quick Build

1/4 signals

2.5

Series A Potential

0/4 signals

0

Sources used for this analysis

arXiv Paper

Full-text PDF analysis of the research paper

GitHub Repository

Code availability, stars, and contributor activity

Citation Network

Semantic Scholar citations and co-citation patterns

Community Predictions

Crowd-sourced unicorn probability assessments

Analysis model: GPT-4o · Last scored: 2/19/2026

Explore the full citation network and related research.

7-day free trial. Cancel anytime.

Understand the commercial significance and market impact.

7-day free trial. Cancel anytime.

Get detailed profiles of the research team.

7-day free trial. Cancel anytime.