Retrieval-Augmented Generation with Covariate Time Series

PDF Viewer

BUILDER'S SANDBOX

Build This Paper

Use an AI coding agent to implement this research.

OpenAI Codex
OpenAI CodexAI Agent

Lightweight coding agent in your terminal.

Claude Code
Claude CodeAI Agent

Agentic coding tool for terminal workflows.

AntiGravity IDE
AntiGravity IDEScaffolding

AI agent mindset installer and workflow scaffolder.

Cursor
CursorIDE

AI-first code editor built on VS Code.

VS Code
VS CodeIDE

Free, open-source editor by Microsoft.

Estimated $9K - $13K over 6-10 weeks.

See exactly what it costs to build this -- with 3 comparable funded startups.

7-day free trial. Cancel anytime.

Discover the researchers behind this paper and find similar experts.

7-day free trial. Cancel anytime.

References (31)

[1]
Chronos-2: From Univariate to Universal Forecasting
2025Abdul Fatir Ansari, Oleksandr Shchur et al.
[2]
TS-RAG: Retrieval-Augmented Generation based Time Series Foundation Models are Stronger Zero-Shot Forecaster
2025Kanghui Ning, Zijie Pan et al.
[3]
Sundial: A Family of Highly Capable Time Series Foundation Models
2025Yong Liu, Guo Qin et al.
[4]
TimeRAF: Retrieval-Augmented Foundation model for Zero-shot Time Series Forecasting
2024Huanyu Zhang, Chang Xu et al.
[5]
TimeRAG: Boosting LLM Time Series Forecasting via Retrieval-Augmented Generation
2024Si-Nan Yang, Dong Wang et al.
[6]
Timer-XL: Long-Context Transformers for Unified Time Series Forecasting
2024Yong Liu, Guo Qin et al.
[7]
Deep Time Series Models: A Comprehensive Survey and Benchmark
2024Yuxuan Wang, Haixu Wu et al.
[8]
TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting
2024Shiyu Wang, Haixu Wu et al.
[9]
Chronos: Learning the Language of Time Series
2024Abdul Fatir Ansari, Lorenzo Stella et al.
[10]
TimeXer: Empowering Transformers for Time Series Forecasting with Exogenous Variables
2024Yuxuan Wang, Haixu Wu et al.
[11]
Timer: Generative Pre-trained Transformers Are Large Time Series Models
2024Yong Liu, Haoran Zhang et al.
[12]
A Survey of Deep Learning and Foundation Models for Time Series Forecasting
2024John A. Miller, Mohammed Aldosari et al.
[13]
Large Language Models Are Zero-Shot Time Series Forecasters
2023Nate Gruver, Marc Finzi et al.
[14]
iTransformer: Inverted Transformers Are Effective for Time Series Forecasting
2023Yong Liu, Tengge Hu et al.
[15]
Time-LLM: Time Series Forecasting by Reprogramming Large Language Models
2023Ming Jin, Shiyu Wang et al.
[16]
Apache IoTDB: A Time Series Database for IoT Applications
2023Chen Wang, Jialin Qiao et al.
[17]
LLaMA: Open and Efficient Foundation Language Models
2023Hugo Touvron, Thibaut Lavril et al.
[18]
A Time Series is Worth 64 Words: Long-term Forecasting with Transformers
2022Yuqi Nie, Nam H. Nguyen et al.
[19]
Are Transformers Effective for Time Series Forecasting?
2022Ailing Zeng, Mu-Hwa Chen et al.
[20]
Pyraformer: Low-Complexity Pyramidal Attention for Long-Range Time Series Modeling and Forecasting
2022Shizhan Liu, Hang Yu et al.

Showing 20 of 31 references

Founder's Pitch

"RAG4CTS provides a cutting-edge, training-free framework for anomaly detection in industrial time-series applications like predictive maintenance."

Industrial AIScore: 8View PDF ↗

Commercial Viability Breakdown

0-10 scale

High Potential

2/4 signals

5

Quick Build

4/4 signals

10

Series A Potential

4/4 signals

10

Sources used for this analysis

arXiv Paper

Full-text PDF analysis of the research paper

GitHub Repository

Code availability, stars, and contributor activity

Citation Network

Semantic Scholar citations and co-citation patterns

Community Predictions

Crowd-sourced unicorn probability assessments

Analysis model: GPT-4o · Last scored: 3/5/2026

Explore the full citation network and related research.

7-day free trial. Cancel anytime.

Understand the commercial significance and market impact.

7-day free trial. Cancel anytime.

Get detailed profiles of the research team.

7-day free trial. Cancel anytime.

Related Papers

Loading…