PDF Viewer

BUILDER'S SANDBOX

Build This Paper

Use an AI coding agent to implement this research.

OpenAI Codex
OpenAI CodexAI Agent

Lightweight coding agent in your terminal.

Claude Code
Claude CodeAI Agent

Agentic coding tool for terminal workflows.

AntiGravity IDE
AntiGravity IDEScaffolding

AI agent mindset installer and workflow scaffolder.

Cursor
CursorIDE

AI-first code editor built on VS Code.

VS Code
VS CodeIDE

Free, open-source editor by Microsoft.

Estimated $9K - $13K over 6-10 weeks.

See exactly what it costs to build this -- with 3 comparable funded startups.

7-day free trial. Cancel anytime.

Discover the researchers behind this paper and find similar experts.

7-day free trial. Cancel anytime.

References (49)

[1]
Explainable AI: XAI-guided context-aware data augmentation
2025Melkamu Abay Mersha, M. Yigezu et al.
[2]
Generalized Attention Flow: Feature Attribution for Transformer Models via Maximum Flow
2025B. Azarkhalili, Maxwell Libbrecht
[3]
Evaluating the effectiveness of XAI techniques for encoder-based language models
2025Melkamu Abay Mersha, M. Yigezu et al.
[4]
IRCAN: Mitigating Knowledge Conflicts in LLM Generation via Identifying and Reweighting Context-Aware Neurons
2024Dan Shi, Renren Jin et al.
[5]
Explainable artificial intelligence: A survey of needs, techniques, applications, and future direction
2024Melkamu Abay Mersha, K. Lam et al.
[6]
Fantastic Semantics and Where to Find Them: Investigating Which Layers of Generative LLMs Reflect Lexical Semantics
2024Zhu Liu, Cunliang Kong et al.
[7]
An Interpretable and Transferrable Vision Transformer Model for Rapid Materials Spectra Classification
2024Zhenru Chen, Yunchao Xie et al.
[8]
Decoding Layer Saliency in Language Transformers
2023Elizabeth M. Hou, Greg Castañón
[9]
AttentionViz: A Global View of Transformer Attention
2023Catherine Yeh, Yida Chen et al.
[10]
Exploring Amharic Hate Speech Data Collection and Classification Approaches
2023A. Ayele, Seid Muhie Yimam et al.
[11]
AfroLM: A Self-Active Learning-based Multilingual Pretrained Language Model for 23 African Languages
2022Bonaventure F. P. Dossou, A. Tonja et al.
[12]
Measuring the Mixing of Contextual Information in the Transformer
2022Javier Ferrando, Gerard I. Gállego et al.
[13]
From Anecdotal Evidence to Quantitative Evaluation Methods: A Systematic Review on Evaluating Explainable AI
2022Meike Nauta, Jan Trienes et al.
[14]
Probe-Less Probing of BERT’s Layer-Wise Linguistic Knowledge with Masked Word Prediction
2022T. Aoyama, Nathan Schneider
[15]
AttCAT: Explaining Transformers via Attentive Class Activation Tokens
2022Yao Qiang, Deng Pan et al.
[16]
On Exploring Attention-based Explanation for Transformer Models in Text Classification
2021Shengzhong Liu, Franck Le et al.
[17]
Masked Autoencoders Are Scalable Vision Learners
2021Kaiming He, Xinlei Chen et al.
[18]
Relative Importance in Sentence Processing
2021Nora Hollenstein, Lisa Beinborn
[19]
Guided Integrated Gradients: an Adaptive Path Method for Removing Noise
2021A. Kapishnikov, Subhashini Venugopalan et al.
[20]
Explaining Information Flow Inside Vision Transformers Using Markov Chain
2021Ting Yuan, Xuhong Li et al.

Showing 20 of 49 references

Founder's Pitch

"A framework for providing context-aware explanations of Transformer models' decisions."

Explainable AIScore: 3View PDF ↗

Commercial Viability Breakdown

0-10 scale

High Potential

0/4 signals

0

Quick Build

1/4 signals

2.5

Series A Potential

0/4 signals

0

Sources used for this analysis

arXiv Paper

Full-text PDF analysis of the research paper

GitHub Repository

Code availability, stars, and contributor activity

Citation Network

Semantic Scholar citations and co-citation patterns

Community Predictions

Crowd-sourced unicorn probability assessments

Analysis model: GPT-4o · Last scored: 2/18/2026

Explore the full citation network and related research.

7-day free trial. Cancel anytime.

Understand the commercial significance and market impact.

7-day free trial. Cancel anytime.

Get detailed profiles of the research team.

7-day free trial. Cancel anytime.