Long-Context Encoder Models for Polish Language Understanding

PDF Viewer

BUILDER'S SANDBOX

Build This Paper

Use an AI coding agent to implement this research.

OpenAI Codex
OpenAI CodexAI Agent

Lightweight coding agent in your terminal.

Claude Code
Claude CodeAI Agent

Agentic coding tool for terminal workflows.

AntiGravity IDE
AntiGravity IDEScaffolding

AI agent mindset installer and workflow scaffolder.

Cursor
CursorIDE

AI-first code editor built on VS Code.

VS Code
VS CodeIDE

Free, open-source editor by Microsoft.

Estimated $9K - $13K over 6-10 weeks.

See exactly what it costs to build this -- with 3 comparable funded startups.

7-day free trial. Cancel anytime.

Discover the researchers behind this paper and find similar experts.

7-day free trial. Cancel anytime.

References (35)

[1]
PLLuM: A Family of Polish Large Language Models
2025Jan Koco'n, Maciej Piasecki et al.
[2]
mmBERT: A Modern Multilingual Encoder with Annealed Language Learning
2025Marc Marone, Orion Weller et al.
[3]
EuroBERT: Scaling Multilingual Encoders for European Languages
2025Nicolas Boizard, Hippolyte Gisserot-Boukhlef et al.
[4]
NeoBERT: A Next-Generation BERT
2025Lola Le Breton, Quentin Fournier et al.
[5]
Smarter, Better, Faster, Longer: A Modern Bidirectional Encoder for Fast, Memory Efficient, and Long Context Finetuning and Inference
2024Benjamin Warner, Antoine Chaffin et al.
[6]
DeepSeek-V3 Technical Report
2024DeepSeek-AI, A. Liu et al.
[7]
MIPD: Exploring Manipulation and Intention In a Novel Corpus of Polish Disinformation
2024Arkadiusz Modzelewski, Giovanni Da San Martino et al.
[8]
A Review on Large Language Models: Architectures, Applications, Taxonomies, Open Issues and Challenges
2024Mohaimenul Azam Khan Raiaan, Md. Saddam Hossain Mukta et al.
[9]
MosaicBERT: A Bidirectional Encoder Optimized for Fast Pretraining
2023J. Portes, Alex Trott et al.
[10]
BAN-PL: A Polish Dataset of Banned Harmful and Offensive Content from Wykop.pl Web Service
2023Anna Kołos, Inez Okulska et al.
[11]
How to Distill your BERT: An Empirical Study on the Impact of Weight Initialisation and Distillation Objectives
2023Xinpeng Wang, Leonie Weissweiler et al.
[12]
ChatGPT: Jack of all trades, master of none
2023Jan Koco'n, Igor Cichecki et al.
[13]
TwitterEmo: Annotating Emotions and Sentiment in Polish Twitter
2023S. Bogdanowicz, Hanna Cwynar et al.
[14]
TrelBERT: A pre-trained encoder for Polish Twitter
2023Wojciech Szmyd, Alicja Kotyla et al.
[15]
Training Effective Neural Sentence Encoders from Automatically Mined Paraphrases
2022Slawomir Dadas
[16]
FlashAttention: Fast and Memory-Efficient Exact Attention with IO-Awareness
2022Tri Dao, Daniel Y. Fu et al.
[17]
Evaluation of Transfer Learning for Polish with a Text-to-Text Model
2022Aleksandra Chrabrowa, Lukasz Dragan et al.
[18]
HerBERT: Efficiently Pretrained Transformer-based Language Model for Polish
2021Robert Mroczkowski, Piotr Rybak et al.
[19]
RoFormer: Enhanced Transformer with Rotary Position Embedding
2021Jianlin Su, Yu Lu et al.
[20]
Impact of News on the Commodity Market: Dataset and Results
2020Ankur Sinha, Tanmay Khandait

Showing 20 of 35 references

Founder's Pitch

"A high-quality Polish language model designed for long-document understanding, outperforming existing solutions."

NLPScore: 7View PDF ↗

Commercial Viability Breakdown

0-10 scale

High Potential

2/4 signals

5

Quick Build

4/4 signals

10

Series A Potential

0/4 signals

0

Sources used for this analysis

arXiv Paper

Full-text PDF analysis of the research paper

GitHub Repository

Code availability, stars, and contributor activity

Citation Network

Semantic Scholar citations and co-citation patterns

Community Predictions

Crowd-sourced unicorn probability assessments

Analysis model: GPT-4o · Last scored: 3/12/2026

Explore the full citation network and related research.

7-day free trial. Cancel anytime.

Understand the commercial significance and market impact.

7-day free trial. Cancel anytime.

Get detailed profiles of the research team.

7-day free trial. Cancel anytime.

Related Papers

Loading…