ScienceToStartup
Dashboard
Research
Trends
Topics
Saved
Articles
Changelog
Careers
About
Enterprise
Resources
Home
Resources
Glossary
Sparse Mixture of Experts (SMoE)
Sparse Mixture of Experts (SMoE)
Sparse Mixture of Experts (SMoE) is a model in our research taxonomy.
Related papers
ECO: Quantized Training without Full-Precision Master Weights