Mixture-of-Experts (MoE)

Mixture-of-Experts (MoE) is a model technology tracked in AI research papers.