Geometry-Preserving Aggregation for Mixture-of-Experts Embedding Models
arXiv:2602.14039v1 Announce Type: new Abstract: Mixture-of-Experts (MoE) embedding models combine expert outputs using weighted linear summation, implicitly assuming a linear subspace structure in the embedding …
Sajjad Kachuee, Mohammad Sharifkhani
3 views