Higher-Order Modular Attention: Fusing Pairwise and Triadic Interactions for Protein Sequences
arXiv:2603.11133v1 Announce Type: new Abstract: Transformer self-attention computes pairwise token interactions, yet protein sequence to phenotype relationships often involve cooperative dependencies among three or more …
Shirin Amiraslani, Xin Gao
9 views