AI Lab Tracker
Labs
Timeline
MoBA: Mixture of Block Attention for Long-Context LLMs
paper
2025-02-18
Moonshot AI
Mixture of Block Attention mechanism for efficient long-context processing.
Paper (arXiv)
GitHub
Paper
arXiv:
2502.13189
scaling
attention
architecture
More Links
LinkedIn: Insights from Core Developer