Open-source MoE model with 52B activated parameters out of 389B total. 256K context. Pre-trained on 7T tokens.

Outputs 2

Hunyuan-Large

model
Architecture MOE
Parameters 389B
Active params 52B
Context window 256,000

Hunyuan-Large: An Open-Source MoE Model with 52 Billion Activated Parameters

paper

arXiv: 2411.02265

moeopen-weightfrontier