Built entirely in-house (not based on Qwen). 34B parameters, 48 layers, trained on 2.1T tokens on SKT's TITAN supercomputer. Also available as A.X-3.1-Light (7B). Apache-2.0 license.

Model Details

Architecture DENSE
Parameters 34B
open-weightmultilingual