Major upgrade introducing Kakao's first MoE model (15.7B/3B active, 37% FLOPS) alongside dense 2.1B and 8B. 32K native context with 128K YaRN extension. Apache 2.0 for 2.1B and 8B; custom license for MoE.

Instruct 8B: MT-Bench 7.76, MATH 67.54, GSM8K 87.64, IFEval 80.11. Over Kanana 1.0: HumanEval +53%, function calling +234%.

Model Details

Architecture MOE
Parameters 15.7B
Active params 3B
Context window 32,000

Variants

Name Parameters Notes
Kanana-1.5-2.1B 2.1B
Kanana-1.5-8B 8B
Kanana-1.5-15.7B-A3B 15.7B First Kakao MoE
open-weightmoemultilingual

Related