Scientific multimodal MoE foundation model with 241B total parameters (28B active). Trained on 5 trillion tokens including 2.5T+ from scientific domains. Achieves top-tier general reasoning and significantly outperforms open-source models in scientific tasks such as molecular synthesis, reaction prediction, and crystal stability analysis.

Model Details

Architecture MOE
Parameters 241B
Active params 28B
sciencefrontiermoereasoningopen-weight

Related