Moirai
modelTime series foundation model. 14M, 91M, and 311M parameter variants trained on the LOTSA dataset (27B observations from 9 domains). Uses any-variate attention and multiple input/output projections for universal forecasting.
#1 on GIFT-Eval leaderboard for time series forecasting. ICML 2024. Moirai 2.0 (2025) switched to decoder-only architecture. By Woo, Liu, Kumar et al. Apache 2.0.
Model Details
Architecture DENSE
Parameters 311M
Paper
arXiv: 2402.02592
Venue: ICML 2024