This article discusses Time-MOE, an open-source time-series foundation model using Mixture-of-Experts (MOE) to improve forecasting accuracy while reducing computational costs. Key contributions include the Time-300B dataset, scaling laws for time series, and the Time-MOE architecture.