In the implementations of fuzzy time series forecasting, the identification of interval lengths has an important impact on the performance of the procedure. However, the interval length has been chosen arbitrarily in many papers. Huarng developed a new approach which is called ratio-based lengths of intervals in order to identify the length of intervals. In our paper, we propose a new approach which uses a single-variable constrained optimization to determine the ratio for the length of intervals. The proposed approach is applied to the two well-known time series, which are enrollment data at The University of Alabama and inventory demand data. The obtained results are compared to those of other methods. The proposed method produces more accurate predictions for the future values of used time series. (c) 2008 Elsevier B.V. All rights reserved.