Model Validation Methods
\(\)The commonly used RMSE \citep{Fox1981} quantifies the differences between predicted and observed values, and thus indicates how far the forecasts are from actual data. A few major outliers in the series can skew the RMSE statistic substantially because the effect of each deviation on the RMSE is proportional to the size of the squared error. The overall, nondimensional measure of the accuracy of forecasts MASE \citep*{Hyndman2006} is less sensitive to outliers than the RMSE. The MASE is recommended for determining comparative accuracy of forecasts \citep{Franses2016} because it examines the performance of forecasts, relative to a benchmark forecast. It is calculated as the average of the absolute value of the difference between the forecast and the actual value divided by the scale determined by using a random walk model (naïve reference model on the history prior to the period of data held back from model \(Y\left(t\right)\) el training). MASE<1 indicates that the forecast model is superior to a random walk.
The correlation coefficient between estimates and observations
\cite{Addiscott_1987},
\(-1\) (anti-correlation)
\(-1\le R\le1\) (perfect correlation), assesses linear relationships, in that forecasted values may show a continuous increase or decrease as actual values increase or decrease. Its extent is not consistently related to the accuracy of the estimates.
Forecasts online means were used to employ simulation model with support of Excel spreadhseet. The statistics were assessed interactively using Statistics Software
STATGRAPHIC Online and
WESSA R–JAVA web \citep{Wessa2012}.