Leave-One-Out Cross-Validation (LOOCV).
This is another version of k-fold cross validation where k = n, the
number of data points. In this method, each time, only one data-point in
the original dataset is held-out for model validation while the
remaining data points are used to build the model. As a result, this
process runs as many times as the number of data-points in the sample.
This method provides negligible bias as the almost entire dataset is
used for building the model, which is its advantage. However, this
method has the major disadvantage that only one data point is used for
validating the model every time, resulting in a high variance in the
estimates of the model’s performance, particularly when multiple
outliers in the dataset. In addition, this method is computationally
very intensive, particularly when the dataset is
large4.