This research used a hard-cut iterative training algorithm to improve a Gaussian process mixture (GPM) model. Our enhanced GPM (EGPM) concisely estimates distribution parameters to the greatest extent possible. GPM models are powerful tools for data presentation and forecasting owing to their linear mix of multiple Gaussian process (GP) models. The hidden posterior probability distribution variables in the GPM model, which are based on the hard-cut algorithm, are 0 and 1, respectively, which can simplify the training process and reduce calculation requirements by training each GP via a maximum likelihood estimation method. The EGPM model is then used for a short-term electric load forecasting problem and compared with various forecasting models. First, the EGPM results are compared with those of two previous GPM training algorithms: the variational and leave-one-out cross validation (LOOCV) algorithms. The experimental results indicate that the EGPM model can accurately and more reliably forecast electric loads. The GP, support vector machine, and radial basis function network are also assessed for their ability to solve the short-term electric load forecasting problem. The empirical results indicate that the performance of the proposed EGPM is superior to that of the other methods in terms of forecasting accuracy.