Each tree in the random forest is then trained on 2/3 of the available data. This allows for an out-of-bag prediction to be made using the 1/3 of data that was not used in the training of each tree. This “cross-validation” increases confidence on predictions made in other weather years. The model was trained using 500 estimators, and independent variables included weather data (mentioned above), as well as month, hour, day-of-week, and holiday indicator variables that remove variation unrelated to weather. The final regression performed within predicted parameters, both within and outside the training sample, as shown in Table 2. TABLE 2 Random Forest Regression Fit Regression Zone Training data R2 Out-of-bag R2 CSWS 0.998 0.982 EDE 0.997 0.978 GRDA 0.993 0.951 GRDAN1 0.96 0.708 INDN 0.997 0.981 KACY 0.996 0.972 KCPL 0.997 0.981 LES 0.997 0.98 MPS 0.997 0.978 NPPD 0.997 0.977 OKGE 0.998 0.983 OPPD 0.997 0.98 SECI 0.995 0.962 SPAMKT 0.996 0.971 SPRM 0.998 0.982 SPS 0.997 0.981 WAUE 0.997 0.977 WFEC 0.997 0.976 WR 0.997 0.982 SPP FUTURE LOAD SCENARIOS | EVOLVED ENERGY RESEARCH | 14

Future Load Scenarios for Southwest Power Pool - Page 16 Future Load Scenarios for Southwest Power Pool Page 15 Page 17