This example calculates the cross-validation error for models trained by the GLML1L2 function by classification (the GLML1L2 call specified Family ('BINOMIAL')) and does not show the training error for each fold in the output table.
Input
The input table, admissions_train, is from GLM Example: Logistic Regression Analysis with Intercept.
SQL Call
SELECT * FROM CrossValidation2 ( ON admissions_train AS InputTable OUT TABLE OutputTable (cv_out) USING FunctionName ('GLML1L2') Metric ('accuracy') EvaluateTraining ('f') IDColumn ('id') TargetColumns ('masters', 'gpa', 'stats', 'programming') CategoricalColumns ('masters', 'stats', 'programming') ResponseColumn ('admitted') Family ('Binomial') Alpha (0) RegularizationLambda (0.02) ) AS dt;
Output
fold_num | validation_accuracy ---------------+--------------------- best_score | 0.75 average_score | 0.625 1 | 0.75 2 | 0.75 3 | 0.625 4 | 0.75 5 | 0.25
SELECT * FROM cv_out;
attribute | category | estimate | information ----------------+----------+--------------------+------------- (Intercept) | | 0.335012994511237 | p masters | yes | -1.28304964430463 | p stats | beginner | 0.0904655970263125 | p stats | novice | 0.327897495296943 | p programming | beginner | -1.05600969848353 | p programming | novice | 0.625366583472179 | p gpa | | 0.365983318212613 | p Family | | | Binomial Regularization | | | Ridge Alpha | | 0 | Lambda | | 0.02 | Iterations # | | 16 | Converged | | | true Rows # | | 32 | Features # | | 7 | AIC | | 15.3721755866811 | BIC | | 25.6323269062792 |
Download a zip file of all examples and a SQL script file that creates their input tables.