1 Introduction

Welcome to the ninth practical session of CS233 - Introduction to Machine Learning.
In this exercise class we will start using Machine Learning methods to solve regression problems.

In [2]:
# Useful starting lines
%matplotlib inline
import numpy as np
np.random.seed(42)
import matplotlib.pyplot as plt
%load_ext autoreload
%autoreload 2

2 Regression Problem

Let $f$ be a function $f: \mathbb{R}^d \rightarrow \mathbb{R}^v$ with a data set $\left(X \subseteq \mathbb{R}^d, y \subseteq\mathbb{R}^v \right )$. The regression problem is the task of estimating an approximation $\hat{f}$ of $f$. Within this exercise we consider the special case of $v=1$, i.e. the problem is univariate as opposed to multivariate. Specifically, we will analyze the Boston house prices data set and predict costs based on properties such as per capita crime rate by town, pupil-teacher ratio by town etc.

We will model the given data by means of a linear regression model, i.e. a model that explains a dependent variable in terms of a linear combination of independent variables.

  • How does a regression problem differ from a classification problem?
    • A classification problem has discrete-valued target variables, whereas the values of target variables in regression problems may be continuous.
  • Why is the linear regression model a linear model? Is it linear in the dependent variables? Is it linear in the parameters?
    • The linear regression model is linear in the parameters.

2.1 Load and inspect data

We load the data and split it such that 80% and 20% are train and test data, respectively. After, we normalize the data such that each feature has zero mean and unit standard deviation. Please fill in the required code and complete the function normalize.

  • Explore the relation between different features and the house prices. Describe what you see. Can you identify any trends?
    • The relations between feature values and house prices differ between feature dimensions. Some features (e.g. 4) are positively correlated with house prices, some (e.g. 11) are negatively correlated and for many others a clear trend is hard to spot by mere inspection.
In [3]:
# get the data set and print a description
from sklearn.datasets import load_boston
boston_dataset = load_boston()
print(boston_dataset.DESCR)

X = boston_dataset["data"]
y = boston_dataset["target"]

# remove categorical feature
X = np.delete(X, 3, axis=1)
# removing second mode
ind = y<40
X = X[ind,:]
y = y[ind]

# split the data into 80% training and 20% test data
indices = np.arange(X.shape[0])
np.random.shuffle(indices)

splitRatio = 0.8
n          = X.shape[0]
X_train    = X[indices[0:int(n*splitRatio)],:] 
y_train    = y[indices[0:int(n*splitRatio)]] 
X_test     = X[indices[int(n*(splitRatio)):],:] 
y_test     = y[indices[int(n*(splitRatio)):]] 
.. _boston_dataset:

Boston house prices dataset
---------------------------

**Data Set Characteristics:**  

    :Number of Instances: 506 

    :Number of Attributes: 13 numeric/categorical predictive. Median Value (attribute 14) is usually the target.

    :Attribute Information (in order):
        - CRIM     per capita crime rate by town
        - ZN       proportion of residential land zoned for lots over 25,000 sq.ft.
        - INDUS    proportion of non-retail business acres per town
        - CHAS     Charles River dummy variable (= 1 if tract bounds river; 0 otherwise)
        - NOX      nitric oxides concentration (parts per 10 million)
        - RM       average number of rooms per dwelling
        - AGE      proportion of owner-occupied units built prior to 1940
        - DIS      weighted distances to five Boston employment centres
        - RAD      index of accessibility to radial highways
        - TAX      full-value property-tax rate per $10,000
        - PTRATIO  pupil-teacher ratio by town
        - B        1000(Bk - 0.63)^2 where Bk is the proportion of blacks by town
        - LSTAT    % lower status of the population
        - MEDV     Median value of owner-occupied homes in $1000's

    :Missing Attribute Values: None

    :Creator: Harrison, D. and Rubinfeld, D.L.

This is a copy of UCI ML housing dataset.
https://archive.ics.uci.edu/ml/machine-learning-databases/housing/


This dataset was taken from the StatLib library which is maintained at Carnegie Mellon University.

The Boston house-price data of Harrison, D. and Rubinfeld, D.L. 'Hedonic
prices and the demand for clean air', J. Environ. Economics & Management,
vol.5, 81-102, 1978.   Used in Belsley, Kuh & Welsch, 'Regression diagnostics
...', Wiley, 1980.   N.B. Various transformations are used in the table on
pages 244-261 of the latter.

The Boston house-price data has been used in many machine learning papers that address regression
problems.   
     
.. topic:: References

   - Belsley, Kuh & Welsch, 'Regression diagnostics: Identifying Influential Data and Sources of Collinearity', Wiley, 1980. 244-261.
   - Quinlan,R. (1993). Combining Instance-Based and Model-Based Learning. In Proceedings on the Tenth International Conference of Machine Learning, 236-243, University of Massachusetts, Amherst. Morgan Kaufmann.

In [4]:
'''
Make mean 0 and std dev 1 of the data.
'''
def normalize(X):
    """
    Please fill in the required code here
    """
    
    mu    = np.mean(X,0,keepdims=True)
    std   = np.std(X,0,keepdims=True) 
    X     = (X-mu)/std
    return X, mu, std

#Use train stats for normalizing test set
X_train,mu_train,std_train = normalize(X_train)
X_test = (X_test-mu_train)/std_train
In [5]:
# Exploratory analysis of the data. Have a look at the distribution of prices vs features

feature = 4
plt.scatter(X_train[:,feature], y_train)
plt.xlabel(f"Attribute $X_{feature}$")
plt.ylabel("Price $y$")
plt.title(f"Attribute $X_{feature}$ vs Price $y$")
Out[5]:
Text(0.5, 1.0, 'Attribute $X_4$ vs Price $y$')

2.2 Closed-form solution for linear regression

The linear regression model has an analytical solution. Please use this solution to complete the function get_w_analytical and to obtain the weight parameters $w$. Tip: You may want to use the function np.linalg.solve.

  • What is the time complexity of this approach?
    • Let N be the number of data points. The time complexity is determined by the slowest operation, i.e. the matrix inversion, which is in $\mathcal{O}(N^3)$.
In [6]:
def get_w_analytical(X_train,y_train):
    """
    compute the weight parameters w
    """
    
    """
    Please fill in the required code here
    """
        
    # compute w via the normal equation
    # np.linalg.solve is more stable than np.linalg.inv
    w = np.linalg.solve(X_train.T@X_train,X_train.T@y_train)
    return w

def get_loss(w, X_train, y_train,X_test,y_test,val=False):
    # predict dependent variables and MSE loss for seen training data
    """
    Please fill in the required code here
    """
    loss_train = (np.mean((y_train-X_train@w)**2))
    loss_train_std = np.std((y_train-X_train@w)**2)
    
    # predict dependent variables and MSE loss for unseen test data
    """
    Please fill in the required code here
    """
    loss_test = (np.mean((y_test-X_test@w)**2))
    loss_test_std = np.std((y_test-X_test@w)**2)
    if not val:
        print("The training loss is {} with std:{}. The test loss is {} with std:{}.".format(loss_train, loss_train_std, loss_test,loss_test_std))
    else:
        print("The training loss is {} with std:{}. The val loss is {} with std:{}.".format(loss_train, loss_train_std, loss_test,loss_test_std))

    return loss_test
In [7]:
# compute w and calculate its goodness
w_ana = get_w_analytical(X_train,y_train)
get_loss(w_ana, X_train,y_train, X_test,y_test)
The training loss is 436.8437185477003 with std:142.73673671586107. The test loss is 437.3294638920944 with std:148.94707784497865.
Out[7]:
437.3294638920944
2.3 Feature expansion

Similar to feature expansion for classification problems, we can also perform feature expansion here. Please complete the function expand_X and perform a degree-2 polynomial feature expansion of X, including a bias term but omitting interaction terms.

  • Is our model still a linear regression model? Why (not)?
    • Yes, our model is still a linear regression model because it is still linear in the parameters.
  • How does linear regression on degree-2 polynomially expanded data compare against our previous model? Explain!
    • The model performs better on the degree-2 polynomially expanded data as compared to the previous model.
  • Try polynomial feature expansion for different parameters values of $d$. What do you observe? Explain!
    • The model improves in performance up to degree-5 polynomial feature expansion, after which the training loss still decreases but the test loss increases again. This is due to overfitting.
  • Look up the concept of the condition number of a matrix. What does this tell us about the feature-expanded data?
    • The inverse condition number of a function is a measure of how much the function is sensitive to changes in its input. Functions with a large-condition number are said to be ill-conditioned and may cause our methods to deal with numerical inaccuracies. A polynomial feature expansion of a higher degree results in a larger condition number and will lead to inaccurate inverse calculation.
In [7]:
def expand_X(X,d):
    """
    perform degree-d polynomial feature expansion of X, with bias but omitting interaction terms
    """
    
    """
    Please fill in the required code here
    """
    
    expand = np.ones((X.shape[0],1))
    for idx in range(1,d+1): expand=np.hstack((expand, X**idx))
    return expand
In [8]:
def expand_and_normalize_X(X,d):
    """
    perform degree-d polynomial feature expansion of X, with bias but omitting interaction terms
    and normalize them.
    """
    
    """
    Please fill in the required code here
    """
    expand = expand_X(X,d)
    expand_withoutBias,mu,std = normalize(expand[:,1:])
    expand[:,1:] = expand_withoutBias
    return expand, mu, std
In [9]:
# perform polynomial feature expansion
d = 2

#normalize the data after expansion
X_train_poly,mu_train_poly,std_train_poly = expand_and_normalize_X(X_train,d)
X_test_poly  = expand_X(X_test,d)
X_test_poly[:,1:]  = (X_test_poly[:,1:]-mu_train_poly)/std_train_poly


print("The original data has {} features.".format(X_train.shape[1]))
print("After degree-{} polynomial feature expansion (with bias, without interaction terms) the data has {} features.".format(d,X_train_poly.shape[1]))

cond_num_before = np.linalg.cond(X_train.T@X_train)
cond_num_after = np.linalg.cond(X_train_poly.T@X_train_poly)
print("The original data X^TX has condition number {}. \nThe expanded data X^TX has condition number {}.".format(cond_num_before,cond_num_after))

# re-compute w and calculate its goodness
w_augm = get_w_analytical(X_train_poly,y_train)
get_loss(w_augm, X_train_poly,y_train, X_test_poly,y_test)
The original data has 12 features.
After degree-2 polynomial feature expansion (with bias, without interaction terms) the data has 25 features.
The original data X^TX has condition number 94.30540153557189. 
The expanded data X^TX has condition number 304.6821566696805.
The training loss is 8.146544587369405 with std:18.14451706916919. The test loss is 8.577004357720213 with std:13.4943321035368.
Out[9]:
8.577004357720213

In above exercise, we directly evaluate model on test loss and choose the best degree of polynomial. But test should not be touched until your final model. So to choose best degree we'll use Cross Validation. We're going to K-Fold CV for that. We will use our training set and create K splits of it to choose best degree and finally evaluate on our test set.

In [10]:
# Function for using kth split as validation set to get loss
# and k-1 splits to train our model
# k = kth fold
# k_fold_ind = all the fold indices
# X,Y= train data and labels
# degree = degree of polynomial expansion

def do_cross_validation(k,k_fold_ind,X,Y,degree=1):
    
    # use one split as val
    val_ind = k_fold_ind[k]
    # use k-1 split to train
    train_splits = [i for i in range(k_fold_ind.shape[0]) if i is not k]
    train_ind = k_fold_ind[train_splits,:].reshape(-1)
    
    #Get train and val 
    cv_X_train = X[train_ind,:]
    cv_Y_train = Y[train_ind]
    cv_X_val = X[val_ind,:]
    cv_Y_val = Y[val_ind]

    #expand and normalize for degree d
    cv_X_train_poly,mu,std = expand_and_normalize_X(cv_X_train,degree)
    #apply the normalization using statistics (mean, std) computed on train data
    cv_X_val_poly = expand_X(cv_X_val,degree)
    cv_X_val_poly[:,1:] =  (cv_X_val_poly[:,1:]-mu)/std
    
    #fit on train set
    w = get_w_analytical(cv_X_train_poly,cv_Y_train)
    
    #get loss for val
    loss_test = get_loss(w,cv_X_train_poly,cv_Y_train,cv_X_val_poly,cv_Y_val,val=True)
    return loss_test

Let's do 3-fold CV

In [11]:
from helper import fold_indices
k_fold=3

# We create the k_fold splits of the train data
num_train_examples = X_train.shape[0]
fold_ind = fold_indices(num_train_examples,k_fold)
In [12]:
from helper import grid_search_cv

# put the list of degree values to be evaluated
search_degree = np.arange(1,15)
params={'degree':search_degree}

#call to the grid search function
grid_val,grid_val_std = grid_search_cv(params,k_fold,fold_ind,do_cross_validation,X_train,y_train)
Evaluating for {'degree': 1} ...
The training loss is 9.878733332550935 with std:18.37782618122855. The val loss is 12.097836086985499 with std:17.127104767852995.
The training loss is 11.399153182137644 with std:21.252737552838692. The val loss is 8.642850525165404 with std:13.217410675430829.
The training loss is 9.647515495078757 with std:14.163330615094118. The val loss is 12.361915261420615 with std:29.5120132938108.
Evaluating for {'degree': 2} ...
The training loss is 6.704434095012012 with std:13.04290453803446. The val loss is 15.33226377128941 with std:59.194356689515615.
The training loss is 8.59812312387266 with std:19.20966159118811. The val loss is 7.99215021511865 with std:12.953280974968997.
The training loss is 8.295107830537537 with std:13.781793734479015. The val loss is 8.952202411127612 with std:18.538578865584697.
Evaluating for {'degree': 3} ...
The training loss is 6.016961518016398 with std:12.98640668859862. The val loss is 219.9763051196828 with std:2147.199387698197.
The training loss is 6.61059467066222 with std:11.603390465787877. The val loss is 9.208393316317428 with std:15.817117629028836.
The training loss is 6.647726826379213 with std:9.462305742327297. The val loss is 8.79045927610683 with std:21.62653887360139.
Evaluating for {'degree': 4} ...
The training loss is 5.4892943798361244 with std:12.80478145667062. The val loss is 74.51757309120002 with std:726.1004842259258.
The training loss is 5.473830379191732 with std:9.859770792406268. The val loss is 8.932445276085183 with std:15.892676072038716.
The training loss is 4.760151294169597 with std:7.517319364401974. The val loss is 11.104276197433602 with std:30.667662527594764.
Evaluating for {'degree': 5} ...
The training loss is 4.803302814257727 with std:11.09802369989356. The val loss is 86692.64546040651 with std:952719.7229907172.
The training loss is 4.710971731988219 with std:8.919907269774212. The val loss is 9.595861768063715 with std:17.232314947977986.
The training loss is 4.364725903525293 with std:7.272421429102086. The val loss is 10.17497671793749 with std:29.348416108333794.
Evaluating for {'degree': 6} ...
The training loss is 4.174160725676965 with std:9.351346901044497. The val loss is 2110289.5464206506 with std:23354426.74003958.
The training loss is 3.8883200040271553 with std:7.936795431717853. The val loss is 9.056525080448516 with std:17.384060610674155.
The training loss is 3.379186432774632 with std:5.196306323036411. The val loss is 9.325098990059624 with std:24.10104216689044.
Evaluating for {'degree': 7} ...
The training loss is 3.454942949690028 with std:8.09768859407402. The val loss is 71169463.2709065 with std:792971783.1498725.
The training loss is 3.5035278392126625 with std:6.557384389295983. The val loss is 38.339813897351924 with std:222.6762030708114.
The training loss is 3.121985399006722 with std:5.09853143125587. The val loss is 12.932961969363047 with std:47.35118119855683.
Evaluating for {'degree': 8} ...
The training loss is 3.0808232427325684 with std:6.694596687028538. The val loss is 2177096439.794818 with std:24310024387.982788.
The training loss is 2.888023540483583 with std:5.179397949745349. The val loss is 68.07150823982161 with std:404.28621352492235.
The training loss is 2.9034889407272138 with std:5.091025381734397. The val loss is 25.561810641714263 with std:121.0406376894253.
Evaluating for {'degree': 9} ...
The training loss is 2.7576123809317057 with std:6.435594425393558. The val loss is 6835134455.083041 with std:76406820853.24263.
The training loss is 2.5375712060184914 with std:4.745735399383841. The val loss is 458.06032342458417 with std:2813.9430653569925.
The training loss is 2.3815234183897362 with std:4.70479167558011. The val loss is 106.17589221237208 with std:562.4954939770727.
Evaluating for {'degree': 10} ...
The training loss is 2.4271811717738125 with std:5.90589984052625. The val loss is 377997670633.3835 with std:4225806628382.2393.
The training loss is 2.053433411907226 with std:3.712431430948495. The val loss is 973.665490642278 with std:5238.391831677754.
The training loss is 2.08220448562441 with std:4.27134537244911. The val loss is 104.4430927204144 with std:579.4592649392137.
Evaluating for {'degree': 11} ...
The training loss is 2.267903183352421 with std:4.8943376545395605. The val loss is 514531197557593.1 with std:5751707957778364.0.
The training loss is 2.045958995994062 with std:3.640980949487254. The val loss is 1686.7277351210726 with std:9575.741935411643.
The training loss is 2.0236632652235325 with std:4.04684670233468. The val loss is 545.4338601434469 with std:2938.0308379107205.
Evaluating for {'degree': 12} ...
The training loss is 2.2052898940596473 with std:4.662254146806866. The val loss is 890688454655504.4 with std:9956634675520458.0.
The training loss is 1.8537073595190403 with std:3.3808259856009304. The val loss is 290795.241569859 with std:2228274.549813424.
The training loss is 1.8219687006778027 with std:3.779985421290979. The val loss is 563.5112517093773 with std:2601.605186857718.
Evaluating for {'degree': 13} ...
The training loss is 2.0176882830300844 with std:4.409069293961505. The val loss is 1.9120918683885404e+16 with std:2.1377317890793005e+17.
The training loss is 1.7965490740219765 with std:3.5616262678889585. The val loss is 2066937.6370844753 with std:16137645.170576556.
The training loss is 1.7750604948177584 with std:3.5995784792952956. The val loss is 12842.92966358202 with std:82693.5450229025.
Evaluating for {'degree': 14} ...
The training loss is 1.8616210480697366 with std:3.9132814753368113. The val loss is 1.8426791366920772e+16 with std:2.0599617187854138e+17.
The training loss is 1.6960458106349987 with std:3.60520372731727. The val loss is 1989237.8823171104 with std:14751945.290853957.
The training loss is 2.2670273736250106 with std:4.330048785668454. The val loss is 2229494.26314388 with std:21880012.987683468.

Observe how the validation score decreases and then increases with degree

In [13]:
#get the best validation score
best_score = np.min(grid_val)
print('Best val score {}'.format(best_score))

#get degree which gives best score
best_degree = search_degree[np.argmin(grid_val)]
print('Best val score for degree {}'.format(best_degree))


X_train_poly,mu,std = expand_and_normalize_X(X_train,best_degree)
w = get_w_analytical(X_train_poly,y_train)
X_test_poly = expand_X(X_test,best_degree)
X_test_poly[:,1:] =  (X_test_poly[:,1:]-mu)/std

get_loss(w,X_train_poly,y_train,X_test_poly,y_test)
Best val score 10.758872132511891
Best val score for degree 2
The training loss is 8.146544587369405 with std:18.14451706916919. The test loss is 8.577004357720213 with std:13.4943321035368.
Out[13]:
8.577004357720213

2.3 Numerical solution for linear regression

The linear regression model has an analytical solution, but we can also get the weight parameters $w$ numerically, e.g. via stochastic gradient descent. Please use this approach to complete the function get_w_numerical below.

  • How do these results compare against those of the analytical solution? Explain the differences or similarities!
    • The analytical solution and the numerical solution are almost identical. But when condition number of matrix is very high inverse may not be stable and solution will differ.
  • In which cases, it maybe be preferable to use the numerical approach over the analytical solution?
    • Let N be the number of data points. Computing the analytical solution has run time $\mathcal{O}(N^3)$, which may be problematic if N is very large.
In [14]:
def get_w_numerical(X_train,y_train,X_test_poly,y_test,epochs,lr):
    """compute the weight parameters w"""
    
    """
    Please fill in the required code here
    """
    
    # initialize the weights
    w    = np.random.normal(0, 1e-1, X_train.shape[1])
    # define the gradient
    grad = lambda w,x,y: (y-x@w)*x
    
    # iterate a given number of epochs over the training data
    for epoch in range(epochs):
        
        # iterate over each data point
        for idx,x_train in enumerate(X_train):
            # update the weights
            w += lr*grad(w,x_train,y_train[idx])
            
        if epoch % 1000 == 0:
            print(f"Epoch {1000+epoch}/{epochs}")
            get_loss(w, X_train_poly,y_train, X_test_poly,y_test)
            
    return w
In [15]:
# compute w and calculate its goodness
X_train_poly,mu,std = expand_and_normalize_X(X_train,best_degree)
X_test_poly = expand_X(X_test,best_degree)
X_test_poly[:,1:] =  (X_test_poly[:,1:]-mu)/std
w_num = get_w_numerical(X_train_poly,y_train,X_test_poly,y_test,15000,8*1e-5)
Epoch 1000/15000
The training loss is 438.82197783111553 with std:269.70024898557244. The test loss is 487.795755682178 with std:288.0750855084646.
Epoch 2000/15000
The training loss is 8.236056562702556 with std:18.501123209527982. The test loss is 9.137317350004377 with std:14.55992640535479.
Epoch 3000/15000
The training loss is 8.16167686880121 with std:18.180565642719948. The test loss is 8.822776407838896 with std:14.025521070509464.
Epoch 4000/15000
The training loss is 8.149433149908623 with std:18.106098455220362. The test loss is 8.687750036374553 with std:13.741249494747919.
Epoch 5000/15000
The training loss is 8.147175969113508 with std:18.08255403686697. The test loss is 8.629587660944399 with std:13.607602247208229.
Epoch 6000/15000
The training loss is 8.146733506632511 with std:18.074171654432142. The test loss is 8.604089986700975 with std:13.546077110524106.
Epoch 7000/15000
The training loss is 8.146642412498348 with std:18.07097230682923. The test loss is 8.592811783189845 with std:13.518056881839968.
Epoch 8000/15000
The training loss is 8.146622452296937 with std:18.069691338158993. The test loss is 8.587802614398301 with std:13.505385378018985.
Epoch 9000/15000
The training loss is 8.146617636083846 with std:18.06916070937. The test loss is 8.58557354698758 with std:13.499682139050417.
Epoch 10000/15000
The training loss is 8.146616303337446 with std:18.06893558308592. The test loss is 8.584580678157014 with std:13.49712330275415.
Epoch 11000/15000
The training loss is 8.146615872757668 with std:18.068838483167. The test loss is 8.584138207871659 with std:13.49597761619587.
Epoch 12000/15000
The training loss is 8.146615713886503 with std:18.068796131753146. The test loss is 8.583940960149839 with std:13.495465332402217.
Epoch 13000/15000
The training loss is 8.146615649809972 with std:18.068777520398932. The test loss is 8.58385301126355 with std:13.495236463491345.
Epoch 14000/15000
The training loss is 8.146615622636013 with std:18.068769300526963. The test loss is 8.583813790785237 with std:13.495134268372222.
Epoch 15000/15000
The training loss is 8.146615610812333 with std:18.068765658008225. The test loss is 8.58379629866519 with std:13.495088651298861.

We can also use the sklearn implementation of the linear regression model. Please look up the documentation to

  1. instantiate the LinearRegression model
  2. fit the model to our training data
  3. evaluate the model on the test data
  4. and compare the results with our previous outcomes
In [16]:
from sklearn.linear_model import LinearRegression
from sklearn import metrics

"""
Please fill in the required code here
"""
    
model = LinearRegression()
model.fit(X_train_poly,y_train)
y_hat = model.predict(X_test_poly)

print('MSE of sklearn linear regression model on test data: ' , metrics.mean_squared_error(y_test,y_hat))
MSE of sklearn linear regression model on test data:  8.577004357720154

2.4 Ridge Regression

As seen in previous section, we would like to do feature expansion to fit the non-linearity of the data, but it soon leads to overfitting. There are different ways to tackle this problem, like getting more data, changing the prediction method, regularization, etc. For the task of regression, we'll add a regularization to our training objective to mitigate this problem. Intutively, regularization restricts the domain from which the values of model parameters are taken, which means that we are biasing our model.

In Ridge Regression, we restrict the $l_2$ norm of the coefficients $\mathbf{w}$. Our loss function looks as following, \begin{align} L(\mathbf{w}) &=\frac{1}{N} \| \mathbf{y} - \mathbf{X}\mathbf{w} \|^2 + \frac{\lambda}{N}\|\mathbf{w}\|^2 \\ \nabla L(\mathbf{w}) &= -\frac{2}{N}\mathbf{X}^T(\mathbf{y} - \mathbf{X}\mathbf{w}) + 2\frac{\lambda}{N}\mathbf{w} \end{align}

$\nabla L(\mathbf{w}) = 0$ for minimum condition, we get

\begin{align} \mathbf{w} &= (\mathbf{X}^T\mathbf{X}+\lambda\mathbf{I})^{-1}\mathbf{X}^T\mathbf{y} \end{align}

dimensions are following: $\mathbf{w}$ is $D\times1$; $\mathbf{y}$ is $N\times1$; $\mathbf{X}$ is $N\times D$; $\mathbf{I}$ is identity matrix of dimension $D \times D$ .

$\lambda$ is our penality term, also know as weight decay. By varying its value, we can allow biasing in our model.

Question: When $\lambda$ is high, our model is more complex or less?

Answer: High $\lambda$ penalises the norm term more, hence restricts the value $\mathbf{w}$ can have, rendering simpler model.

Question: How will $\lambda$ affect inverse condition number of $\mathbf{X}^T\mathbf{X}+\lambda\mathbf{I}$ ?

Answer: $\lambda > 0$, decreases the condition number, hence more stable computations. \begin{align} \mathbf{X}^T\mathbf{X}+\lambda\mathbf{I} &= \mathbf{U}\mathbf{S}\mathbf{U}^T + \lambda \mathbf{U}\mathbf{I}\mathbf{U}^T \\ \mathbf{X}^T\mathbf{X}+\lambda\mathbf{I} &= \mathbf{U}[\mathbf{S}+\lambda\mathbf{I}]\mathbf{U}^T \end{align} if we have singular decomposition of $\mathbf{X}^T\mathbf{X}$ as $\mathbf{U}\mathbf{S}\mathbf{U}^T$, then the above steps show that eigenvalues of $\mathbf{X}^T\mathbf{X}+\lambda\mathbf{I}$ are atleast $\lambda$, hence we are lifting the all the eigenvalues.

In [17]:
def get_w_analytical_with_regularization(X_train,y_train,lmda):
    """compute the weight parameters w with ridge regression"""
    
    """
    Please fill in the required code here
    """
    #create lambda matrix 
    lmda_mat = lmda*np.eye(X_train.shape[1])
    # compute w via the normal equation
    # np.linalg.solve is more stable than np.linalg.inv
    w = np.linalg.solve(X_train.T@X_train+lmda_mat,X_train.T@y_train)
    return w
In [18]:
# perform polynomial feature expansion
d  = 14

#normalize the data after expansion
X_train_poly,mu_train_poly,std_train_poly = expand_and_normalize_X(X_train,d)
X_test_poly  = expand_X(X_test,d)
X_test_poly[:,1:]  = (X_test_poly[:,1:]-mu_train_poly)/std_train_poly


print("The original data has {} features.".format(X_train.shape[1]))
print("After degree-{} polynomial feature expansion (with bias, without interaction terms) the data has {} features.".format(d,X_train_poly.shape[1]))

cond_num_before = np.linalg.cond(X_train.T@X_train)
cond_num_after = np.linalg.cond(X_train_poly.T@X_train_poly)
print("The original data X^TX has condition number {}. \nThe expanded data X^TX has condition number {}.".format(cond_num_before,cond_num_after))

#choose lambda value
lmda = 2

#write the X^TX+\lambda*I matrix
A = X_train.T@X_train+lmda*np.eye(X_train.shape[1])
cond_num_ridge = np.linalg.cond(A)
print("The X^TX+lambda*I with lambda:{} has condition number {}".format(lmda,cond_num_ridge))
The original data has 12 features.
After degree-14 polynomial feature expansion (with bias, without interaction terms) the data has 169 features.
The original data X^TX has condition number 94.30540153557189. 
The expanded data X^TX has condition number 6.119085746825847e+17.
The X^TX+lambda*I with lambda:2 has condition number 87.29222508026365

See how the condition number has changed with regularization.

Cross Validation(CV) is used to choose value of $\lambda$. As seen in previous exercise, we will use K-fold CV. We will use our training set and create K splits of it to choose best degree and corresponding $\lambda$ and finally evaluate on our test set.

In [19]:
# Function for using kth split as validation set to get accuracy
# and k-1 splits to train our model
def do_cross_validation_reg(k,k_fold_ind,X,Y,lmda=0,degree=1):
    
    # use one split as val
    val_ind = k_fold_ind[k]
    # use k-1 split to train
    train_splits = [i for i in range(k_fold_ind.shape[0]) if i is not k]
    train_ind = k_fold_ind[train_splits,:].reshape(-1)
   
    #Get train and val 
    cv_X_train = X[train_ind,:]
    cv_Y_train = Y[train_ind]
    cv_X_val = X[val_ind,:]
    cv_Y_val = Y[val_ind]
    
   
    #expand and normalize for degree d
    cv_X_train_poly,mu,std = expand_and_normalize_X(cv_X_train,degree)

    #apply the normalization using statistics (mean, std) computed on train data
    cv_X_val_poly = expand_X(cv_X_val,degree)
    cv_X_val_poly[:,1:] =  (cv_X_val_poly[:,1:]-mu)/std
    
    #fit on train set using regularised version
    w = get_w_analytical_with_regularization(cv_X_train_poly,cv_Y_train,lmda)
    
    #get loss for val
    loss_test = get_loss(w,cv_X_train_poly,cv_Y_train,cv_X_val_poly,cv_Y_val,val=True)
    print(loss_test,lmda,degree)
    return loss_test

Let's do 3-fold CV. We will use same the training data splits as in non regularised case for fairer comparison.

In [20]:
#list of lambda values to try.. use np.logspace
search_lambda = np.logspace(-2,1,num=27)
#list of degrees
search_degree = np.arange(1,15,1)

params = {'degree':search_degree,'lmda':search_lambda,}
k_fold =3
#call to the grid search function
grid_val,grid_val_std = grid_search_cv(params,k_fold,fold_ind,do_cross_validation_reg,X_train,y_train)
Evaluating for {'degree': 1, 'lmda': 0.01} ...
The training loss is 9.878734525447705 with std:18.38119705319961. The val loss is 12.097042010817713 with std:17.12823906312638.
12.097042010817713 0.01 1
The training loss is 11.399154129861566 with std:21.256533730574343. The val loss is 8.643007248451504 with std:13.217445759846907.
8.643007248451504 0.01 1
The training loss is 9.647516425595219 with std:14.164703865164896. The val loss is 12.362239217481456 with std:29.51765349897477.
12.362239217481456 0.01 1
Evaluating for {'degree': 1, 'lmda': 0.013043213867190054} ...
The training loss is 9.878735361495497 with std:18.382222995873544. The val loss is 12.09680115503232 with std:17.128585342953304.
12.09680115503232 0.013043213867190054 1
The training loss is 11.399154794290626 with std:21.257689165045775. The val loss is 8.643055262867495 with std:13.217456971916642.
8.643055262867495 0.013043213867190054 1
The training loss is 9.64751707799271 with std:14.16512228786205. The val loss is 12.362338138238242 with std:29.519369769040463.
12.362338138238242 0.013043213867190054 1
Evaluating for {'degree': 1, 'lmda': 0.017012542798525893} ...
The training loss is 9.87873678324588 with std:18.383561234855193. The val loss is 12.096487560709871 with std:17.129037765448025.
12.096487560709871 0.017012542798525893 1
The training loss is 11.399155924450753 with std:21.25919634648909. The val loss is 8.643118113493331 with std:13.217471971164224.
8.643118113493331 0.017012542798525893 1
The training loss is 9.647518187726961 with std:14.165668405289736. The val loss is 12.36246739682643 with std:29.52160822111885.
12.36246739682643 0.017012542798525893 1
Evaluating for {'degree': 1, 'lmda': 0.02218982341458972} ...
The training loss is 9.878739200720693 with std:18.38530686457416. The val loss is 12.096079482795053 with std:17.12962916458133.
12.096079482795053 0.02218982341458972 1
The training loss is 11.399157846691217 with std:21.261162405490037. The val loss is 8.64320047251388 with std:13.217492172960497.
8.64320047251388 0.02218982341458972 1
The training loss is 9.64752007531279 with std:14.166381329696465. The val loss is 12.362636389773163 with std:29.524527685280525.
12.362636389773163 0.02218982341458972 1
Evaluating for {'degree': 1, 'lmda': 0.028942661247167517} ...
The training loss is 9.878743310615796 with std:18.387583957838476. The val loss is 12.095548831385633 with std:17.130402737281784.
12.095548831385633 0.028942661247167517 1
The training loss is 11.399161115915117 with std:21.263727135172374. The val loss is 8.643308544021044 with std:13.2175196074955.
8.643308544021044 0.028942661247167517 1
The training loss is 9.64752328579067 with std:14.167312252421192. The val loss is 12.362857488097356 with std:29.528335270163602.
12.362857488097356 0.028942661247167517 1
Evaluating for {'degree': 1, 'lmda': 0.037750532053243954} ...
The training loss is 9.878750296295943 with std:18.390554411820105. The val loss is 12.094859430308427 with std:17.13141545412499.
12.094859430308427 0.037750532053243954 1
The training loss is 11.399166675489381 with std:21.267072973201554. The val loss is 8.643450607165644 with std:13.217557235691286.
8.643450607165644 0.037750532053243954 1
The training loss is 9.647528745888462 with std:14.168528243119912. The val loss is 12.363147022944636 with std:29.533301015378125.
12.363147022944636 0.037750532053243954 1
Evaluating for {'degree': 1, 'lmda': 0.04923882631706739} ...
The training loss is 9.878762166785355 with std:18.394429504952722. The val loss is 12.093964876495292 with std:17.132742684519812.
12.093964876495292 0.04923882631706739 1
The training loss is 11.399176128850657 with std:21.271438050720448. The val loss is 8.64363777830268 with std:13.21760945123658.
8.64363777830268 0.04923882631706739 1
The training loss is 9.647538031044142 with std:14.170117291160455. The val loss is 12.363526627382258 with std:29.539776975103536.
12.363526627382258 0.04923882631706739 1
Evaluating for {'degree': 1, 'lmda': 0.0642232542222936} ...
The training loss is 9.878782330722151 with std:18.39948500271863. The val loss is 12.092805967849811 with std:17.13448452539098.
12.092805967849811 0.0642232542222936 1
The training loss is 11.399192200600488 with std:21.277133261632436. The val loss is 8.643885096109283 with std:13.217682888066298.
8.643885096109283 0.0642232542222936 1
The training loss is 9.647553818906733 with std:14.17219502691321. The val loss is 12.364025081923351 with std:29.548222061930858.
12.364025081923351 0.0642232542222936 1
Evaluating for {'degree': 1, 'lmda': 0.0837677640068292} ...
The training loss is 9.87881656675227 with std:18.406080914122562. The val loss is 12.091307717009828 with std:17.136774549689047.
12.091307717009828 0.0837677640068292 1
The training loss is 11.399219518794181 with std:21.2845646094035. The val loss is 8.644213091091487 with std:13.21778773237121.
8.644213091091487 0.0837677640068292 1
The training loss is 9.647580659150153 with std:14.174913735039102. The val loss is 12.364680883451282 with std:29.55923437090689.
12.364680883451282 0.0837677640068292 1
Evaluating for {'degree': 1, 'lmda': 0.10926008611173785} ...
The training loss is 9.878874661178413 with std:18.414687350585865. The val loss is 12.089376071520324 with std:17.13979203657799.
12.089376071520324 0.10926008611173785 1
The training loss is 11.399265941077603 with std:21.29426249374124. The val loss is 8.644650093440072 with std:13.217939872122177.
8.644650093440072 0.10926008611173785 1
The training loss is 9.647626279358088 with std:14.178474527329751. The val loss is 12.365545869986098 with std:29.57359320420813.
12.365545869986098 0.10926008611173785 1
Evaluating for {'degree': 1, 'lmda': 0.14251026703029984} ...
The training loss is 9.878973165164519 with std:18.425918404953272. The val loss is 12.086894656525619 with std:17.143779272334218.
12.086894656525619 0.14251026703029984 1
The training loss is 11.399344800384188 with std:21.30692014263285. The val loss is 8.645235683352919 with std:13.218164441819273.
8.645235683352919 0.14251026703029984 1
The training loss is 9.647703798378037 with std:14.183143922490274. The val loss is 12.366690412280345 with std:29.592313653795504.
12.366690412280345 0.14251026703029984 1
Evaluating for {'degree': 1, 'lmda': 0.18587918911465645} ...
The training loss is 9.879140022895761 with std:18.440576591495876. The val loss is 12.083722206800044 with std:17.149066314538793.
12.083722206800044 0.18587918911465645 1
The training loss is 11.399478702672981 with std:21.32344412712286. The val loss is 8.646025932739324 with std:13.218501696915546.
8.646025932739324 0.18587918911465645 1
The training loss is 9.647835473592751 with std:14.189276652816218. The val loss is 12.368210968741554 with std:29.616717403480788.
12.368210968741554 0.18587918911465645 1
Evaluating for {'degree': 1, 'lmda': 0.24244620170823283} ...
The training loss is 9.879422310621583 with std:18.4597112253224. The val loss is 12.079691957763153 with std:17.1561068284557.
12.079691957763153 0.24244620170823283 1
The training loss is 11.399705938218313 with std:21.34502088780339. The val loss is 8.647101487330062 with std:13.219016782609947.
8.647101487330062 0.24244620170823283 1
The training loss is 9.648059036109244 with std:14.197347390931036. The val loss is 12.370241260203816 with std:29.648524415895725.
12.370241260203816 0.24244620170823283 1
Evaluating for {'degree': 1, 'lmda': 0.31622776601683794} ...
The training loss is 9.879899115563203 with std:18.484695251013555. The val loss is 12.074615264118032 with std:17.165530419813546.
12.074615264118032 0.31622776601683794 1
The training loss is 11.400091281213555 with std:21.373204551931796. The val loss is 8.64858019242956 with std:13.219816012646232.
8.64858019242956 0.31622776601683794 1
The training loss is 9.648438381546084 with std:14.207995433391924. The val loss is 12.37296906607291 with std:29.689971404846435.
12.37296906607291 0.31622776601683794 1
Evaluating for {'degree': 1, 'lmda': 0.41246263829013524} ...
The training loss is 9.880702840881087 with std:18.517326566871407. The val loss is 12.068293312384974 with std:17.178219536503253.
12.068293312384974 0.41246263829013524 1
The training loss is 11.400744130801044 with std:21.41003316412371. The val loss is 8.650637033433663 with std:13.221074018320992.
8.650637033433663 0.41246263829013524 1
The training loss is 9.649081568998946 with std:14.222088456591237. The val loss is 12.37666185304966 with std:29.743964480635466.
12.37666185304966 0.41246263829013524 1
Evaluating for {'degree': 1, 'lmda': 0.5379838403443686} ...
The training loss is 9.882054194243398 with std:18.55996198954772. The val loss is 12.060543254242372 with std:17.19542274843204.
12.060543254242372 0.5379838403443686 1
The training loss is 11.401848878507078 with std:21.458182974202963. The val loss is 8.653535898969642 with std:13.223079006354462.
8.653535898969642 0.5379838403443686 1
The training loss is 9.650171034788377 with std:14.240814679494214. The val loss is 12.381706412061709 with std:29.81427509224056.
12.381706412061709 0.5379838403443686 1
Evaluating for {'degree': 1, 'lmda': 0.701703828670383} ...
The training loss is 9.884319154636282 with std:18.615694886453703. The val loss is 12.051248723733528 with std:17.218921250700305.
12.051248723733528 0.701703828670383 1
The training loss is 11.403715525515512 with std:21.521173858373185. The val loss is 8.657680483600997 with std:13.226308075831236.
8.657680483600997 0.701703828670383 1
The training loss is 9.652014123279706 with std:14.265817733050936. The val loss is 12.388670861508253 with std:29.905790324724123.
12.388670861508253 0.701703828670383 1
Evaluating for {'degree': 1, 'lmda': 0.9152473108773893} ...
The training loss is 9.888100811691087 with std:18.688591491114295. The val loss is 12.040449840357613 with std:17.25127183230645.
12.040449840357613 0.9152473108773893 1
The training loss is 11.40686361875169 with std:21.60364357627838. The val loss is 8.66369615959223 with std:13.231552176770135.
8.66369615959223 0.9152473108773893 1
The training loss is 9.655127211082135 with std:14.299396116640795. The val loss is 12.398402505554882 with std:30.024830606098448.
12.398402505554882 0.9152473108773893 1
Evaluating for {'degree': 1, 'lmda': 1.1937766417144369} ...
The training loss is 9.894385854621593 with std:18.78400648087296. The val loss is 12.028494767204858 with std:17.29615715647743.
12.028494767204858 1.1937766417144369 1
The training loss is 11.412160594798088 with std:21.71171467998793. The val loss is 8.672561842355313 with std:13.240122490287375.
8.672561842355313 1.1937766417144369 1
The training loss is 9.660374979645194 with std:14.344800486922558. The val loss is 12.412183247086817 with std:30.17954965849949.
12.412183247086817 1.1937766417144369 1
Evaluating for {'degree': 1, 'lmda': 1.5570684047537318} ...
The training loss is 9.90477579423533 with std:18.909006233502605. The val loss is 12.016284157924519 with std:17.35888285425784.
12.016284157924519 1.5570684047537318 1
The training loss is 11.421048227041178 with std:21.853485705637496. The val loss is 8.685822268539027 with std:13.254189155038043.
8.685822268539027 1.5570684047537318 1
The training loss is 9.669199463061226 with std:14.406678578220973. The val loss is 12.431977269491853 with std:30.380432610149537.
12.431977269491853 1.5570684047537318 1
Evaluating for {'degree': 1, 'lmda': 2.030917620904737} ...
The training loss is 9.92184877324301 with std:19.072939271928462. The val loss is 12.00565266829268 with std:17.447071464165457.
12.00565266829268 2.030917620904737 1
The training loss is 11.435910501367784 with std:22.039687746285647. The val loss is 8.705929068075237 with std:13.27733256978534.
8.705929068075237 2.030917620904737 1
The training loss is 9.683993851355623 with std:14.49174060258699. The val loss is 12.460826093516244 with std:30.640907812668825.
12.460826093516244 2.030917620904737 1
Evaluating for {'degree': 1, 'lmda': 2.6489692876105297} ...
The training loss is 9.949723016987859 with std:19.28820889944564. The val loss is 11.999951987269263 with std:17.57161964817342.
11.999951987269263 2.6489692876105297 1
The training loss is 11.46066663558595 with std:22.284557989635942. The val loss is 8.736787302019051 with std:13.315430699120613.
8.736787302019051 2.6489692876105297 1
The training loss is 9.708707299741537 with std:14.609747801942431. The val loss is 12.503477658517626 with std:30.97808504347909.
12.503477658517626 2.6489692876105297 1
Evaluating for {'degree': 1, 'lmda': 3.4551072945922217} ...
The training loss is 9.994935127939764 with std:19.57132343465739. The val loss is 12.004937238079165 with std:17.748017085984692.
12.004937238079165 3.4551072945922217 1
The training loss is 11.501719931445228 with std:22.606991535592794. The val loss is 8.784628790953647 with std:13.378065542179598.
8.784628790953647 3.4551072945922217 1
The training loss is 9.74981449703963 with std:14.774960554051754. The val loss is 12.567384207191077 with std:31.413626211886143.
12.567384207191077 3.4551072945922217 1
Evaluating for {'degree': 1, 'lmda': 4.506570337745478} ...
The training loss is 10.067818202021629 with std:19.94432227174549. The val loss is 12.030132092218217 with std:17.99818668928449.
12.030132092218217 4.506570337745478 1
The training loss is 11.569465712566947 with std:23.03203794271501. The val loss is 8.859404060743689 with std:13.480704584364718.
8.859404060743689 4.506570337745478 1
The training loss is 9.817855269988144 with std:15.008209783130456. The val loss is 12.664276113704664 with std:31.974743424793292.
12.664276113704664 4.506570337745478 1
Evaluating for {'degree': 1, 'lmda': 5.878016072274912} ...
The training loss is 10.18468296597491 with std:20.436690835272024. The val loss is 12.090984720871425 with std:18.35309797428469.
12.090984720871425 5.878016072274912 1
The training loss is 11.680672672063695 with std:23.59280193932544. The val loss is 8.976994719714199 with std:13.647975180643078.
8.976994719714199 5.878016072274912 1
The training loss is 9.929856100124226 with std:15.339751335202166. The val loss is 12.812625545927457 with std:32.69530312962139.
12.812625545927457 5.878016072274912 1
Evaluating for {'degree': 1, 'lmda': 7.666822074546214} ...
The training loss is 10.371283133727795 with std:21.087859805099722. The val loss is 12.212348549833797 with std:18.856479235897343.
12.212348549833797 7.666822074546214 1
The training loss is 11.862216309487806 with std:24.332774866487213. The val loss is 9.16271303772791 with std:13.918329048220784.
9.16271303772791 7.666822074546214 1
The training loss is 10.11310071378945 with std:15.81297835622198. The val loss is 13.041466399667327 with std:33.616995325677365.
13.041466399667327 7.666822074546214 1
Evaluating for {'degree': 1, 'lmda': 10.0} ...
The training loss is 10.668277049273366 with std:21.950293282490183. The val loss is 12.434115407835106 with std:19.56986105148837.
12.434115407835106 10.0 1
The training loss is 12.15686754798503 with std:25.308543133708348. The val loss is 9.456782447931543 with std:14.350127498388764.
9.456782447931543 10.0 1
The training loss is 10.410930334578754 with std:16.488827508272816. The val loss is 13.396241380150453 with std:34.7905060952534.
13.396241380150453 10.0 1
Evaluating for {'degree': 2, 'lmda': 0.01} ...
The training loss is 6.7044363554655755 with std:13.045062141644117. The val loss is 15.316215955226586 with std:59.119332756798656.
15.316215955226586 0.01 2
The training loss is 8.598125258683003 with std:19.212729552451012. The val loss is 7.99156995209688 with std:12.951818877310963.
7.99156995209688 0.01 2
The training loss is 8.29510948401904 with std:13.781802498065005. The val loss is 8.952883613153553 with std:18.54498325947114.
8.952883613153553 0.01 2
Evaluating for {'degree': 2, 'lmda': 0.013043213867190054} ...
The training loss is 6.704437937514224 with std:13.045719786750915. The val loss is 15.311354741433984 with std:59.096653497127924.
15.311354741433984 0.013043213867190054 2
The training loss is 8.598126753611997 with std:19.213662312745132. The val loss is 7.991394250651952 with std:12.951375422771381.
7.991394250651952 0.013043213867190054 2
The training loss is 8.295110642281404 with std:13.781805938124082. The val loss is 8.953091378110521 with std:18.546931640226436.
8.953091378110521 0.013043213867190054 2
Evaluating for {'degree': 2, 'lmda': 0.017012542798525893} ...
The training loss is 6.704440625212566 with std:13.046578293174084. The val loss is 15.305029844778678 with std:59.06717842478045.
15.305029844778678 0.017012542798525893 2
The training loss is 8.59812929429459 with std:19.214878311724622. The val loss is 7.99116569799717 with std:12.95079805929382.
7.99116569799717 0.017012542798525893 2
The training loss is 8.295112611262008 with std:13.781810966000101. The val loss is 8.953362692556956 with std:18.549472525095734.
8.953362692556956 0.017012542798525893 2
Evaluating for {'degree': 2, 'lmda': 0.02218982341458972} ...
The training loss is 6.704445189336714 with std:13.047699291959942. The val loss is 15.296806745217541 with std:59.02891288278312.
15.296806745217541 0.02218982341458972 2
The training loss is 8.598133610949155 with std:19.21646331131684. The val loss is 7.990868641539196 with std:12.950046763531132.
7.990868641539196 0.02218982341458972 2
The training loss is 8.29511595763712 with std:13.78181844275659. The val loss is 8.95371712087013 with std:18.55278592463591.
8.95371712087013 0.02218982341458972 2
Evaluating for {'degree': 2, 'lmda': 0.028942661247167517} ...
The training loss is 6.7044529356611475 with std:13.049163518431568. The val loss is 15.28612622893057 with std:58.97930567182766.
15.28612622893057 0.028942661247167517 2
The training loss is 8.598140942085191 with std:19.218528870077876. The val loss is 7.990482964518397 with std:12.949069837927382.
7.990482964518397 0.028942661247167517 2
The training loss is 8.295121643233793 with std:13.781829754684418. The val loss is 8.954180337776958 with std:18.55710641967093.
8.954180337776958 0.028942661247167517 2
Evaluating for {'degree': 2, 'lmda': 0.037750532053243954} ...
The training loss is 6.704466073510254 with std:13.051076864182203. The val loss is 15.272271595764337 with std:58.91511383599584.
15.272271595764337 0.037750532053243954 2
The training loss is 8.598153386389699 with std:19.22121998337113. The val loss is 7.989982933587795 with std:12.947800706917976.
7.989982933587795 0.037750532053243954 2
The training loss is 8.29513129943443 with std:13.781847155544831. The val loss is 8.95478609612696 with std:18.56273962082039.
8.95478609612696 0.037750532053243954 2
Evaluating for {'degree': 2, 'lmda': 0.04923882631706739} ...
The training loss is 6.704488334877049 with std:13.053578430271914. The val loss is 15.25432934321127 with std:58.832248774341764.
15.25432934321127 0.04923882631706739 2
The training loss is 8.598174495916401 with std:19.224724903443395. The val loss is 7.989335835767365 with std:12.946153975927285.
7.989335835767365 0.04923882631706739 2
The training loss is 8.295147690769761 with std:13.781874338289175. The val loss is 8.955578867653475 with std:18.570083534231337.
8.955578867653475 0.04923882631706739 2
Evaluating for {'degree': 2, 'lmda': 0.0642232542222936} ...
The training loss is 6.704526010558187 with std:13.056851293535383. The val loss is 15.231143578581479 with std:58.72561202726287.
15.231143578581479 0.0642232542222936 2
The training loss is 8.59821027331184 with std:19.229287703770808. The val loss is 7.9885004336346706 with std:12.944020687370404.
7.9885004336346706 0.0642232542222936 2
The training loss is 8.295175496511295 with std:13.78191739089535. The val loss is 8.956617411437492 with std:18.579656255203638.
8.956617411437492 0.0642232542222936 2
Evaluating for {'degree': 2, 'lmda': 0.0837677640068292} ...
The training loss is 6.704589675815154 with std:13.061136976381954. The val loss is 15.201265840830253 with std:58.588939489731295.
15.201265840830253 0.0837677640068292 2
The training loss is 8.598270842351043 with std:19.235224257739233. The val loss is 7.987425335456509 with std:12.941262789439621.
7.987425335456509 0.0837677640068292 2
The training loss is 8.295222624866984 with std:13.781986393878816. The val loss is 8.957979634870437 with std:18.592131794900194.
8.957979634870437 0.0837677640068292 2
Evaluating for {'degree': 2, 'lmda': 0.10926008611173785} ...
The training loss is 6.704697046638333 with std:13.066755005413713. The val loss is 15.162904586573335 with std:58.414689708524364.
15.162904586573335 0.10926008611173785 2
The training loss is 8.59837323364984 with std:19.242942397334534. The val loss is 7.986047499949583 with std:12.937707000116.
7.986047499949583 0.10926008611173785 2
The training loss is 8.295302415113728 with std:13.782098086293137. The val loss is 8.959769275580925 with std:18.608386312238725.
8.959769275580925 0.10926008611173785 2
Evaluating for {'degree': 2, 'lmda': 0.14251026703029984} ...
The training loss is 6.704877670092471 with std:13.074129500849036. The val loss is 15.11388290046188 with std:58.1940374023319.
15.11388290046188 0.14251026703029984 2
The training loss is 8.598546002886808 with std:19.252967075240175. The val loss is 7.9842913069898405 with std:12.933138556665767.
7.9842913069898405 0.14251026703029984 2
The training loss is 8.295437311371542 with std:13.78228030278224. The val loss is 8.962125177461202 with std:18.629557576938858.
8.962125177461202 0.14251026703029984 2
Evaluating for {'degree': 2, 'lmda': 0.18587918911465645} ...
The training loss is 6.705180549817254 with std:13.083825510792447. The val loss is 15.05161971980486 with std:57.91706813719979.
15.05161971980486 0.18587918911465645 2
The training loss is 8.598836830065697 with std:19.265971325722095. The val loss is 7.982068973885477 with std:12.927295871649923.
7.982068973885477 0.18587918911465645 2
The training loss is 8.295664957242787 with std:13.7825793280463. The val loss is 8.965234300183123 with std:18.657121121954.
8.965234300183123 0.18587918911465645 2
Evaluating for {'degree': 2, 'lmda': 0.24244620170823283} ...
The training loss is 6.705686389603602 with std:13.096598842743607. The val loss is 14.973159591185643 with std:57.5733104321446.
14.973159591185643 0.24244620170823283 2
The training loss is 8.599324906551487 with std:19.282813630419515. The val loss is 7.979283645489729 with std:12.919867980228029.
7.979283645489729 0.24244620170823283 2
The training loss is 8.296048235454302 with std:13.783072018393796. The val loss is 8.969350148036803 with std:18.692987223177383.
8.969350148036803 0.24244620170823283 2
Evaluating for {'degree': 2, 'lmda': 0.31622776601683794} ...
The training loss is 6.706526952844699 with std:13.113464468414213. The val loss is 14.875288607861984 with std:57.1527730930815.
14.875288607861984 0.31622776601683794 2
The training loss is 8.600140889406243 with std:19.304581837626134. The val loss is 7.975837310677145 with std:12.910497990147109.
7.975837310677145 0.31622776601683794 2
The training loss is 8.29669166152986 with std:13.783885633349888. The val loss is 8.974819112740116 with std:18.739623503725987.
8.974819112740116 0.31622776601683794 2
Evaluating for {'degree': 2, 'lmda': 0.41246263829013524} ...
The training loss is 6.707915118275364 with std:13.135790115185195. The val loss is 14.75478785370766 with std:56.64764380707672.
14.75478785370766 0.41246263829013524 2
The training loss is 8.601498583235031 with std:19.332642931293034. The val loss is 7.971646870662691 with std:12.898797653251734.
7.971646870662691 0.41246263829013524 2
The training loss is 8.297767867563344 with std:13.785229980783614. The val loss is 8.982118408344386 with std:18.800208479127114.
8.982118408344386 0.41246263829013524 2
Evaluating for {'degree': 2, 'lmda': 0.5379838403443686} ...
The training loss is 6.710190574521063 with std:13.16542315132922. The val loss is 14.608885046403316 with std:56.05468576979093.
14.608885046403316 0.5379838403443686 2
The training loss is 8.603744377489374 with std:19.368696608286367. The val loss is 7.966673254612952 with std:12.88438070272362.
7.966673254612952 0.5379838403443686 2
The training loss is 8.299559845204667 with std:13.787448935263425. The val loss is 8.991910988039638 with std:18.87882155870565.
8.991910988039638 0.5379838403443686 2
Evaluating for {'degree': 2, 'lmda': 0.701703828670383} ...
The training loss is 6.713887761903353 with std:13.204859776896418. The val loss is 14.435959333797816 with std:55.37805208875348.
14.435959333797816 0.701703828670383 2
The training loss is 8.60743287428216 with std:19.414828898762167. The val loss is 7.9609704576380205 with std:12.866925597246828.
7.9609704576380205 0.701703828670383 2
The training loss is 8.30252735596187 with std:13.791101928295122. The val loss is 9.005125298253208 with std:18.980674687866216.
9.005125298253208 0.701703828670383 2
Evaluating for {'degree': 2, 'lmda': 0.9152473108773893} ...
The training loss is 6.719834847801193 with std:13.257465019005911. The val loss is 14.236516747647782 with std:54.63165500151631.
14.236516747647782 0.9152473108773893 2
The training loss is 8.613440172449975 with std:19.47356058658114. The val loss is 7.9547638199390995 with std:12.846281390413788.
7.9547638199390995 0.9152473108773893 2
The training loss is 8.307409741526175 with std:13.797091016822726. The val loss is 9.023071338095823 with std:19.112389841063468.
9.023071338095823 0.9152473108773893 2
Evaluating for {'degree': 2, 'lmda': 1.1937766417144369} ...
The training loss is 6.729296052397797 with std:13.327749377420998. The val loss is 14.014367312628604 with std:53.839469364960394.
14.014367312628604 1.1937766417144369 2
The training loss is 8.62313033113588 with std:19.547885757367787. The val loss is 7.948570214761011 with std:12.82263319098981.
7.948570214761011 1.1937766417144369 2
The training loss is 8.31538281513866 with std:13.806856266323944. The val loss is 9.047610073359696 with std:19.28232524917871.
9.047610073359696 1.1937766417144369 2
Evaluating for {'degree': 2, 'lmda': 1.5570684047537318} ...
The training loss is 6.744176820970973 with std:13.421703744431431. The val loss is 13.777796665095241 with std:53.03168263190437.
13.777796665095241 1.5570684047537318 2
The training loss is 8.638597119502732 with std:19.641302360924996. The val loss is 7.943378792399464 with std:12.796746716354184.
7.943378792399464 1.5570684047537318 2
The training loss is 8.32829588572267 with std:13.822672850374918. The val loss is 9.081402649712166 with std:19.500952850578653.
9.081402649712166 1.5570684047537318 2
Evaluating for {'degree': 2, 'lmda': 2.030917620904737} ...
The training loss is 6.767326914123069 with std:13.547192421157524. The val loss is 13.540373865860992 with std:52.235347509629214.
13.540373865860992 2.030917620904737 2
The training loss is 8.66301633253613 with std:19.75785631711967. The val loss is 7.9409241822787395 with std:12.770319000981067.
7.9409241822787395 2.030917620904737 2
The training loss is 8.349029375344095 with std:13.848100725269541. The val loss is 9.128283013474151 with std:19.781292998198925.
9.128283013474151 2.030917620904737 2
Evaluating for {'degree': 2, 'lmda': 2.6489692876105297} ...
The training loss is 6.803008074610868 with std:13.714414046388718. The val loss is 13.320975051547634 with std:51.46096748358885.
13.320975051547634 2.6489692876105297 2
The training loss is 8.70117213806165 with std:19.902263869961118. The val loss is 7.944113989787231 with std:12.74648646168807.
7.944113989787231 2.6489692876105297 2
The training loss is 8.382040468074788 with std:13.88866877925309. The val loss is 9.193829926877731 with std:20.139424855127544.
9.193829926877731 2.6489692876105297 2
Evaluating for {'degree': 2, 'lmda': 3.4551072945922217} ...
The training loss is 6.857649861143691 with std:13.936475434246178. The val loss is 13.142777352177479 with std:50.69051608007423.
13.142777352177479 3.4551072945922217 2
The training loss is 8.760277346758164 with std:20.08025539154636. The val loss is 7.95773321444713 with std:12.730605465838277.
7.95773321444713 3.4551072945922217 2
The training loss is 8.4342150051555 with std:13.952931548837432. The val loss is 9.286273492538902 with std:20.59511705767838.
9.286273492538902 3.4551072945922217 2
Evaluating for {'degree': 2, 'lmda': 4.506570337745478} ...
The training loss is 6.9411078811444185 with std:14.230192779782938. The val loss is 13.03148964515796 with std:49.87447668340653.
13.03148964515796 4.506570337745478 2
The training loss is 8.851306577156334 with std:20.299405729173316. The val loss is 7.989653896279457 with std:12.731554045468412.
7.989653896279457 4.506570337745478 2
The training loss is 8.516233218722803 with std:14.054131530180483. The val loss is 9.417970437204646 with std:21.172662126316812.
9.417970437204646 4.506570337745478 2
Evaluating for {'degree': 2, 'lmda': 5.878016072274912} ...
The training loss is 7.068768179363775 with std:14.617329383141568. The val loss is 13.013850616504257 with std:48.941995812550864.
13.013850616504257 5.878016072274912 2
The training loss is 8.991210008336743 with std:20.570879160395528. The val loss is 8.05293251891841 with std:12.764019894504322.
8.05293251891841 5.878016072274912 2
The training loss is 8.64479750169037 with std:14.212835661091216. The val loss is 9.607829525965546 with std:21.902034172287777.
9.607829525965546 5.878016072274912 2
Evaluating for {'degree': 2, 'lmda': 7.666822074546214} ...
The training loss is 7.265011719156934 with std:15.126561125734293. The val loss is 13.118112200621088 with std:47.820326705783934.
13.118112200621088 7.666822074546214 2
The training loss is 9.206564859985834 with std:20.91270184678881. The val loss is 8.169365447750833 with std:12.852489295274891.
8.169365447750833 7.666822074546214 2
The training loss is 8.846263298526495 with std:14.46104664926353. The val loss is 9.88525689450263 with std:22.820488640138272.
9.88525689450263 7.666822074546214 2
Evaluating for {'degree': 2, 'lmda': 10.0} ...
The training loss is 7.568759922675903 with std:15.796462772200105. The val loss is 13.378431853913868 with std:46.454529683150135.
13.378431853913868 10.0 2
The training loss is 9.539427296079273 with std:21.355329806353268. The val loss is 8.375265066611039 with std:13.03775044973556.
8.375265066611039 10.0 2
The training loss is 9.162441122419963 with std:14.848268154814525. The val loss is 10.296394150885124 with std:23.974645697036028.
10.296394150885124 10.0 2
Evaluating for {'degree': 3, 'lmda': 0.01} ...
The training loss is 6.017045262090918 with std:12.977011749053265. The val loss is 211.70023397036982 with std:2061.882635625643.
211.70023397036982 0.01 3
The training loss is 6.6106475783691945 with std:11.60523767958761. The val loss is 9.201695859874079 with std:15.817049563139227.
9.201695859874079 0.01 3
The training loss is 6.64776755320434 with std:9.467847593182029. The val loss is 8.787491039497837 with std:21.603515296481525.
8.787491039497837 0.01 3
Evaluating for {'degree': 3, 'lmda': 0.013043213867190054} ...
The training loss is 6.017102089895987 with std:12.974343277575517. The val loss is 209.26720850146253 with std:2036.8013330098886.
209.26720850146253 0.013043213867190054 3
The training loss is 6.610683390531809 with std:11.605794937837373. The val loss is 9.199768613299595 with std:15.817008797096658.
9.199768613299595 0.013043213867190054 3
The training loss is 6.6477955835305025 with std:9.469563866959213. The val loss is 8.78660264585311 with std:21.596695741896948.
8.78660264585311 0.013043213867190054 3
Evaluating for {'degree': 3, 'lmda': 0.017012542798525893} ...
The training loss is 6.017196587296125 with std:12.970987066510018. The val loss is 206.1516391030608 with std:2004.6841597481523.
206.1516391030608 0.017012542798525893 3
The training loss is 6.610742855858642 with std:11.606520404414715. The val loss is 9.197326093985186 with std:15.816942032489953.
9.197326093985186 0.017012542798525893 3
The training loss is 6.647842646060678 with std:9.471820998495982. The val loss is 8.785454179656679 with std:21.58792682493946.
8.785454179656679 0.017012542798525893 3
Evaluating for {'degree': 3, 'lmda': 0.02218982341458972} ...
The training loss is 6.0173527387468395 with std:12.96681075780986. The val loss is 202.18390101514657 with std:1963.7827525747182.
202.18390101514657 0.02218982341458972 3
The training loss is 6.61084096919843 with std:11.607466438210647. The val loss is 9.194254362197182 with std:15.816832438149255.
9.194254362197182 0.02218982341458972 3
The training loss is 6.6479213641261214 with std:9.474794058924244. The val loss is 8.783973501523809 with std:21.576697808345948.
8.783973501523809 0.02218982341458972 3
Evaluating for {'degree': 3, 'lmda': 0.028942661247167517} ...
The training loss is 6.017608757351153 with std:12.961685008890697. The val loss is 197.1665768910006 with std:1912.0624160900736.
197.1665768910006 0.028942661247167517 3
The training loss is 6.611001600345888 with std:11.608704167210531. The val loss is 9.190427715697224 with std:15.816652594073409.
9.190427715697224 0.028942661247167517 3
The training loss is 6.64805239702544 with std:9.478716123584292. The val loss is 8.782071178277997 with std:21.56239401694151.
8.782071178277997 0.028942661247167517 3
Evaluating for {'degree': 3, 'lmda': 0.037750532053243954} ...
The training loss is 6.018024483245307 with std:12.955504617606321. The val loss is 190.87950323797529 with std:1847.2542510676897.
190.87950323797529 0.037750532053243954 3
The training loss is 6.611262163846971 with std:11.610332766214631. The val loss is 9.185714686665955 with std:15.816358402549369.
9.185714686665955 0.037750532053243954 3
The training loss is 6.648269188995564 with std:9.483896778686871. The val loss is 8.779638178804026 with std:21.54429477499786.
8.779638178804026 0.037750532053243954 3
Evaluating for {'degree': 3, 'lmda': 0.04923882631706739} ...
The training loss is 6.018691651172426 with std:12.94822185274969. The val loss is 183.09247200864402 with std:1766.9862573649623.
183.09247200864402 0.04923882631706739 3
The training loss is 6.61168026349346 with std:11.612494815801233. The val loss is 9.179987669949238 with std:15.815880209829258.
9.179987669949238 0.04923882631706739 3
The training loss is 6.648625140695741 with std:9.49074490360622. The val loss is 8.77654457664624 with std:21.521585264633252.
8.77654457664624 0.04923882631706739 3
Evaluating for {'degree': 3, 'lmda': 0.0642232542222936} ...
The training loss is 6.019747265977714 with std:12.939894392058454. The val loss is 173.58907766476918 with std:1669.0294285646726.
173.58907766476918 0.0642232542222936 3
The training loss is 6.612342788684788 with std:11.615401538015332. The val loss is 9.173135999530526 with std:15.815110606771654.
9.173135999530526 0.0642232542222936 3
The training loss is 6.649204053720465 with std:9.499794987454152. The val loss is 8.772640353282979 with std:21.493391575046473.
8.772640353282979 0.0642232542222936 3
Evaluating for {'degree': 3, 'lmda': 0.0837677640068292} ...
The training loss is 6.021389486456457 with std:12.930748311527331. The val loss is 162.20474864006735 with std:1551.6902335342716.
162.20474864006735 0.0837677640068292 3
The training loss is 6.613377780125153 with std:11.619372935630532. The val loss is 9.165081149527335 with std:15.813888980472017.
9.165081149527335 0.0837677640068292 3
The training loss is 6.650134618384989 with std:9.511734537089135. The val loss is 8.767759877650404 with std:21.458850987459325.
8.767759877650404 0.0837677640068292 3
Evaluating for {'degree': 3, 'lmda': 0.10926008611173785} ...
The training loss is 6.023893800481815 with std:12.921253045455629. The val loss is 148.87949029355437 with std:1414.3550529533122.
148.87949029355437 0.10926008611173785 3
The training loss is 6.614968904694576 with std:11.624898134961539. The val loss is 9.155791490588015 with std:15.811984072444448.
9.155791490588015 0.10926008611173785 3
The training loss is 6.65160919868823 with std:9.527427647253202. The val loss is 8.76173207199253 with std:21.417231344749233.
8.76173207199253 0.10926008611173785 3
Evaluating for {'degree': 3, 'lmda': 0.14251026703029984} ...
The training loss is 6.027624780738613 with std:12.91220038191477. The val loss is 133.72018442534446 with std:1258.1337022873133.
133.72018442534446 0.14251026703029984 3
The training loss is 6.617371321019776 with std:11.632719085267532. The val loss is 9.14529321048314 with std:15.809077417178164.
9.14529321048314 0.14251026703029984 3
The training loss is 6.6539059145897825 with std:9.547926982341163. The val loss is 8.754398438612766 with std:21.368112354891185.
8.754398438612766 0.14251026703029984 3
Evaluating for {'degree': 3, 'lmda': 0.18587918911465645} ...
The training loss is 6.033035869457525 with std:12.904774386945462. The val loss is 117.05836034048673 with std:1086.4573891879752.
117.05836034048673 0.18587918911465645 3
The training loss is 6.6209258321664075 with std:11.643934764366545. The val loss is 9.133673970604963 with std:15.804751906047368.
9.133673970604963 0.18587918911465645 3
The training loss is 6.65741085125739 with std:9.57446456759771. The val loss is 8.745640756134636 with std:21.311635219807346.
8.745640756134636 0.18587918911465645 3
Evaluating for {'degree': 3, 'lmda': 0.24244620170823283} ...
The training loss is 6.04064827968954 with std:12.900595772675697. The val loss is 99.47982780369601 with std:905.3888060973825.
99.47982780369601 0.24244620170823283 3
The training loss is 6.626065739565154 with std:11.660113250248159. The val loss is 9.121076375681644 with std:15.798489626033792.
9.121076375681644 0.24244620170823283 3
The training loss is 6.662634506763177 with std:9.608413273562828. The val loss is 8.735419276427598 with std:21.248812580636702.
8.735419276427598 0.24244620170823283 3
Evaluating for {'degree': 3, 'lmda': 0.31622776601683794} ...
The training loss is 6.051003579341295 with std:12.901726201742802. The val loss is 81.79891283696485 with std:723.3655448935742.
81.79891283696485 0.31622776601683794 3
The training loss is 6.63330923626014 with std:11.683388903774482. The val loss is 9.107678794357758 with std:15.78968029182568.
9.107678794357758 0.31622776601683794 3
The training loss is 6.670214820477808 with std:9.651218639526721. The val loss is 8.723821104811405 with std:21.181868904043945.
8.723821104811405 0.31622776601683794 3
Evaluating for {'degree': 3, 'lmda': 0.41246263829013524} ...
The training loss is 6.064596181009116 with std:12.910629826086803. The val loss is 64.96040774418944 with std:550.2074007451299.
64.96040774418944 0.41246263829013524 3
The training loss is 6.643232726558743 with std:11.716518892222048. The val loss is 9.093661858106117 with std:15.777635678980525.
9.093661858106117 0.41246263829013524 3
The training loss is 6.680901132808371 with std:9.704316320275788. The val loss is 8.711118324246616 with std:21.11455957828043.
8.711118324246616 0.41246263829013524 3
Evaluating for {'degree': 3, 'lmda': 0.5379838403443686} ...
The training loss is 6.081811172417563 with std:12.930113014939135. The val loss is 49.87956065376051 with std:395.50469216186116.
49.87956065376051 0.5379838403443686 3
The training loss is 6.6564307906957625 with std:11.762885734328316. The val loss is 9.079162536420682 with std:15.761598238429151.
9.079162536420682 0.5379838403443686 3
The training loss is 6.695522286677666 with std:9.769071098540586. The val loss is 8.697838063310648 with std:21.052410936390505.
8.697838063310648 0.5379838403443686 3
Evaluating for {'degree': 3, 'lmda': 0.701703828670383} ...
The training loss is 6.102911597953306 with std:12.96329245450644. The val loss is 37.26417444422613 with std:266.8675964813886.
37.26417444422613 0.701703828670383 3
The training loss is 6.673487107285675 with std:11.826462684940095. The val loss is 9.064227209867843 with std:15.740728034820723.
9.064227209867843 0.701703828670383 3
The training loss is 6.714958379707169 with std:9.846790186340916. The val loss is 8.684853595852113 with std:21.002849800947498.
8.684853595852113 0.701703828670383 3
Evaluating for {'degree': 3, 'lmda': 0.9152473108773893} ...
The training loss is 6.128124684587599 with std:13.013660082187345. The val loss is 27.48128768963449 with std:168.73808533697243.
27.48128768963449 0.9152473108773893 3
The training loss is 6.69500113528874 with std:11.911795830214828. The val loss is 9.04879042156289 with std:15.714056839785796.
9.04879042156289 0.9152473108773893 3
The training loss is 6.740154859405374 with std:9.938865622025181. The val loss is 8.673516990290095 with std:20.975250800590935.
8.673516990290095 0.9152473108773893 3
Evaluating for {'degree': 3, 'lmda': 1.1937766417144369} ...
The training loss is 6.157862005837817 with std:13.085303560090287. The val loss is 20.516776972687698 with std:102.31268976976686.
20.516776972687698 1.1937766417144369 3
The training loss is 6.7217259713471815 with std:12.024079170242508. The val loss is 9.03272046480109 with std:15.680413078521267.
9.03272046480109 1.1937766417144369 3
The training loss is 6.77223103304223 with std:10.047084434767566. The val loss is 8.66586833438994 with std:20.980995562396412.
8.66586833438994 1.1937766417144369 3
Evaluating for {'degree': 3, 'lmda': 1.5570684047537318} ...
The training loss is 6.1930835987507935 with std:13.183302581139081. The val loss is 16.035073579164244 with std:65.95948478205136.
16.035073579164244 1.5570684047537318 3
The training loss is 6.75486593632834 with std:12.169390626626148. The val loss is 9.015977673696094 with std:15.638339980367935.
9.015977673696094 1.5570684047537318 3
The training loss is 6.812736160705705 with std:10.174120362644238. The val loss is 8.66497109286959 with std:21.03366650337603.
8.66497109286959 1.5570684047537318 3
Evaluating for {'degree': 3, 'lmda': 2.030917620904737} ...
The training loss is 6.235802848749974 with std:13.314276898228583. The val loss is 13.504557972475395 with std:52.78415045278603.
13.504557972475395 2.030917620904737 3
The training loss is 6.796566399582614 with std:12.355127120576661. The val loss is 8.998925464169021 with std:15.58604461055822.
8.998925464169021 2.030917620904737 3
The training loss is 6.864098515862071 with std:10.32420512343111. The val loss is 8.675434719793284 with std:21.149464304843836.
8.675434719793284 2.030917620904737 3
Evaluating for {'degree': 3, 'lmda': 2.6489692876105297} ...
The training loss is 6.289748362530641 with std:13.487043891039868. The val loss is 12.336853970670896 with std:50.14395768810306.
12.336853970670896 2.6489692876105297 3
The training loss is 6.8506285284127895 with std:12.590661537499471. The val loss is 8.98283401107857 with std:15.521438000511107.
8.98283401107857 2.6489692876105297 3
The training loss is 6.930311756443381 with std:10.50398812892455. The val loss is 8.704195799785234 with std:21.347863062996126.
8.704195799785234 2.6489692876105297 3
Evaluating for {'degree': 3, 'lmda': 3.4551072945922217} ...
The training loss is 6.361260455992137 with std:13.713372709415145. The val loss is 11.99880393067873 with std:49.66973326541665.
11.99880393067873 3.4551072945922217 3
The training loss is 6.923526808590446 with std:12.888279289431544. The val loss is 8.97065171234494 with std:15.442397216011184.
8.97065171234494 3.4551072945922217 3
The training loss is 7.017932075646394 with std:10.72364514448462. The val loss is 8.76165050351781 with std:21.65245764056516.
8.76165050351781 3.4551072945922217 3
Evaluating for {'degree': 3, 'lmda': 4.506570337745478} ...
The training loss is 6.460604335549625 with std:14.008899494025174. The val loss is 12.079561482641514 with std:49.24054795332804.
12.079561482641514 4.506570337745478 3
The training loss is 7.025913048157945 with std:13.264538230091109. The val loss is 8.968218862478444 with std:15.347560123257228.
8.968218862478444 4.506570337745478 3
The training loss is 7.137543293254653 with std:10.998395659330972. The val loss is 8.863291494113174 with std:22.09196690634105.
8.863291494113174 4.506570337745478 3
Evaluating for {'degree': 3, 'lmda': 5.878016072274912} ...
The training loss is 6.60403005970318 with std:14.394379036586399. The val loss is 12.315343367075958 with std:48.7128780247369.
12.315343367075958 5.878016072274912 3
The training loss is 7.174955543966469 with std:13.742285317482107. The val loss is 8.986273354643656 with std:15.238304388873825.
8.986273354643656 5.878016072274912 3
The training loss is 7.306003233208494 with std:11.35071705636644. The val loss is 9.032136054524413 with std:22.701443574963537.
9.032136054524413 5.878016072274912 3
Evaluating for {'degree': 3, 'lmda': 7.666822074546214} ...
The training loss is 6.817101048452411 with std:14.897563096600114. The val loss is 12.583921547622014 with std:48.06803866213907.
12.583921547622014 7.666822074546214 3
The training loss is 7.39807302744819 with std:14.353575883032649. The val loss is 9.04383826703158 with std:15.12306709844279.
9.04383826703158 7.666822074546214 3
The training loss is 7.550014106364227 with std:11.81362553403233. The val loss is 9.30245815082631 with std:23.52385605638643.
9.30245815082631 7.666822074546214 3
Evaluating for {'degree': 3, 'lmda': 10.0} ...
The training loss is 7.140038268827526 with std:15.556057987785113. The val loss is 12.8821472569622 with std:47.23747190873389.
12.8821472569622 10.0 3
The training loss is 7.7388567091147324 with std:15.143596176941328. The val loss is 9.173851074991186 with std:15.025719230629319.
9.173851074991186 10.0 3
The training loss is 7.911832751737521 with std:12.43528083141069. The val loss is 9.725630349508593 with std:24.612257634861386.
9.725630349508593 10.0 3
Evaluating for {'degree': 4, 'lmda': 0.01} ...
The training loss is 5.49453418792616 with std:12.855823407327463. The val loss is 22.079765538121766 with std:97.81628711065353.
22.079765538121766 0.01 4
The training loss is 5.541014306186811 with std:10.035180847858054. The val loss is 8.683355068840802 with std:15.585395211886555.
8.683355068840802 0.01 4
The training loss is 4.822608628615276 with std:7.628460871563311. The val loss is 10.469159191105248 with std:29.02062497697947.
10.469159191105248 0.01 4
Evaluating for {'degree': 4, 'lmda': 0.013043213867190054} ...
The training loss is 5.496964378908754 with std:12.869658690462815. The val loss is 40.61326226594881 with std:267.63541426359683.
40.61326226594881 0.013043213867190054 4
The training loss is 5.548277166287378 with std:10.04863136378275. The val loss is 8.683972853726617 with std:15.587556326301163.
8.683972853726617 0.013043213867190054 4
The training loss is 4.83680907502573 with std:7.656279581969515. The val loss is 10.37183853800777 with std:28.696070653376754.
10.37183853800777 0.013043213867190054 4
Evaluating for {'degree': 4, 'lmda': 0.017012542798525893} ...
The training loss is 5.50037201543194 with std:12.885909543250174. The val loss is 73.63250855474345 with std:598.7031272368763.
73.63250855474345 0.017012542798525893 4
The training loss is 5.555009663046193 with std:10.061914372306436. The val loss is 8.686400623817901 with std:15.592619960797217.
8.686400623817901 0.017012542798525893 4
The training loss is 4.853396747351929 with std:7.69065699386316. The val loss is 10.270484268095545 with std:28.349871995786078.
10.270484268095545 0.017012542798525893 4
Evaluating for {'degree': 4, 'lmda': 0.02218982341458972} ...
The training loss is 5.504999940653214 with std:12.904354742767131. The val loss is 124.30059735687949 with std:1120.4505177505068.
124.30059735687949 0.02218982341458972 4
The training loss is 5.561271640482526 with std:10.075452853044599. The val loss is 8.689962033165493 with std:15.599392781718429.
8.689962033165493 0.02218982341458972 4
The training loss is 4.872565468560109 with std:7.732480423077764. The val loss is 10.167525677438718 with std:27.99102196419992.
10.167525677438718 0.02218982341458972 4
Evaluating for {'degree': 4, 'lmda': 0.028942661247167517} ...
The training loss is 5.511060841296885 with std:12.924433840416107. The val loss is 194.04996280338494 with std:1847.7425246132304.
194.04996280338494 0.028942661247167517 4
The training loss is 5.567185954881802 with std:10.08976809833442. The val loss is 8.693947901600161 with std:15.60643744492315.
8.693947901600161 0.028942661247167517 4
The training loss is 4.894352381198367 with std:7.7822851587081265. The val loss is 10.065127722372639 with std:27.628388835172636.
10.065127722372639 0.028942661247167517 4
Evaluating for {'degree': 4, 'lmda': 0.037750532053243954} ...
The training loss is 5.5186970895446725 with std:12.945200897330013. The val loss is 281.4556307295042 with std:2765.931326225595.
281.4556307295042 0.037750532053243954 4
The training loss is 5.57291413092613 with std:10.105444644082432. The val loss is 8.697641841802874 with std:15.61218244442992.
8.697641841802874 0.037750532053243954 4
The training loss is 4.918616717271848 with std:7.840139510065997. The val loss is 9.964812543582305 with std:27.26934700178998.
9.964812543582305 0.037750532053243954 4
Evaluating for {'degree': 4, 'lmda': 0.04923882631706739} ...
The training loss is 5.527951012203623 with std:12.965351233316518. The val loss is 381.6036496078282 with std:3823.2947713234967.
381.6036496078282 0.04923882631706739 4
The training loss is 5.5786415053261695 with std:10.123101185834528. The val loss is 8.700357991085342 with std:15.615084967962344.
8.700357991085342 0.04923882631706739 4
The training loss is 4.945074624135558 with std:7.905633292503955. The val loss is 9.867263346927286 with std:26.918801256311227.
9.867263346927286 0.04923882631706739 4
Evaluating for {'degree': 4, 'lmda': 0.0642232542222936} ...
The training loss is 5.538758677067888 with std:12.983335786241687. The val loss is 486.3663480250544 with std:4933.669032019014.
486.3663480250544 0.0642232542222936 4
The training loss is 5.584572669373978 with std:10.143367938080702. The val loss is 8.701485489182854 with std:15.613817009074054.
8.701485489182854 0.0642232542222936 4
The training loss is 4.973383602088622 with std:7.977993614551524. The val loss is 9.772345547270037 with std:26.578828810743303.
9.772345547270037 0.0642232542222936 4
Evaluating for {'degree': 4, 'lmda': 0.0837677640068292} ...
The training loss is 5.5509732315832885 with std:12.99754665046682. The val loss is 585.641057858018 with std:5989.55078030236.
585.641057858018 0.0837677640068292 4
The training loss is 5.590933596604311 with std:10.166869470350875. The val loss is 8.70052787937072 with std:15.607425661681889.
8.70052787937072 0.0837677640068292 4
The training loss is 5.00325843205002 with std:8.056319892130633. The val loss is 9.679313884666467 with std:26.248993115691945.
9.679313884666467 0.0837677640068292 4
Evaluating for {'degree': 4, 'lmda': 0.10926008611173785} ...
The training loss is 5.564415203839597 with std:13.006532493928969. The val loss is 669.1660078856283 with std:6881.471391218834.
669.1660078856283 0.10926008611173785 4
The training loss is 5.597976023907608 with std:10.194214133510492. The val loss is 8.6971289556369 with std:15.595419271606815.
8.6971289556369 0.10926008611173785 4
The training loss is 5.034594092787023 with std:8.139888734405476. The val loss is 9.587146191220759 with std:25.92722626787555.
9.587146191220759 0.10926008611173785 4
Evaluating for {'degree': 4, 'lmda': 0.14251026703029984} ...
The training loss is 5.578942181646626 with std:13.009201346062929. The val loss is 728.29790045928 with std:7517.022964417394.
728.29790045928 0.14251026703029984 4
The training loss is 5.605982286745587 with std:10.225995905946066. The val loss is 8.69108601854128 with std:15.57775226885773.
8.69108601854128 0.14251026703029984 4
The training loss is 5.067565402543455 with std:8.228440040588481. The val loss is 9.494937885964825 with std:25.611041621013356.
9.494937885964825 0.14251026703029984 4
Evaluating for {'degree': 4, 'lmda': 0.18587918911465645} ...
The training loss is 5.594528890425567 with std:13.004990474409652. The val loss is 757.2606900361499 with std:7834.24895133953.
757.2606900361499 0.18587918911465645 4
The training loss is 5.61527178648741 with std:10.26281803859419. The val loss is 8.68235750260821 with std:15.554711529094885.
8.68235750260821 0.18587918911465645 4
The training loss is 5.10266994194846 with std:8.322341454995287. The val loss is 9.402280484061802 with std:25.298733071451142.
9.402280484061802 0.18587918911465645 4
Evaluating for {'degree': 4, 'lmda': 0.24244620170823283} ...
The training loss is 5.61134674464626 with std:12.994010189344936. The val loss is 753.7130832682071 with std:7807.734462176839.
753.7130832682071 0.24244620170823283 4
The training loss is 5.6262102800927 with std:10.305346662349354. The val loss is 8.671066709335415 with std:15.526730113801708.
8.671066709335415 0.24244620170823283 4
The training loss is 5.14068811178046 with std:8.422557002916328. The val loss is 9.309531828652455 with std:24.990181461180228.
9.309531828652455 0.24244620170823283 4
Evaluating for {'degree': 4, 'lmda': 0.31622776601683794} ...
The training loss is 5.6298276472056425 with std:12.97718030552941. The val loss is 718.76457759071 with std:7448.772138752831.
718.76457759071 0.31622776601683794 4
The training loss is 5.639220709695879 with std:10.354397593730486. The val loss is 8.657488914602594 with std:15.49416024151005.
8.657488914602594 0.31622776601683794 4
The training loss is 5.182557466626313 with std:8.530413160536435. The val loss is 9.217889473858444 with std:24.68697977219486.
9.217889473858444 0.31622776601683794 4
Evaluating for {'degree': 4, 'lmda': 0.41246263829013524} ...
The training loss is 5.650693793760757 with std:12.956369368978269. The val loss is 656.6393119205387 with std:6801.748013674069.
656.6393119205387 0.41246263829013524 4
The training loss is 5.654794030295245 with std:10.411056457467836. The val loss is 8.641998189898434 with std:15.45702915222238.
8.641998189898434 0.41246263829013524 4
The training loss is 5.229194881502856 with std:8.647237389181278. The val loss is 9.129226123571115 with std:24.39183595602414.
9.129226123571115 0.41246263829013524 4
Evaluating for {'degree': 4, 'lmda': 0.5379838403443686} ...
The training loss is 5.674939321864456 with std:12.934528827779591. The val loss is 574.0871833274168 with std:5937.842200538598.
574.0871833274168 0.5379838403443686 4
The training loss is 5.673504326386214 with std:10.47683473988104. The val loss is 8.624957236220357 with std:15.414792699625895.
8.624957236220357 0.5379838403443686 4
The training loss is 5.2813362995792215 with std:8.774007317000668. The val loss is 9.045735726559764 with std:24.10755384736411.
9.045735726559764 0.5379838403443686 4
Evaluating for {'degree': 4, 'lmda': 0.701703828670383} ...
The training loss is 5.703764497401604 with std:12.915803919433058. The val loss is 479.52935700600506 with std:4945.93936666308.
479.52935700600506 0.701703828670383 4
The training loss is 5.696042165470491 with std:10.55386990808786. The val loss is 8.606561632344086 with std:15.366105374206759.
8.606561632344086 0.701703828670383 4
The training loss is 5.339478426227716 with std:8.91116341831448. The val loss is 8.969532423058567 with std:23.836161629526604.
8.969532423058567 0.701703828670383 4
Evaluating for {'degree': 4, 'lmda': 0.9152473108773893} ...
The training loss is 5.738488665617641 with std:12.90561076183983. The val loss is 381.9292402301727 with std:3920.667730223503.
381.9292402301727 0.9152473108773893 4
The training loss is 5.723287293710031 with std:10.645181775783117. The val loss is 8.586687494517097 with std:15.308644536312475.
8.586687494517097 0.9152473108773893 4
The training loss is 5.4039838033294245 with std:9.05869349329032. The val loss is 8.902387713482963 with std:23.578777632423144.
8.902387713482963 0.9152473108773893 4
Evaluating for {'degree': 4, 'lmda': 1.1937766417144369} ...
The training loss is 5.780500147891637 with std:12.910694483197437. The val loss is 289.5113253249901 with std:2948.857950831452.
289.5113253249901 1.1937766417144369 4
The training loss is 5.756445158280604 with std:10.755002156707677. The val loss is 8.564821881850422 with std:15.239052620654387.
8.564821881850422 1.1937766417144369 4
The training loss is 5.475359535273206 with std:9.21651771977009. The val loss is 8.845759293378526 with std:23.336549284788678.
8.845759293378526 1.1937766417144369 4
Evaluating for {'degree': 4, 'lmda': 1.5570684047537318} ...
The training loss is 5.831331035167501 with std:12.939217665039806. The val loss is 208.6035295334616 with std:2097.322952239131.
208.6035295334616 1.5570684047537318 4
The training loss is 5.797279996668684 with std:10.889208807384792. The val loss is 8.540169867108192 with std:15.153083254184693.
8.540169867108192 1.5570684047537318 4
The training loss is 5.554678476444099 with std:9.38513925656055. The val loss is 8.801189978484551 with std:23.112640471068627.
8.801189978484551 1.5570684047537318 4
Evaluating for {'degree': 4, 'lmda': 2.030917620904737} ...
The training loss is 5.892962104728543 with std:13.000960667105755. The val loss is 142.91758490582092 with std:1405.2688744216366.
142.91758490582092 2.030917620904737 4
The training loss is 5.848501003447301 with std:11.05591858662683. The val loss is 8.512033804682893 with std:15.046048522175415.
8.512033804682893 2.030917620904737 4
The training loss is 5.644119387117852 with std:9.56651680939439. The val loss is 8.771098775481752 with std:22.914981070100858.
8.771098775481752 2.030917620904737 4
Evaluating for {'degree': 4, 'lmda': 2.6489692876105297} ...
The training loss is 5.968463352905152 with std:13.107733530717843. The val loss is 93.45204892481595 with std:883.2801597260906.
93.45204892481595 2.6489692876105297 4
The training loss is 5.914397795011417 with std:11.266312443403583. The val loss is 8.4805562185041 with std:14.913663852446584.
8.4805562185041 2.6489692876105297 4
The training loss is 5.747666128712367 with std:9.765161229979142. The val loss is 8.75998421068735 with std:22.75939136442403.
8.75998421068735 2.6489692876105297 4
Evaluating for {'degree': 4, 'lmda': 3.4551072945922217} ...
The training loss is 6.063070134578826 with std:13.274092812635713. The val loss is 58.97505190583068 with std:518.4350274149142.
58.97505190583068 3.4551072945922217 4
The training loss is 6.001868658943307 with std:11.535754691518486. The val loss is 8.447917257658311 with std:14.753388070608107.
8.447917257658311 3.4551072945922217 4
The training loss is 5.872096387877253 with std:9.989529762855456. The val loss is 8.77609837224171 with std:22.67265989850488.
8.77609837224171 3.4551072945922217 4
Evaluating for {'degree': 4, 'lmda': 4.506570337745478} ...
The training loss is 6.185815192070452 with std:13.518429001472157. The val loss is 36.853343508119075 with std:283.1301100065626.
36.853343508119075 4.506570337745478 4
The training loss is 6.1220403863987745 with std:11.885230959138777. The val loss is 8.420100535936117 with std:14.566406124397174.
8.420100535936117 4.506570337745478 4
The training loss is 6.0284780738288415 with std:10.253868571367207. The val loss is 8.83369660604662 with std:22.695078726975737.
8.83369660604662 4.506570337745478 4
Evaluating for {'degree': 4, 'lmda': 5.878016072274912} ...
The training loss is 6.351925098908919 with std:13.864468579972534. The val loss is 23.932514675653255 with std:144.62744491386303.
23.932514675653255 5.878016072274912 4
The training loss is 6.29276578192878 with std:12.343098637850654. The val loss is 8.409423616563481 with std:14.360586193360014.
8.409423616563481 5.878016072274912 4
The training loss is 6.234484985612076 with std:10.580732243395234. The val loss is 8.956026546803512 with std:22.881849796776372.
8.956026546803512 5.878016072274912 4
Evaluating for {'degree': 4, 'lmda': 7.666822074546214} ...
The training loss is 6.586375595197952 with std:14.343254940105405. The val loss is 17.238385288122487 with std:73.10919790168244.
17.238385288122487 7.666822074546214 4
The training loss is 6.542442811464894 with std:12.947170166864597. The val loss is 8.43821217344154 with std:14.15514159728448.
8.43821217344154 7.666822074546214 4
The training loss is 6.517992114491342 with std:11.004460888116856. The val loss is 9.179354811042838 with std:23.30288184890475.
9.179354811042838 7.666822074546214 4
Evaluating for {'degree': 4, 'lmda': 10.0} ...
The training loss is 6.929277296260825 with std:14.995731029377463. The val loss is 14.394504381796809 with std:45.6660581499365.
14.394504381796809 10.0 4
The training loss is 6.9158481460162085 with std:13.747244835192715. The val loss is 8.544293683638655 with std:13.988281165741894.
8.544293683638655 10.0 4
The training loss is 6.922661799549598 with std:11.575813193609257. The val loss is 9.558620855045529 with std:24.041058184234032.
9.558620855045529 10.0 4
Evaluating for {'degree': 5, 'lmda': 0.01} ...
The training loss is 4.936926782988398 with std:11.513206728969474. The val loss is 105.72082716288496 with std:1010.0521578173419.
105.72082716288496 0.01 5
The training loss is 4.940297743419149 with std:9.49844329772429. The val loss is 8.395646441203317 with std:14.8714831343603.
8.395646441203317 0.01 5
The training loss is 4.434878635237778 with std:7.368095311731808. The val loss is 10.315785066856357 with std:29.135045405532868.
10.315785066856357 0.01 5
Evaluating for {'degree': 5, 'lmda': 0.013043213867190054} ...
The training loss is 4.952185939816373 with std:11.56051483330267. The val loss is 300.65414009517855 with std:3174.252221275265.
300.65414009517855 0.013043213867190054 5
The training loss is 4.961818408979613 with std:9.531835897165866. The val loss is 8.356953671611963 with std:14.811509692804275.
8.356953671611963 0.013043213867190054 5
The training loss is 4.4419157814091825 with std:7.375470209012972. The val loss is 10.277284001950571 with std:28.98226965900228.
10.277284001950571 0.013043213867190054 5
Evaluating for {'degree': 5, 'lmda': 0.017012542798525893} ...
The training loss is 4.967457612775544 with std:11.609890154625608. The val loss is 462.87213863108855 with std:4977.2916344640735.
462.87213863108855 0.017012542798525893 5
The training loss is 4.982279855761824 with std:9.563445269981345. The val loss is 8.32135570515499 with std:14.75818660515293.
8.32135570515499 0.017012542798525893 5
The training loss is 4.450079227511827 with std:7.385492383140131. The val loss is 10.228626674212789 with std:28.793115209488747.
10.228626674212789 0.017012542798525893 5
Evaluating for {'degree': 5, 'lmda': 0.02218982341458972} ...
The training loss is 4.983060752534799 with std:11.66225049846415. The val loss is 519.3642975934704 with std:5611.153558668142.
519.3642975934704 0.02218982341458972 5
The training loss is 5.002152077626161 with std:9.59390099787025. The val loss is 8.288132860699994 with std:14.709874780034053.
8.288132860699994 0.02218982341458972 5
The training loss is 4.460031997723067 with std:7.399295791373032. The val loss is 10.170065400969444 with std:28.565625218330528.
10.170065400969444 0.02218982341458972 5
Evaluating for {'degree': 5, 'lmda': 0.028942661247167517} ...
The training loss is 4.9995046433136885 with std:11.718539230207757. The val loss is 457.23314281549176 with std:4928.996590623373.
457.23314281549176 0.028942661247167517 5
The training loss is 5.022198138196182 with std:9.624084882725379. The val loss is 8.25706948993484 with std:14.66518881024333.
8.25706948993484 0.028942661247167517 5
The training loss is 4.472650299177903 with std:7.418301617229651. The val loss is 10.102183455608357 with std:28.298837703788898.
10.102183455608357 0.028942661247167517 5
Evaluating for {'degree': 5, 'lmda': 0.037750532053243954} ...
The training loss is 5.01741616934315 with std:11.779469299185084. The val loss is 311.3068978655425 with std:3308.2831571856404.
311.3068978655425 0.037750532053243954 5
The training loss is 5.043396366121254 with std:9.655133352817217. The val loss is 8.22844388914432 with std:14.623029837097322.
8.22844388914432 0.037750532053243954 5
The training loss is 4.4889962895258515 with std:7.44418611926383. The val loss is 10.025768845063405 with std:27.992831672851885.
10.025768845063405 0.037750532053243954 5
Evaluating for {'degree': 5, 'lmda': 0.04923882631706739} ...
The training loss is 5.037419777182291 with std:11.845252542113117. The val loss is 146.677062943112 with std:1460.2251274063044.
146.677062943112 0.04923882631706739 5
The training loss is 5.066823279108711 with std:9.688433184743891. The val loss is 8.202887146883546 with std:14.58260254082969.
8.202887146883546 0.04923882631706739 5
The training loss is 4.510251875293698 with std:7.47879724073958. The val loss is 9.941628216124707 with std:27.648643035910347.
9.941628216124707 0.04923882631706739 5
Evaluating for {'degree': 5, 'lmda': 0.0642232542222936} ...
The training loss is 5.059992710690221 with std:11.915377111985975. The val loss is 37.408861016073345 with std:213.79136744965723.
37.408861016073345 0.0642232542222936 5
The training loss is 5.0935023903782675 with std:9.725575474574486. The val loss is 8.181134146652077 with std:14.543425703155286.
8.181134146652077 0.0642232542222936 5
The training loss is 4.537606624831463 with std:7.5240145349380585. The val loss is 9.850401143658338 with std:27.268054208907362.
9.850401143658338 0.0642232542222936 5
Evaluating for {'degree': 5, 'lmda': 0.0837677640068292} ...
The training loss is 5.08533735017133 with std:11.988510541373508. The val loss is 44.482214336716766 with std:240.77957994986838.
44.482214336716766 0.0837677640068292 5
The training loss is 5.124237251635372 with std:9.76823142748771. The val loss is 8.163725827868438 with std:14.505326300189724.
8.163725827868438 0.0837677640068292 5
The training loss is 4.572101279153988 with std:7.581554319189056. The val loss is 9.752440389206662 with std:26.853280275614363.
9.752440389206662 0.0837677640068292 5
Evaluating for {'degree': 5, 'lmda': 0.10926008611173785} ...
The training loss is 5.113320990418594 with std:12.062591384682563. The val loss is 198.11130177551323 with std:1781.852641757793.
198.11130177551323 0.10926008611173785 5
The training loss is 5.159463602134242 with std:9.817945235929432. The val loss is 8.150750515606878 with std:14.468389578485732.
8.150750515606878 0.10926008611173785 5
The training loss is 4.614447278484123 with std:7.6527402162723295. The val loss is 9.647792060380874 with std:26.406595805024295.
9.647792060380874 0.10926008611173785 5
Evaluating for {'degree': 5, 'lmda': 0.14251026703029984} ...
The training loss is 5.143519338466864 with std:12.135122616380706. The val loss is 489.6984932653599 with std:4866.232108436112.
489.6984932653599 0.14251026703029984 5
The training loss is 5.199164110443135 with std:9.875896564270972. The val loss is 8.141712733696009 with std:14.432842547284725.
8.141712733696009 0.14251026703029984 5
The training loss is 4.6648699547430486 with std:7.738290524308955. The val loss is 9.536271241945164 with std:25.929995960614196.
9.536271241945164 0.14251026703029984 5
Evaluating for {'degree': 5, 'lmda': 0.18587918911465645} ...
The training loss is 5.17536385147043 with std:12.203614757625827. The val loss is 875.7068440870811 with std:8995.437007333638.
875.7068440870811 0.18587918911465645 5
The training loss is 5.24287879973583 with std:9.942730520397628. The val loss is 8.135579155983926 with std:14.398882435072132.
8.135579155983926 0.18587918911465645 5
The training loss is 4.723042322389364 with std:7.838201020912573. The val loss is 9.41762023421165 with std:25.4250601412789.
9.41762023421165 0.18587918911465645 5
Evaluating for {'degree': 5, 'lmda': 0.24244620170823283} ...
The training loss is 5.208353071849821 with std:12.266082184136806. The val loss is 1291.1223512468446 with std:13465.393211741719.
1291.1223512468446 0.24244620170823283 5
The training loss is 5.289814444138068 with std:10.018545442956253. The val loss is 8.130984728102867 with std:14.36650161251128.
8.130984728102867 0.24244620170823283 5
The training loss is 4.78816235731634 with std:7.951791662438449. The val loss is 9.291753246510806 with std:24.893224933698626.
9.291753246510806 0.24244620170823283 5
Evaluating for {'degree': 5, 'lmda': 0.31622776601683794} ...
The training loss is 5.242267879081881 with std:12.321495971464552. The val loss is 1666.865589313618 with std:17527.558614491547.
1666.865589313618 0.31622776601683794 5
The training loss is 5.339021999104373 with std:10.103066021761379. The val loss is 8.126519129721096 with std:14.335369535069676.
8.126519129721096 0.31622776601683794 5
The training loss is 4.859169197288276 with std:8.07792172871468. The val loss is 9.159086345362518 with std:24.33659110016919.
9.159086345362518 0.31622776601683794 5
Evaluating for {'degree': 5, 'lmda': 0.41246263829013524} ...
The training loss is 5.277338447581425 with std:12.370132563235554. The val loss is 1945.3247361624422 with std:20556.115496792678.
1945.3247361624422 0.41246263829013524 5
The training loss is 5.389593746680785 with std:10.195957294776585. The val loss is 8.120986766599469 with std:14.304797235071492.
8.120986766599469 0.41246263829013524 5
The training loss is 4.935021265266221 with std:8.215293519149412. The val loss is 9.02089462624812 with std:23.759165563048693.
9.02089462624812 0.41246263829013524 5
Evaluating for {'degree': 5, 'lmda': 0.5379838403443686} ...
The training loss is 5.314334677422931 with std:12.413799662486534. The val loss is 2090.4481684724096 with std:22157.53861268541.
2090.4481684724096 0.5379838403443686 5
The training loss is 5.440842058424304 with std:10.297206610157998. The val loss is 8.11355444624099 with std:14.27375731308505.
8.11355444624099 0.5379838403443686 5
The training loss is 5.014927249393819 with std:8.36272810595379. The val loss is 8.879566991659134 with std:23.16817600652939.
8.879566991659134 0.5379838403443686 5
Evaluating for {'degree': 5, 'lmda': 0.701703828670383} ...
The training loss is 5.354574756990354 with std:12.455946234395014. The val loss is 2091.52559262197 with std:22211.871540922984.
2091.52559262197 0.701703828670383 5
The training loss is 5.492451044683332 with std:10.407525538128645. The val loss is 8.103757071844662 with std:14.240904885625811.
8.103757071844662 0.701703828670383 5
The training loss is 5.098464037886667 with std:8.519348691556306. The val loss is 8.738621646129712 with std:22.57497228978552.
8.738621646129712 0.701703828670383 5
Evaluating for {'degree': 5, 'lmda': 0.9152473108773893} ...
The training loss is 5.399863091174132 with std:12.501673447914317. The val loss is 1961.4793647298025 with std:20854.518445432655.
1961.4793647298025 0.9152473108773893 5
The training loss is 5.544620851619825 with std:10.52876714824935. The val loss is 8.091396687178195 with std:14.20456413428361.
8.091396687178195 0.9152473108773893 5
The training loss is 5.18561070507842 with std:8.684713871800827. The val loss is 8.602444723396015 with std:21.995168528918246.
8.602444723396015 0.9152473108773893 5
Evaluating for {'degree': 5, 'lmda': 1.1937766417144369} ...
The training loss is 5.452387873018987 with std:12.557675512439708. The val loss is 1731.114664752227 with std:18414.31385393714.
1731.114664752227 1.1937766417144369 5
The training loss is 5.5982376898702215 with std:10.664379357426204. The val loss is 8.07642006904579 with std:14.162698371323778.
8.07642006904579 1.1937766417144369 5
The training loss is 5.276807512699296 with std:8.859021777891245. The val loss is 8.475876549351607 with std:21.448041114894465.
8.475876549351607 1.1937766417144369 5
Evaluating for {'degree': 5, 'lmda': 1.5570684047537318} ...
The training loss is 5.514641508276262 with std:12.632167904618393. The val loss is 1440.9729857271675 with std:15325.728618903526.
1440.9729857271675 1.5570684047537318 5
The training loss is 5.6551048556556625 with std:10.819918806399162. The val loss is 8.058885838630818 with std:14.112940434071337.
8.058885838630818 1.5570684047537318 5
The training loss is 5.3731609786063315 with std:9.043502577124201. The val loss is 8.363897730462616 with std:20.95559056211353.
8.363897730462616 1.5570684047537318 5
Evaluating for {'degree': 5, 'lmda': 2.030917620904737} ...
The training loss is 5.589475526863482 with std:12.734901936534243. The val loss is 1132.6087731406888 with std:12034.879114431067.
1132.6087731406888 2.030917620904737 5
The training loss is 5.7182805008715345 with std:11.003654475205545. The val loss is 8.039142344571871 with std:14.052804554724439.
8.039142344571871 2.030917620904737 5
The training loss is 5.476869853973806 with std:9.241055436294769. The val loss is 8.271693847714392 with std:20.54188556263547.
8.271693847714392 2.030917620904737 5
Evaluating for {'degree': 5, 'lmda': 2.6489692876105297} ...
The training loss is 5.680453656276676 with std:12.877400332443523. The val loss is 841.149785276875 with std:8919.369883383006.
841.149785276875 2.6489692876105297 5
The training loss is 5.792604874727752 with std:11.22731069620866. The val loss is 8.018347611678603 with std:13.980224433831706.
8.018347611678603 2.6489692876105297 5
The training loss is 5.591905799950307 with std:9.457140989596757. The val loss is 8.205318782383722 with std:20.233226607792247.
8.205318782383722 2.6489692876105297 5
Evaluating for {'degree': 5, 'lmda': 3.4551072945922217} ...
The training loss is 5.792705383084759 with std:13.073559117814773. The val loss is 590.6104255853037 with std:6237.749711667141.
590.6104255853037 3.4551072945922217 5
The training loss is 5.8855653372385754 with std:11.507021603227612. The val loss is 7.999479809036596 with std:13.894567427257277.
7.999479809036596 3.4551072945922217 5
The training loss is 5.725007963875333 with std:9.700963962975294. The val loss is 8.173102195684262 with std:20.059324370526355.
8.173102195684262 3.4551072945922217 5
Evaluating for {'degree': 5, 'lmda': 4.506570337745478} ...
The training loss is 5.934503446006391 with std:13.340740026152428. The val loss is 392.5659093276127 with std:4115.094061482219.
392.5659093276127 4.506570337745478 5
The training loss is 6.008731737909535 with std:11.864565091025376. The val loss is 7.989011471322465 with std:13.798281903439168.
7.989011471322465 4.506570337745478 5
The training loss is 5.887155433253649 with std:9.98706654561374. The val loss is 8.187915101246631 with std:20.055227214515977.
8.187915101246631 4.506570337745478 5
Evaluating for {'degree': 5, 'lmda': 5.878016072274912} ...
The training loss is 6.119818459671126 with std:13.701436564428187. The val loss is 247.77787697532844 with std:2560.3085012178544.
247.77787697532844 5.878016072274912 5
The training loss is 6.180096871752953 with std:12.328907910808834. The val loss is 7.99946771063098 with std:13.699409859785511.
7.99946771063098 5.878016072274912 5
The training loss is 6.095839232871459 with std:10.337560169672377. The val loss is 8.270445911425162 with std:20.263390487430478.
8.270445911425162 5.878016072274912 5
Evaluating for {'degree': 5, 'lmda': 7.666822074546214} ...
The training loss is 6.372196723669994 with std:14.185561777803779. The val loss is 149.61482504753613 with std:1502.7356859240233.
149.61482504753613 7.666822074546214 5
The training loss is 6.427800434035609 with std:12.938064437824462. The val loss is 8.053203215164762 with std:13.61545091670847.
8.053203215164762 7.666822074546214 5
The training loss is 6.378645998527821 with std:10.785302224486207. The val loss is 8.453774789697498 with std:20.735340230362763.
8.453774789697498 7.666822074546214 5
Evaluating for {'degree': 5, 'lmda': 10.0} ...
The training loss is 6.730510981206947 with std:14.833392438504879. The val loss is 87.89218405345237 with std:833.3097475489346.
87.89218405345237 10.0 5
The training loss is 6.795933066574976 with std:13.741309350175. The val loss is 8.1879695816616 with std:13.579546387501836.
8.1879695816616 10.0 5
The training loss is 6.778897901028327 with std:11.378285487495917. The val loss is 8.789805649838593 with std:21.53296368716455.
8.789805649838593 10.0 5
Evaluating for {'degree': 6, 'lmda': 0.01} ...
The training loss is 4.572805428314505 with std:10.83848159179468. The val loss is 45014.15994069332 with std:495764.6603632011.
45014.15994069332 0.01 6
The training loss is 4.500877874104301 with std:8.825729034316431. The val loss is 8.998594511563706 with std:17.049929233099405.
8.998594511563706 0.01 6
The training loss is 3.940603588458181 with std:6.560697688958154. The val loss is 9.54991503865812 with std:24.540392843610956.
9.54991503865812 0.01 6
Evaluating for {'degree': 6, 'lmda': 0.013043213867190054} ...
The training loss is 4.600890758523736 with std:10.942897544049352. The val loss is 26368.895232535157 with std:290062.8638511324.
26368.895232535157 0.013043213867190054 6
The training loss is 4.537883319819651 with std:8.919019215292295. The val loss is 8.861300483948472 with std:16.604631236486945.
8.861300483948472 0.013043213867190054 6
The training loss is 3.9699425601157827 with std:6.6239564093293035. The val loss is 9.519354710406576 with std:24.593012948655762.
9.519354710406576 0.013043213867190054 6
Evaluating for {'degree': 6, 'lmda': 0.017012542798525893} ...
The training loss is 4.6316899772544415 with std:11.055701538240523. The val loss is 14186.192063419983 with std:155728.2937911828.
14186.192063419983 0.017012542798525893 6
The training loss is 4.577382741444029 with std:9.014694850734841. The val loss is 8.730875345277772 with std:16.20544323830147.
8.730875345277772 0.017012542798525893 6
The training loss is 4.001807543672504 with std:6.691179633078321. The val loss is 9.483444798034007 with std:24.60209742428702.
9.483444798034007 0.017012542798525893 6
Evaluating for {'degree': 6, 'lmda': 0.02218982341458972} ...
The training loss is 4.664843421462611 with std:11.175755711113553. The val loss is 6899.65738568425 with std:75447.45462359364.
6899.65738568425 0.02218982341458972 6
The training loss is 4.618900397432003 with std:9.110875123976234. The val loss is 8.608877338737749 with std:15.854834682625258.
8.608877338737749 0.02218982341458972 6
The training loss is 4.0358942180839925 with std:6.761100843715456. The val loss is 9.440913609000138 with std:24.56001358875525.
9.440913609000138 0.02218982341458972 6
Evaluating for {'degree': 6, 'lmda': 0.028942661247167517} ...
The training loss is 4.700027384573766 with std:11.301659052648931. The val loss is 2982.2021323581025 with std:32344.49810243063.
2982.2021323581025 0.028942661247167517 6
The training loss is 4.662028233207701 with std:9.205516571765234. The val loss is 8.496275455767273 with std:15.552035190095381.
8.496275455767273 0.028942661247167517 6
The training loss is 4.071904631446288 with std:6.832454687234618. The val loss is 9.390686899013616 with std:24.462233595351734.
9.390686899013616 0.028942661247167517 6
Evaluating for {'degree': 6, 'lmda': 0.037750532053243954} ...
The training loss is 4.73700981325111 with std:11.431732500784879. The val loss is 1140.7165194563308 with std:12130.90943086361.
1140.7165194563308 0.037750532053243954 6
The training loss is 4.70650447718312 with std:9.296659471729173. The val loss is 8.393751178264353 with std:15.294139871438347.
8.393751178264353 0.037750532053243954 6
The training loss is 4.1097186493578315 with std:6.904346181383454. The val loss is 9.3324314022757 with std:24.308462043553785.
9.3324314022757 0.037750532053243954 6
Evaluating for {'degree': 6, 'lmda': 0.04923882631706739} ...
The training loss is 4.775670217370236 with std:11.563998836195196. The val loss is 413.480223578071 with std:4180.800298993403.
413.480223578071 0.04923882631706739 6
The training loss is 4.752274781481055 with std:9.382771859829676. The val loss is 8.301928281610532 with std:15.07703131983651.
8.301928281610532 0.04923882631706739 6
The training loss is 4.149554582613336 with std:6.976585662362785. The val loss is 9.267003701237218 with std:24.103014052152908.
9.267003701237218 0.04923882631706739 6
Evaluating for {'degree': 6, 'lmda': 0.0642232542222936} ...
The training loss is 4.815984365393752 with std:11.69619020455289. The val loss is 189.18889060043796 with std:1740.0423191907728.
189.18889060043796 0.0642232542222936 6
The training loss is 4.799523823265249 with std:9.463121737154681. The val loss is 8.221521546303093 with std:14.896198181539138.
8.221521546303093 0.0642232542222936 6
The training loss is 4.192058185867704 with std:7.049892021608446. The val loss is 9.19661001549761 with std:23.854411115699776.
9.19661001549761 0.0642232542222936 6
Evaluating for {'degree': 6, 'lmda': 0.0837677640068292} ...
The training loss is 4.857978959974945 with std:11.82580726629666. The val loss is 168.3746185929955 with std:1488.4524665445572.
168.3746185929955 0.0837677640068292 6
The training loss is 4.848656204408319 with std:9.53806439477966. The val loss is 8.153385045630237 with std:14.747418918371013.
8.153385045630237 0.0837677640068292 6
The training loss is 4.238275563239966 with std:7.125901114347881. The val loss is 9.124581563009142 with std:23.574655946427125.
9.124581563009142 0.0837677640068292 6
Evaluating for {'degree': 6, 'lmda': 0.10926008611173785} ...
The training loss is 4.901664933649644 with std:11.950242804077277. The val loss is 283.9492198325463 with std:2671.5418097407323.
283.9492198325463 0.10926008611173785 6
The training loss is 4.900209115249696 with std:9.609129267916279. The val loss is 8.098424165956283 with std:14.627194647675955.
8.098424165956283 0.10926008611173785 6
The training loss is 4.289497506668862 with std:7.206965569196679. The val loss is 9.054808108197072 with std:23.278805526737028.
9.054808108197072 0.10926008611173785 6
Evaluating for {'degree': 6, 'lmda': 0.14251026703029984} ...
The training loss is 4.946966918871997 with std:12.066974594131626. The val loss is 597.8014433669368 with std:5985.89942254464.
597.8014433669368 0.14251026703029984 6
The training loss is 4.9547034201400555 with std:9.67884278822215. The val loss is 8.057358696484078 with std:14.532842080073959.
8.057358696484078 0.14251026703029984 6
The training loss is 4.347002589521559 with std:7.295783505751261. The val loss is 8.99097153898144 with std:22.98496892127075.
8.99097153898144 0.14251026703029984 6
Evaluating for {'degree': 6, 'lmda': 0.18587918911465645} ...
The training loss is 4.99367965747875 with std:12.173821897483224. The val loss is 1195.7582514010444 with std:12395.079725645019.
1195.7582514010444 0.18587918911465645 6
The training loss is 5.0124749805794995 with std:9.750317533039972. The val loss is 8.03039113257706 with std:14.462255344722555.
8.03039113257706 0.18587918911465645 6
The training loss is 4.411766667067851 with std:7.39494243725246. The val loss is 8.935747856403282 with std:22.713949594603825.
8.935747856403282 0.18587918911465645 6
Evaluating for {'degree': 6, 'lmda': 0.24244620170823283} ...
The training loss is 5.04148677153964 with std:12.26924085788242. The val loss is 2106.5557924113828 with std:22235.34526528084.
2106.5557924113828 0.24244620170823283 6
The training loss is 5.073549816042123 with std:9.826734498269206. The val loss is 8.016900327592895 with std:14.413438798272574.
8.016900327592895 0.24244620170823283 6
The training loss is 4.48423397410718 with std:7.5065024539838365. The val loss is 8.89014038272687 with std:22.487202587738953.
8.89014038272687 0.24244620170823283 6
Evaluating for {'degree': 6, 'lmda': 0.31622776601683794} ...
The training loss is 5.090060721476458 with std:12.352613939986412. The val loss is 3264.1060113653466 with std:34801.931914717425.
3264.1060113653466 0.31622776601683794 6
The training loss is 5.137614657569343 with std:9.9108942959164. The val loss is 8.015291438461112 with std:14.38396461612841.
8.015291438461112 0.31622776601683794 6
The training loss is 4.5642390337579055 with std:7.631745932950394. The val loss is 8.853123520176474 with std:22.322354607392644.
8.853123520176474 0.31622776601683794 6
Evaluating for {'degree': 6, 'lmda': 0.41246263829013524} ...
The training loss is 5.139229412886483 with std:12.42448599131922. The val loss is 4516.455271972795 with std:48444.35070722307.
4516.455271972795 0.41246263829013524 6
The training loss is 5.204092600115146 with std:10.004984442476356. The val loss is 8.023072319373867 with std:14.37051706424567.
8.023072319373867 0.41246263829013524 6
The training loss is 4.651116342741504 with std:7.771165823819171. The val loss is 8.821788274757592 with std:22.227331460804365.
8.821788274757592 0.41246263829013524 6
Evaluating for {'degree': 6, 'lmda': 0.5379838403443686} ...
The training loss is 5.18916620098285 with std:12.486723641794894. The val loss is 5668.039982392989 with std:61027.7940091629.
5668.039982392989 0.5379838403443686 6
The training loss is 5.272292024738204 with std:10.110622497123611. The val loss is 8.03713090789039 with std:14.368655743552152.
8.03713090789039 0.5379838403443686 6
The training loss is 4.743954325227188 with std:7.9246645688210355. The val loss is 8.792088656868117 with std:22.1960986541842.
8.792088656868117 0.5379838403443686 6
Evaluating for {'degree': 6, 'lmda': 0.701703828670383} ...
The training loss is 5.240554244250977 with std:12.542610800995053. The val loss is 6533.388399188271 with std:70521.50105956547.
6533.388399188271 0.701703828670383 6
The training loss is 5.341587295656739 with std:10.229147469622617. The val loss is 8.054116391600001 with std:14.372866543761804.
8.054116391600001 0.701703828670383 6
The training loss is 4.841891399340511 with std:8.091854697603159. The val loss is 8.760055469548986 with std:22.209279148969745.
8.760055469548986 0.701703828670383 6
Evaluating for {'degree': 6, 'lmda': 0.9152473108773893} ...
The training loss is 5.294693029770421 with std:12.596914584845702. The val loss is 6982.520648576271 with std:75497.10144701938.
6982.520648576271 0.9152473108773893 6
The training loss is 5.411615894325634 with std:10.362094098376042. The val loss is 8.070813671470928 with std:14.376879149597999.
8.070813671470928 0.9152473108773893 6
The training loss is 4.944362074444598 with std:8.272357361547922. The val loss is 8.723096298428425 with std:22.24049883630865.
8.723096298428425 0.9152473108773893 6
Evaluating for {'degree': 6, 'lmda': 1.1937766417144369} ...
The training loss is 5.353543032496987 with std:12.655952345171919. The val loss is 6966.408935368644 with std:75408.92358512647.
6966.408935368644 1.1937766417144369 6
The training loss is 5.482513074345142 with std:10.511796681742991. The val loss is 8.084440575507154 with std:14.374139829138718.
8.084440575507154 1.1937766417144369 6
The training loss is 5.051279892202104 with std:8.466087898297536. The val loss is 8.680938586018694 with std:22.265796640860344.
8.680938586018694 1.1937766417144369 6
Evaluating for {'degree': 6, 'lmda': 1.5570684047537318} ...
The training loss is 5.419738884529931 with std:12.727679209940751. The val loss is 6519.401199589302 with std:70622.11124911689.
6519.401199589302 1.5570684047537318 6
The training loss is 5.555224912092338 with std:10.682096777949678. The val loss is 8.092864914406693 with std:14.358289714967986.
8.092864914406693 1.5570684047537318 6
The training loss is 5.163239076495865 with std:8.673622965222936. The val loss is 8.635979295289696 with std:22.27176521072241.
8.635979295289696 1.5570684047537318 6
Evaluating for {'degree': 6, 'lmda': 2.030917620904737} ...
The training loss is 5.49664810441595 with std:12.821826360983692. The val loss is 5741.888059823222 with std:62224.40433032626.
5741.888059823222 2.030917620904737 6
The training loss is 5.6319440693303715 with std:10.879149646357975. The val loss is 8.094813579976364 with std:14.323561680236573.
8.094813579976364 2.030917620904737 6
The training loss is 5.281866608729415 with std:8.89678194962575. The val loss is 8.593160915353995 with std:22.25946875828543.
8.593160915353995 2.030917620904737 6
Evaluating for {'degree': 6, 'lmda': 2.6489692876105297} ...
The training loss is 5.58861638079572 with std:12.950165348119658. The val loss is 4770.578674512774 with std:51702.16334919041.
4770.578674512774 2.6489692876105297 6
The training loss is 5.716727766076838 with std:11.112347323050308. The val loss is 8.090222774266408 with std:14.265172927907928.
8.090222774266408 2.6489692876105297 6
The training loss is 5.410447442144024 with std:9.13952027118238. The val loss is 8.559791245581703 with std:22.244106639615755.
8.559791245581703 2.6489692876105297 6
Evaluating for {'degree': 6, 'lmda': 3.4551072945922217} ...
The training loss is 5.701614636861673 with std:13.127034955295805. The val loss is 3745.679510022752 with std:40582.219779009065.
3745.679510022752 3.4551072945922217 6
The training loss is 5.81641486962853 with std:11.395418391577243. The val loss is 8.08095887533713 with std:14.179993962714034.
8.08095887533713 3.4551072945922217 6
The training loss is 5.554927711314154 with std:9.409192378720922. The val loss is 8.545842645566351 with std:22.25259597357308.
8.545842645566351 3.4551072945922217 6
Evaluating for {'degree': 6, 'lmda': 4.506570337745478} ...
The training loss is 5.844574238594115 with std:13.370312401904979. The val loss is 2783.703624287508 with std:30134.881438800046.
2783.703624287508 4.506570337745478 6
The training loss is 5.942067097592467 with std:11.74781541053098. The val loss is 8.072221200116498 with std:14.067940346913758.
8.072221200116498 4.506570337745478 6
The training loss is 5.725446278580533 with std:9.718266748682778. The val loss is 8.565237532503032 with std:22.321682028101808.
8.565237532503032 4.506570337745478 6
Evaluating for {'degree': 6, 'lmda': 5.878016072274912} ...
The training loss is 6.031763006678398 with std:13.703018296442988. The val loss is 1961.639715079298 with std:21200.6805338126.
1961.639715079298 5.878016072274912 6
The training loss is 6.1113045748754375 with std:12.1965208650948. The val loss is 8.075015268029226 with std:13.934632410710346.
8.075015268029226 5.878016072274912 6
The training loss is 5.938688834064059 with std:10.086666893702361. The val loss is 8.638564041418242 with std:22.49817524949182.
8.638564041418242 5.878016072274912 6
Evaluating for {'degree': 6, 'lmda': 7.666822074546214} ...
The training loss is 6.2866377518430525 with std:14.155707455429908. The val loss is 1313.8958922486615 with std:14156.397992311608.
1313.8958922486615 7.666822074546214 6
The training loss is 6.352074131999776 with std:12.778355113505171. The val loss is 8.110165826433114 with std:13.795954342600806.
8.110165826433114 7.666822074546214 6
The training loss is 6.221579375054011 with std:10.545004554470548. The val loss is 8.797664406169682 with std:22.84135394247527.
8.797664406169682 7.666822074546214 6
Evaluating for {'degree': 6, 'lmda': 10.0} ...
The training loss is 6.647744067090405 with std:14.769706172473827. The val loss is 839.4789018249055 with std:8992.774095723937.
839.4789018249055 10.0 6
The training loss is 6.708580424096812 with std:13.542759503338871. The val loss is 8.214455951863217 with std:13.685305414259707.
8.214455951863217 10.0 6
The training loss is 6.617074054229775 with std:11.138953491121606. The val loss is 9.092643413978143 with std:23.42630970555846.
9.092643413978143 10.0 6
Evaluating for {'degree': 7, 'lmda': 0.01} ...
The training loss is 4.4577659242161225 with std:10.76348145751315. The val loss is 884217.6974230363 with std:9803051.377981905.
884217.6974230363 0.01 7
The training loss is 4.390464426880323 with std:8.652736382430174. The val loss is 9.608632890616423 with std:19.247242235398346.
9.608632890616423 0.01 7
The training loss is 3.8187686108432195 with std:6.469243705394727. The val loss is 11.291666886388969 with std:33.676484277678284.
11.291666886388969 0.01 7
Evaluating for {'degree': 7, 'lmda': 0.013043213867190054} ...
The training loss is 4.499899505641233 with std:10.879574283809859. The val loss is 619533.6018195528 with std:6867401.079790004.
619533.6018195528 0.013043213867190054 7
The training loss is 4.432769490451964 with std:8.765327430249101. The val loss is 9.408483486637484 with std:18.43584617263354.
9.408483486637484 0.013043213867190054 7
The training loss is 3.8535867845629226 with std:6.533294342021749. The val loss is 11.117932393290664 with std:32.61198053422361.
11.117932393290664 0.013043213867190054 7
Evaluating for {'degree': 7, 'lmda': 0.017012542798525893} ...
The training loss is 4.5432224733424516 with std:11.002476140681733. The val loss is 413100.0436317698 with std:4578021.184466999.
413100.0436317698 0.017012542798525893 7
The training loss is 4.476766671546418 with std:8.880041455317379. The val loss is 9.221155916727927 with std:17.700537896210694.
9.221155916727927 0.017012542798525893 7
The training loss is 3.890360412825657 with std:6.600119544660245. The val loss is 10.933844175141013 with std:31.481400027146925.
10.933844175141013 0.017012542798525893 7
Evaluating for {'degree': 7, 'lmda': 0.02218982341458972} ...
The training loss is 4.587577412121537 with std:11.13173864499821. The val loss is 261518.87978423925 with std:2897156.950697384.
261518.87978423925 0.02218982341458972 7
The training loss is 4.522056029584727 with std:8.994984436099418. The val loss is 9.049440029414768 with std:17.057469546125066.
9.049440029414768 0.02218982341458972 7
The training loss is 3.9287314633558 with std:6.668869623616403. The val loss is 10.744450986519022 with std:30.330569194187984.
10.744450986519022 0.02218982341458972 7
Evaluating for {'degree': 7, 'lmda': 0.028942661247167517} ...
The training loss is 4.632689575280338 with std:11.266253735023689. The val loss is 156899.67916390684 with std:1737227.0214955278.
156899.67916390684 0.028942661247167517 7
The training loss is 4.568119820888485 with std:9.107828364990922. The val loss is 8.894721451287811 with std:16.513169961377642.
8.894721451287811 0.028942661247167517 7
The training loss is 3.968200004884775 with std:6.738455002683017. The val loss is 10.554320028031693 with std:29.20458772946768.
10.554320028031693 0.028942661247167517 7
Evaluating for {'degree': 7, 'lmda': 0.037750532053243954} ...
The training loss is 4.678254443998372 with std:11.404360608132171. The val loss is 89210.3767867809 with std:986905.3972179984.
89210.3767867809 0.037750532053243954 7
The training loss is 4.614464437599012 with std:9.216082555471942. The val loss is 8.757095177121748 with std:16.06543621054884.
8.757095177121748 0.037750532053243954 7
The training loss is 4.008296497275556 with std:6.807823219817727. The val loss is 10.367324185089661 with std:28.142288153413084.
10.367324185089661 0.037750532053243954 7
Evaluating for {'degree': 7, 'lmda': 0.04923882631706739} ...
The training loss is 4.724016873925974 with std:11.54395570610151. The val loss is 48313.16650912263 with std:533706.2735943468.
48313.16650912263 0.04923882631706739 7
The training loss is 4.660750117729248 with std:9.317440240695253. The val loss is 8.635706327731432 with std:15.70544641063866.
8.635706327731432 0.04923882631706739 7
The training loss is 4.048778899884767 with std:6.876280935381534. The val loss is 10.186780735183415 with std:27.173316974473863.
10.186780735183415 0.04923882631706739 7
Evaluating for {'degree': 7, 'lmda': 0.0642232542222936} ...
The training loss is 4.769824392090251 with std:11.682610353870341. The val loss is 25350.759041442205 with std:279358.267760275.
25350.759041442205 0.0642232542222936 7
The training loss is 4.706895743308775 with std:9.410173166983908. The val loss is 8.529207045691164 with std:15.420481417754072.
8.529207045691164 0.0642232542222936 7
The training loss is 4.089807713362131 with std:6.943791248341858. The val loss is 10.015736835188452 with std:26.317270904296503.
10.015736835188452 0.0642232542222936 7
Evaluating for {'degree': 7, 'lmda': 0.0837677640068292} ...
The training loss is 4.815648798255492 with std:11.817709361771323. The val loss is 13438.669115989092 with std:147479.26881292276.
13438.669115989092 0.0837677640068292 7
The training loss is 4.753147785267692 with std:9.493521807820592. The val loss is 8.436231883395662 with std:15.196684609520887.
8.436231883395662 0.0837677640068292 7
The training loss is 4.1320542740263075 with std:7.011181948320328. The val loss is 9.857133079912865 with std:25.584076811826694.
9.857133079912865 0.0837677640068292 7
Evaluating for {'degree': 7, 'lmda': 0.10926008611173785} ...
The training loss is 4.861573780472501 with std:11.94661671907315. The val loss is 7797.880103239246 with std:85045.28716745214.
7797.880103239246 0.10926008611173785 7
The training loss is 4.800094089933887 with std:9.568000388393163. The val loss is 8.355779568124417 with std:15.021293212932994.
8.355779568124417 0.10926008611173785 7
The training loss is 4.1767109997738165 with std:7.080221756750166. The val loss is 9.713667344602303 with std:24.975398239743107.
9.713667344602303 0.10926008611173785 7
Evaluating for {'degree': 7, 'lmda': 0.14251026703029984} ...
The training loss is 4.9077486286082515 with std:12.066866188629257. The val loss is 5511.862624130117 with std:59695.628274056166.
5511.862624130117 0.14251026703029984 7
The training loss is 4.848601188910181 with std:9.63552546780426. The val loss is 8.287382297453373 with std:14.883918125191181.
8.287382297453373 0.14251026703029984 7
The training loss is 4.225389970447353 with std:7.153544714104941. The val loss is 9.587363614915635 with std:24.487163685181002.
9.587363614915635 0.14251026703029984 7
Evaluating for {'degree': 7, 'lmda': 0.18587918911465645} ...
The training loss is 4.954319218496916 with std:12.176373031944111. The val loss is 5073.151279597697 with std:54706.94088385714.
5073.151279597697 0.18587918911465645 7
The training loss is 4.8996707270823245 with std:9.699300400510138. The val loss is 8.2309912016511 with std:14.776764468208096.
8.2309912016511 0.18587918911465645 7
The training loss is 4.279914919974638 with std:7.234419521880597. The val loss is 9.47900636975076 with std:24.112553353715647.
9.47900636975076 0.18587918911465645 7
Evaluating for {'degree': 7, 'lmda': 0.24244620170823283} ...
The training loss is 5.001365389627541 with std:12.273661484518918. The val loss is 5856.406126884645 with std:63154.218004383045.
5856.406126884645 0.24244620170823283 7
The training loss is 4.954242477087994 with std:9.763448788671472. The val loss is 8.186608717125338 with std:14.694019139250779.
8.186608717125338 0.24244620170823283 7
The training loss is 4.342034724395529 with std:7.3263785121149665. The val loss is 9.387647956742043 with std:23.843605977824037.
9.387647956742043 0.24244620170823283 7
Evaluating for {'degree': 7, 'lmda': 0.31622776601683794} ...
The training loss is 5.048883669875833 with std:12.358093991320512. The val loss is 7634.100510449456 with std:82543.1497922991.
7634.100510449456 0.31622776601683794 7
The training loss is 5.013000070624849 with std:9.832474806056494. The val loss is 8.153806531388089 with std:14.63082642198613.
8.153806531388089 0.31622776601683794 7
The training loss is 4.413116610567947 with std:7.432754359745041. The val loss is 9.31035713710882 with std:23.66964875385154.
9.31035713710882 0.31622776601683794 7
Evaluating for {'degree': 7, 'lmda': 0.41246263829013524} ...
The training loss is 5.096842774968552 with std:12.4300797929607. The val loss is 10223.728636871392 with std:110895.53407089636.
10223.728636871392 0.41246263829013524 7
The training loss is 5.076241947940905 with std:9.910704193840894. The val loss is 8.131321348994833 with std:14.582286365194367.
8.131321348994833 0.41246263829013524 7
The training loss is 4.493904118387947 with std:7.5562189601027825. The val loss is 9.24233418180242 with std:23.572836066772076.
9.24233418180242 0.41246263829013524 7
Evaluating for {'degree': 7, 'lmda': 0.5379838403443686} ...
The training loss is 5.145310865391761 with std:12.491246836924267. The val loss is 13309.58739587459 with std:144765.84027170195.
13309.58739587459 0.5379838403443686 7
The training loss is 5.143859426594342 with std:10.001885557484464. The val loss is 8.116889287851441 with std:14.542796277002756.
8.116889287851441 0.5379838403443686 7
The training loss is 4.584424142729238 with std:7.698447962934578. The val loss is 9.177490316521446 with std:23.523982275617932.
9.177490316521446 0.5379838403443686 7
Evaluating for {'degree': 7, 'lmda': 0.701703828670383} ...
The training loss is 5.19462863381664 with std:12.544584575791514. The val loss is 16426.8604868702 with std:179054.98724760374.
16426.8604868702 0.701703828670383 7
The training loss is 5.215432079523656 with std:10.109088919798719. The val loss is 8.107381562295798 with std:14.505889499715908.
8.107381562295798 0.701703828670383 7
The training loss is 4.684085052119495 with std:7.860015470394932. The val loss is 9.109503070745589 with std:23.482992607714053.
9.109503070745589 0.701703828670383 7
Evaluating for {'degree': 7, 'lmda': 0.9152473108773893} ...
The training loss is 5.245598187623722 with std:12.594588715167534. The val loss is 19053.189124104054 with std:208015.63565941222.
19053.189124104054 0.9152473108773893 7
The training loss is 5.290431827360997 with std:10.234953010057344. The val loss is 8.099194644270552 with std:14.46456196694451.
8.099194644270552 0.9152473108773893 7
The training loss is 4.791946159355259 with std:8.040565321562465. The val loss is 9.03317701411643 with std:23.405940258154143.
9.03317701411643 0.9152473108773893 7
Evaluating for {'degree': 7, 'lmda': 1.1937766417144369} ...
The training loss is 5.299673476113562 with std:12.647440138436005. The val loss is 20737.9913619748 with std:226678.07839181065.
20737.9913619748 1.1937766417144369 7
The training loss is 5.368530879574009 with std:10.382253841340837. The val loss is 8.088782856498137 with std:14.41194664436263.
8.088782856498137 1.1937766417144369 7
The training loss is 4.907105269804344 with std:8.239254908833177. The val loss is 8.94576676165133 with std:23.25663283013586.
8.94576676165133 1.1937766417144369 7
Evaluating for {'degree': 7, 'lmda': 1.5570684047537318} ...
The training loss is 5.359161314774812 with std:12.711225265187192. The val loss is 21210.36836880934 with std:232042.6063322029.
21210.36836880934 1.5570684047537318 7
The training loss is 5.4500231247919935 with std:10.55472596327212. The val loss is 8.073223568655765 with std:14.342125765824647.
8.073223568655765 1.5570684047537318 7
The training loss is 5.029171645580606 with std:8.455456069032925. The val loss is 8.847915065580944 with std:23.017733328038872.
8.847915065580944 1.5570684047537318 7
Evaluating for {'degree': 7, 'lmda': 2.030917620904737} ...
The training loss is 5.4274680056806295 with std:12.796179344923013. The val loss is 20430.5159077164 with std:223650.1972117532.
20430.5159077164 2.030917620904737 7
The training loss is 5.536380737331042 with std:10.758064184845466. The val loss is 8.050764481383796 with std:14.25088995959438.
8.050764481383796 2.030917620904737 7
The training loss is 5.15884470488642 with std:8.689707796295458. The val loss is 8.744061683842421 with std:22.697072006049616.
8.744061683842421 2.030917620904737 7
Evaluating for {'degree': 7, 'lmda': 2.6489692876105297} ...
The training loss is 5.509465815632589 with std:12.914947764120486. The val loss is 18575.99834086082 with std:203435.34300545562.
18575.99834086082 2.6489692876105297 7
The training loss is 5.630986600727001 with std:11.001057923914674. The val loss is 8.021398225384317 with std:14.136373780588093.
8.021398225384317 2.6489692876105297 7
The training loss is 5.298666602209046 with std:8.94491767893317. The val loss is 8.642473315671872 with std:22.327652965596844.
8.642473315671872 2.6489692876105297 7
Evaluating for {'degree': 7, 'lmda': 3.4551072945922217} ...
The training loss is 5.612122411868911 with std:13.082936113068925. The val loss is 15976.21117840515 with std:175007.31782845352.
15976.21117840515 3.4551072945922217 7
The training loss is 5.740136238098631 with std:11.296859820863201. The val loss is 7.987629122175154 with std:13.999695885327924.
7.987629122175154 3.4551072945922217 7
The training loss is 5.454057003958716 with std:9.227814401711015. The val loss is 8.5552912566336 with std:21.96274203189144.
8.5552912566336 3.4551072945922217 7
Evaluating for {'degree': 7, 'lmda': 4.506570337745478} ...
The training loss is 5.745642943685161 with std:13.31893844293462. The val loss is 13021.577971862613 with std:142652.05925696038.
13021.577971862613 4.506570337745478 7
The training loss is 5.874510751952608 with std:11.664456035844395. The val loss is 7.955733299360064 with std:13.845959566474674.
7.955733299360064 4.506570337745478 7
The training loss is 5.634809554508633 with std:9.550704109202835. The val loss is 8.499142216637798 with std:21.66892687418398.
8.499142216637798 4.506570337745478 7
Evaluating for {'degree': 7, 'lmda': 5.878016072274912} ...
The training loss is 5.92551475151507 with std:13.646338282079087. The val loss is 10076.325321251 with std:110372.29268626332.
10076.325321251 5.878016072274912 7
The training loss is 6.0514812868209304 with std:12.130455031368005. The val loss is 7.937953924277291 with std:13.686174843588327.
7.937953924277291 5.878016072274912 7
The training loss is 5.857370624895472 with std:9.933690481242563. The val loss is 8.496943979494782 with std:21.52003669660339.
8.496943979494782 5.878016072274912 7
Evaluating for {'degree': 7, 'lmda': 7.666822074546214} ...
The training loss is 6.176005999573025 with std:14.095201963915802. The val loss is 7416.498514256367 with std:81204.57860028186.
7416.498514256367 7.666822074546214 7
The training loss is 6.298797142473818 with std:12.731297579998861. The val loss is 7.9562042579352354 with std:13.540821652964318.
7.9562042579352354 7.666822074546214 7
The training loss is 6.148434714703725 with std:10.407627997940944. The val loss is 8.581613173685634 with std:21.59377952971129.
8.581613173685634 7.666822074546214 7
Evaluating for {'degree': 7, 'lmda': 10.0} ...
The training loss is 6.535835658290109 with std:14.705477871450903. The val loss is 5202.885713663554 with std:56920.01844525878.
5202.885713663554 10.0 7
The training loss is 6.660418840558581 with std:13.51589065551816. The val loss is 8.047975684892227 with std:13.445882709214041.
8.047975684892227 10.0 7
The training loss is 6.5506451568331405 with std:11.018081308797253. The val loss is 8.802483223630482 with std:21.97155339587587.
8.802483223630482 10.0 7
Evaluating for {'degree': 8, 'lmda': 0.01} ...
The training loss is 4.305628075782325 with std:10.349026800787302. The val loss is 7608527.488268324 with std:84615513.54070973.
7608527.488268324 0.01 8
The training loss is 4.244078749688658 with std:8.373270266865338. The val loss is 11.884020544448758 with std:31.45048453747541.
11.884020544448758 0.01 8
The training loss is 3.7178452097953616 with std:6.354752712106808. The val loss is 12.868410413483486 with std:47.97300268862425.
12.868410413483486 0.01 8
Evaluating for {'degree': 8, 'lmda': 0.013043213867190054} ...
The training loss is 4.351213813748012 with std:10.493701773181837. The val loss is 5926315.6919772 with std:65903874.07027418.
5926315.6919772 0.013043213867190054 8
The training loss is 4.293890730979719 with std:8.502901982317109. The val loss is 11.378915965616656 with std:28.397757929826316.
11.378915965616656 0.013043213867190054 8
The training loss is 3.7512558131012166 with std:6.415477323062106. The val loss is 12.594349228056464 with std:45.808602317521434.
12.594349228056464 0.013043213867190054 8
Evaluating for {'degree': 8, 'lmda': 0.017012542798525893} ...
The training loss is 4.398154823217097 with std:10.644008845787829. The val loss is 4417267.7041778695 with std:49118949.774283625.
4417267.7041778695 0.017012542798525893 8
The training loss is 4.343112548323491 with std:8.633833453025472. The val loss is 10.929534756324081 with std:25.827692234808673.
10.929534756324081 0.017012542798525893 8
The training loss is 3.7858274239145944 with std:6.478002119600484. The val loss is 12.311707158835226 with std:43.594044457557104.
12.311707158835226 0.017012542798525893 8
Evaluating for {'degree': 8, 'lmda': 0.02218982341458972} ...
The training loss is 4.446563305695344 with std:10.799819055375538. The val loss is 3148250.945829781 with std:35004430.219311945.
3148250.945829781 0.02218982341458972 8
The training loss is 4.391892011606454 with std:8.76502349633811. The val loss is 10.533067884689425 with std:23.68919985809813.
10.533067884689425 0.02218982341458972 8
The training loss is 3.8217816008380505 with std:6.542243128661047. The val loss is 12.025235869385364 with std:41.36006546043552.
12.025235869385364 0.02218982341458972 8
Evaluating for {'degree': 8, 'lmda': 0.028942661247167517} ...
The training loss is 4.4962728754841175 with std:10.96040344524607. The val loss is 2145375.797777539 with std:23850598.399232976.
2145375.797777539 0.028942661247167517 8
The training loss is 4.440223055099476 with std:8.894927324774919. The val loss is 10.18504186825284 with std:21.923741329396066.
10.18504186825284 0.028942661247167517 8
The training loss is 3.8591189044302276 with std:6.60782071801557. The val loss is 11.739313062190806 with std:39.14325542416243.
11.739313062190806 0.028942661247167517 8
Evaluating for {'degree': 8, 'lmda': 0.037750532053243954} ...
The training loss is 4.5469216162400405 with std:11.12456424492431. The val loss is 1399269.67769726 with std:15553033.27535276.
1399269.67769726 0.037750532053243954 8
The training loss is 4.488034976424509 with std:9.021611130098597. The val loss is 9.879973285431676 with std:20.473477219713807.
9.879973285431676 0.037750532053243954 8
The training loss is 3.897673499481349 with std:6.674140496776282. The val loss is 11.457354839566012 with std:36.98018275093143.
11.457354839566012 0.037750532053243954 8
Evaluating for {'degree': 8, 'lmda': 0.04923882631706739} ...
The training loss is 4.5980780643515615 with std:11.290727636626366. The val loss is 875953.9113362116 with std:9733627.696938487.
875953.9113362116 0.04923882631706739 8
The training loss is 4.535300493596294 with std:9.142939959849032. The val loss is 9.612103023179003 with std:19.286185857716635.
9.612103023179003 0.04923882631706739 8
The training loss is 3.9372301150733415 with std:6.740564090802396. The val loss is 11.181710946648943 with std:34.904128574308544.
11.181710946648943 0.04923882631706739 8
Evaluating for {'degree': 8, 'lmda': 0.0642232542222936} ...
The training loss is 4.649360096295854 with std:11.45696688337252. The val loss is 529455.6043096795 with std:5880892.288151941.
529455.6043096795 0.0642232542222936 8
The training loss is 4.582129558888745 with std:9.25683028840373. The val loss is 9.376052549523715 with std:18.317266159900832.
9.376052549523715 0.0642232542222936 8
The training loss is 3.9776881735727705 with std:6.806651950822028. The val loss is 10.914088345785 with std:32.9448906399821.
10.914088345785 0.0642232542222936 8
Evaluating for {'degree': 8, 'lmda': 0.0837677640068292} ...
The training loss is 4.700510323614238 with std:11.620985048575228. The val loss is 312607.83376354596 with std:3470092.868978899.
312607.83376354596 0.0837677640068292 8
The training loss is 4.628833455459285 with std:9.36156616018851. The val loss is 9.167285400908169 with std:17.529623658667223.
9.167285400908169 0.0837677640068292 8
The training loss is 4.019237124956245 with std:6.872435659298576. The val loss is 10.656279115641668 with std:31.1299991073901.
10.656279115641668 0.0837677640068292 8
Evaluating for {'degree': 8, 'lmda': 0.10926008611173785} ...
The training loss is 4.751418889581035 with std:11.780128673939824. The val loss is 184158.9557010043 with std:2042322.757433162.
184158.9557010043 0.10926008611173785 8
The training loss is 4.675958505699485 with std:9.456166655929174. The val loss is 8.98234508203346 with std:16.892510020753228.
8.98234508203346 0.10926008611173785 8
The training loss is 4.0624951735264165 with std:6.9386620219614334. The val loss is 10.410819240883972 with std:29.48496606592374.
10.410819240883972 0.10926008611173785 8
Evaluating for {'degree': 8, 'lmda': 0.14251026703029984} ...
The training loss is 4.802106549583169 with std:11.931501542400612. The val loss is 112065.62170131305 with std:1241122.6997521303.
112065.62170131305 0.14251026703029984 8
The training loss is 4.724287689487098 with std:9.54075563690707. The val loss is 8.818904281220028 with std:16.380250813714824.
8.818904281220028 0.14251026703029984 8
The training loss is 4.108566009089484 with std:7.006950135848911. The val loss is 10.181231366029351 with std:28.031314991338153.
10.181231366029351 0.14251026703029984 8
Evaluating for {'degree': 8, 'lmda': 0.18587918911465645} ...
The training loss is 4.852687569848419 with std:12.072207705406141. The val loss is 73868.89350887864 with std:816659.4625254988.
73868.89350887864 0.18587918911465645 8
The training loss is 4.774797556663858 with std:9.616841784093106. The val loss is 8.67566070589216 with std:15.971288557992722.
8.67566070589216 0.18587918911465645 8
The training loss is 4.158984689445904 with std:7.0798141866193145. The val loss is 9.971691404943867 with std:26.783399095830323.
9.971691404943867 0.18587918911465645 8
Evaluating for {'degree': 8, 'lmda': 0.24244620170823283} ...
The training loss is 4.903328915998973 with std:12.19970399884131. The val loss is 55302.23090747493 with std:610244.0392318044.
55302.23090747493 0.24244620170823283 8
The training loss is 4.828555992044559 with std:9.687400851482872. The val loss is 8.552082004634881 with std:15.647472599012751.
8.552082004634881 0.24244620170823283 8
The training loss is 4.215550646957175 with std:7.160524501727026. The val loss is 9.786214633254204 with std:25.746545917432403.
9.786214633254204 0.24244620170823283 8
Evaluating for {'degree': 8, 'lmda': 0.31622776601683794} ...
The training loss is 4.954220876550679 with std:12.312208335710805. The val loss is 48112.8668083351 with std:530105.079565457.
48112.8668083351 0.31622776601683794 8
The training loss is 4.886566014771366 with std:9.756686693761083. The val loss is 8.447994617494478 with std:15.39336472572258.
8.447994617494478 0.31622776601683794 8
The training loss is 4.280077430868867 with std:7.252806464849017. The val loss is 9.627645915995034 with std:24.918106951691993.
9.627645915995034 0.31622776601683794 8
Evaluating for {'degree': 8, 'lmda': 0.41246263829013524} ...
The training loss is 5.005576933006253 with std:12.409097753145666. The val loss is 47812.79606065045 with std:526379.006408223.
47812.79606065045 0.41246263829013524 8
The training loss is 4.949593571614706 with std:9.829782992384574. The val loss is 8.363055688556145 with std:15.195464933485647.
8.363055688556145 0.41246263829013524 8
The training loss is 4.354117251587576 with std:7.360413922262756. The val loss is 9.496775408717788 with std:24.290035064288467.
9.496775408717788 0.41246263829013524 8
Evaluating for {'degree': 8, 'lmda': 0.5379838403443686} ...
The training loss is 5.057679265816281 with std:12.491237486964005. The val loss is 51827.81993699485 with std:570521.4835276847.
51827.81993699485 0.5379838403443686 8
The training loss is 5.018039981082365 with std:9.912015808493054. The val loss is 8.296217451962418 with std:15.041443953118304.
8.296217451962418 0.5379838403443686 8
The training loss is 4.438732485819219 with std:7.486657016100717. The val loss is 9.391824464952846 with std:23.84961979342646.
9.391824464952846 0.5379838403443686 8
Evaluating for {'degree': 8, 'lmda': 0.701703828670383} ...
The training loss is 5.110976515648196 with std:12.561208440219874. The val loss is 58278.833079531454 with std:641741.8732194564.
58278.833079531454 0.701703828670383 8
The training loss is 5.0919156911205015 with std:10.00842966628779. The val loss is 8.245338063966372 with std:14.919528235821183.
8.245338063966372 0.701703828670383 8
The training loss is 4.534377953858708 with std:7.633998181389676. The val loss is 9.308462294861876 with std:23.57651622754776.
9.308462294861876 0.701703828670383 8
Evaluating for {'degree': 8, 'lmda': 0.9152473108773893} ...
The training loss is 5.166230561158825 with std:12.62344093539841. The val loss is 65410.74852760662 with std:720633.4117657129.
65410.74852760662 0.9152473108773893 8
The training loss is 5.170948262418415 with std:10.123541612292263. The val loss is 8.20707783852655 with std:14.818143548694625.
8.20707783852655 0.9152473108773893 8
The training loss is 4.640933185332271 with std:7.8038385884276975. The val loss is 9.24045453309197 with std:23.43834918873345.
9.24045453309197 0.9152473108773893 8
Evaluating for {'degree': 8, 'lmda': 1.1937766417144369} ...
The training loss is 5.224709835668035 with std:12.684301209757207. The val loss is 71520.77477890914 with std:788354.6369273774.
71520.77477890914 1.1937766417144369 8
The training loss is 5.25483708289535 with std:10.261518599975561. The val loss is 8.177151695451599 with std:14.725885993484066.
8.177151695451599 1.1937766417144369 8
The training loss is 4.7579039599699495 with std:7.99660204732604. The val loss is 9.180940038048231 with std:23.39009093253489.
9.180940038048231 1.1937766417144369 8
Evaluating for {'degree': 8, 'lmda': 1.5570684047537318} ...
The training loss is 5.288438042920965 with std:12.752191711513557. The val loss is 75162.18286049658 with std:828877.7776096041.
75162.18286049658 1.5570684047537318 8
The training loss is 5.343664406159124 with std:10.426809544586316. The val loss is 8.150921709908843 with std:14.631874995874115.
8.150921709908843 1.5570684047537318 8
The training loss is 4.884813457329401 with std:8.212197362691967. The val loss is 9.124143048910309 with std:23.38082449779168.
9.124143048910309 1.5570684047537318 8
Evaluating for {'degree': 8, 'lmda': 2.030917620904737} ...
The training loss is 5.360523461566862 with std:12.837710485542116. The val loss is 75406.57342098436 with std:831895.2488257339.
75406.57342098436 2.030917620904737 8
The training loss is 5.438481621978321 with std:10.62516167982575. The val loss is 8.124246592768305 with std:14.526490443139522.
8.124246592768305 2.030917620904737 8
The training loss is 5.021818534881067 with std:8.450905701695552. The val loss is 9.067153867574747 with std:23.366370948920178.
9.067153867574747 2.030917620904737 8
Evaluating for {'degree': 8, 'lmda': 2.6489692876105297} ...
The training loss is 5.445621196022353 with std:12.953894062945425. The val loss is 72007.331181696 with std:794637.1063698478.
72007.331181696 2.6489692876105297 8
The training loss is 5.542106445485461 with std:10.864905354017782. The val loss is 8.094493623997957 with std:14.402397068176086.
8.094493623997957 2.6489692876105297 8
The training loss is 5.170598479495686 with std:8.714691581760832. The val loss is 9.01141431684285 with std:23.322132088614293.
9.01141431684285 2.6489692876105297 8
Evaluating for {'degree': 8, 'lmda': 3.4551072945922217} ...
The training loss is 5.550639060963258 with std:13.116587634417149. The val loss is 65390.48287780764 with std:721778.633902037.
65390.48287780764 3.4551072945922217 8
The training loss is 5.6602084850447305 with std:11.158417408606027. The val loss is 8.06169624846441 with std:14.255730230923062.
8.06169624846441 3.4551072945922217 8
The training loss is 5.335578650569082 with std:9.00889508765467. The val loss is 8.963788107726748 with std:23.250709111092018.
8.963788107726748 3.4551072945922217 8
Evaluating for {'degree': 8, 'lmda': 4.506570337745478} ...
The training loss is 5.685905194705342 with std:13.34507233243178. The val loss is 56482.34695497973 with std:623539.5116302094.
56482.34695497973 4.506570337745478 8
The training loss is 5.802857396050663 with std:11.523756756466414. The val loss is 8.030020008812489 with std:14.087496435579467.
8.030020008812489 4.506570337745478 8
The training loss is 5.525611018351009 with std:9.344279336178733. The val loss is 8.93749433740102 with std:23.182499908300766.
8.93749433740102 4.506570337745478 8
Evaluating for {'degree': 8, 'lmda': 5.878016072274912} ...
The training loss is 5.86717175221874 with std:13.663204825136807. The val loss is 46442.88401186307 with std:512736.17626703135.
46442.88401186307 5.878016072274912 8
The training loss is 5.986871926071841 with std:11.986562521410978. The val loss is 8.009948978269813 with std:13.9056283556018.
8.009948978269813 5.878016072274912 8
The training loss is 5.756380489551 with std:9.739506949032943. The val loss is 8.953583232052633 with std:23.171349358458553.
8.953583232052633 5.878016072274912 8
Evaluating for {'degree': 8, 'lmda': 7.666822074546214} ...
The training loss is 6.119016766122128 with std:14.101410276676976. The val loss is 36394.07679665398 with std:401776.0211679416.
36394.07679665398 7.666822074546214 8
The training loss is 6.2395222572527755 with std:12.582347276639322. The val loss is 8.021855854630079 with std:13.728560486782134.
8.021855854630079 7.666822074546214 8
The training loss is 6.054044819206493 with std:10.224247848337518. The val loss is 9.04392863208188 with std:23.28911846492842.
9.04392863208188 7.666822074546214 8
Evaluating for {'degree': 8, 'lmda': 10.0} ...
The training loss is 6.480400230262932 with std:14.699821627922056. The val loss is 27216.30555345632 with std:300401.7086893113.
27216.30555345632 10.0 8
The training loss is 6.604370404363355 with std:13.359245589464104. The val loss is 8.101811389748017 with std:13.591432711210528.
8.101811389748017 10.0 8
The training loss is 6.460906951321327 with std:10.843146883445192. The val loss is 9.256879798189882 with std:23.622408230668718.
9.256879798189882 10.0 8
Evaluating for {'degree': 9, 'lmda': 0.01} ...
The training loss is 4.222867445300734 with std:10.240278943965428. The val loss is 41566412.06289636 with std:463173219.3025653.
41566412.06289636 0.01 9
The training loss is 4.181062750299289 with std:8.314191612293445. The val loss is 11.629884203183984 with std:31.314988068994822.
11.629884203183984 0.01 9
The training loss is 3.6517465682044645 with std:6.286448522825049. The val loss is 14.712370729109104 with std:69.73686290234359.
14.712370729109104 0.01 9
Evaluating for {'degree': 9, 'lmda': 0.013043213867190054} ...
The training loss is 4.272051626062625 with std:10.390036485716067. The val loss is 35206415.93582717 with std:392298186.98308486.
35206415.93582717 0.013043213867190054 9
The training loss is 4.234317771562113 with std:8.447417190771073. The val loss is 11.204856146789133 with std:28.601392305697253.
11.204856146789133 0.013043213867190054 9
The training loss is 3.6859734714633468 with std:6.353617207687049. The val loss is 14.564451793380272 with std:68.07079128149817.
14.564451793380272 0.013043213867190054 9
Evaluating for {'degree': 9, 'lmda': 0.017012542798525893} ...
The training loss is 4.322516267476479 with std:10.544702953725702. The val loss is 28487979.88383282 with std:317429318.2958842.
28487979.88383282 0.017012542798525893 9
The training loss is 4.286421399294389 with std:8.582543770308451. The val loss is 10.827829492391016 with std:26.273537198353868.
10.827829492391016 0.017012542798525893 9
The training loss is 3.720726122023496 with std:6.421620878253541. The val loss is 14.360234289035615 with std:65.91116359655369.
14.360234289035615 0.017012542798525893 9
Evaluating for {'degree': 9, 'lmda': 0.02218982341458972} ...
The training loss is 4.37449941703412 with std:10.70502254302783. The val loss is 22018245.178553317 with std:245332772.12451738.
22018245.178553317 0.02218982341458972 9
The training loss is 4.337669592902594 with std:8.71875012780591. The val loss is 10.49490830456322 with std:24.298599609940588.
10.49490830456322 0.02218982341458972 9
The training loss is 3.7563899941058234 with std:6.490157510039547. The val loss is 14.104744123991317 with std:63.27736935926015.
14.104744123991317 0.02218982341458972 9
Evaluating for {'degree': 9, 'lmda': 0.028942661247167517} ...
The training loss is 4.428042368493511 with std:10.87090905608277. The val loss is 16256646.664508179 with std:181128427.62976867.
16256646.664508179 0.028942661247167517 9
The training loss is 4.388222044850719 with std:8.854647540942748. The val loss is 10.201607443086658 with std:22.633950596022615.
10.201607443086658 0.028942661247167517 9
The training loss is 3.793192015311009 with std:6.558785845444494. The val loss is 13.805007595988872 with std:60.21951091139884.
13.805007595988872 0.028942661247167517 9
Evaluating for {'degree': 9, 'lmda': 0.037750532053243954} ...
The training loss is 4.482966115733306 with std:11.04144972413015. The val loss is 11475078.094975213 with std:127846042.93706703.
11475078.094975213 0.037750532053243954 9
The training loss is 4.438109236636538 with std:8.988317202576278. The val loss is 9.942990883740874 with std:21.235096993341802.
9.942990883740874 0.037750532053243954 9
The training loss is 3.8311992559466796 with std:6.627003549675465. The val loss is 13.469772887647945 with std:56.818926022565144.
13.469772887647945 0.037750532053243954 9
Evaluating for {'degree': 9, 'lmda': 0.04923882631706739} ...
The training loss is 4.538900936078196 with std:11.214999489103649. The val loss is 7759036.21865681 with std:86438102.15048456.
7759036.21865681 0.04923882631706739 9
The training loss is 4.487301175853387 with std:9.117477836829828. The val loss is 9.713968744701157 with std:20.06069044719085.
9.713968744701157 0.04923882631706739 9
The training loss is 3.8703669622583865 with std:6.694352407931447. The val loss is 13.108882176131901 with std:53.18198876642945.
13.108882176131901 0.04923882631706739 9
Evaluating for {'degree': 9, 'lmda': 0.0642232542222936} ...
The training loss is 4.595361932981139 with std:11.389327280423759. The val loss is 5044465.744708966 with std:56190496.75563991.
5044465.744708966 0.0642232542222936 9
The training loss is 4.535806015398739 with std:9.23975313361124. The val loss is 9.509717416692117 with std:19.075195884407993.
9.509717416692117 0.0642232542222936 9
The training loss is 3.9106376914066847 with std:6.76055049379467. The val loss is 12.732629256726689 with std:49.43090879360859.
12.732629256726689 0.0642232542222936 9
Evaluating for {'degree': 9, 'lmda': 0.0837677640068292} ...
The training loss is 4.651849226787076 with std:11.56177925471258. The val loss is 3174137.035779712 with std:35350843.13261173.
3174137.035779712 0.0837677640068292 9
The training loss is 4.583764359607107 with std:9.352997809829022. The val loss is 9.326100953515585 with std:18.24966257937647.
9.326100953515585 0.0837677640068292 9
The training loss is 3.9520825524054577 with std:6.825653046015499. The val loss is 12.35136830222981 with std:45.69490945373897.
12.35136830222981 0.0837677640068292 9
Evaluating for {'degree': 9, 'lmda': 0.10926008611173785} ...
The training loss is 4.707945404497958 with std:11.729443769106258. The val loss is 1954376.923817541 with std:21760698.270574383.
1954376.923817541 0.10926008611173785 9
The training loss is 4.6315193558823 with std:9.455650648411629. The val loss is 9.159968760470063 with std:17.561019553321344.
9.159968760470063 0.10926008611173785 9
The training loss is 3.995058233526714 with std:6.890234770469339. The val loss is 11.975397330216204 with std:42.102832087965304.
11.975397330216204 0.10926008611173785 9
Evaluating for {'degree': 9, 'lmda': 0.14251026703029984} ...
The training loss is 4.7633858049304605 with std:11.889325259835777. The val loss is 1198355.9492341666 with std:13337937.84282802.
1198355.9492341666 0.14251026703029984 9
The training loss is 4.679655385868219 with std:9.54708850205001. The val loss is 9.009272330617014 with std:16.99058770713973.
9.009272330617014 0.14251026703029984 9
The training loss is 4.040336275922207 with std:6.955566941524592. The val loss is 11.614878967201978 with std:38.775666803066066.
11.614878967201978 0.14251026703029984 9
Evaluating for {'degree': 9, 'lmda': 0.18587918911465645} ...
The training loss is 4.818086374685811 with std:12.038545170737985. The val loss is 750943.1935744272 with std:8353720.645839756.
750943.1935744272 0.18587918911465645 9
The training loss is 4.728998471352584 with std:9.627940393487131. The val loss is 8.873007583488565 with std:16.522577789700843.
8.873007583488565 0.18587918911465645 9
The training loss is 4.089156209598422 with std:7.0237441618522105. The val loss is 11.279461762741173 with std:35.81694326170098.
11.279461762741173 0.18587918911465645 9
Evaluating for {'degree': 9, 'lmda': 0.24244620170823283} ...
The training loss is 4.872127891991377 with std:12.174586284050179. The val loss is 497172.6890088478 with std:5526832.0518640755.
497172.6890088478 0.24244620170823283 9
The training loss is 4.7805664274152315 with std:9.700293777272865. The val loss is 8.751004676332403 with std:16.142988397447574.
8.751004676332403 0.24244620170823283 9
The training loss is 4.143169213686296 with std:7.09770893359191. The val loss is 10.977419771136121 with std:33.301235355396464.
10.977419771136121 0.24244620170823283 9
Evaluating for {'degree': 9, 'lmda': 0.31622776601683794} ...
The training loss is 4.9257131632393545 with std:12.295586717157864. The val loss is 359610.1221105684 with std:3994347.579336945.
359610.1221105684 0.31622776601683794 9
The training loss is 4.835462093874232 with std:9.767712955780205. The val loss is 8.643575777645351 with std:15.838846712532092.
8.643575777645351 0.31622776601683794 9
The training loss is 4.204270023492748 with std:7.1811303805286215. The val loss is 10.714451932627087 with std:31.26440932894573.
10.714451932627087 0.31622776601683794 9
Evaluating for {'degree': 9, 'lmda': 0.41246263829013524} ...
The training loss is 4.979128706116936 with std:12.400672248734319. The val loss is 290132.3596926696 with std:3220041.000445402.
290132.3596926696 0.41246263829013524 9
The training loss is 4.894724705550895 with std:9.835018229400754. The val loss is 8.551039672293156 with std:15.597558356284361.
8.551039672293156 0.41246263829013524 9
The training loss is 4.274349842539111 with std:7.278116598878915. The val loss is 10.492553857624078 with std:29.700007302071953.
10.492553857624078 0.41246263829013524 9
Evaluating for {'degree': 9, 'lmda': 0.5379838403443686} ...
The training loss is 5.032744215933121 with std:12.490293920552888. The val loss is 260392.66032505885 with std:2888140.934524106.
260392.66032505885 0.5379838403443686 9
The training loss is 4.959180989115314 with std:9.907848981274187. The val loss is 8.47318595937968 with std:15.406293087961634.
8.47318595937968 0.5379838403443686 9
The training loss is 4.3550290734434824 with std:7.39278049737405. The val loss is 10.309414030503635 with std:28.56282525026302.
10.309414030503635 0.5379838403443686 9
Evaluating for {'degree': 9, 'lmda': 0.701703828670383} ...
The training loss is 5.087069536358397 with std:12.566526701221147. The val loss is 253631.06498960184 with std:2812062.1371732545.
253631.06498960184 0.701703828670383 9
The training loss is 5.029351162659457 with std:9.992131742994676. The val loss is 8.408796289869208 with std:15.251558256727481.
8.408796289869208 0.701703828670383 9
The training loss is 4.447437964647785 with std:7.5287350568759495. The val loss is 10.158614760343095 with std:27.7768362922662.
10.158614760343095 0.701703828670383 9
Evaluating for {'degree': 9, 'lmda': 0.9152473108773893} ...
The training loss is 5.1428722328121905 with std:12.63330037444849. The val loss is 259006.96201506717 with std:2871224.057435338.
259006.96201506717 0.9152473108773893 9
The training loss is 5.105460804004379 with std:10.093651536021445. The val loss is 8.355369862710148 with std:15.11917245839166.
8.355369862710148 0.9152473108773893 9
The training loss is 4.552108236252688 with std:7.6886521874891445. The val loss is 10.030697542041366 with std:27.244710898192807.
10.030697542041366 0.9152473108773893 9
Evaluating for {'degree': 9, 'lmda': 1.1937766417144369} ...
The training loss is 5.201355469408566 with std:12.696572732984686. The val loss is 268505.8504193612 with std:2976541.364612042.
268505.8504193612 1.1937766417144369 9
The training loss is 5.187599645844261 with std:10.217939378060747. The val loss is 8.30918594160304 with std:14.994750362317314.
8.30918594160304 1.1937766417144369 9
The training loss is 4.669032191005444 with std:7.87405879010465. The val loss is 9.914985309636572 with std:26.859412593802748.
9.914985309636572 1.1937766417144369 9
Evaluating for {'degree': 9, 'lmda': 1.5570684047537318} ...
The training loss is 5.264409279989706 with std:12.7644919998716. The val loss is 275809.2032563282 with std:3057787.2239084984.
275809.2032563282 1.5570684047537318 9
The training loss is 5.276065017461475 with std:10.370627081049049. The val loss is 8.265778902706465 with std:14.864677442390464.
8.265778902706465 1.5570684047537318 9
The training loss is 4.797948658119923 with std:8.085546718742208. The val loss is 9.801910569868122 with std:26.519493185852067.
9.801910569868122 1.5570684047537318 9
Evaluating for {'degree': 9, 'lmda': 2.030917620904737} ...
The training loss is 5.334969116906095 with std:12.84760388472568. The val loss is 276286.8461064181 with std:3063473.8930439283.
276286.8461064181 2.030917620904737 9
The training loss is 5.371928992016657 with std:10.558307726087374. The val loss is 8.220818598459472 with std:14.717466505155517.
8.220818598459472 2.030917620904737 9
The training loss is 4.938921009980831 with std:8.323529554185495. The val loss is 9.685449972788543 with std:26.146706530319854.
9.685449972788543 2.030917620904737 9
Evaluating for {'degree': 9, 'lmda': 2.6489692876105297} ...
The training loss is 5.417539985629742 with std:12.959140921548547. The val loss is 267343.3561871748 with std:2964695.920136989.
267343.3561871748 2.6489692876105297 9
The training loss is 5.477870378927318 with std:10.789828864617723. The val loss is 8.171317659645226 with std:14.54533341718938.
8.171317659645226 2.6489692876105297 9
The training loss is 5.093265655857452 with std:8.589590378302177. The val loss is 9.565201873514322 with std:25.700903642958817.
9.565201873514322 2.6489692876105297 9
Evaluating for {'degree': 9, 'lmda': 3.4551072945922217} ...
The training loss is 5.518987856636455 with std:13.115426293574577. The val loss is 248583.6276934344 with std:2756979.720096986.
248583.6276934344 3.4551072945922217 9
The training loss is 5.599333417120826 with std:11.077889594573852. The val loss is 8.117085390367631 with std:14.345814737076987.
8.117085390367631 3.4551072945922217 9
The training loss is 5.264878504070749 with std:8.888364901360903. The val loss is 9.447799585940217 with std:25.186897575712813.
9.447799585940217 3.4551072945922217 9
Evaluating for {'degree': 9, 'lmda': 4.506570337745478} ...
The training loss is 5.649787143728206 with std:13.336478407061168. The val loss is 221558.2774020017 with std:2457470.16523799.
221558.2774020017 4.506570337745478 9
The training loss is 5.746152371232023 with std:11.44084001020741. The val loss is 8.062467288579969 with std:14.12336413411021.
8.062467288579969 4.506570337745478 9
The training loss is 5.462041975803321 with std:9.229861493593269. The val loss is 9.347757322096557 with std:24.651327341873895.
9.347757322096557 4.506570337745478 9
Evaluating for {'degree': 9, 'lmda': 5.878016072274912} ...
The training loss is 5.826057500813148 with std:13.647023675576314. The val loss is 189123.680613446 with std:2097833.749240679.
189123.680613446 5.878016072274912 9
The training loss is 5.934941605449299 with std:11.904672411440085. The val loss is 8.018654541197808 with std:13.8911881075265.
8.018654541197808 5.878016072274912 9
The training loss is 5.699933862030883 with std:9.632185241413033. The val loss is 9.288374675573092 with std:24.171871416900476.
9.288374675573092 5.878016072274912 9
Evaluating for {'degree': 9, 'lmda': 7.666822074546214} ...
The training loss is 6.072919045802585 with std:14.078252842518435. The val loss is 154625.32552126722 with std:1715193.1476598766.
154625.32552126722 7.666822074546214 9
The training loss is 6.192783529226227 with std:12.505279694629536. The val loss is 8.007160829411042 with std:13.674053457450764.
8.007160829411042 7.666822074546214 9
The training loss is 6.0043135113399835 with std:10.124770623941703. The val loss is 9.303799322357037 with std:23.84396887303013.
9.303799322357037 7.666822074546214 9
Evaluating for {'degree': 9, 'lmda': 10.0} ...
The training loss is 6.4299201764987455 with std:14.67067862469095. The val loss is 121149.40892273597 with std:1343813.4213334867.
121149.40892273597 10.0 9
The training loss is 6.563010217079742 with std:13.29105153668507. The val loss is 8.065355088610032 with std:13.513178981131816.
8.065355088610032 10.0 9
The training loss is 6.4171808057093775 with std:10.752295802739782. The val loss is 9.443647914555436 with std:23.77057450995154.
9.443647914555436 10.0 9
Evaluating for {'degree': 10, 'lmda': 0.01} ...
The training loss is 4.155287064935343 with std:10.043667300989702. The val loss is 165487757.80640158 with std:1846247736.7156873.
165487757.80640158 0.01 10
The training loss is 4.08330756464546 with std:8.1297651124425. The val loss is 14.164371860305604 with std:54.3985431082515.
14.164371860305604 0.01 10
The training loss is 3.596715928053969 with std:6.212751737258113. The val loss is 15.929316074067337 with std:80.61816952463461.
15.929316074067337 0.01 10
Evaluating for {'degree': 10, 'lmda': 0.013043213867190054} ...
The training loss is 4.201652618162301 with std:10.186414395126965. The val loss is 153176510.20132712 with std:1708898140.8574958.
153176510.20132712 0.013043213867190054 10
The training loss is 4.138366943829808 with std:8.265585843818153. The val loss is 13.367221853940373 with std:47.74829730502888.
13.367221853940373 0.013043213867190054 10
The training loss is 3.630898828277409 with std:6.282265747524938. The val loss is 16.044046680978894 with std:82.17034973641825.
16.044046680978894 0.013043213867190054 10
Evaluating for {'degree': 10, 'lmda': 0.017012542798525893} ...
The training loss is 4.248817982651863 with std:10.33475111434455. The val loss is 134820521.3814584 with std:1504105193.4904687.
134820521.3814584 0.017012542798525893 10
The training loss is 4.192943178704105 with std:8.406517476549896. The val loss is 12.64939338854155 with std:41.77472323494852.
12.64939338854155 0.017012542798525893 10
The training loss is 3.6651099537439156 with std:6.35287108373045. The val loss is 16.060551012687 with std:82.7545656740104.
16.060551012687 0.017012542798525893 10
Evaluating for {'degree': 10, 'lmda': 0.02218982341458972} ...
The training loss is 4.297347156331836 with std:10.490684891335452. The val loss is 112995488.097398 with std:1260607290.6623623.
112995488.097398 0.02218982341458972 10
The training loss is 4.247162026868308 with std:8.551489024037277. The val loss is 12.018111596661177 with std:36.610173393297146.
12.018111596661177 0.02218982341458972 10
The training loss is 3.699754368201038 with std:6.42409993230666. The val loss is 15.981968405340508 with std:82.34959166345185.
15.981968405340508 0.02218982341458972 10
Evaluating for {'degree': 10, 'lmda': 0.028942661247167517} ...
The training loss is 4.347729417208874 with std:10.655245690421543. The val loss is 90251646.87742034 with std:1006858430.2267325.
90251646.87742034 0.028942661247167517 10
The training loss is 4.301005700338141 with std:8.698727330292762. The val loss is 11.474548630964922 with std:32.296011870544305.
11.474548630964922 0.028942661247167517 10
The training loss is 3.735155340887185 with std:6.495333367685156. The val loss is 15.81337076095338 with std:80.96551581698091.
15.81337076095338 0.028942661247167517 10
Evaluating for {'degree': 10, 'lmda': 0.037750532053243954} ...
The training loss is 4.400252281296831 with std:10.828273044445012. The val loss is 68754833.05332094 with std:767023338.1230444.
68754833.05332094 0.037750532053243954 10
The training loss is 4.3543182880493205 with std:8.845841983619662. The val loss is 11.01362569439621 with std:28.792389648258727.
11.01362569439621 0.037750532053243954 10
The training loss is 3.7715367756725704 with std:6.565904610756664. The val loss is 15.561902562926266 with std:78.65120268713206.
15.561902562926266 0.037750532053243954 10
Evaluating for {'degree': 10, 'lmda': 0.04923882631706739} ...
The training loss is 4.454917395005372 with std:11.008404044046497. The val loss is 50031412.774829194 with std:558131726.7516032.
50031412.774829194 0.04923882631706739 10
The training loss is 4.406877748293191 with std:8.990034012700933. The val loss is 10.62542299096538 with std:26.002022349393474.
10.62542299096538 0.04923882631706739 10
The training loss is 3.8090402592178267 with std:6.635226765952268. The val loss is 15.236753325426912 with std:75.4962106934318.
15.236753325426912 0.04923882631706739 10
Evaluating for {'degree': 10, 'lmda': 0.0642232542222936} ...
The training loss is 4.511428673650027 with std:11.193245039462775. The val loss is 34868202.41085068 with std:388961860.80520195.
34868202.41085068 0.0642232542222936 10
The training loss is 4.458506945477238 with std:9.128392160125898. The val loss is 10.29740412585204 with std:23.799393452243855.
10.29740412585204 0.0642232542222936 10
The training loss is 3.8477747565735068 with std:6.702923181176186. The val loss is 14.849026044466333 with std:71.62809028862439.
14.849026044466333 0.0642232542222936 10
Evaluating for {'degree': 10, 'lmda': 0.0837677640068292} ...
The training loss is 4.569258528669631 with std:11.379663067768018. The val loss is 23378360.84268897 with std:260775892.4245309.
23378360.84268897 0.0837677640068292 10
The training loss is 4.509185061102941 with std:9.258222956931103. The val loss is 10.0167141665941 with std:22.057315499320147.
10.0167141665941 0.0837677640068292 10
The training loss is 3.887897844821563 with std:6.768950225172073. The val loss is 14.411661493619308 with std:67.20760388474615.
14.411661493619308 0.0837677640068292 10
Evaluating for {'degree': 10, 'lmda': 0.10926008611173785} ...
The training loss is 4.627770726452181 with std:11.564113600391465. The val loss is 15188250.538587015 with std:169404790.34941337.
15188250.538587015 0.10926008611173785 10
The training loss is 4.559128888981103 with std:9.377371087176188. The val loss is 9.772018004913834 with std:20.665299461080437.
9.772018004913834 0.10926008611173785 10
The training loss is 3.9297248382334917 with std:6.833717522073651. The val loss is 13.93948993033415 with std:62.423580624833804.
13.93948993033415 0.10926008611173785 10
Evaluating for {'degree': 10, 'lmda': 0.14251026703029984} ...
The training loss is 4.686361428400096 with std:11.742940850130184. The val loss is 9666771.970407536 with std:107807074.71799895.
9666771.970407536 0.14251026703029984 10
The training loss is 4.608832126570786 with std:9.484513633591057. The val loss is 9.554602263908794 with std:19.537901544414265.
9.554602263908794 0.14251026703029984 10
The training loss is 3.973849407636224 with std:6.898214005805149. The val loss is 13.449259703809457 with std:57.486612194949124.
13.449259703809457 0.14251026703029984 10
Evaluating for {'degree': 10, 'lmda': 0.18587918911465645} ...
The training loss is 4.744578388988059 with std:11.912631510442054. The val loss is 6125301.937999427 with std:68299533.20115776.
6125301.937999427 0.18587918911465645 10
The training loss is 4.659064263965804 with std:9.579429581373386. The val loss is 9.358709391498916 with std:18.614642321665457.
9.358709391498916 0.18587918911465645 10
The training loss is 4.021240820244318 with std:6.964132956744074. The val loss is 12.95928279536222 with std:52.618489972890984.
12.95928279536222 0.18587918911465645 10
Evaluating for {'degree': 10, 'lmda': 0.24244620170823283} ...
The training loss is 4.802190540119507 with std:12.070045342741482. The val loss is 3949676.6302920682 with std:44029750.147323556.
3949676.6302920682 0.24244620170823283 10
The training loss is 4.710831440209494 with std:9.663235084588917. The val loss is 9.181254085554084 with std:17.854902826312703.
9.181254085554084 0.24244620170823283 10
The training loss is 4.07327430483692 with std:7.033964581005057. The val loss is 12.488326285441355 with std:48.03439564935827.
12.488326285441355 0.24244620170823283 10
Evaluating for {'degree': 10, 'lmda': 0.31622776601683794} ...
The training loss is 4.859202164803335 with std:12.212665486980468. The val loss is 2660666.4517872822 with std:29650834.977677114.
2660666.4517872822 0.31622776601683794 10
The training loss is 4.765296676199358 with std:9.73854085262956. The val loss is 9.021150539760937 with std:17.231180304092593.
9.021150539760937 0.31622776601683794 10
The training loss is 4.131664619964329 with std:7.111006529729138. The val loss is 12.053671768650005 with std:43.91852113404101.
12.053671768650005 0.31622776601683794 10
Evaluating for {'degree': 10, 'lmda': 0.41246263829013524} ...
The training loss is 4.91583002054754 with std:12.33890055874562. The val loss is 1920588.5119473953 with std:21395229.799289696.
1920588.5119473953 0.41246263829013524 10
The training loss is 4.823660010288367 with std:9.809459519722228. The val loss is 8.878461218577161 with std:16.722976046263543.
8.878461218577161 0.41246263829013524 10
The training loss is 4.19830496701451 with std:7.199241890986631. The val loss is 11.668746901302242 with std:40.39929166445971.
11.668746901302242 0.41246263829013524 10
Evaluating for {'degree': 10, 'lmda': 0.5379838403443686} ...
The training loss is 4.972477535713042 with std:12.448436521734182. The val loss is 1509665.1737025392 with std:16810978.67374311.
1509665.1737025392 0.5379838403443686 10
The training loss is 4.887016076298936 with std:9.88140231478893. The val loss is 8.753518075418386 with std:16.31233906482924.
8.753518075418386 0.5379838403443686 10
The training loss is 4.275046619365536 with std:7.303054968308877. The val loss is 11.341083798868196 with std:37.53323382500305.
11.341083798868196 0.5379838403443686 10
Evaluating for {'degree': 10, 'lmda': 0.701703828670383} ...
The training loss is 5.029740848310499 with std:12.542599699958817. The val loss is 1292186.901893865 with std:14384222.820281807.
1292186.901893865 0.701703828670383 10
The training loss is 4.956229327452528 with std:9.960672997865071. The val loss is 8.646124811923343 with std:15.981250936398768.
8.646124811923343 0.701703828670383 10
The training loss is 4.36347300136305 with std:7.426799511978755. The val loss is 11.071313620988645 with std:35.30338551832618.
11.071313620988645 0.701703828670383 10
Evaluating for {'degree': 10, 'lmda': 0.9152473108773893} ...
The training loss is 5.088471349112055 with std:12.62467921379151. The val loss is 1185391.8872280656 with std:13191967.809368346.
1185391.8872280656 0.9152473108773893 10
The training loss is 5.031882878843345 with std:10.053977095402884. The val loss is 8.554948055430103 with std:15.710638385402769.
8.554948055430103 0.9152473108773893 10
The training loss is 4.464725252312108 with std:7.574300635806963. The val loss is 10.85352726363239 with std:33.63120838500434.
10.85352726363239 0.9152473108773893 10
Evaluating for {'degree': 10, 'lmda': 1.1937766417144369} ...
The training loss is 5.149911589174518 with std:12.700180042063321. The val loss is 1136917.0646861023 with std:12650442.057415226.
1136917.0646861023 1.1937766417144369 10
The training loss is 5.1143622230148695 with std:10.168065665961054. The val loss is 8.477227346124959 with std:15.480643180081133.
8.477227346124959 1.1937766417144369 10
The training loss is 4.579440920373339 with std:7.748449214807549. The val loss is 10.6769320082361 with std:32.39655596599491.
10.6769320082361 1.1937766417144369 10
Evaluating for {'degree': 10, 'lmda': 1.5570684047537318} ...
The training loss is 5.215925632749192 with std:12.777022490377208. The val loss is 1111486.3224102468 with std:12366491.61182951.
1111486.3224102468 1.5570684047537318 10
The training loss is 5.204134631705298 with std:10.30976232488237. The val loss is 8.40893649378045 with std:15.271739377132471.
8.40893649378045 1.5570684047537318 10
The training loss is 4.7078875512446485 with std:7.951109284743767. The val loss is 10.528518379950508 with std:31.461158467869193.
10.528518379950508 1.5570684047537318 10
Evaluating for {'degree': 10, 'lmda': 2.030917620904737} ...
The training loss is 5.289359799598236 with std:12.8657395298787. The val loss is 1084686.3507565544 with std:12068053.904689385.
1084686.3507565544 2.030917620904737 10
The training loss is 5.3022813714747725 with std:10.486547356330565. The val loss is 8.345478261959673 with std:15.066342058617598.
8.345478261959673 2.030917620904737 10
The training loss is 4.850398118287427 with std:8.18356607597957. The val loss is 10.396341077231838 with std:30.693318252128456.
10.396341077231838 2.030917620904737 10
Evaluating for {'degree': 10, 'lmda': 2.6489692876105297} ...
The training loss is 5.374591491893005 with std:12.979732478930574. The val loss is 1040851.675030416 with std:11580503.311593963.
1040851.675030416 2.6489692876105297 10
The training loss is 5.411334279489123 with std:10.707724829844146. The val loss is 8.282898338151675 with std:14.850655843433291.
8.282898338151675 2.6489692876105297 10
The training loss is 5.008216824327447 with std:8.447667544940002. The val loss is 10.272880386730227 with std:29.990822709779998.
10.272880386730227 2.6489692876105297 10
Evaluating for {'degree': 10, 'lmda': 3.4551072945922217} ...
The training loss is 5.478362235044876 with std:13.135644848353142. The val loss is 972457.6651489947 with std:10819861.529036716.
972457.6651489947 3.4551072945922217 10
The training loss is 5.536477655692482 with std:10.98606166730083. The val loss is 8.219529076824365 with std:14.61660061822274.
8.219529076824365 3.4551072945922217 10
The training loss is 5.18483749396677 with std:8.747677628944231. The val loss is 10.157883729833474 with std:29.296992820801908.
10.157883729833474 3.4551072945922217 10
Evaluating for {'degree': 10, 'lmda': 4.506570337745478} ...
The training loss is 5.611070351824645 with std:13.353946999602986. The val loss is 879327.9403249181 with std:9783979.698955562.
879327.9403249181 4.506570337745478 10
The training loss is 5.687238495750373 with std:11.339740966903197. The val loss is 8.158013571003737 with std:14.363756009837715.
8.158013571003737 4.506570337745478 10
The training loss is 5.387914237880035 with std:9.092747002962025. The val loss is 10.06033715106873 with std:28.60546927111398.
10.06033715106873 4.506570337745478 10
Evaluating for {'degree': 10, 'lmda': 5.878016072274912} ...
The training loss is 5.78883892226819 with std:13.659918640595752. The val loss is 766976.1331274739 with std:8534094.473926842.
766976.1331274739 5.878016072274912 10
The training loss is 5.879937189042927 with std:11.794531605661547. The val loss is 8.10787214875766 with std:14.101518774312312.
8.10787214875766 5.878016072274912 10
The training loss is 5.6319234333082955 with std:9.499902890844902. The val loss is 9.999848140689297 with std:27.953619512894953.
9.999848140689297 5.878016072274912 10
Evaluating for {'degree': 10, 'lmda': 7.666822074546214} ...
The training loss is 6.036874654942354 with std:14.085339826424615. The val loss is 644229.606454154 with std:7168385.169659648.
644229.606454154 7.666822074546214 10
The training loss is 6.141406878276825 with std:12.386179037978533. The val loss is 8.089121241754015 with std:13.852128361974874.
8.089121241754015 7.666822074546214 10
The training loss is 5.941993509146925 with std:9.997560993073067. The val loss is 10.008489173338901 with std:27.40933545919273.
10.008489173338901 7.666822074546214 10
Evaluating for {'degree': 10, 'lmda': 10.0} ...
The training loss is 6.394868749122544 with std:14.671251286969378. The val loss is 520720.5076313126 with std:5794042.489020432.
520720.5076313126 10.0 10
The training loss is 6.5147717679231185 with std:13.163062638476832. The val loss is 8.137826726263006 with std:13.655687221813128.
8.137826726263006 10.0 10
The training loss is 6.3596528651237625 with std:10.629636330977808. The val loss is 10.13472045392533 with std:27.05840784064152.
10.13472045392533 10.0 10
Evaluating for {'degree': 11, 'lmda': 0.01} ...
The training loss is 4.107580032100943 with std:9.981541306187786. The val loss is 514093507.6689721 with std:5739879242.710038.
514093507.6689721 0.01 11
The training loss is 4.049657099774255 with std:8.113785703759984. The val loss is 13.439587471891507 with std:51.72779811746422.
13.439587471891507 0.01 11
The training loss is 3.561740945295461 with std:6.163947172338833. The val loss is 15.305063092909133 with std:77.99941708293802.
15.305063092909133 0.01 11
Evaluating for {'degree': 11, 'lmda': 0.013043213867190054} ...
The training loss is 4.153660783424772 with std:10.126210559090529. The val loss is 529644113.1337412 with std:5913554052.111032.
529644113.1337412 0.013043213867190054 11
The training loss is 4.1047832499701435 with std:8.246789559667404. The val loss is 12.744011626052586 with std:45.86257664878375.
12.744011626052586 0.013043213867190054 11
The training loss is 3.5964448144152548 with std:6.236100290175237. The val loss is 16.008348617731876 with std:85.22097219946333.
16.008348617731876 0.013043213867190054 11
Evaluating for {'degree': 11, 'lmda': 0.017012542798525893} ...
The training loss is 4.200592501748394 with std:10.274654626214554. The val loss is 511031041.36581564 with std:5705763968.613176.
511031041.36581564 0.017012542798525893 11
The training loss is 4.159512755122866 with std:8.385618438195829. The val loss is 12.11629926447872 with std:40.456404435422705.
12.11629926447872 0.017012542798525893 11
The training loss is 3.6308473538996497 with std:6.30966075940972. The val loss is 16.57985771944045 with std:91.22475368563764.
16.57985771944045 0.017012542798525893 11
Evaluating for {'degree': 11, 'lmda': 0.02218982341458972} ...
The training loss is 4.248877922237609 with std:10.429272632642721. The val loss is 464684100.9727516 with std:5188300684.082562.
464684100.9727516 0.02218982341458972 11
The training loss is 4.213901869046252 with std:8.529362747353403. The val loss is 11.565023416470325 with std:35.6916995435636.
11.565023416470325 0.02218982341458972 11
The training loss is 3.665295664357909 with std:6.384147339757998. The val loss is 17.00678224303904 with std:95.80270349602715.
17.00678224303904 0.02218982341458972 11
Evaluating for {'degree': 11, 'lmda': 0.028942661247167517} ...
The training loss is 4.299043939023561 with std:10.591769691264265. The val loss is 399722072.3914343 with std:4462981933.804748.
399722072.3914343 0.028942661247167517 11
The training loss is 4.267968950054812 with std:8.676475123432962. The val loss is 11.093587090856774 with std:31.65869219786764.
11.093587090856774 0.028942661247167517 11
The training loss is 3.700112907887739 with std:6.458858609974568. The val loss is 17.282132261928208 with std:98.81157894496164.
17.282132261928208 0.028942661247167517 11
Evaluating for {'degree': 11, 'lmda': 0.037750532053243954} ...
The training loss is 4.351519295916858 with std:10.76278885596458. The val loss is 326085787.4762252 with std:3640804807.501348.
326085787.4762252 0.037750532053243954 11
The training loss is 4.3216530805775095 with std:8.824780051518255. The val loss is 10.699403328780782 with std:28.361210649595893.
10.699403328780782 0.037750532053243954 11
The training loss is 3.7355676836583047 with std:6.532975904096592. The val loss is 17.40365511143975 with std:100.17082220394452.
17.40365511143975 0.037750532053243954 11
Evaluating for {'degree': 11, 'lmda': 0.04923882631706739} ...
The training loss is 4.4065142139367905 with std:10.941710847929967. The val loss is 252821246.49292254 with std:2822774907.577625.
252821246.49292254 0.04923882631706739 11
The training loss is 4.3748210671503704 with std:8.97159536041712. The val loss is 10.374597416874794 with std:25.73640932944222.
10.374597416874794 0.04923882631706739 11
The training loss is 3.771881452605116 with std:6.605714797207256. The val loss is 17.373489219123304 with std:99.86461652442007.
17.373489219123304 0.04923882631706739 11
Evaluating for {'degree': 11, 'lmda': 0.0642232542222936} ...
The training loss is 4.4639373436238134 with std:11.126659070250252. The val loss is 186780125.12224904 with std:2085396464.804671.
186780125.12224904 0.0642232542222936 11
The training loss is 4.427325508665763 with std:9.113971207096334. The val loss is 10.1076802430215 with std:23.6807493214491.
10.1076802430215 0.0642232542222936 11
The training loss is 3.8092680455882952 with std:6.676492243679965. The val loss is 17.19848969664166 with std:97.94709847834736.
17.19848969664166 0.0642232542222936 11
Evaluating for {'degree': 11, 'lmda': 0.0837677640068292} ...
The training loss is 4.523384779492228 with std:11.31470615195751. The val loss is 131981072.75002956 with std:1473540896.1441858.
131981072.75002956 0.0837677640068292 11
The training loss is 4.479095172135532 with std:9.249013864289147. The val loss is 9.885637927622867 with std:22.07576064763753.
9.885637927622867 0.0837677640068292 11
The training loss is 3.8479953467863317 with std:6.745078711284577. The val loss is 16.89094503837499 with std:94.54870805068505.
16.89094503837499 0.0837677640068292 11
Evaluating for {'degree': 11, 'lmda': 0.10926008611173785} ...
The training loss is 4.584214704577865 with std:11.50222930600137. The val loss is 89702560.45051824 with std:1001483621.4210771.
89702560.45051824 0.10926008611173785 11
The training loss is 4.530227378873129 with std:9.374237382349648. The val loss is 9.695971618585125 with std:20.808978236111066.
9.695971618585125 0.10926008611173785 11
The training loss is 3.8884629819729937 with std:6.811722876133241. The val loss is 16.469275729146975 with std:89.88060284073472.
16.469275729146975 0.10926008611173785 11
Evaluating for {'degree': 11, 'lmda': 0.14251026703029984} ...
The training loss is 4.6456891373975 with std:11.685326972696405. The val loss is 59136366.33450791 with std:660200670.9187665.
59136366.33450791 0.14251026703029984 11
The training loss is 4.5810552865114325 with std:9.487891694864603. The val loss is 9.528317224600313 with std:19.787471137428444.
9.528317224600313 0.14251026703029984 11
The training loss is 3.9312906036263975 with std:6.877255634780207. The val loss is 15.958240062253736 with std:84.2330307935468.
15.958240062253736 0.14251026703029984 11
Evaluating for {'degree': 11, 'lmda': 0.18587918911465645} ...
The training loss is 4.707138114390037 with std:11.860210838721875. The val loss is 38277187.09168924 with std:427301721.777208.
38277187.09168924 0.18587918911465645 11
The training loss is 4.632174865492837 with std:9.589242938377783. The val loss is 9.375387342291308 with std:18.943415298499453.
9.375387342291308 0.18587918911465645 11
The training loss is 3.9774013514656943 with std:6.943184628377328. The val loss is 15.388176325273308 with std:77.96306725016304.
15.388176325273308 0.18587918911465645 11
Evaluating for {'degree': 11, 'lmda': 0.24244620170823283} ...
The training loss is 4.768094816356387 with std:12.023527590942402. The val loss is 24737208.990919515 with std:276125267.63011515.
24737208.990919515 0.24244620170823283 11
The training loss is 4.684427523395453 with std:9.67880372985279. The val loss is 9.233139505545964 with std:18.232899199701254.
9.233139505545964 0.24244620170823283 11
The training loss is 4.02807048084528 with std:7.011776189496404. The val loss is 14.792925831458236 with std:71.46795353185883.
14.792925831458236 0.24244620170823283 11
Evaluating for {'degree': 11, 'lmda': 0.31622776601683794} ...
The training loss is 4.828367031058384 with std:12.17262031795009. The val loss is 16306071.905679656 with std:181991015.67552042.
16306071.905679656 0.31622776601683794 11
The training loss is 4.738840780852703 with std:9.758505823898473. The val loss is 9.100265880739071 with std:17.63030140040142.
9.100265880739071 0.31622776601683794 11
The training loss is 4.084905883858603 with std:7.086097578061795. The val loss is 14.206391392854837 with std:65.14353768491452.
14.206391392854837 0.31622776601683794 11
Evaluating for {'degree': 11, 'lmda': 0.41246263829013524} ...
The training loss is 4.8880435513074785 with std:12.305776082059525. The val loss is 11224606.993440244 with std:125256472.99154927.
11224606.993440244 0.41246263829013524 11
The training loss is 4.796534507165407 with std:9.831779253916949. The val loss is 8.97724540011576 with std:17.12094112066816.
8.97724540011576 0.41246263829013524 11
The training loss is 4.1497445691987345 with std:7.169972671291948. The val loss is 13.658222617822613 with std:59.333445666149565.
13.658222617822613 0.41246263829013524 11
Evaluating for {'degree': 11, 'lmda': 0.5379838403443686} ...
The training loss is 4.947466219883414 with std:12.422502675554947. The val loss is 8236380.31360728 with std:91892638.25475706.
8236380.31360728 0.5379838403443686 11
The training loss is 4.858608701371182 with std:9.90348162570509. The val loss is 8.86524192803345 with std:16.69428426450869.
8.86524192803345 0.5379838403443686 11
The training loss is 4.224479362922796 with std:7.267801766575978. The val loss is 13.169684974751776 with std:54.28127077551786.
13.169684974751776 0.5379838403443686 11
Evaluating for {'degree': 11, 'lmda': 0.701703828670383} ...
The training loss is 5.007212699746684 with std:12.523841889969002. The val loss is 6512423.966355003 with std:72643719.98852646.
6512423.966355003 0.701703828670383 11
The training loss is 4.926040617210118 with std:9.979644601888406. The val loss is 8.76508410499471 with std:16.339096507801717.
8.76508410499471 0.701703828670383 11
The training loss is 4.3108555157875585 with std:7.384221246072949. The val loss is 12.750964816851237 with std:50.100412325559546.
12.750964816851237 0.701703828670383 11
Evaluating for {'degree': 11, 'lmda': 0.9152473108773893} ...
The training loss is 5.0681292768534485 with std:12.612690825713367. The val loss is 5532172.824596001 with std:61697765.79499446.
5532172.824596001 0.9152473108773893 11
The training loss is 4.999630979173704 with std:10.067084920864279. The val loss is 8.67648886870044 with std:16.041012005733037.
8.67648886870044 0.9152473108773893 11
The training loss is 4.410288604067757 with std:7.523637908214518. The val loss is 12.400761578955352 with std:46.7712137006132.
12.400761578955352 0.9152473108773893 11
Evaluating for {'degree': 11, 'lmda': 1.1937766417144369} ...
The training loss is 5.131440972699738 with std:12.694097605519694. The val loss is 4973158.152113988 with std:55454948.43850517.
4973158.152113988 1.1937766417144369 11
The training loss is 5.080051199281327 with std:10.173038181714233. The val loss is 8.597646869381652 with std:15.78231969665774.
8.597646869381652 1.1937766417144369 11
The training loss is 4.52376380591867 with std:7.689761437361881. The val loss is 12.108270456727046 with std:44.16543329073526.
12.108270456727046 1.1937766417144369 11
Evaluating for {'degree': 11, 'lmda': 1.5570684047537318} ...
The training loss is 5.198964468863792 with std:12.775526834856404. The val loss is 4631696.9065977195 with std:51641846.940930635.
4631696.9065977195 1.5570684047537318 11
The training loss is 5.168058257343004 with std:10.305055451653184. The val loss is 8.525278008846438 with std:15.543422462420432.
8.525278008846438 1.5570684047537318 11
The training loss is 4.651900136726487 with std:7.885352304295174. The val loss is 11.857046183484567 with std:42.089895537720906.
11.857046183484567 1.5570684047537318 11
Evaluating for {'degree': 11, 'lmda': 2.030917620904737} ...
The training loss is 5.273462388716179 with std:12.867128049577502. The val loss is 4375798.238313187 with std:48785422.032777324.
4375798.238313187 2.030917620904737 11
The training loss is 5.264957002865939 with std:10.471395630189214. The val loss is 8.455248267812845 with std:15.305335621534818.
8.455248267812845 2.030917620904737 11
The training loss is 4.795298473898439 with std:8.112454703881662. The val loss is 11.629956978440491 with std:40.33769837398188.
11.629956978440491 2.030917620904737 11
Evaluating for {'degree': 11, 'lmda': 2.6489692876105297} ...
The training loss is 5.359200884969405 with std:12.98206006982665. The val loss is 4121464.3939652867 with std:45948238.273309514.
4121464.3939652867 2.6489692876105297 11
The training loss is 5.373392326121524 with std:10.682036499565845. The val loss is 8.383778478856351 with std:15.052658215949549.
8.383778478856351 2.6489692876105297 11
The training loss is 4.955308775918332 with std:8.373346785901946. The val loss is 11.414317782531684 with std:38.73521795274391.
11.414317782531684 2.6489692876105297 11
Evaluating for {'degree': 11, 'lmda': 3.4551072945922217} ...
The training loss is 5.4628102405565615 with std:13.136927780223443. The val loss is 3821675.455783415 with std:42605388.43924237.
3821675.455783415 3.4551072945922217 11
The training loss is 5.498553968726877 with std:10.950269239064454. The val loss is 8.309194629165496 with std:14.776579230615573.
8.309194629165496 3.4551072945922217 11
The training loss is 5.135331135911529 with std:8.6723092813758. The val loss is 11.206237028168104 with std:37.17452455665931.
11.206237028168104 3.4551072945922217 11
Evaluating for {'degree': 11, 'lmda': 4.506570337745478} ...
The training loss is 5.594620840601194 with std:13.352411407405036. The val loss is 3459715.690397353 with std:38569990.38039723.
3459715.690397353 4.506570337745478 11
The training loss is 5.649913650951078 with std:11.29472670611231. The val loss is 8.234155810825177 with std:14.477650380741602.
8.234155810825177 4.506570337745478 11
The training loss is 5.342747129333188 with std:9.0181591193624. The val loss is 11.013450548718938 with std:35.62447091374179.
11.013450548718938 4.506570337745478 11
Evaluating for {'degree': 11, 'lmda': 5.878016072274912} ...
The training loss is 5.770777292028807 with std:13.654248153231059. The val loss is 3042238.5198191414 with std:33915846.50356988.
3042238.5198191414 5.878016072274912 11
The training loss is 5.843736672770799 with std:11.741684597534737. The val loss is 8.168441249210682 with std:14.168345619243016.
8.168441249210682 5.878016072274912 11
The training loss is 5.59164821606926 with std:9.427421640912335. The val loss is 10.856671303024807 with std:34.12044265048973.
10.856671303024807 5.878016072274912 11
Evaluating for {'degree': 11, 'lmda': 7.666822074546214} ...
The training loss is 6.016626598951371 with std:14.074847875950974. The val loss is 2590908.2767312583 with std:28884255.053044613.
2590908.2767312583 7.666822074546214 11
The training loss is 6.106836453368365 with std:12.327544515536673. The val loss is 8.132696449340068 with std:13.875905564423386.
8.132696449340068 7.666822074546214 11
The training loss is 5.906745009677811 with std:9.928060690470643. The val loss is 10.770555204753265 with std:32.74028759806859.
10.770555204753265 7.666822074546214 11
Evaluating for {'degree': 11, 'lmda': 10.0} ...
The training loss is 6.372112670234473 with std:14.655897840691889. The val loss is 2133665.5197004513 with std:23786615.08366159.
2133665.5197004513 10.0 11
The training loss is 6.4823370975275845 with std:13.10146783738277. The val loss is 8.163944069354686 with std:13.646495564279922.
8.163944069354686 10.0 11
The training loss is 6.329177152272928 with std:10.563745424401478. The val loss is 10.806262565057725 with std:31.57842305534936.
10.806262565057725 10.0 11
Evaluating for {'degree': 12, 'lmda': 0.01} ...
The training loss is 4.0752619402722345 with std:9.899877269072382. The val loss is 1167786596.1754885 with std:13044708222.242357.
1167786596.1754885 0.01 12
The training loss is 4.002528159860153 with std:8.018645307566059. The val loss is 16.145224313280007 with std:78.01617058979667.
16.145224313280007 0.01 12
The training loss is 3.5329749281952654 with std:6.11649550904213. The val loss is 13.053905347397754 with std:51.79938355738785.
13.053905347397754 0.01 12
Evaluating for {'degree': 12, 'lmda': 0.013043213867190054} ...
The training loss is 4.1192613587642155 with std:10.04047928717459. The val loss is 1410888340.3357544 with std:15760516024.96133.
1410888340.3357544 0.013043213867190054 12
The training loss is 4.0549910219130565 with std:8.146625072205174. The val loss is 15.07511061446737 with std:68.60963297591364.
15.07511061446737 0.013043213867190054 12
The training loss is 3.567581609033122 with std:6.188970613265576. The val loss is 14.219211451373472 with std:63.79288459204413.
14.219211451373472 0.013043213867190054 12
Evaluating for {'degree': 12, 'lmda': 0.017012542798525893} ...
The training loss is 4.163727299605519 with std:10.183283652182968. The val loss is 1539015439.881083 with std:17191945360.06897.
1539015439.881083 0.017012542798525893 12
The training loss is 4.107782026892308 with std:8.282562086533638. The val loss is 14.072822182970585 with std:59.61880304183672.
14.072822182970585 0.017012542798525893 12
The training loss is 3.6016755245325074 with std:6.2630407155277. The val loss is 15.34778207110582 with std:75.9402638572283.
15.34778207110582 0.017012542798525893 12
Evaluating for {'degree': 12, 'lmda': 0.02218982341458972} ...
The training loss is 4.20911908416201 with std:10.330986653466043. The val loss is 1547366730.4989707 with std:17285340802.587196.
1547366730.4989707 0.02218982341458972 12
The training loss is 4.161015151935798 with std:8.425858197415142. The val loss is 13.165853630348943 with std:51.394969419264235.
13.165853630348943 0.02218982341458972 12
The training loss is 3.6355753226206207 with std:6.338390103491586. The val loss is 16.389654423877122 with std:87.48522144168764.
16.389654423877122 0.02218982341458972 12
Evaluating for {'degree': 12, 'lmda': 0.028942661247167517} ...
The training loss is 4.256010984814267 with std:10.485940094699647. The val loss is 1451084694.592727 with std:16209850300.990196.
1451084694.592727 0.028942661247167517 12
The training loss is 4.214704325820163 with std:8.575086169368333. The val loss is 12.374218472446 with std:44.186311996398544.
12.374218472446 0.028942661247167517 12
The training loss is 3.6696007584486474 with std:6.414439764153957. The val loss is 17.302441511985297 with std:97.8010267620706.
17.302441511985297 0.028942661247167517 12
Evaluating for {'degree': 12, 'lmda': 0.037750532053243954} ...
The training loss is 4.304994997931261 with std:10.649691219195844. The val loss is 1278391438.879208 with std:14280743018.80999.
1278391438.879208 0.037750532053243954 12
The training loss is 4.268724255047103 with std:8.72799964009124. The val loss is 11.706738524553902 with std:38.11565442808491.
11.706738524553902 0.037750532053243954 12
The training loss is 3.70403612438824 with std:6.490411689308674. The val loss is 18.051143040338435 with std:106.38289294781116.
18.051143040338435 0.037750532053243954 12
Evaluating for {'degree': 12, 'lmda': 0.04923882631706739} ...
The training loss is 4.3565558531247035 with std:10.822619860089757. The val loss is 1063138326.6216185 with std:11876177139.51597.
1063138326.6216185 0.04923882631706739 12
The training loss is 4.322825150637203 with std:8.881674283400555. The val loss is 11.160570692311433 with std:33.18597763321743.
11.160570692311433 0.04923882631706739 12
The training loss is 3.739130895739078 with std:6.565459207450366. The val loss is 18.607776696686088 with std:112.84135310389836.
18.607776696686088 0.04923882631706739 12
Evaluating for {'degree': 12, 'lmda': 0.0642232542222936} ...
The training loss is 4.410943001581105 with std:11.003751278859601. The val loss is 837824662.1422096 with std:9359208586.81613.
837824662.1422096 0.0642232542222936 12
The training loss is 4.376702893601735 with std:9.032780959426153. The val loss is 10.723322333739702 with std:29.305309130494233.
10.723322333739702 0.0642232542222936 12
The training loss is 3.775135890589088 with std:6.63883904497432. The val loss is 18.951802808756653 with std:116.9028233642998.
18.951802808756653 0.0642232542222936 12
Evaluating for {'degree': 12, 'lmda': 0.0837677640068292} ...
The training loss is 4.468081669810135 with std:11.190796626922094. The val loss is 628208452.8706115 with std:7017586132.092859.
628208452.8706115 0.0837677640068292 12
The training loss is 4.430105717172833 with std:9.177952907198916. The val loss is 10.376550138807385 with std:26.320506190068514.
10.376550138807385 0.0837677640068292 12
The training loss is 3.8123624560669467 with std:6.710088777014898. The val loss is 19.071828644598128 with std:118.42210462599799.
19.071828644598128 0.0837677640068292 12
Evaluating for {'degree': 12, 'lmda': 0.10926008611173785} ...
The training loss is 4.527564138916357 with std:11.380419212926656. The val loss is 450447878.1480407 with std:5031817937.013607.
450447878.1480407 0.10926008611173785 12
The training loss is 4.4829464917403286 with std:9.314181625890065. The val loss is 10.099523264104453 with std:24.051691907094384.
10.099523264104453 0.10926008611173785 12
The training loss is 3.8512518382133103 with std:6.779181382123733. The val loss is 18.968300903566654 with std:117.40493731556059.
18.968300903566654 0.10926008611173785 12
Evaluating for {'degree': 12, 'lmda': 0.14251026703029984} ...
The training loss is 4.588736307701859 with std:11.568659627312778. The val loss is 311025167.69798625 with std:3474320847.5009465.
311025167.69798625 0.14251026703029984 12
The training loss is 4.535390084264196 with std:9.439176154112205. The val loss is 9.872497052055824 with std:22.32195237869026.
9.872497052055824 0.14251026703029984 12
The training loss is 3.892446950605451 with std:6.846648832328997. The val loss is 18.65616224469086 with std:114.03069776004722.
18.65616224469086 0.14251026703029984 12
Evaluating for {'degree': 12, 'lmda': 0.18587918911465645} ...
The training loss is 4.650855782995311 with std:11.751411816114343. The val loss is 208818834.3164953 with std:2332569146.9085526.
208818834.3164953 0.18587918911465645 12
The training loss is 4.587895749487008 with std:9.55164951365877. The val loss is 9.679076805808307 with std:20.978793624873262.
9.679076805808307 0.18587918911465645 12
The training loss is 3.9368588771836257 with std:6.913682672847001. The val loss is 18.166085011940446 with std:108.66055313423895.
18.166085011940446 0.18587918911465645 12
Evaluating for {'degree': 12, 'lmda': 0.24244620170823283} ...
The training loss is 4.713268680630554 with std:11.924853490591145. The val loss is 138141481.51809773 with std:1543029907.7694597.
138141481.51809773 0.24244620170823283 12
The training loss is 4.641208302361817 with std:9.651533834578753. The val loss is 9.507473643772617 with std:19.905503564073463.
9.507473643772617 0.24244620170823283 12
The training loss is 3.9857122282154016 with std:6.982219776603604. The val loss is 17.54305796953699 with std:101.81744428294493.
17.54305796953699 0.24244620170823283 12
Evaluating for {'degree': 12, 'lmda': 0.31622776601683794} ...
The training loss is 4.775549869833131 with std:12.085791942399979. The val loss is 91604258.59819546 with std:1023162042.7747638.
91604258.59819546 0.31622776601683794 12
The training loss is 4.696303068110316 with std:9.740142599251087. The val loss is 9.350642213742733 with std:19.02261728574395.
9.350642213742733 0.31622776601683794 12
The training loss is 4.040545507439676 with std:7.055005445439432. The val loss is 16.841755268022528 with std:94.13116615397615.
16.841755268022528 0.31622776601683794 12
Evaluating for {'degree': 12, 'lmda': 0.41246263829013524} ...
The training loss is 4.837576393725256 with std:12.231951217198876. The val loss is 62132628.380303934 with std:693934902.6780554.
62132628.380303934 0.41246263829013524 12
The training loss is 4.754296872271551 with std:9.820280224497205. The val loss is 9.205493305559548 with std:18.281748853080234.
9.205493305559548 0.41246263829013524 12
The training loss is 4.103147058887082 with std:7.135602320699309. The val loss is 16.119138103386167 with std:86.25326432988741.
16.119138103386167 0.41246263829013524 12
Evaluating for {'degree': 12, 'lmda': 0.5379838403443686} ...
The training loss is 4.899541097436828 with std:12.362255666920545. The val loss is 43989475.27493186 with std:491258009.60311604.
43989475.27493186 0.5379838403443686 12
The training loss is 4.816342479928476 with std:9.896261981902631. The val loss is 9.071540253858727 with std:17.655372240452724.
9.071540253858727 0.5379838403443686 12
The training loss is 4.175426307751944 with std:7.228295608024739. The val loss is 15.425876626140226 with std:78.75949981287599.
15.425876626140226 0.5379838403443686 12
Evaluating for {'degree': 12, 'lmda': 0.701703828670383} ...
The training loss is 4.961942624114836 with std:12.477145218451664. The val loss is 33015669.17374861 with std:368668950.4224069.
33015669.17374861 0.701703828670383 12
The training loss is 4.883529404597261 with std:9.973790598799614. The val loss is 8.949387122320545 with std:17.126266896613846.
8.949387122320545 0.701703828670383 12
The training loss is 4.259243358149691 with std:7.337848903003458. The val loss is 14.798917694869786 with std:72.06622684956449.
14.798917694869786 0.701703828670383 12
Evaluating for {'degree': 12, 'lmda': 0.9152473108773893} ...
The training loss is 5.025597951264354 with std:12.578915335924998. The val loss is 26421353.921813548 with std:295002198.5934954.
26421353.921813548 0.9152473108773893 12
The training loss is 4.956822377290558 with std:10.059676931616014. The val loss is 8.839397751535849 with std:16.679392668165736.
8.839397751535849 0.9152473108773893 12
The training loss is 4.356236615585924 with std:7.469105542794939. The val loss is 14.257316850633647 with std:66.38472761182548.
14.257316850633647 0.9152473108773893 12
Evaluating for {'degree': 12, 'lmda': 1.1937766417144369} ...
The training loss is 5.091720121978891 with std:12.672050383686829. The val loss is 22419943.40854286 with std:250300511.82502195.
22419943.40854286 1.1937766417144369 12
The training loss is 5.037083307385792 with std:10.161494968088535. The val loss is 8.740776173226068 with std:16.2974716092276.
8.740776173226068 1.1937766417144369 12
The training loss is 4.467702121515042 with std:7.626513968100651. The val loss is 13.802310569162854 with std:61.72409070547495.
13.802310569162854 1.1937766417144369 12
Evaluating for {'degree': 12, 'lmda': 1.5570684047537318} ...
The training loss is 5.162101260816585 with std:12.763532713654998. The val loss is 19885742.066138502 with std:221989954.6598481.
19885742.066138502 1.5570684047537318 12
The training loss is 5.1252445613338615 with std:10.287380454217933. The val loss is 8.651212311216856 with std:15.960177334076327.
8.651212311216856 1.5570684047537318 12
The training loss is 4.59460572608976 with std:7.813766438874916. The val loss is 13.421242525259055 with std:57.93655184520985.
13.421242525259055 1.5570684047537318 12
Evaluating for {'degree': 12, 'lmda': 2.030917620904737} ...
The training loss is 5.2394486066335935 with std:12.863146233936293. The val loss is 18107397.500920527 with std:202125306.43441477.
18107397.500920527 2.030917620904737 12
The training loss is 5.222723654409318 with std:10.446239447989516. The val loss is 8.567202377922385 with std:15.64598139114916.
8.567202377922385 2.030917620904737 12
The training loss is 4.737852151145861 with std:8.033832402143734. The val loss is 13.094102550999496 with std:54.788274864764425.
13.094102550999496 2.030917620904737 12
Evaluating for {'degree': 12, 'lmda': 2.6489692876105297} ...
The training loss is 5.327939780769496 with std:12.983823004885396. The val loss is 16638114.638216576 with std:185716398.73990253.
16638114.638216576 2.6489692876105297 12
The training loss is 5.332179371659655 with std:10.648572087056312. The val loss is 8.485093782854827 with std:15.335454815165376.
8.485093782854827 2.6489692876105297 12
The training loss is 4.898965801231279 with std:8.289677906403712. The val loss is 12.801150837927116 with std:52.035403502928894.
12.801150837927116 2.6489692876105297 12
Evaluating for {'degree': 12, 'lmda': 3.4551072945922217} ...
The training loss is 5.434097237353448 with std:13.142095212020081. The val loss is 15213114.892471505 with std:169805784.02603602.
15213114.892471505 3.4551072945922217 12
The training loss is 5.458709515345941 with std:10.907959091887982. The val loss is 8.402817321869986 with std:15.015001262163542.
8.402817321869986 3.4551072945922217 12
The training loss is 5.081329569552401 with std:8.5858589241645. The val loss is 12.530051952943985 with std:49.48662914598462.
12.530051952943985 3.4551072945922217 12
Evaluating for {'degree': 12, 'lmda': 4.506570337745478} ...
The training loss is 5.568149600783602 with std:13.358737949381151. The val loss is 13702053.207829205 with std:152937117.82610217.
13702053.207829205 4.506570337745478 12
The training loss is 5.611616524404924 with std:11.243098607231762. The val loss is 8.322224944172675 with std:14.68038802625036.
8.322224944172675 4.506570337745478 12
The training loss is 5.292105829751046 with std:8.931007109064407. The val loss is 12.28114092243692 with std:47.03803360199734.
12.28114092243692 4.506570337745478 12
Evaluating for {'degree': 12, 'lmda': 5.878016072274912} ...
The training loss is 5.746173384788921 with std:13.659756495545315. The val loss is 12073926.840414258 with std:134763274.0076449.
12073926.840414258 5.878016072274912 12
The training loss is 5.806972551237027 with std:11.680212429710052. The val loss is 8.252056596320708 with std:14.339938361143876.
8.252056596320708 5.878016072274912 12
The training loss is 5.545011754937507 with std:9.341100002923735. The val loss is 12.070241156683363 with std:44.67486632459523.
12.070241156683363 5.878016072274912 12
Evaluating for {'degree': 12, 'lmda': 7.666822074546214} ...
The training loss is 5.993500625225766 with std:14.077982578597233. The val loss is 10364221.541338336 with std:115679572.9661028.
10364221.541338336 7.666822074546214 12
The training loss is 6.071429251314523 with std:12.255675578885215. The val loss is 8.211860465993322 with std:14.017847226584518.
8.211860465993322 7.666822074546214 12
The training loss is 5.864311955518902 with std:9.843391739092427. The val loss is 11.929852315441439 with std:42.44679135193588.
11.929852315441439 7.666822074546214 12
Evaluating for {'degree': 12, 'lmda': 10.0} ...
The training loss is 6.350117526188108 with std:14.655620895295092. The val loss is 8643332.120056156 with std:96471270.6644354.
8643332.120056156 10.0 12
The training loss is 6.448014276747612 with std:13.01878077016159. The val loss is 8.237616815417041 with std:13.758668701526586.
8.237616815417041 10.0 12
The training loss is 6.29071511031736 with std:10.480901800703458. The val loss is 11.910896325898754 with std:40.43232580031783.
11.910896325898754 10.0 12
Evaluating for {'degree': 13, 'lmda': 0.01} ...
The training loss is 4.053513103649818 with std:9.8698042921712. The val loss is 1643593408.4632585 with std:18365061723.867325.
1643593408.4632585 0.01 13
The training loss is 3.989144776985466 with std:8.021020197178922. The val loss is 15.482447751184942 with std:74.09353825232594.
15.482447751184942 0.01 13
The training loss is 3.5116896959643142 with std:6.082660304965941. The val loss is 9.708437156847467 with std:25.69897511260262.
9.708437156847467 0.01 13
Evaluating for {'degree': 13, 'lmda': 0.013043213867190054} ...
The training loss is 4.096351913536543 with std:10.012759148870293. The val loss is 2751601513.6345606 with std:30746577924.323517.
2751601513.6345606 0.013043213867190054 13
The training loss is 4.040768883506429 with std:8.1459762206999. The val loss is 14.52066658663522 with std:65.67563271298236.
14.52066658663522 0.013043213867190054 13
The training loss is 3.5470082772373557 with std:6.157066567732176. The val loss is 10.912983487070468 with std:34.3695464039166.
10.912983487070468 0.013043213867190054 13
Evaluating for {'degree': 13, 'lmda': 0.017012542798525893} ...
The training loss is 4.139526817135396 with std:10.156730035054759. The val loss is 3684413087.228266 with std:41170520599.29542.
3684413087.228266 0.017012542798525893 13
The training loss is 4.092727846522006 with std:8.278984109331045. The val loss is 13.597000741028667 with std:57.44349542425274.
13.597000741028667 0.017012542798525893 13
The training loss is 3.581624398272245 with std:6.232772273549707. The val loss is 12.336520808917351 with std:47.41273961287018.
12.336520808917351 0.017012542798525893 13
Evaluating for {'degree': 13, 'lmda': 0.02218982341458972} ...
The training loss is 4.18345930059385 with std:10.30424697813666. The val loss is 4282225685.7715516 with std:47851066625.02592.
4282225685.7715516 0.02218982341458972 13
The training loss is 4.145120449169954 with std:8.41959861606463. The val loss is 12.746938350381978 with std:49.785512743517046.
12.746938350381978 0.02218982341458972 13
The training loss is 3.6157984397876337 with std:6.309609582505365. The val loss is 13.88430606050385 with std:63.099201963646486.
13.88430606050385 0.02218982341458972 13
Evaluating for {'degree': 13, 'lmda': 0.028942661247167517} ...
The training loss is 4.228717215759042 with std:10.457708879257462. The val loss is 4486732122.155926 with std:50136588519.43599.
4486732122.155926 0.028942661247167517 13
The training loss is 4.1979709486234675 with std:8.566584755929998. The val loss is 11.99735573689847 with std:42.98542922435239.
11.99735573689847 0.028942661247167517 13
The training loss is 3.6498251674595914 with std:6.387175875524002. The val loss is 15.461413988521407 with std:79.839348066053.
15.461413988521407 0.028942661247167517 13
Evaluating for {'degree': 13, 'lmda': 0.037750532053243954} ...
The training loss is 4.275945622264954 with std:10.618929895541317. The val loss is 4324169679.198132 with std:48320236395.52607.
4324169679.198132 0.037750532053243954 13
The training loss is 4.251207803766648 with std:8.71793187784738. The val loss is 11.36299353848911 with std:37.20013402053455.
11.36299353848911 0.037750532053243954 13
The training loss is 3.6839928921486393 with std:6.464858100572754. The val loss is 16.97825339677058 with std:96.31428113017881.
16.97825339677058 0.037750532053243954 13
Evaluating for {'degree': 13, 'lmda': 0.04923882631706739} ...
The training loss is 4.3257552120266825 with std:10.788727555514196. The val loss is 3879094328.723468 with std:43346869626.34181.
3879094328.723468 0.04923882631706739 13
The training loss is 4.304666689757116 with std:8.87095678431024. The val loss is 10.846011462575536 with std:32.46543553187627.
10.846011462575536 0.04923882631706739 13
The training loss is 3.7185729895433575 with std:6.5419201144953325. The val loss is 18.353236141760963 with std:111.42067123841339.
18.353236141760963 0.04923882631706739 13
Evaluating for {'degree': 13, 'lmda': 0.0642232542222936} ...
The training loss is 4.378582998446307 with std:10.96664310493641. The val loss is 3264401601.4767284 with std:36478045193.487366.
3264401601.4767284 0.0642232542222936 13
The training loss is 4.3581290762792255 with std:9.022515481066085. The val loss is 10.437984934106936 with std:28.72076218478241.
10.437984934106936 0.0642232542222936 13
The training loss is 3.7538458871533136 with std:6.617643914177856. The val loss is 19.514272902854067 with std:124.23614335401044.
19.514272902854067 0.0642232542222936 13
Evaluating for {'degree': 13, 'lmda': 0.0837677640068292} ...
The training loss is 4.43456445622525 with std:11.150872097862221. The val loss is 2592877492.351821 with std:28974105019.438126.
2592877492.351821 0.0837677640068292 13
The training loss is 4.411395828567647 with std:9.169316978658252. The val loss is 10.1231985385408 with std:25.841538309811618.
10.1231985385408 0.0837677640068292 13
The training loss is 3.790155326207296 with std:6.691496736968232. The val loss is 20.40124449436237 with std:134.01986187777268.
20.40124449436237 0.0837677640068292 13
Evaluating for {'degree': 13, 'lmda': 0.10926008611173785} ...
The training loss is 4.493469958515068 with std:11.338439448868577. The val loss is 1955474289.6573994 with std:21851413408.415623.
1955474289.6573994 0.10926008611173785 13
The training loss is 4.464380863255201 with std:9.308297645000366. The val loss is 9.88221085064429 with std:23.67071411616525.
9.88221085064429 0.10926008611173785 13
The training loss is 3.827976391692993 with std:6.76329394957808. The val loss is 20.970689121628013 with std:140.24848486523163.
20.970689121628013 0.10926008611173785 13
Evaluating for {'degree': 13, 'lmda': 0.14251026703029984} ...
The training loss is 4.554743184989176 with std:11.525583495154399. The val loss is 1410002993.3156803 with std:15755996644.673504.
1410002993.3156803 0.14251026703029984 13
The training loss is 4.517199112964285 with std:9.436992135366955. The val loss is 9.69506067689901 with std:22.045039612261213.
9.69506067689901 0.14251026703029984 13
The training loss is 3.8679846907468742 with std:6.833341251687869. The val loss is 21.20211894168931 with std:142.67604030133.
21.20211894168931 0.14251026703029984 13
Evaluating for {'degree': 13, 'lmda': 0.18587918911465645} ...
The training loss is 4.617637922280376 with std:11.708250252362726. The val loss is 980697662.6380763 with std:10958672296.996569.
980697662.6380763 0.18587918911465645 13
The training loss is 4.57022222263043 with std:9.553845634903928. The val loss is 9.5438159313577 with std:20.814679736932604.
9.5438159313577 0.18587918911465645 13
The training loss is 3.9111166484494833 with std:6.902555896710664. The val loss is 21.103651343786716 with std:141.39046829880056.
21.103651343786716 0.18587918911465645 13
Evaluating for {'degree': 13, 'lmda': 0.24244620170823283} ...
The training loss is 4.681406652222974 with std:11.882581467759028. The val loss is 665684291.2149054 with std:7438508718.568493.
665684291.2149054 0.24244620170823283 13
The training loss is 4.624085477547021 with std:9.658447297418011. The val loss is 9.414309237030276 with std:19.85607709625044.
9.414309237030276 0.24244620170823283 13
The training loss is 3.9586084708755354 with std:6.972572476288631. The val loss is 20.713995853434454 with std:136.83270309105652.
20.713995853434454 0.24244620170823283 13
Evaluating for {'degree': 13, 'lmda': 0.31622776601683794} ...
The training loss is 4.745476655172915 with std:12.045320768066933. The val loss is 447674765.918267 with std:5002324355.08303.
447674765.918267 0.31622776601683794 13
The training loss is 4.67964548591295 with std:9.751697662296046. The val loss is 9.296943788836717 with std:19.078115687288587.
9.296943788836717 0.31622776601683794 13
The training loss is 4.011996842507391 with std:7.045829903884914. The val loss is 20.098625270121982 with std:129.75237062788614.
20.098625270121982 0.31622776601683794 13
Evaluating for {'degree': 13, 'lmda': 0.41246263829013524} ...
The training loss is 4.809566306709223 with std:12.194132067247846. The val loss is 303759060.12060195 with std:3394111387.181628.
303759060.12060195 0.41246263829013524 13
The training loss is 4.737903190066546 with std:9.835925633035895. The val loss is 9.186537362858836 with std:18.42167926665935.
9.186537362858836 0.41246263829013524 13
The training loss is 4.0730662106451 with std:7.125616423543758. The val loss is 19.33982794527271 with std:121.09481113869833.
19.33982794527271 0.41246263829013524 13
Evaluating for {'degree': 13, 'lmda': 0.5379838403443686} ...
The training loss is 4.873732456707193 with std:12.327875036270772. The val loss is 212064848.16040194 with std:2369455206.8378916.
212064848.16040194 0.5379838403443686 13
The training loss is 4.799915618408778 with std:9.914942540873902. The val loss is 9.081351589341855 with std:17.853503800408934.
9.081351589341855 0.5379838403443686 13
The training loss is 4.143739928800934 with std:7.216028581702031. The val loss is 18.522588494182024 with std:111.83990241323501.
18.522588494182024 0.5379838403443686 13
Evaluating for {'degree': 13, 'lmda': 0.701703828670383} ...
The training loss is 4.938376060078555 with std:12.446885778122194. The val loss is 154978493.2817003 with std:1731529240.0268269.
154978493.2817003 0.701703828670383 13
The training loss is 4.86672115139148 with std:9.993992291027565. The val loss is 8.981640390679212 with std:17.356614728154867.
8.981640390679212 0.701703828670383 13
The training loss is 4.225930838010278 with std:7.321793667315358. The val loss is 17.72002163497881 with std:102.8342970924537.
17.72002163497881 0.701703828670383 13
Evaluating for {'degree': 13, 'lmda': 0.9152473108773893} ...
The training loss is 5.00425211692657 with std:12.553279822793007. The val loss is 119793412.80959861 with std:1338341652.565964.
119793412.80959861 0.9152473108773893 13
The training loss is 4.939301987933598 with std:10.07956919192168. The val loss is 8.888131590530584 with std:16.920541036511825.
8.888131590530584 0.9152473108773893 13
The training loss is 4.321383231628602 with std:7.447931974796118. The val loss is 16.982495541142203 with std:94.66458435329336.
16.982495541142203 0.9152473108773893 13
Evaluating for {'degree': 13, 'lmda': 1.1937766417144369} ...
The training loss is 5.072532560368317 with std:12.65126690597555. The val loss is 97978274.73229417 with std:1094558583.9522789.
97978274.73229417 1.1937766417144369 13
The training loss is 5.0186148604884995 with std:10.17915080796447. The val loss is 8.800809605451834 with std:16.534126365520816.
8.800809605451834 1.1937766417144369 13
The training loss is 4.43155415188781 with std:7.599308108212659. The val loss is 16.33325743606624 with std:87.60457639983949.
16.33325743606624 1.1937766417144369 13
Evaluating for {'degree': 13, 'lmda': 1.5570684047537318} ...
The training loss is 5.14496967797645 with std:12.747469236391126. The val loss is 84030528.25156204 with std:938692829.0796038.
84030528.25156204 1.5570684047537318 13
The training loss is 5.1057455170675485 with std:10.301009251337227. The val loss is 8.718265954363511 with std:16.182354389020244.
8.718265954363511 1.5570684047537318 13
The training loss is 4.557610347075197 with std:7.780231682214632. The val loss is 15.771032806150435 with std:81.64257969839778.
15.771032806150435 1.5570684047537318 13
Evaluating for {'degree': 13, 'lmda': 2.030917620904737} ...
The training loss is 5.224214355120935 with std:12.851261276095174. The val loss is 74459056.48097476 with std:831734771.7580107.
74459056.48097476 2.030917620904737 13
The training loss is 5.202278256082688 with std:10.454352979366. The val loss is 8.637790623543959 with std:15.847040071228681.
8.637790623543959 2.030917620904737 13
The training loss is 4.700662268727746 with std:7.994377789274544. The val loss is 15.278039593876613 with std:76.56909016180511.
15.278039593876613 2.030917620904737 13
Evaluating for {'degree': 13, 'lmda': 2.6489692876105297} ...
The training loss is 5.31436139208417 with std:12.975174201475623. The val loss is 67074405.7625008 with std:749220122.9489144.
67074405.7625008 2.6489692876105297 13
The training loss is 5.31099900376416 with std:10.650039787637777. The val loss is 8.556288601137902 with std:15.510218354622811.
8.556288601137902 2.6489692876105297 13
The training loss is 4.862395136033763 with std:8.245342631842616. The val loss is 14.830862089187603 with std:72.09196839702427.
14.830862089187603 2.6489692876105297 13
Evaluating for {'degree': 13, 'lmda': 3.4551072945922217} ...
The training loss is 5.421827379829335 with std:13.135420020946562. The val loss is 60554858.62428017 with std:676380250.4040331.
60554858.62428017 3.4551072945922217 13
The training loss is 5.437059153638027 with std:10.901975154778784. The val loss is 8.472004147877334 with std:15.158776000259868.
8.472004147877334 3.4551072945922217 13
The training loss is 5.0462594004171075 with std:8.538081390988078. The val loss is 14.411424470147251 with std:67.94457606692576.
14.411424470147251 3.4551072945922217 13
Evaluating for {'degree': 13, 'lmda': 4.506570337745478} ...
The training loss is 5.5567312990711235 with std:13.352608774787807. The val loss is 54186706.4555907 with std:605239508.5961411.
54186706.4555907 4.506570337745478 13
The training loss is 5.589744755120139 with std:11.22913697748688. The val loss is 8.386964725607022 with std:14.78914090188194.
8.386964725607022 4.506570337745478 13
The training loss is 5.259364497426303 with std:8.881314836879973. The val loss is 14.01558731604631 with std:63.958697161697.
14.01558731604631 4.506570337745478 13
Evaluating for {'degree': 13, 'lmda': 5.878016072274912} ...
The training loss is 5.735067657649251 with std:13.652792204124525. The val loss is 47688247.08206787 with std:532648593.0228471.
47688247.08206787 5.878016072274912 13
The training loss is 5.785079081790055 with std:11.65804975452566. The val loss is 8.310119217352403 with std:14.411427573035201.
8.310119217352403 5.878016072274912 13
The training loss is 5.51525814385487 with std:9.290833574184433. The val loss is 13.65790233543183 with std:60.087237071317574.
13.65790233543183 5.878016072274912 13
Evaluating for {'degree': 13, 'lmda': 7.666822074546214} ...
The training loss is 5.982147586618888 with std:14.069072228389087. The val loss is 41066210.837652706 with std:458680584.5227817.
41066210.837652706 7.666822074546214 13
The training loss is 6.049679832339586 with std:12.225514681663057. The val loss is 8.261411333961156 with std:14.053187441411303.
8.261411333961156 7.666822074546214 13
The training loss is 5.837948731504751 with std:9.793558505367397. The val loss is 13.372941827806066 with std:56.38008525497887.
13.372941827806066 7.666822074546214 13
Evaluating for {'degree': 13, 'lmda': 10.0} ...
The training loss is 6.33801933785102 with std:14.64409483307058. The val loss is 34489376.00961797 with std:385219191.44232905.
34489376.00961797 10.0 13
The training loss is 6.426583434634587 with std:12.98143741312226. The val loss is 8.277456827324407 with std:13.76364581492332.
8.277456827324407 10.0 13
The training loss is 6.267843138687839 with std:10.432193118101003. The val loss is 13.215682777615301 with std:52.933162126274475.
13.215682777615301 10.0 13
Evaluating for {'degree': 14, 'lmda': 0.01} ...
The training loss is 4.039739988317937 with std:9.832694362452015. The val loss is 450457940.0751199 with std:5033493795.715482.
450457940.0751199 0.01 14
The training loss is 3.966221435490778 with std:7.971497541953082. The val loss is 18.130787827537937 with std:98.87028389368362.
18.130787827537937 0.01 14
The training loss is 3.4936183079775214 with std:6.047417836497711. The val loss is 8.749597000904147 with std:22.491476782106695.
8.749597000904147 0.01 14
Evaluating for {'degree': 14, 'lmda': 0.013043213867190054} ...
The training loss is 4.081396636829182 with std:9.974996432554855. The val loss is 2858847725.31186 with std:31950603834.063618.
2858847725.31186 0.013043213867190054 14
The training loss is 4.0161410865914755 with std:8.092827592145678. The val loss is 16.830640232786674 with std:87.28415648300621.
16.830640232786674 0.013043213867190054 14
The training loss is 3.529157320904516 with std:6.123207211372504. The val loss is 8.801006135463544 with std:22.513843970121496.
8.801006135463544 0.013043213867190054 14
Evaluating for {'degree': 14, 'lmda': 0.017012542798525893} ...
The training loss is 4.123004587387384 with std:10.117216804006272. The val loss is 6155619666.438954 with std:68797691162.84921.
6155619666.438954 0.017012542798525893 14
The training loss is 4.066447559869879 with std:8.222869944980078. The val loss is 15.556466344165937 with std:75.86033103774696.
15.556466344165937 0.017012542798525893 14
The training loss is 3.563852148834713 with std:6.199920710478627. The val loss is 9.470870349796686 with std:23.75747095326903.
9.470870349796686 0.017012542798525893 14
Evaluating for {'degree': 14, 'lmda': 0.02218982341458972} ...
The training loss is 4.164960470547131 with std:10.261767026393365. The val loss is 9255955762.635576 with std:103449712261.89589.
9255955762.635576 0.02218982341458972 14
The training loss is 4.11741719977945 with std:8.361566712741787. The val loss is 14.3610144102903 with std:65.09963671783296.
14.3610144102903 0.02218982341458972 14
The training loss is 3.597943569551172 with std:6.277497745181566. The val loss is 10.679505911458794 with std:30.975218633226195.
10.679505911458794 0.02218982341458972 14
Evaluating for {'degree': 14, 'lmda': 0.028942661247167517} ...
The training loss is 4.207814444402306 with std:10.411089464297332. The val loss is 11463160063.380177 with std:128119612043.58345.
11463160063.380177 0.028942661247167517 14
The training loss is 4.169183637845819 with std:8.507945609630958. The val loss is 13.287763080571084 with std:55.40213839895551.
13.287763080571084 0.028942661247167517 14
The training loss is 3.6317168696509805 with std:6.355696733119216. The val loss is 12.310877441594968 with std:45.78448683200814.
12.310877441594968 0.028942661247167517 14
Evaluating for {'degree': 14, 'lmda': 0.037750532053243954} ...
The training loss is 4.252232887286941 with std:10.56725047158469. The val loss is 12469168543.08506 with std:139364052818.36914.
12469168543.08506 0.037750532053243954 14
The training loss is 4.221722384159867 with std:8.660132642855423. The val loss is 12.364628890502168 with std:47.020583125740515.
12.364628890502168 0.037750532053243954 14
The training loss is 3.6654627629309084 with std:6.434091025941782. The val loss is 14.225351139264143 with std:65.68377370917952.
14.225351139264143 0.037750532053243954 14
Evaluating for {'degree': 14, 'lmda': 0.04923882631706739} ...
The training loss is 4.298916039101596 with std:10.731522604309642. The val loss is 12292835536.955729 with std:137393658486.24792.
12292835536.955729 0.04923882631706739 14
The training loss is 4.274866293927016 with std:8.815463830849447. The val loss is 11.601915562844244 with std:40.0510373478574.
11.601915562844244 0.04923882631706739 14
The training loss is 3.699463351884667 with std:6.512115744154892. The val loss is 16.27174381457026 with std:87.94673340862376.
16.27174381457026 0.04923882631706739 14
Evaluating for {'degree': 14, 'lmda': 0.0642232542222936} ...
The training loss is 4.348473145408461 with std:10.904042629322538. The val loss is 11179824985.180784 with std:124954109052.97571.
11179824985.180784 0.0642232542222936 14
The training loss is 4.328357144787736 with std:8.970709154413269. The val loss is 10.994160912073728 with std:34.45477186992413.
10.994160912073728 0.0642232542222936 14
The training loss is 3.7340118394461577 with std:6.589168543253673. The val loss is 18.296696937051326 with std:110.34895884706941.
18.296696937051326 0.0642232542222936 14
Evaluating for {'degree': 14, 'lmda': 0.0837677640068292} ...
The training loss is 4.401282058239438 with std:11.083636818990975. The val loss is 9487736031.534908 with std:106042198830.88634.
9487736031.534908 0.0837677640068292 14
The training loss is 4.381930012299854 with std:9.12240100942329. The val loss is 10.524367346365102 with std:30.097133600317417.
10.524367346365102 0.0837677640068292 14
The training loss is 3.769461719545064 with std:6.664747617425428. The val loss is 20.152592327724967 with std:131.00500177952787.
20.152592327724967 0.0837677640068292 14
Evaluating for {'degree': 14, 'lmda': 0.10926008611173785} ...
The training loss is 4.457383296787399 with std:11.267878855740777. The val loss is 7579821728.412578 with std:84717939912.90561.
7579821728.412578 0.10926008611173785 14
The training loss is 4.435416920559318 with std:9.26722844899173. The val loss is 10.168910080406638 with std:26.789139536001816.
10.168910080406638 0.10926008611173785 14
The training loss is 3.806292875479366 with std:6.738601713634801. The val loss is 21.70686765491541 with std:148.32325874168478.
21.70686765491541 0.10926008611173785 14
Evaluating for {'degree': 14, 'lmda': 0.14251026703029984} ...
The training loss is 4.516459092285785 with std:11.453383315446406. The val loss is 5747710381.810414 with std:64240824844.902954.
5747710381.810414 0.14251026703029984 14
The training loss is 4.488846146598094 with std:9.402433135242639. The val loss is 9.901936941251195 with std:24.32272518726804.
9.901936941251195 0.14251026703029984 14
The training loss is 3.8451809395139147 with std:6.810873687609877. The val loss is 22.854428317069644 with std:161.0792538755033.
22.854428317069644 0.14251026703029984 14
Evaluating for {'degree': 14, 'lmda': 0.18587918911465645} ...
The training loss is 4.577917689991582 with std:11.636263075809442. The val loss is 4174434430.9497166 with std:46656599720.52324.
4174434430.9497166 0.18587918911465645 14
The training loss is 4.5425096086104055 with std:9.526144099107297. The val loss is 9.698757866665662 with std:22.49641666485656.
9.698757866665662 0.18587918911465645 14
The training loss is 3.8870579868411963 with std:6.882232790693196. The val loss is 23.5316506419368 with std:168.5426413930815.
23.5316506419368 0.18587918911465645 14
Evaluating for {'degree': 14, 'lmda': 0.24244620170823283} ...
The training loss is 4.641057534766333 with std:11.812636146560262. The val loss is 2935727801.9037037 with std:32811756912.214985.
2935727801.9037037 0.24244620170823283 14
The training loss is 4.59697824862009 with std:9.637620936148583. The val loss is 9.538175580376194 with std:21.132094572747597.
9.538175580376194 0.24244620170823283 14
The training loss is 3.933152229201814 with std:6.9539991157459236. The val loss is 23.727451694890483 with std:170.587713889543.
23.727451694890483 0.24244620170823283 14
Evaluating for {'degree': 14, 'lmda': 0.31622776601683794} ...
The training loss is 4.705253805609559 with std:11.97908440926207. The val loss is 2026485937.2154434 with std:22649267429.96471.
2026485937.2154434 0.31622776601683794 14
The training loss is 4.653062585231261 with std:9.737416450337328. The val loss is 9.403842521349633 with std:20.084860875013412.
9.403842521349633 0.31622776601683794 14
The training loss is 3.9849925186685446 with std:7.028259138318009. The val loss is 23.48614366506473 with std:167.71976915752145.
23.48614366506473 0.31622776601683794 14
Evaluating for {'degree': 14, 'lmda': 0.41246263829013524} ...
The training loss is 4.770110499988809 with std:12.133028332282079. The val loss is 1395657285.2514687 with std:15598552855.262388.
1395657285.2514687 0.41246263829013524 14
The training loss is 4.71173459644305 with std:9.827487205441216. The val loss is 9.284689968480322 with std:19.24728078891705.
9.284689968480322 0.41246263829013524 14
The training loss is 4.04436525813451 with std:7.1079540822318386. The val loss is 22.89889042539742 with std:160.97682336440093.
22.89889042539742 0.41246263829013524 14
Evaluating for {'degree': 14, 'lmda': 0.5379838403443686} ...
The training loss is 4.835550779506932 with std:12.273041293466097. The val loss is 976515022.0350336 with std:10913826500.930464.
976515022.0350336 0.5379838403443686 14
The training loss is 4.774038890896769 with std:9.911260868827007. The val loss is 9.174468524792452 with std:18.54804412513359.
9.174468524792452 0.5379838403443686 14
The training loss is 4.113220241519369 with std:7.196899781140034. The val loss is 22.084588584470335 with std:151.71290810719918.
22.084588584470335 0.5379838403443686 14
Evaluating for {'degree': 14, 'lmda': 0.701703828670383} ...
The training loss is 4.901855925210489 with std:12.399147114383458. The val loss is 706210870.7616135 with std:7892645414.545912.
706210870.7616135 0.701703828670383 14
The training loss is 4.841021709996989 with std:9.993633343733729. The val loss is 9.070554997966909 with std:17.945206109403696.
9.070554997966909 0.701703828670383 14
The training loss is 4.193536072616777 with std:7.29968245698977. The val loss is 21.16512955198098 with std:141.31731721250551.
21.16512955198098 0.701703828670383 14
Evaluating for {'degree': 14, 'lmda': 0.9152473108773893} ...
The training loss is 4.969690287929895 with std:12.513125887748545. The val loss is 534604084.31993073 with std:5974594019.774961.
534604084.31993073 0.9152473108773893 14
The training loss is 4.913700563115487 with std:10.080858671339499. The val loss is 8.972350083441288 with std:17.415801701796056.
8.972350083441288 0.9152473108773893 14
The training loss is 4.287170814620577 with std:7.421390408702961. The val loss is 20.24232767972694 with std:130.95189519712147.
20.24232767972694 0.9152473108773893 14
Evaluating for {'degree': 14, 'lmda': 1.1937766417144369} ...
The training loss is 5.040161510155193 with std:12.618828653524673. The val loss is 425614327.02196115 with std:4756408120.224165.
425614327.02196115 1.1937766417144369 14
The training loss is 4.993098852930349 with std:10.180345341356128. The val loss is 8.87969372360996 with std:16.945160703939706.
8.87969372360996 1.1937766417144369 14
The training loss is 4.39574111471044 with std:7.567204827921775. The val loss is 19.38304913058849 with std:121.3808769079205.
19.38304913058849 1.1937766417144369 14
Evaluating for {'degree': 14, 'lmda': 1.5570684047537318} ...
The training loss is 5.1149701215528225 with std:12.72250044422901. The val loss is 354894192.1720049 with std:3965962435.845833.
354894192.1720049 1.5570684047537318 14
The training loss is 5.08039191857026 with std:10.30048064936708. The val loss is 8.79171383230761 with std:16.519203122768925.
8.79171383230761 1.5570684047537318 14
The training loss is 4.5206034055174085 with std:7.741982531267097. The val loss is 18.615659258382507 with std:112.92884721891637.
18.615659258382507 1.5570684047537318 14
Evaluating for {'degree': 14, 'lmda': 2.030917620904737} ...
The training loss is 5.1967107895437685 with std:12.833133658503453. The val loss is 306601725.4762719 with std:3426197302.228817.
306601725.4762719 2.030917620904737 14
The training loss is 5.177251349058166 with std:10.450714519271077. The val loss is 8.706431944951792 with std:16.12141016232694.
8.706431944951792 2.030917620904737 14
The training loss is 4.663057769604958 with std:7.950085678465907. The val loss is 17.93686960455982 with std:105.55493723908619.
17.93686960455982 2.030917620904737 14
Evaluating for {'degree': 14, 'lmda': 2.6489692876105297} ...
The training loss is 5.289404606272104 with std:12.962897112228235. The val loss is 270664358.6319308 with std:3024536718.773721.
270664358.6319308 2.6489692876105297 14
The training loss is 5.286512352589493 with std:10.642163909865143. The val loss is 8.621314297303666 with std:15.734188173810487.
8.621314297303666 2.6489692876105297 14
The training loss is 4.824938239385381 with std:8.19578591123844. The val loss is 17.32537799064276 with std:98.99890449466528.
17.32537799064276 2.6489692876105297 14
Evaluating for {'degree': 14, 'lmda': 3.4551072945922217} ...
The training loss is 5.399373332525022 with std:13.127696788612894. The val loss is 240882469.793736 with std:2691690646.6915545.
240882469.793736 3.4551072945922217 14
The training loss is 5.413307569691631 with std:10.888905107666897. The val loss is 8.53480035157701 with std:15.34304991647861.
8.53480035157701 3.4551072945922217 14
The training loss is 5.009762720816209 with std:8.484534922889189. The val loss is 16.757766949553538 with std:92.94424034255158.
16.757766949553538 3.4551072945922217 14
Evaluating for {'degree': 14, 'lmda': 4.506570337745478} ...
The training loss is 5.53662812795384 with std:13.34793917562259. The val loss is 213710092.60640225 with std:2388026278.784001.
213710092.60640225 4.506570337745478 14
The training loss is 5.566828788383812 with std:11.209948190784399. The val loss is 8.448718369574724 with std:14.94180196044448.
8.448718369574724 4.506570337745478 14
The training loss is 5.224601107611653 with std:8.825250934883483. The val loss is 16.22238476883648 with std:87.14900126834074.
16.22238476883648 4.506570337745478 14
Evaluating for {'degree': 14, 'lmda': 5.878016072274912} ...
The training loss is 5.71706040718439 with std:13.649620961094291. The val loss is 187437434.9695524 with std:2094431137.5329869.
187437434.9695524 5.878016072274912 14
The training loss is 5.762951093977923 with std:11.631739149077319. The val loss is 8.371515641625882 with std:14.537519425554578.
8.371515641625882 5.878016072274912 14
The training loss is 5.482853781755937 with std:9.233599422986199. The val loss is 15.728206641491356 with std:81.5117750842862.
15.728206641491356 5.878016072274912 14
Evaluating for {'degree': 14, 'lmda': 7.666822074546214} ...
The training loss is 5.965904099677949 with std:14.06596646913946. The val loss is 161575421.45026034 with std:1805434884.732739.
161575421.45026034 7.666822074546214 14
The training loss is 6.028132785506737 with std:12.190980982270991. The val loss is 8.322472350374092 with std:14.155083480809271.
8.322472350374092 7.666822074546214 14
The training loss is 5.808297333469309 with std:9.736135689306249. The val loss is 15.308105288984807 with std:76.06592116289688.
15.308105288984807 7.666822074546214 14
Evaluating for {'degree': 14, 'lmda': 10.0} ...
The training loss is 6.3231732281951025 with std:14.639915254862952. The val loss is 136354793.06765765 with std:1523611862.7791982.
136354793.06765765 10.0 14
The training loss is 6.40528661149204 with std:12.93757892305938. The val loss is 8.33750147595226 with std:13.842027761980017.
8.33750147595226 10.0 14
The training loss is 6.24106049186098 with std:10.375109580636433. The val loss is 15.018907873504453 with std:70.92220536411101.
15.018907873504453 10.0 14
In [21]:
from helper import plot_cv_result
plot_cv_result(np.log((grid_val.T)),search_lambda,search_degree)
In [22]:
from helper import plot_cv_result
plot_cv_result(np.log((grid_val_std.T)),search_lambda,search_degree)
In [23]:
# best val score
best_score = np.min(grid_val)
print(best_score)

# params which give best val score
d,l= np.unravel_index(np.argmin(grid_val),grid_val.shape)
best_degree = search_degree[d]
best_lambda = search_lambda[l]
print('Best score achieved using degree:{} and lambda:{}'.format(best_degree,best_lambda))
9.910368715513828
Best score achieved using degree:3 and lambda:3.4551072945922217
In [24]:
#Evaluate on the test set
X_train_poly,mu,std = expand_and_normalize_X(X_train,best_degree)
w = get_w_analytical_with_regularization(X_train_poly,y_train,best_lambda)
X_test_poly = expand_X(X_test,best_degree)
X_test_poly[:,1:] =  (X_test_poly[:,1:]-mu)/std

get_loss(w,X_train_poly,y_train,X_test_poly,y_test)
The training loss is 7.06392522895881 with std:13.444314802640736. The test loss is 8.228462752602276 with std:11.38108061204404.
Out[24]:
8.228462752602276

Question: How can you interpret the linear regression coefficients?

Answer: The coefficients value closer to zero means that corresponding features are unimportant for the model. Positive coefficient represent positive correlation with the value to be predicted and similar for negative case.

Question: Is it good to have coefficients' values close to zero?

Answer: Having coefficients close to zero means corresponding features are redundant for the model predictions. Hence we can represent our data with smaller feature sets. It might become important, when we might have several thousand features and prune out redundant ones, improving the interpretibility of the model.

Question: How would you proceed to improve the prediction?

Answer: Better feature expansion. One way would be to choose feature combinations which have high coefficient values. Choosing better hyperparameter. Till now we have done Grid Search, it would better to RandomSearch on the parameter space. Trying out different methods like Random Forest or Neural Nets.