How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?

Published: August 08, 2019

DMCA.com Protection Status

Examples of how to use Gaussian processes in machine learning to do a regression or classification using python 3:

A 1D example:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
from numpy.linalg import inv

import matplotlib.pyplot as plt
import numpy as np

X = np.array([1., 3., 5., 6., 7., 8.])

Y = X * np.sin(X)

X = X[:,np.newaxis]


sigma_n = 1.5

plt.grid(True,linestyle='--')

plt.errorbar(X, Y, yerr=sigma_n, fmt='o')

plt.title('Gaussian Processes for regression (1D Case) Training Data', fontsize=7)

plt.xlabel('x')
plt.ylabel('y')

plt.savefig('gaussian_processes_1d_fig_01.png', bbox_inches='tight')

How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?
How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?

Calculate the covariance matrix K

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
sigma_f = 10.0

l = 1.0

X_dim1 = X.shape[0]

D = np.zeros((X_dim1,X_dim1))

K = np.zeros((X_dim1,X_dim1))


D = X - X.T

K = sigma_f**2*np.exp((-D*D)/(2.0*l**2))

np.fill_diagonal(K, K.diagonal() +sigma_n**2 )

Make a prediction on 1 new point

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
x_new = 2.0 # 2.5

D_new = np.zeros((X_dim1))

K_new = np.zeros((X_dim1))

D_new = X - x_new

K_new = sigma_f**2*np.exp((-D_new*D_new)/(2.0*l**2))

K_inv = inv(K)

m1 = np.dot(K_new[:,0],K_inv)

y_predict = np.dot(m1,Y)

print(y_predict)

Make a prediction on a grid

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
X_new = np.linspace(0,10,100)

Y_predict = []

Y_VAR_predict = []

for x_new in X_new:

    D_new = np.zeros((X_dim1))

    K_new = np.zeros((X_dim1))

    D_new = X - x_new

    K_new = sigma_f**2*np.exp((-D_new*D_new)/(2.0*l**2))

    m1 = np.dot(K_new[:,0],K_inv)

    y_predict = np.dot(m1,Y)

    Y_predict.append(y_predict)

    y_var_predict = K[0,0] - K_new[:,0].dot(K_inv.dot(np.transpose(K_new[:,0])))

    Y_VAR_predict.append(y_var_predict)

plt.plot(X_new,Y_predict,'--',label='y predict')

plt.legend()

plt.savefig('gaussian_processes_1d_fig_02.png', bbox_inches='tight')

How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?
How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?

Plot the variance

1
2
3
4
5
plt.fill_between(X_new, 
[i-1.96*np.sqrt(Y_VAR_predict[idx]) for idx,i in enumerate(Y_predict)], 
[i+1.96*np.sqrt(Y_VAR_predict[idx]) for idx,i in enumerate(Y_predict)],color='#D3D3D3')

plt.savefig('gaussian_processes_1d_fig_03.png', bbox_inches='tight')

How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?
How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?

Find the hyperparameters

With Gaussian processes it is necessary to find "good" hyperparameters ($\sigma_f$ and $l$). For example using $\sigma_f=1$ and $l=1$ the results are less interesting:

How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?
How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?

To find "good" $\sigma_f$ and $l$ a solution is to use a grid search:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
from pylab import figure, cm

sigma_f, l = np.meshgrid(np.arange(0.1,10.0, 0.05), np.arange(0.1,10.0, 0.05))

sigma_f_dim1 = sigma_f.shape[0]
sigma_f_dim2 = sigma_f.shape[1]

Z = np.zeros((sigma_f_dim1,sigma_f_dim2))

D = X - X.T

for i in np.arange(sigma_f_dim1):
    for j in np.arange(sigma_f_dim2):

        K = np.zeros((X_dim1,X_dim1))

        K = sigma_f[i,j]**2*np.exp((-D*D)/(2.0*l[i,j]**2))

        np.fill_diagonal(K, K.diagonal() +sigma_n**2 )

        K_inv = inv(K)

        m1 = np.dot(K_inv,Y)

        part1 = -0.5 * np.dot(Y.T,m1)

        part2 = - 0.5 * np.log(np.linalg.det(K))

        part3 = - X_dim1 / 2.0 * np.log(2*np.pi)

        Z[i,j] = part1 + part2 + part3

Z = np.log(-Z)

print(np.min(Z))
print(np.where(Z == Z.min()))

min_indexes = np.where(Z == Z.min())

sigma_f_opt = sigma_f[min_indexes[0],min_indexes[1]][0]
l_opt = l[min_indexes[0],min_indexes[1]][0]

print(sigma_f_opt)
print(l_opt)

fig = plt.figure()

ax = fig.add_subplot(111)

plt.imshow( Z.T, interpolation='bilinear', origin='lower', cmap=cm.jet, extent=[0.1,10.0,0.1,10.0])

plt.colorbar()

plt.scatter(l_opt,sigma_f_opt,color='r')

ax.text(l_opt+0.3,sigma_f_opt+0.3,
r'$l$='+str(round(l_opt,2))+"\n"+
r'$\sigma_f$='+str(round(sigma_f_opt,2))
,color='red',fontsize=8)

plt.xlabel(r'$l$')
plt.ylabel(r'$\sigma_f$')

plt.savefig('gaussian_processes_1d_fig_07.png', bbox_inches='tight')

plt.close()

Example of results with $\sigma_f=4.3$, $l=1.4$ and $\sigma_n=1.5$

Comment utiliser les processus Gaussien pour faire une ( Comment utiliser les processus Gaussien pour faire une (
Comment utiliser les processus Gaussien pour faire une ("machine learning") regression ou une classification en python 3 ?

Example of results with $\sigma_f=4.8$, $l=1.7$ and $\sigma_n=0.0$

How to use Gaussian processes in machine learning to do a regression or classification using python 3 ? How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?
How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?

Using a gradient descent
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
from scipy import misc


def partial_derivative(func, var=0, point=[]):
    args = point[:]
    def wraps(x):
        args[var] = x
        return func(*args)
    return misc.derivative(wraps, point[var], dx = 1e-6)

D = X - X.T

def log_likelihood_function(l,sigma_f):

    K = np.zeros((X_dim1,X_dim1))

    K = sigma_f**2*np.exp((-D*D)/(2.0*l**2))

    np.fill_diagonal(K, K.diagonal() +sigma_n**2 )

    K_inv = inv(K)

    m1 = np.dot(K_inv,Y)

    part1 = -0.5 * np.dot(Y.T,m1)

    part2 = - 0.5 * np.log(np.linalg.det(K))

    part3 = - X_dim1 / 2.0 * np.log(2*np.pi)

    return part1 + part2 + part3


# gradient descent

alpha = 0.1 #-----> learning rate
n_max = 100 #-----> Nb max of iterations
eps = 0.0001 #-----> stoping condition

l = 5.0 
sigma_f = 5.0


cond = 99999.9
n = 0

previous_log_likelihood_value = log_likelihood_function(l,sigma_f)

while cond > eps and n < n_max:

    tmp_l = l + alpha * partial_derivative(log_likelihood_function, 0, [l,sigma_f])
    tmp_sigma_f = sigma_f + alpha * partial_derivative(log_likelihood_function, 1, [l,sigma_f])

    l = tmp_l
    sigma_f = tmp_sigma_f

    log_likelihood_value = log_likelihood_function(l,sigma_f)

    n = n + 1
    cond = abs( previous_log_likelihood_value - log_likelihood_value )

    previous_log_likelihood_value = log_likelihood_value

    print(l,sigma_f,cond)

returns for example

  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
4.78183917442 5.11095008539 0.608280597188
4.55429401535 5.2188698863 0.640142098648
4.31982856882 5.32180751166 0.658119817078
4.0808659291 5.41800151482 0.663405319095
3.83910094134 5.50614491691 0.661234713591
3.59496945618 5.58550864637 0.659109099824
3.3476630236 5.65584518456 0.663142115996
3.09604436098 5.71708200959 0.673012717522
2.8407266811 5.76890427962 0.675810009161
2.58721345575 5.81044078843 0.642363172031
2.34876787887 5.84039185193 0.539075865365
2.1451499821 5.85781314922 0.368693616173
1.9919544359 5.86327712605 0.197922669419
1.88822260797 5.85927240487 0.0890215568352
1.82117734903 5.8489039681 0.0382474008663
1.77797202473 5.8345927089 0.0176653049369
1.74966419363 5.81789727666 0.00960729400176
1.73062339447 5.79977191891 0.00643603168767
1.71737101412 5.78080033212 0.00516061254988
1.70775230758 5.76134337323 0.00462601242516
1.70042410589 5.74162660351 0.00438304378158
1.69454454434 5.7217923196 0.00425536743384
1.68958392913 5.70193099706 0.00417325135836
1.68520817906 5.68210059911 0.00410899546345
1.68120631675 5.662338579 0.00405157566838
1.67744505309 5.64266940605 0.00399660899519
1.67384021151 5.62310930939 0.00394236251995
1.67033869244 5.6036692799 0.00388816118681
1.66690707301 5.58435697294 0.00383375194647
1.66352438936 5.56517791638 0.00377904990646
1.66017756859 5.54613627753 0.00372403697226
1.65685853376 5.52723535102 0.00366872150598
1.65356236989 5.50847786568 0.00361312292446
1.65028616427 5.48986618175 0.00355726470619
1.64702827058 5.47140241417 0.00350117238154
1.64378784221 5.45308851071 0.00344487247814
1.64056453827 5.43492630102 0.00338839210266
1.6373583371 5.4169175261 0.00333175900396
1.63416941669 5.39906385858 0.00327500109805
1.63099808167 5.38136691245 0.00321814686585
1.62784471433 5.36382825037 0.00316122502044
1.62470974489 5.34644938687 0.0031042645982
1.62159363165 5.3292317901 0.00304729487112
1.61849684909 5.31217688221 0.00299034530951
1.61541987983 5.29528603847 0.00293344563304
1.61236320954 5.27856058775 0.00287662544284
1.60932732262 5.26200181029 0.00281991463091
1.60631270082 5.24561093674 0.0027633430066
1.60331982039 5.22938914714 0.00270694032657
1.6003491517 5.21333756965 0.0026507362048
1.59740115785 5.19745727844 0.00259476023949
1.59447629377 5.18174929313 0.00253904169456
1.59157500567 5.16621457721 0.00248360961558
1.58869773 5.15085403682 0.00242849274304
1.58584489328 5.13566851962 0.00237371941547
1.58301691188 5.12065881343 0.00231931758073
1.5802141908 5.10582564531 0.00226531470252
1.57743712316 5.09116968052 0.00221173772644
1.57468608962 5.0766915214 0.00215861303944
1.5719614595 5.06239170727 0.00210596626173
1.56926358753 5.04827071258 0.00205382255701
1.56659281637 5.03432894662 0.00200220623698
1.56394947429 5.02056675381 0.00195114076658
1.56133387518 5.0069844118 0.00190064899079
1.55874631852 4.99358213145 0.00185075283395
1.55618708978 4.98036005753 0.00180147317804
1.55365645772 4.96731826695 0.00175283024823
1.55115467672 4.95445676948 0.00170484313387
1.54868198551 4.94177550793 0.0016575299388
1.5462386065 4.92927435724 0.00161090788632
1.54382474628 4.91695312538 0.00156499305164
1.54144059509 4.90481155319 0.00151980051557
1.53908632684 4.89284931594 0.0014753441323
1.53676209856 4.88106602142 0.00143163696541
1.53446805166 4.8694612124 0.00138869071449
1.53220431011 4.85803436662 0.00134651608931
1.52997098121 4.84678489788 0.00130512263383
1.52776815626 4.83571215569 0.00126451889551
1.52559590935 4.82481542773 0.00122471216369
1.52345429837 4.81409393914 0.00118570878754
1.52134336472 4.80354685523 0.00114751385722
1.51926313327 4.79317328158 0.00111013148642
1.51721361307 4.78297226555 0.00107356468557
1.51519479666 4.77294279763 0.0010378154182
1.51320666132 4.76308381235 0.00100288464105
1.51124916813 4.75339419027 0.00096877226985
1.50932226363 4.74387275952 0.000935477199718
1.50742587853 4.73451829671 0.000902997459658
1.50555992901 4.7253295291 0.000871330079608
1.50372431628 4.71630513578 0.000840471240682
1.50191892826 4.70744374984 0.000810416194831
1.50014363848 4.69874395944 0.00078115944687
1.49839830695 4.69020431099 0.000752694572114
1.49668278074 4.68182330843 0.000725014597748
1.49499689394 4.67359941859 0.00069811154107
1.49334046872 4.66553106923 0.000671977040962
1.49171331502 4.65761665379 0.000646601863259
1.49011523122 4.649854532 0.000621976274843
1.48854600469 4.64224303177 0.000598089969127
1.4870054125 4.63478045171 0.000574932065636

A 2D example:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
from numpy.linalg import inv

import matplotlib.pyplot as plt
import numpy as np

X = np.array([[-8.0,-8.0], [-6.0,-3.0], [-7.0,2.0], [-4.0,4.0], 
             [2.0,3.0], [5.0,7.0], [1.0,-1.0], [3.0,-4.0], [7.0,-7.0]])

Y = np.array([-1.0, -1.0, -1.0, -1.0, -1.0, -1.0, 1.0, 1.0, 1.0])

print(X.shape)

X_dim1 = X.shape[0]

sigma_n = 0.1

markers = []
colors = []

for i in Y:
    if i == -1.0: 
        markers.append('o')
        colors.append('#1f77b4')
    if i == 1.0: 
        markers.append('x')
        colors.append('#ff7f0e')

plt.xlabel(r'$x_1$')
plt.ylabel(r'$x_2$')

plt.title('Gaussian Processes (2D Case)', fontsize=7)

plt.grid(True,linestyle="--")

for i in range(X_dim1):
    plt.scatter(X[i,0], X[i,1], marker=markers[i], color=colors[i])

plt.savefig('gaussian_processes_2d_fig_01.png', bbox_inches='tight')

plt.close()

How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?
How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?

Calculate the covariance matrix K

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
sigma_f = 1.0

l1 = 4.0
l2 = 4.0

X = X[:,:,np.newaxis]

D = np.zeros((X_dim1,X_dim1))

K = np.zeros((X_dim1,X_dim1))

X1 = X[:,0,:]

D1 = (X1 - X1.T)**2 / (2.0 * l1**2 )

X2 = X[:,1,:]

D2 = (X2 - X2.T)**2 / (2.0 * l2**2 )

K = sigma_f**2*np.exp(-(D1+D2))

np.fill_diagonal(K, K.diagonal() +sigma_n**2 )

Make a prediction on 1 new point

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
x1_new = -7.0

x2_new = -5.0

K_new = np.zeros((X_dim1))

D1 = X1 - x1_new

D1 = ( D1**2 ) / l1**2

D2 = X2 - x2_new

D2 = ( D2**2 ) / l2**2

K_new = sigma_f**2 * np.exp( - 0.5 * (D1 + D2) )

K_inv = inv(K)

m1 = np.dot(K_new[:,0],K_inv)

y_new = np.dot(m1,Y)

var_y = K[0,0] - K_new[:,0].dot(K_inv.dot(np.transpose(K_new[:,0])))

print( 'y_new ', y_new)

print( "var_y ", var_y)

Make a prediction on a grid

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
from pylab import figure, cm

X1_new, X2_new = np.meshgrid(np.arange(-10, 10, 0.1), np.arange(-10, 10, 0.1))

X1_new_dim = X1_new.shape

K_new = np.zeros((X_dim1, X1_new_dim[0], X1_new_dim[1]))

Y_predict = np.zeros(X1_new_dim)

Y_predict_var = np.zeros(X1_new_dim)

for i in range(X_dim1):

    D1 = X1_new - X1[i]

    D1 = ( D1**2 ) / l1**2

    D2 = X2_new - X2[i]

    D2 = ( D2**2 ) / l2**2

    K_new[i,:,:] = sigma_f**2 * np.exp( - 0.5 * (D1 + D2) )


K_inv = inv(K)

for i in range(X1_new_dim[0]):
    for j in range(X1_new_dim[1]):
        m1 = np.dot(K_new[:,i,j],K_inv)     
        Y_predict[i,j] = np.dot(m1,Y)
        Y_predict_var[i,j] = K[0,0] - K_new[:,i,j].dot(K_inv.dot(np.transpose(K_new[:,i,j])))

plt.imshow(Y_predict, interpolation='bilinear', 
           origin='lower', cmap=cm.jet, extent=[-10.0,10.0,-10.0,10.0])

plt.colorbar()

plt.xlabel(r'$x_1$')
plt.ylabel(r'$x_2$')

plt.title('Gaussian Processes (2D Case)', fontsize=7)

plt.grid(True, linestyle="--")

plt.savefig('gaussian_processes_2d_fig_02.png')

plt.close()

How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?
How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?

Plot the variance

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
plt.imshow(Y_predict_var, interpolation='bilinear', origin='lower', cmap=cm.jet)

plt.colorbar()

plt.xlabel(r'$x_1$')
plt.ylabel(r'$x_2$')

plt.title('Gaussian Processes (2D Case)', fontsize=7)

plt.grid(True, linestyle="--")

plt.savefig('gaussian_processes_2d_fig_03.png')

plt.close()

How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?
How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?

Classification

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
for x in np.nditer(Y_predict, op_flags=['readwrite']):
    x[...] = 1.0 / ( 1.0 + np.exp(-x))

plt.imshow(Y_predict, interpolation='bilinear', cmap=cm.jet,
           origin='lower',vmin=0,vmax=1)

plt.colorbar()

plt.xlabel(r'$x_1$')
plt.ylabel(r'$x_2$')

plt.title('Gaussian Processes (2D Case)', fontsize=7)

plt.grid(True, linestyle="--")

plt.savefig('gaussian_processes_2d_fig_04.png')

plt.close()

How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?
How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
CS = plt.contour(X1_new, X2_new, Y_predict, origin='lower', cmap=cm.jet)

plt.clabel(CS, inline=1, fontsize=10)

plt.xlabel(r'$x_1$')
plt.ylabel(r'$x_2$')

plt.title('Gaussian Processes (2D Case)', fontsize=7)

plt.grid(True, linestyle="--")

plt.savefig('gaussian_processes_2d_fig_05.png')

plt.close()

How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?
How to use Gaussian processes in machine learning to do a regression or classification using python 3 ?

Find the hyperparameters using gradient descent

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
from scipy import misc


def partial_derivative(func, var=0, point=[]):
    args = point[:]
    def wraps(x):
        args[var] = x
        return func(*args)
    return misc.derivative(wraps, point[var], dx = 1e-6)



def log_likelihood_function(l1,l2,sigma_f):

    D1 = (X1 - X1.T)**2 / (2.0 * l1**2 )

    D2 = (X2 - X2.T)**2 / (2.0 * l2**2 )

    K = np.zeros((X_dim1,X_dim1))

    K = sigma_f**2*np.exp(-(D1+D2))

    np.fill_diagonal(K, K.diagonal() +sigma_n**2 )

    K_inv = inv(K)

    m1 = np.dot(K_inv,Y)

    part1 = -0.5 * np.dot(Y.T,m1)

    part2 = - 0.5 * np.log(np.linalg.det(K))

    part3 = - X_dim1 / 2.0 * np.log(2*np.pi)

    return part1 + part2 + part3

# gradient descent

alpha = 0.1 #-----> learning rate
n_max = 100 #-----> Nb max of iterations
eps = 0.0001 #-----> stoping condition

l1 = 2.5 
l2 = 2.5 
sigma_f = 3.0


cond = 99999.9
n = 0

previous_log_likelihood_value = log_likelihood_function(l1,l2,sigma_f)

print("---- l1,l2,sigma_f,cond -----")

while cond > eps and n < n_max:

    tmp_l1 = l1 + alpha * partial_derivative(log_likelihood_function, 0, [l1,l2,sigma_f])
    tmp_l2 = l2 + alpha * partial_derivative(log_likelihood_function, 1, [l1,l2,sigma_f])
    tmp_sigma_f = sigma_f + alpha * partial_derivative(log_likelihood_function, 2, [l1,l2,sigma_f])

    l1 = tmp_l1
    l2 = tmp_l2
    sigma_f = tmp_sigma_f

    log_likelihood_value = log_likelihood_function(l1,l2,sigma_f)

    n = n + 1
    cond = abs( previous_log_likelihood_value - log_likelihood_value )

    previous_log_likelihood_value = log_likelihood_value

    print('iteration',n,'-->',l1,l2,sigma_f,cond)

returns for example:

  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
---- l1,l2,sigma_f,cond -----
iteration 1 --> 2.52076843213 2.52978597013 2.72876110874 0.776501103425
iteration 2 --> 2.54283238138 2.56073447381 2.43703048668 0.900807215056
iteration 3 --> 2.56652587885 2.59297019581 2.12100706201 1.05906032495
iteration 4 --> 2.59242675354 2.62668358647 1.7771759316 1.24921437311
iteration 5 --> 2.62166130237 2.66220301751 1.4069391501 1.40666790935
iteration 6 --> 2.65674503837 2.70017478333 1.03991169811 1.16605196431
iteration 7 --> 2.70348991479 2.74186212428 0.8422428212 0.130794333563
iteration 8 --> 2.76215068063 2.78617992287 1.01448089283 0.0015303855478
iteration 9 --> 2.80888735632 2.82710666984 0.83988808808 0.0912720387629
iteration 10 --> 2.86486637214 2.86801909991 1.00972249888 0.00271694420842
iteration 11 --> 2.9103364823 2.90739127565 0.836361455228 0.0833823453526
iteration 12 --> 2.96350581915 2.94440006519 1.01017098333 0.0107171341724
iteration 13 --> 3.00736528658 2.98189979642 0.833245571144 0.0814174440835
iteration 14 --> 3.05755491211 3.01465564077 1.01239069663 0.0192916587256
iteration 15 --> 3.09969318702 3.05013963793 0.831190504982 0.0813671102884
iteration 16 --> 3.14678020406 3.07848232196 1.01466249707 0.0266915517351
iteration 17 --> 3.18718525865 3.11188830962 0.830359908946 0.0811549969431
iteration 18 --> 3.23112386367 3.13583513157 1.01620669464 0.0320813849351
iteration 19 --> 3.26981807222 3.16713955862 0.830640563556 0.0797055093778
iteration 20 --> 3.31064105277 3.18684799839 1.01685416898 0.0353571170071
iteration 21 --> 3.34765467863 3.21605521867 0.831813505389 0.0768136602825
iteration 22 --> 3.38545954124 3.23177989286 1.01671769859 0.0368147629435
iteration 23 --> 3.42082541947 3.25892220988 0.833655231716 0.0727688407535
iteration 24 --> 3.45575620325 3.27097778899 1.01597454177 0.0368689064087
iteration 25 --> 3.48951167748 3.29611458874 0.835973919003 0.0679905949419
iteration 26 --> 3.52174231582 3.3048473628 1.01478262409 0.0359148821317
iteration 27 --> 3.55393139309 3.3280615206 0.838614074563 0.0628528233192
iteration 28 --> 3.58365223728 3.33382850304 1.01326612792 0.0342850372417
iteration 29 --> 3.61432624319 3.35522149993 0.841452193903 0.0576394770073
iteration 30 --> 3.64173368049 3.35837547248 1.01152100281 0.0322425883769
iteration 31 --> 3.67095064649 3.37806236149 0.844391485824 0.0525494247917
iteration 32 --> 3.69623939527 3.37894154515 1.00962215959 0.0299873541033
iteration 33 --> 3.72406271137 3.39704625097 0.847357342362 0.0477125696996
iteration 34 --> 3.74742035563 3.3959675581 1.00762869633 0.027664928895
iteration 35 --> 3.77391711295 3.41261868516 0.850293569002 0.0432060603369
iteration 36 --> 3.79552047499 3.409873705 1.00558736389 0.0253765576871
iteration 37 --> 3.82075977755 3.42520091775 0.853159179062 0.0390682013947
iteration 38 --> 3.84077275415 3.42105394618 1.00353493695 0.0231885674087
iteration 39 --> 3.86482417119 3.43518494353 0.855925620538 0.0353097937001
iteration 40 --> 3.883396698 3.42987250664 1.00149995242 0.0211407677146
iteration 41 --> 3.90632894787 3.44293058597 0.858574362919 0.0319230664366
iteration 42 --> 3.92359679689 3.43666201888 0.999504078994 0.0192535531162
iteration 43 --> 3.94547669647 3.44876420631 0.861094809171 0.0288884943486
iteration 44 --> 3.96156185696 3.44172294869 0.997563252028 0.0175336532535
iteration 45 --> 3.98245353877 3.45297865164 0.863482507654 0.0261798231464
iteration 46 --> 3.99746498632 3.44532400112 0.99568864649 0.0159786185058
iteration 47 --> 4.0174293571 3.45583412749 0.865737640179 0.023767638537
iteration 48 --> 4.03146404829 3.44770325108 0.993887524128 0.0145802017383
iteration 49 --> 4.05055845896 3.45755973319 0.867863767098 0.0216218023378
iteration 50 --> 4.06370244131 3.4490697976 0.99216397216 0.0133268247759
iteration 51 --> 4.08198052693 3.45835545038 0.869866801636 0.0197130248017
iteration 52 --> 4.09431007507 3.44960576167 0.990519545343 0.0122053165949
iteration 53 --> 4.11182172862 3.45839441068 0.871754187351 0.018013809215
iteration 54 --> 4.12340445344 3.44946849378 0.988953823036 0.0112020943948
iteration 55 --> 4.14019589937 3.45782531089 0.87353425349 0.0164989586167
iteration 56 --> 4.15109179191 3.44879287715 0.987464881888 0.0103039249556
iteration 57 --> 4.16720572965 3.45677486565 0.875215721166 0.0151457784129
iteration 58 --> 4.17746811828 3.44769364001 0.986049698628 0.00949838936148
iteration 59 --> 4.19294391243 3.45535022036 0.876807334052 0.0139340875779
iteration 60 --> 4.20262032535 3.44626760969 0.984704482867 0.00877413016149
iteration 61 --> 4.21749422266 3.45364126631 0.878317593791 0.0128461086372
iteration 62 --> 4.22662715263 3.44459586251 0.983424952253 0.00812095833699
iteration 63 --> 4.24093251228 3.45172281522 0.879754577212 0.0118662922762
iteration 64 --> 4.24956008632 3.44274573428 0.982206553537 0.00752986310896
iteration 65 --> 4.26332761219 3.44965661068 0.88112581971 0.0109811096473
iteration 66 --> 4.27148417331 3.44077267159 0.981044637374 0.00699296327088
iteration 67 --> 4.28474214048 3.44749315913 0.88243824872 0.0101788353791
iteration 68 --> 4.29245874978 3.43872191478 0.979934592873 0.0065034236983
iteration 69 --> 4.30523322184 3.44527337886 0.883698155654 0.00944933540784
iteration 70 --> 4.31253808815 3.43663000618 0.978871950366 0.00605535737168
iteration 71 --> 4.32485312061 3.44303006541 0.88491119439 0.00878386857579
iteration 72 --> 4.33177196806 3.4345261315 0.977852453079 0.00564371751936
iteration 73 --> 4.3436497963 3.44078918398 0.886082401788 0.00817490434581
iteration 74 --> 4.35020617707 3.43243329786 0.976872106431 0.00526419459669
iteration 75 --> 4.36166738994 3.43857099701 0.887216229131 0.00761595807429
iteration 76 --> 4.36788294911 3.43036936066 0.975927210003 0.00491311825392
iteration 77 --> 4.37894664565 3.43639103875 0.888316582983 0.00710144717263
iteration 78 --> 4.38484134851 3.42834791019 0.97501437202 0.00458736619008
iteration 79 --> 4.39552528083 3.43426095402 0.889386871108 0.00662656304404
iteration 80 --> 4.40111760361 3.42637903384 0.974130513434 0.00428428396985
iteration 81 --> 4.41143830524 3.43218921142 0.890430049097 0.00618715959078
iteration 82 --> 4.41674539849 3.42446996245 0.973272863128 0.00400161332958
iteration 83 --> 4.42671829895 3.43018170793 0.891448667087 0.00577965678721
iteration 84 --> 4.43175612777 3.42262561875 0.972438946192 0.00373742935286
iteration 85 --> 4.44139565632 3.42824227611 0.892444915593 0.00540095768406
iteration 86 --> 4.44617911866 3.42084907584 0.971626567939 0.00349008738687
iteration 87 --> 4.45549879587 3.42637310809 0.893420667339 0.00504837610129
iteration 88 --> 4.46004182484 3.41914194149 0.970833795335 0.00325817651692
iteration 89 --> 4.46905434544 3.42457510696 0.89437751696 0.0047195756864
iteration 90 --> 4.47336999825 3.41750467446 0.970058936136 0.00304048037756
iteration 91 --> 4.48208730451 3.42284817628 0.895316817154 0.00441251721768
iteration 92 --> 4.48618783811 3.41593684708 0.969300518097 0.00283594460484
iteration 93 --> 4.49462118529 3.42119145707 0.896239711093 0.00412541407269
iteration 94 --> 4.49851812388 3.41443735775 0.968557267392 0.00264364868569
iteration 95 --> 4.50667813877 3.41960352027 0.897147161119 0.00385669364931
iteration 96 --> 4.51038233127 3.4130046049 0.967828088452 0.00246278297722
iteration 97 --> 4.51827906503 3.41808252138 0.898039975119 0.00360496593687
iteration 98 --> 4.52180073687 3.41163662654 0.967112044162 0.00229262977858
iteration 99 --> 4.52944371095 3.41662632604 0.898918828554 0.00336899545594
iteration 100 --> 4.53279250988 3.41033121219 0.96640833809 0.00213254738384

References

Image

of