Exemples de comment utiliser les processus Gaussien pour faire une regression ou une classification (en "machine learning") avec python 3:
Table des matières
Exemple avec des données 1D
from numpy.linalg import invimport matplotlib.pyplot as pltimport numpy as npX = np.array([1., 3., 5., 6., 7., 8.])Y = X * np.sin(X)X = X[:,np.newaxis]sigma_n = 1.5plt.grid(True,linestyle='--')plt.errorbar(X, Y, yerr=sigma_n, fmt='o')plt.title('Gaussian Processes for regression (1D Case) Training Data', fontsize=7)plt.xlabel('x')plt.ylabel('y')plt.savefig('gaussian_processes_1d_fig_01.png', bbox_inches='tight')

Calculer la matrice de covariance K
sigma_f = 10.0l = 1.0X_dim1 = X.shape[0]D = np.zeros((X_dim1,X_dim1))K = np.zeros((X_dim1,X_dim1))D = X - X.TK = sigma_f**2*np.exp((-D*D)/(2.0*l**2))np.fill_diagonal(K, K.diagonal() +sigma_n**2 )
Faire une prediction en 1 point donnée
x_new = 2.0 # 2.5D_new = np.zeros((X_dim1))K_new = np.zeros((X_dim1))D_new = X - x_newK_new = sigma_f**2*np.exp((-D_new*D_new)/(2.0*l**2))K_inv = inv(K)m1 = np.dot(K_new[:,0],K_inv)y_predict = np.dot(m1,Y)print(y_predict)
Faire une prediction sur une grille
X_new = np.linspace(0,10,100)Y_predict = []Y_VAR_predict = []for x_new in X_new:D_new = np.zeros((X_dim1))K_new = np.zeros((X_dim1))D_new = X - x_newK_new = sigma_f**2*np.exp((-D_new*D_new)/(2.0*l**2))m1 = np.dot(K_new[:,0],K_inv)y_predict = np.dot(m1,Y)Y_predict.append(y_predict)y_var_predict = K[0,0] - K_new[:,0].dot(K_inv.dot(np.transpose(K_new[:,0])))Y_VAR_predict.append(y_var_predict)plt.plot(X_new,Y_predict,'--',label='y predict')plt.legend()plt.savefig('gaussian_processes_1d_fig_02.png', bbox_inches='tight')

Tracer la variance
plt.fill_between(X_new,[i-1.96*np.sqrt(Y_VAR_predict[idx]) for idx,i in enumerate(Y_predict)],[i+1.96*np.sqrt(Y_VAR_predict[idx]) for idx,i in enumerate(Y_predict)],color='#D3D3D3')plt.savefig('gaussian_processes_1d_fig_03.png', bbox_inches='tight')

Trouver les hyperparamètres
La difficulté avec les processus Gaussien est de choisir correctement les hyperparamètres ($\sigma_f$ et $l$). Par exemple avec $\sigma_f=1$ et $l=1$ on obtient le résultat moins satisfaisant suivant:

Utiliser une grille de recherche
Pour trouver les hyperparamètres ($\sigma_f$ et $l$) on peut par exemple faire une recherche sur une grille:
from pylab import figure, cmsigma_f, l = np.meshgrid(np.arange(0.1,10.0, 0.05), np.arange(0.1,10.0, 0.05))sigma_f_dim1 = sigma_f.shape[0]sigma_f_dim2 = sigma_f.shape[1]Z = np.zeros((sigma_f_dim1,sigma_f_dim2))D = X - X.Tfor i in np.arange(sigma_f_dim1):for j in np.arange(sigma_f_dim2):K = np.zeros((X_dim1,X_dim1))K = sigma_f[i,j]**2*np.exp((-D*D)/(2.0*l[i,j]**2))np.fill_diagonal(K, K.diagonal() +sigma_n**2 )K_inv = inv(K)m1 = np.dot(K_inv,Y)part1 = -0.5 * np.dot(Y.T,m1)part2 = - 0.5 * np.log(np.linalg.det(K))part3 = - X_dim1 / 2.0 * np.log(2*np.pi)Z[i,j] = part1 + part2 + part3Z = np.log(-Z)print(np.min(Z))print(np.where(Z == Z.min()))min_indexes = np.where(Z == Z.min())sigma_f_opt = sigma_f[min_indexes[0],min_indexes[1]][0]l_opt = l[min_indexes[0],min_indexes[1]][0]print(sigma_f_opt)print(l_opt)fig = plt.figure()ax = fig.add_subplot(111)plt.imshow( Z.T, interpolation='bilinear', origin='lower', cmap=cm.jet, extent=[0.1,10.0,0.1,10.0])plt.colorbar()plt.scatter(l_opt,sigma_f_opt,color='r')ax.text(l_opt+0.3,sigma_f_opt+0.3,r'$l$='+str(round(l_opt,2))+"\n"+r'$\sigma_f$='+str(round(sigma_f_opt,2)),color='red',fontsize=8)plt.xlabel(r'$l$')plt.ylabel(r'$\sigma_f$')plt.savefig('gaussian_processes_1d_fig_07.png', bbox_inches='tight')plt.close()
Exemple de résultat avec $\sigma_f=4.3$, $l=1.4$ et $\sigma_n=1.5$
Exemple de réesultat avec $\sigma_f=4.8$, $l=1.7$ et $\sigma_n=0.0$
Utiliser un algorithme de gradient descent
from scipy import miscdef partial_derivative(func, var=0, point=[]):args = point[:]def wraps(x):args[var] = xreturn func(*args)return misc.derivative(wraps, point[var], dx = 1e-6)D = X - X.Tdef log_likelihood_function(l,sigma_f):K = np.zeros((X_dim1,X_dim1))K = sigma_f**2*np.exp((-D*D)/(2.0*l**2))np.fill_diagonal(K, K.diagonal() +sigma_n**2 )K_inv = inv(K)m1 = np.dot(K_inv,Y)part1 = -0.5 * np.dot(Y.T,m1)part2 = - 0.5 * np.log(np.linalg.det(K))part3 = - X_dim1 / 2.0 * np.log(2*np.pi)return part1 + part2 + part3# gradient descentalpha = 0.1 #-----> learning raten_max = 100 #-----> Nb max of iterationseps = 0.0001 #-----> stoping conditionl = 5.0sigma_f = 5.0cond = 99999.9n = 0previous_log_likelihood_value = log_likelihood_function(l,sigma_f)while cond > eps and n < n_max:tmp_l = l + alpha * partial_derivative(log_likelihood_function, 0, [l,sigma_f])tmp_sigma_f = sigma_f + alpha * partial_derivative(log_likelihood_function, 1, [l,sigma_f])l = tmp_lsigma_f = tmp_sigma_flog_likelihood_value = log_likelihood_function(l,sigma_f)n = n + 1cond = abs( previous_log_likelihood_value - log_likelihood_value )previous_log_likelihood_value = log_likelihood_valueprint(l,sigma_f,cond)
donne par exemple
4.78183917442 5.11095008539 0.6082805971884.55429401535 5.2188698863 0.6401420986484.31982856882 5.32180751166 0.6581198170784.0808659291 5.41800151482 0.6634053190953.83910094134 5.50614491691 0.6612347135913.59496945618 5.58550864637 0.6591090998243.3476630236 5.65584518456 0.6631421159963.09604436098 5.71708200959 0.6730127175222.8407266811 5.76890427962 0.6758100091612.58721345575 5.81044078843 0.6423631720312.34876787887 5.84039185193 0.5390758653652.1451499821 5.85781314922 0.3686936161731.9919544359 5.86327712605 0.1979226694191.88822260797 5.85927240487 0.08902155683521.82117734903 5.8489039681 0.03824740086631.77797202473 5.8345927089 0.01766530493691.74966419363 5.81789727666 0.009607294001761.73062339447 5.79977191891 0.006436031687671.71737101412 5.78080033212 0.005160612549881.70775230758 5.76134337323 0.004626012425161.70042410589 5.74162660351 0.004383043781581.69454454434 5.7217923196 0.004255367433841.68958392913 5.70193099706 0.004173251358361.68520817906 5.68210059911 0.004108995463451.68120631675 5.662338579 0.004051575668381.67744505309 5.64266940605 0.003996608995191.67384021151 5.62310930939 0.003942362519951.67033869244 5.6036692799 0.003888161186811.66690707301 5.58435697294 0.003833751946471.66352438936 5.56517791638 0.003779049906461.66017756859 5.54613627753 0.003724036972261.65685853376 5.52723535102 0.003668721505981.65356236989 5.50847786568 0.003613122924461.65028616427 5.48986618175 0.003557264706191.64702827058 5.47140241417 0.003501172381541.64378784221 5.45308851071 0.003444872478141.64056453827 5.43492630102 0.003388392102661.6373583371 5.4169175261 0.003331759003961.63416941669 5.39906385858 0.003275001098051.63099808167 5.38136691245 0.003218146865851.62784471433 5.36382825037 0.003161225020441.62470974489 5.34644938687 0.00310426459821.62159363165 5.3292317901 0.003047294871121.61849684909 5.31217688221 0.002990345309511.61541987983 5.29528603847 0.002933445633041.61236320954 5.27856058775 0.002876625442841.60932732262 5.26200181029 0.002819914630911.60631270082 5.24561093674 0.00276334300661.60331982039 5.22938914714 0.002706940326571.6003491517 5.21333756965 0.00265073620481.59740115785 5.19745727844 0.002594760239491.59447629377 5.18174929313 0.002539041694561.59157500567 5.16621457721 0.002483609615581.58869773 5.15085403682 0.002428492743041.58584489328 5.13566851962 0.002373719415471.58301691188 5.12065881343 0.002319317580731.5802141908 5.10582564531 0.002265314702521.57743712316 5.09116968052 0.002211737726441.57468608962 5.0766915214 0.002158613039441.5719614595 5.06239170727 0.002105966261731.56926358753 5.04827071258 0.002053822557011.56659281637 5.03432894662 0.002002206236981.56394947429 5.02056675381 0.001951140766581.56133387518 5.0069844118 0.001900648990791.55874631852 4.99358213145 0.001850752833951.55618708978 4.98036005753 0.001801473178041.55365645772 4.96731826695 0.001752830248231.55115467672 4.95445676948 0.001704843133871.54868198551 4.94177550793 0.00165752993881.5462386065 4.92927435724 0.001610907886321.54382474628 4.91695312538 0.001564993051641.54144059509 4.90481155319 0.001519800515571.53908632684 4.89284931594 0.00147534413231.53676209856 4.88106602142 0.001431636965411.53446805166 4.8694612124 0.001388690714491.53220431011 4.85803436662 0.001346516089311.52997098121 4.84678489788 0.001305122633831.52776815626 4.83571215569 0.001264518895511.52559590935 4.82481542773 0.001224712163691.52345429837 4.81409393914 0.001185708787541.52134336472 4.80354685523 0.001147513857221.51926313327 4.79317328158 0.001110131486421.51721361307 4.78297226555 0.001073564685571.51519479666 4.77294279763 0.00103781541821.51320666132 4.76308381235 0.001002884641051.51124916813 4.75339419027 0.000968772269851.50932226363 4.74387275952 0.0009354771997181.50742587853 4.73451829671 0.0009029974596581.50555992901 4.7253295291 0.0008713300796081.50372431628 4.71630513578 0.0008404712406821.50191892826 4.70744374984 0.0008104161948311.50014363848 4.69874395944 0.000781159446871.49839830695 4.69020431099 0.0007526945721141.49668278074 4.68182330843 0.0007250145977481.49499689394 4.67359941859 0.000698111541071.49334046872 4.66553106923 0.0006719770409621.49171331502 4.65761665379 0.0006466018632591.49011523122 4.649854532 0.0006219762748431.48854600469 4.64224303177 0.0005980899691271.4870054125 4.63478045171 0.000574932065636
Un exemple en 2D:
from numpy.linalg import invimport matplotlib.pyplot as pltimport numpy as npX = np.array([[-8.0,-8.0], [-6.0,-3.0], [-7.0,2.0], [-4.0,4.0],[2.0,3.0], [5.0,7.0], [1.0,-1.0], [3.0,-4.0], [7.0,-7.0]])Y = np.array([-1.0, -1.0, -1.0, -1.0, -1.0, -1.0, 1.0, 1.0, 1.0])print(X.shape)X_dim1 = X.shape[0]sigma_n = 0.1markers = []colors = []for i in Y:if i == -1.0:markers.append('o')colors.append('#1f77b4')if i == 1.0:markers.append('x')colors.append('#ff7f0e')plt.xlabel(r'$x_1$')plt.ylabel(r'$x_2$')plt.title('Gaussian Processes (2D Case)', fontsize=7)plt.grid(True,linestyle="--")for i in range(X_dim1):plt.scatter(X[i,0], X[i,1], marker=markers[i], color=colors[i])plt.savefig('gaussian_processes_2d_fig_01.png', bbox_inches='tight')plt.close()

Calculer la matrice de covariance K
sigma_f = 1.0l1 = 4.0l2 = 4.0X = X[:,:,np.newaxis]D = np.zeros((X_dim1,X_dim1))K = np.zeros((X_dim1,X_dim1))X1 = X[:,0,:]D1 = (X1 - X1.T)**2 / (2.0 * l1**2 )X2 = X[:,1,:]D2 = (X2 - X2.T)**2 / (2.0 * l2**2 )K = sigma_f**2*np.exp(-(D1+D2))np.fill_diagonal(K, K.diagonal() +sigma_n**2 )
Faire une prédiction en un point
x1_new = -7.0x2_new = -5.0K_new = np.zeros((X_dim1))D1 = X1 - x1_newD1 = ( D1**2 ) / l1**2D2 = X2 - x2_newD2 = ( D2**2 ) / l2**2K_new = sigma_f**2 * np.exp( - 0.5 * (D1 + D2) )K_inv = inv(K)m1 = np.dot(K_new[:,0],K_inv)y_new = np.dot(m1,Y)var_y = K[0,0] - K_new[:,0].dot(K_inv.dot(np.transpose(K_new[:,0])))print( 'y_new ', y_new)print( "var_y ", var_y)
Faire des prédictions sur une grille
from pylab import figure, cmX1_new, X2_new = np.meshgrid(np.arange(-10, 10, 0.1), np.arange(-10, 10, 0.1))X1_new_dim = X1_new.shapeK_new = np.zeros((X_dim1, X1_new_dim[0], X1_new_dim[1]))Y_predict = np.zeros(X1_new_dim)Y_predict_var = np.zeros(X1_new_dim)for i in range(X_dim1):D1 = X1_new - X1[i]D1 = ( D1**2 ) / l1**2D2 = X2_new - X2[i]D2 = ( D2**2 ) / l2**2K_new[i,:,:] = sigma_f**2 * np.exp( - 0.5 * (D1 + D2) )K_inv = inv(K)for i in range(X1_new_dim[0]):for j in range(X1_new_dim[1]):m1 = np.dot(K_new[:,i,j],K_inv)Y_predict[i,j] = np.dot(m1,Y)Y_predict_var[i,j] = K[0,0] - K_new[:,i,j].dot(K_inv.dot(np.transpose(K_new[:,i,j])))plt.imshow(Y_predict, interpolation='bilinear',origin='lower', cmap=cm.jet, extent=[-10.0,10.0,-10.0,10.0])plt.colorbar()plt.xlabel(r'$x_1$')plt.ylabel(r'$x_2$')plt.title('Gaussian Processes (2D Case)', fontsize=7)plt.grid(True, linestyle="--")plt.savefig('gaussian_processes_2d_fig_02.png')plt.close()

Tracer la variance
plt.imshow(Y_predict_var, interpolation='bilinear', origin='lower', cmap=cm.jet)plt.colorbar()plt.xlabel(r'$x_1$')plt.ylabel(r'$x_2$')plt.title('Gaussian Processes (2D Case)', fontsize=7)plt.grid(True, linestyle="--")plt.savefig('gaussian_processes_2d_fig_03.png')plt.close()

Classification
for x in np.nditer(Y_predict, op_flags=['readwrite']):x[...] = 1.0 / ( 1.0 + np.exp(-x))plt.imshow(Y_predict, interpolation='bilinear', cmap=cm.jet,origin='lower',vmin=0,vmax=1)plt.colorbar()plt.xlabel(r'$x_1$')plt.ylabel(r'$x_2$')plt.title('Gaussian Processes (2D Case)', fontsize=7)plt.grid(True, linestyle="--")plt.savefig('gaussian_processes_2d_fig_04.png')plt.close()

CS = plt.contour(X1_new, X2_new, Y_predict, origin='lower', cmap=cm.jet)plt.clabel(CS, inline=1, fontsize=10)plt.xlabel(r'$x_1$')plt.ylabel(r'$x_2$')plt.title('Gaussian Processes (2D Case)', fontsize=7)plt.grid(True, linestyle="--")plt.savefig('gaussian_processes_2d_fig_05.png')plt.close()

Trouver les hyperparamètres avec un gradient descent
from scipy import miscdef partial_derivative(func, var=0, point=[]):args = point[:]def wraps(x):args[var] = xreturn func(*args)return misc.derivative(wraps, point[var], dx = 1e-6)def log_likelihood_function(l1,l2,sigma_f):D1 = (X1 - X1.T)**2 / (2.0 * l1**2 )D2 = (X2 - X2.T)**2 / (2.0 * l2**2 )K = np.zeros((X_dim1,X_dim1))K = sigma_f**2*np.exp(-(D1+D2))np.fill_diagonal(K, K.diagonal() +sigma_n**2 )K_inv = inv(K)m1 = np.dot(K_inv,Y)part1 = -0.5 * np.dot(Y.T,m1)part2 = - 0.5 * np.log(np.linalg.det(K))part3 = - X_dim1 / 2.0 * np.log(2*np.pi)return part1 + part2 + part3# gradient descentalpha = 0.1 #-----> learning raten_max = 100 #-----> Nb max of iterationseps = 0.0001 #-----> stoping conditionl1 = 2.5l2 = 2.5sigma_f = 3.0cond = 99999.9n = 0previous_log_likelihood_value = log_likelihood_function(l1,l2,sigma_f)print("---- l1,l2,sigma_f,cond -----")while cond > eps and n < n_max:tmp_l1 = l1 + alpha * partial_derivative(log_likelihood_function, 0, [l1,l2,sigma_f])tmp_l2 = l2 + alpha * partial_derivative(log_likelihood_function, 1, [l1,l2,sigma_f])tmp_sigma_f = sigma_f + alpha * partial_derivative(log_likelihood_function, 2, [l1,l2,sigma_f])l1 = tmp_l1l2 = tmp_l2sigma_f = tmp_sigma_flog_likelihood_value = log_likelihood_function(l1,l2,sigma_f)n = n + 1cond = abs( previous_log_likelihood_value - log_likelihood_value )previous_log_likelihood_value = log_likelihood_valueprint('iteration',n,'-->',l1,l2,sigma_f,cond)
donne par exemple:
---- l1,l2,sigma_f,cond -----iteration 1 --> 2.52076843213 2.52978597013 2.72876110874 0.776501103425iteration 2 --> 2.54283238138 2.56073447381 2.43703048668 0.900807215056iteration 3 --> 2.56652587885 2.59297019581 2.12100706201 1.05906032495iteration 4 --> 2.59242675354 2.62668358647 1.7771759316 1.24921437311iteration 5 --> 2.62166130237 2.66220301751 1.4069391501 1.40666790935iteration 6 --> 2.65674503837 2.70017478333 1.03991169811 1.16605196431iteration 7 --> 2.70348991479 2.74186212428 0.8422428212 0.130794333563iteration 8 --> 2.76215068063 2.78617992287 1.01448089283 0.0015303855478iteration 9 --> 2.80888735632 2.82710666984 0.83988808808 0.0912720387629iteration 10 --> 2.86486637214 2.86801909991 1.00972249888 0.00271694420842iteration 11 --> 2.9103364823 2.90739127565 0.836361455228 0.0833823453526iteration 12 --> 2.96350581915 2.94440006519 1.01017098333 0.0107171341724iteration 13 --> 3.00736528658 2.98189979642 0.833245571144 0.0814174440835iteration 14 --> 3.05755491211 3.01465564077 1.01239069663 0.0192916587256iteration 15 --> 3.09969318702 3.05013963793 0.831190504982 0.0813671102884iteration 16 --> 3.14678020406 3.07848232196 1.01466249707 0.0266915517351iteration 17 --> 3.18718525865 3.11188830962 0.830359908946 0.0811549969431iteration 18 --> 3.23112386367 3.13583513157 1.01620669464 0.0320813849351iteration 19 --> 3.26981807222 3.16713955862 0.830640563556 0.0797055093778iteration 20 --> 3.31064105277 3.18684799839 1.01685416898 0.0353571170071iteration 21 --> 3.34765467863 3.21605521867 0.831813505389 0.0768136602825iteration 22 --> 3.38545954124 3.23177989286 1.01671769859 0.0368147629435iteration 23 --> 3.42082541947 3.25892220988 0.833655231716 0.0727688407535iteration 24 --> 3.45575620325 3.27097778899 1.01597454177 0.0368689064087iteration 25 --> 3.48951167748 3.29611458874 0.835973919003 0.0679905949419iteration 26 --> 3.52174231582 3.3048473628 1.01478262409 0.0359148821317iteration 27 --> 3.55393139309 3.3280615206 0.838614074563 0.0628528233192iteration 28 --> 3.58365223728 3.33382850304 1.01326612792 0.0342850372417iteration 29 --> 3.61432624319 3.35522149993 0.841452193903 0.0576394770073iteration 30 --> 3.64173368049 3.35837547248 1.01152100281 0.0322425883769iteration 31 --> 3.67095064649 3.37806236149 0.844391485824 0.0525494247917iteration 32 --> 3.69623939527 3.37894154515 1.00962215959 0.0299873541033iteration 33 --> 3.72406271137 3.39704625097 0.847357342362 0.0477125696996iteration 34 --> 3.74742035563 3.3959675581 1.00762869633 0.027664928895iteration 35 --> 3.77391711295 3.41261868516 0.850293569002 0.0432060603369iteration 36 --> 3.79552047499 3.409873705 1.00558736389 0.0253765576871iteration 37 --> 3.82075977755 3.42520091775 0.853159179062 0.0390682013947iteration 38 --> 3.84077275415 3.42105394618 1.00353493695 0.0231885674087iteration 39 --> 3.86482417119 3.43518494353 0.855925620538 0.0353097937001iteration 40 --> 3.883396698 3.42987250664 1.00149995242 0.0211407677146iteration 41 --> 3.90632894787 3.44293058597 0.858574362919 0.0319230664366iteration 42 --> 3.92359679689 3.43666201888 0.999504078994 0.0192535531162iteration 43 --> 3.94547669647 3.44876420631 0.861094809171 0.0288884943486iteration 44 --> 3.96156185696 3.44172294869 0.997563252028 0.0175336532535iteration 45 --> 3.98245353877 3.45297865164 0.863482507654 0.0261798231464iteration 46 --> 3.99746498632 3.44532400112 0.99568864649 0.0159786185058iteration 47 --> 4.0174293571 3.45583412749 0.865737640179 0.023767638537iteration 48 --> 4.03146404829 3.44770325108 0.993887524128 0.0145802017383iteration 49 --> 4.05055845896 3.45755973319 0.867863767098 0.0216218023378iteration 50 --> 4.06370244131 3.4490697976 0.99216397216 0.0133268247759iteration 51 --> 4.08198052693 3.45835545038 0.869866801636 0.0197130248017iteration 52 --> 4.09431007507 3.44960576167 0.990519545343 0.0122053165949iteration 53 --> 4.11182172862 3.45839441068 0.871754187351 0.018013809215iteration 54 --> 4.12340445344 3.44946849378 0.988953823036 0.0112020943948iteration 55 --> 4.14019589937 3.45782531089 0.87353425349 0.0164989586167iteration 56 --> 4.15109179191 3.44879287715 0.987464881888 0.0103039249556iteration 57 --> 4.16720572965 3.45677486565 0.875215721166 0.0151457784129iteration 58 --> 4.17746811828 3.44769364001 0.986049698628 0.00949838936148iteration 59 --> 4.19294391243 3.45535022036 0.876807334052 0.0139340875779iteration 60 --> 4.20262032535 3.44626760969 0.984704482867 0.00877413016149iteration 61 --> 4.21749422266 3.45364126631 0.878317593791 0.0128461086372iteration 62 --> 4.22662715263 3.44459586251 0.983424952253 0.00812095833699iteration 63 --> 4.24093251228 3.45172281522 0.879754577212 0.0118662922762iteration 64 --> 4.24956008632 3.44274573428 0.982206553537 0.00752986310896iteration 65 --> 4.26332761219 3.44965661068 0.88112581971 0.0109811096473iteration 66 --> 4.27148417331 3.44077267159 0.981044637374 0.00699296327088iteration 67 --> 4.28474214048 3.44749315913 0.88243824872 0.0101788353791iteration 68 --> 4.29245874978 3.43872191478 0.979934592873 0.0065034236983iteration 69 --> 4.30523322184 3.44527337886 0.883698155654 0.00944933540784iteration 70 --> 4.31253808815 3.43663000618 0.978871950366 0.00605535737168iteration 71 --> 4.32485312061 3.44303006541 0.88491119439 0.00878386857579iteration 72 --> 4.33177196806 3.4345261315 0.977852453079 0.00564371751936iteration 73 --> 4.3436497963 3.44078918398 0.886082401788 0.00817490434581iteration 74 --> 4.35020617707 3.43243329786 0.976872106431 0.00526419459669iteration 75 --> 4.36166738994 3.43857099701 0.887216229131 0.00761595807429iteration 76 --> 4.36788294911 3.43036936066 0.975927210003 0.00491311825392iteration 77 --> 4.37894664565 3.43639103875 0.888316582983 0.00710144717263iteration 78 --> 4.38484134851 3.42834791019 0.97501437202 0.00458736619008iteration 79 --> 4.39552528083 3.43426095402 0.889386871108 0.00662656304404iteration 80 --> 4.40111760361 3.42637903384 0.974130513434 0.00428428396985iteration 81 --> 4.41143830524 3.43218921142 0.890430049097 0.00618715959078iteration 82 --> 4.41674539849 3.42446996245 0.973272863128 0.00400161332958iteration 83 --> 4.42671829895 3.43018170793 0.891448667087 0.00577965678721iteration 84 --> 4.43175612777 3.42262561875 0.972438946192 0.00373742935286iteration 85 --> 4.44139565632 3.42824227611 0.892444915593 0.00540095768406iteration 86 --> 4.44617911866 3.42084907584 0.971626567939 0.00349008738687iteration 87 --> 4.45549879587 3.42637310809 0.893420667339 0.00504837610129iteration 88 --> 4.46004182484 3.41914194149 0.970833795335 0.00325817651692iteration 89 --> 4.46905434544 3.42457510696 0.89437751696 0.0047195756864iteration 90 --> 4.47336999825 3.41750467446 0.970058936136 0.00304048037756iteration 91 --> 4.48208730451 3.42284817628 0.895316817154 0.00441251721768iteration 92 --> 4.48618783811 3.41593684708 0.969300518097 0.00283594460484iteration 93 --> 4.49462118529 3.42119145707 0.896239711093 0.00412541407269iteration 94 --> 4.49851812388 3.41443735775 0.968557267392 0.00264364868569iteration 95 --> 4.50667813877 3.41960352027 0.897147161119 0.00385669364931iteration 96 --> 4.51038233127 3.4130046049 0.967828088452 0.00246278297722iteration 97 --> 4.51827906503 3.41808252138 0.898039975119 0.00360496593687iteration 98 --> 4.52180073687 3.41163662654 0.967112044162 0.00229262977858iteration 99 --> 4.52944371095 3.41662632604 0.898918828554 0.00336899545594iteration 100 --> 4.53279250988 3.41033121219 0.96640833809 0.00213254738384
Références
| Liens | Site |
|---|---|
| Gaussian Processes for Regression: A Quick Introduction | robots.ox.ac.uk |
| Gaussian Processes for Machine Learning | gaussianprocess.org |
| Bayesian Classification With Gaussian Processes | publications.aston.ac.uk |
| A Tutorial on Gaussian Processes (or why I don’t use SVMs) | comp.nus.edu |
| Gaussian processes | stanford.edu |
| Gaussian Processes | mit.edu |
| Gaussian Process Regression Models | mathworks.com |
| 1.7. Gaussian Processes | scikit-learn.org |
| Gaussian processes in Bayesian modeling: Manual for Matlab toolbox GPstuff, Version 2.0 | citeseerx.ist.psu.edu |
| Markov chain Monte Carlo algorithms for Gaussian processes | aueb.gr |
| Gaussian processes | stanford.edu |
| Gaussian Processes for Timeseries Modelling | robots.ox.ac.uk |
