# Studying Python (14) ~ Handwritten number recognition-learning process (5)

## Introduction

Hello, this is swim-lover. I’ve just started Python, and I’m studying with the concept of “learning while using”. In the 13th session, we learned about partial differential and gradient. I also tried using 3D plots. This time as well, we will continue to study according to the reference books.

## Reference Book

As a reference book for machine learning, I used “Deep Learning from scratch, O’Reilly Japan, September 2016” by Yasuki Saito.

Last time, we calculated the gradient values for the three gradient points.

Next, let’s check the gradient with a vector diagram.

Python code is getting complicated.

・Create input coordinate data (x0, x1) with np.arange ()

・Create a set of data (X, Y) like (0,0), (0,1), (1,0), (1,1) with meshgrid ()

・Create an X, Y array with np.array and a transpose with .T. (Make it a vertically long matrix)

・Extract a set of idx, x (x0, x1) with enumerate (X)

``````import numpy as np
import matplotlib.pylab as plt
from mpl_toolkits.mplot3d import Axes3D

#h = 10e-50  # bad example, too small value
h = 1e-4    # good example
for idx in range(x.size):
tmp = x[idx]
#calc f(x+h)
x[idx]=tmp + h #add h only x[idx]
fxh1 = f(x)

#calc f(x-h)
x[idx]=tmp - h #subs h only x[idx]
fxh2 = f(x)

x[idx]=tmp  #restore tmp

for idx, x in enumerate(X):#extract index and x data

def func_x0_x1(x):
return x**2+x**2

x0 = np.arange(-2, 2.0, 0.25) # make x0 data
x1 = np.arange(-2, 2.0, 0.25) # make x1 data
X, Y = np.meshgrid(x0, x1) #make lattice point
#print("array X len={}n".format(X.size))

X = X.flatten() #convert one dim
Y = Y.flatten() #convert one dim
indata = np.array([X,Y]).T

plt.figure()
plt.xlim([-2, 2])
plt.ylim([-2, 2])
plt.xlabel('x0')
plt.ylabel('x1')
plt.grid()
plt.draw()
plt.show()``````

quiver () is used for the vector Plot. The size of the vector is shown in color instead of the length of the arrow.

The direction of the arrow is the direction that reduces the value of the function most. It is also written in the reference books that it is an important point.

## Conclusion

This time, I did a Plot of gradient. It’s getting a bit more complicated Python code.

I would like to continue learning about neural network learning.

シェアする
swim-lover-usをフォローする タイトルとURLをコピーしました