I am trying to train a model with MINIST dataset. First column of the dataset is number labels (0~9). the size of the dataset is 60000 x 785 (number label 1 (28 x 28 =784)image)
What is wrong with my code?
#print(x_data.shape, y_data.shape)
#(60000, 784) (60000, 1)
# xy_data = np.loadtxt('/content/drive/MyDrive/Machine-Learning Study/GAN/MNIST_data/mnist_train.csv', delimiter=',', dtype=np.float32)
# xy_test = np.loadtxt('/content/drive/MyDrive/Machine-Learning Study/GAN/MNIST_data/mnist_test.csv', delimiter=',', dtype=np.float32)
# # 60000 x 785 array
# # first column is number label (0 ~ 9)
# x_data = xy_data[:, 1:]
# y_data = xy_data[:, [0]]
nb_classes = 10
X = tf.placeholder(tf.float32, shape = [None, 784])
Y = tf.placeholder(tf.int32, shape = [None, nb_classes])
# used one_hot function to convert y_data [:, [-1]] to [:, 10]
Y_one_hot = tf.one_hot(y_data, nb_classes)
Y_one_hot = tf.reshape(Y_one_hot, [-1, nb_classes])
# since feed_dict cannot take tensor array, converting tensor to array so that we can plug the array into Y
# converting using .eavl only works in Tf 1 version
y_data_array = Y_one_hot.eval(session=tf.Session())
W = tf.Variable(tf.random_normal([784, nb_classes]))
b = tf.Variable(tf.random_normal([nb_classes]))
logits = tf.matmul(X, W) b
hypothesis = tf.nn.softmax(logits)
# element-wise product loss function
loss_i = tf.nn.softmax_cross_entropy_with_logits(logits = logits, labels = Y_one_hot)
loss = tf.reduce_mean(loss_i)
optimizer = tf.train.GradientDescentOptimizer(learning_rate = 0.1).minimize(loss)
is_correct = tf.equal(tf.arg_max(hypothesis, 1), tf.arg_max(Y_one_hot, 1))
accuracy = tf.reduce_mean(tf.cast(is_correct, tf.float32))
training_epochs = 150
sess = tf.Session()
sess.run(tf.global_variables_initializer())
for epoch in range(training_epochs):
loss_val, acc, _ = sess.run([loss, accuracy, optimizer], feed_dict={X:x_data, Y:y_data_array})
if epoch % 5 == 0:
print("Epochs: {:}\tLoss: {:.4f}\tAcc: {:.2%}".format(epoch, loss_val, acc))
Results:
Epochs: 0 Loss: 4227.7871 Acc: 9.71%
Epochs: 5 Loss: 17390.2520 Acc: 41.26%
Epochs: 10 Loss: 8494.0889 Acc: 52.81%
Epochs: 15 Loss: 1412.1642 Acc: 82.48%
Epochs: 20 Loss: 1620.4032 Acc: 82.48%
Epochs: 25 Loss: 1891.1475 Acc: 81.31%
Epochs: 30 Loss: 2770.4656 Acc: 77.99%
Epochs: 35 Loss: 1659.1884 Acc: 79.90%
Epochs: 40 Loss: 1134.2424 Acc: 84.61%
Epochs: 45 Loss: 2560.7073 Acc: 80.17%
Epochs: 50 Loss: 1440.0392 Acc: 82.33%
Epochs: 55 Loss: 1219.5104 Acc: 83.87%
Epochs: 60 Loss: 1002.9220 Acc: 86.11%
Epochs: 65 Loss: 635.6382 Acc: 89.84%
Epochs: 70 Loss: 574.5991 Acc: 90.13%
Epochs: 75 Loss: 544.4010 Acc: 90.15%
Epochs: 80 Loss: 2215.5605 Acc: 80.56%
Epochs: 85 Loss: 4700.1890 Acc: 77.99%
Epochs: 90 Loss: 3243.2017 Acc: 78.18%
Epochs: 95 Loss: 1040.0907 Acc: 85.05%
Epochs: 100 Loss: 1999.5754 Acc: 82.24%
CodePudding user response:
Your code is fine, the problem is with your high learning rate.
I worked out for lr=0.005
and monitored for 150 epochs, it works as you expect.
Epochs: 0 Loss: 3659.2244 Acc: 4.97%
Epochs: 5 Loss: 1218.3916 Acc: 30.38%
Epochs: 10 Loss: 767.9141 Acc: 46.95%
Epochs: 15 Loss: 582.4928 Acc: 55.63%
Epochs: 20 Loss: 480.8191 Acc: 61.28%
Epochs: 25 Loss: 416.9088 Acc: 65.28%
Epochs: 30 Loss: 372.9733 Acc: 68.19%
Epochs: 35 Loss: 340.5632 Acc: 70.34%
Epochs: 40 Loss: 315.6934 Acc: 72.09%
Epochs: 45 Loss: 296.0419 Acc: 73.57%
Epochs: 50 Loss: 280.1195 Acc: 74.72%
Epochs: 55 Loss: 266.9192 Acc: 75.74%
Epochs: 60 Loss: 255.7594 Acc: 76.58%
Epochs: 65 Loss: 246.1218 Acc: 77.29%
Epochs: 70 Loss: 237.6666 Acc: 77.91%
Epochs: 75 Loss: 230.2098 Acc: 78.47%
Epochs: 80 Loss: 223.5687 Acc: 79.02%
Epochs: 85 Loss: 217.6027 Acc: 79.42%
Epochs: 90 Loss: 212.1969 Acc: 79.81%
Epochs: 95 Loss: 207.2774 Acc: 80.16%
Epochs: 100 Loss: 202.7701 Acc: 80.53%
Epochs: 105 Loss: 198.6335 Acc: 80.86%
Epochs: 110 Loss: 194.8041 Acc: 81.12%
Epochs: 115 Loss: 191.2343 Acc: 81.38%
Epochs: 120 Loss: 187.8969 Acc: 81.59%
Epochs: 125 Loss: 184.7562 Acc: 81.78%
Epochs: 130 Loss: 181.7817 Acc: 81.98%
Epochs: 135 Loss: 178.9837 Acc: 82.20%
Epochs: 140 Loss: 176.3420 Acc: 82.36%
Epochs: 145 Loss: 173.8274 Acc: 82.53%