C1W3: Improve MNIST with Convolutions
Contents
C1W3: Improve MNIST with Convolutions#
https-deeplearning-ai/tensorflow-1-public/C1/W3/assignment/C1W3_Assignment.ipynb
Commit
3a2be00
on May 3, 2022, Compare
import tensorflow as tf
import numpy as np
Load the data#
MNIST dataset
60,000 28x28 grayscale images of the 10 digits
tf.keras.datasets.mnist.load_data
(training_images, training_labels), _ = tf.keras.datasets.mnist.load_data()
Pre-processing the data#
def reshape_and_normalize(images):
images = np.expand_dims(images, -1)
images = images / 255
return images
training_images = reshape_and_normalize(training_images)
print(f"Maximum pixel value after normalization: {np.max(training_images)}\n")
print(f"Shape of training set after reshaping: {training_images.shape}\n")
print(f"Shape of one image after reshaping: {training_images[0].shape}")
Maximum pixel value after normalization: 1.0
Shape of training set after reshaping: (60000, 28, 28, 1)
Shape of one image after reshaping: (28, 28, 1)
Defining your callback#
class myCallback(tf.keras.callbacks.Callback):
def on_epoch_end(self, epoch, logs={}):
if logs.get('accuracy') is not None and logs.get('accuracy') >= 0.995:
print("\nReached 99.5 accuracy so canceling training")
self.model.stop_training = True
Convolutional Model#
def convolutional_model():
model = tf.keras.models.Sequential([
tf.keras.layers.Conv2D(32, 3, activation='relu', input_shape=(28, 28, 1)),
tf.keras.layers.MaxPooling2D(),
tf.keras.layers.Flatten(),
tf.keras.layers.Dense(128, activation='relu'),
tf.keras.layers.Dense(10, activation='softmax')
])
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
return model
model = convolutional_model()
callbacks = myCallback()
history = model.fit(training_images, training_labels, epochs=10, callbacks=[callbacks])
Epoch 1/10
1875/1875 [==============================] - 15s 6ms/step - loss: 0.1539 - accuracy: 0.9538
Epoch 2/10
1875/1875 [==============================] - 12s 6ms/step - loss: 0.0527 - accuracy: 0.9840
Epoch 3/10
1875/1875 [==============================] - 12s 6ms/step - loss: 0.0322 - accuracy: 0.9901
Epoch 4/10
1875/1875 [==============================] - 11s 6ms/step - loss: 0.0210 - accuracy: 0.9934
Epoch 5/10
1874/1875 [============================>.] - ETA: 0s - loss: 0.0146 - accuracy: 0.9954
Reached 99.5 accuracy so canceling training
1875/1875 [==============================] - 12s 6ms/step - loss: 0.0146 - accuracy: 0.9954
print(f"Your model was trained for {len(history.epoch)} epochs")
Your model was trained for 5 epochs