Home > other >  No val_loss and val_accuracy keys when I've already had validation_data in model.fit()'s a
No val_loss and val_accuracy keys when I've already had validation_data in model.fit()'s a

Time:03-18

Here's the image augmentation code:

batch_size = 16

train_datagen = ImageDataGenerator(rescale=1./255, validation_split=0.2)

# test_datagen = ImageDataGenerator(rescale=1./255)

# Use flow from dataframe
train_generator = train_datagen.flow_from_dataframe(
        dataframe=train,
        directory="train_images",
        x_col="id",
        y_col=["not_ready", "ready"],
        target_size=(300, 300),
        batch_size=batch_size,
        class_mode="raw",
        color_mode="grayscale",
        subset="training")

validation_generator = train_datagen.flow_from_dataframe(
        dataframe=train,
        directory="train_images",
        x_col="id",
        y_col=["not_ready", "ready"],
        target_size=(300, 300),
        batch_size=batch_size,
        class_mode="raw",
        color_mode="grayscale",
        subset="validation")

Setup the model:

early_stopping = EarlyStopping(monitor='loss',mode='min',verbose=1,patience=7, restore_best_weights=True)

opt = Adam(learning_rate=0.0002)

model.compile(optimizer=opt, loss='binary_crossentropy', metrics=['accuracy'])

history = model.fit(train_generator,
        steps_per_epoch=train_generator.n // batch_size,
        epochs=100,
        validation_data=validation_generator,
        validation_steps=validation_generator.n // batch_size,
        callbacks=[early_stopping])

And print the history keys:

print(history.history.keys())

But the results:

dict_keys(['loss', 'accuracy'])

There's no val_loss and val_accuracy when I've already had validation_data. Why is that and how to make them appear?

CodePudding user response:

First: Make sure your model is running without your validation_generator. Second: Make sure your validation_generator really does have data by iterating through a few samples. Here is a working example:

import tensorflow as tf

flowers = tf.keras.utils.get_file(
    'flower_photos',
    'https://storage.googleapis.com/download.tensorflow.org/example_images/flower_photos.tgz',
    untar=True)

img_gen = tf.keras.preprocessing.image.ImageDataGenerator(rescale=1./255, validation_split=0.2)
BATCH_SIZE = 32

train_generator = img_gen.flow_from_directory(flowers, class_mode='sparse', batch_size=BATCH_SIZE, target_size=(300, 300), shuffle=True, subset="training", color_mode="grayscale")
validation_generator = img_gen.flow_from_directory(flowers, class_mode='sparse', batch_size=BATCH_SIZE, target_size=(300, 300), shuffle=True, subset="validation", color_mode="grayscale")

model = tf.keras.Sequential([
    tf.keras.layers.Conv2D(filters=16, kernel_size=(3, 3), activation='relu', input_shape=(300, 300, 1)),
    tf.keras.layers.MaxPooling2D(pool_size=(2, 2)),
    tf.keras.layers.Dropout(0.5),
    tf.keras.layers.Conv2D(filters=32, kernel_size=(3, 3), activation='relu'),
    tf.keras.layers.MaxPooling2D(pool_size=(2, 2)),
    tf.keras.layers.Dropout(0.5),
    tf.keras.layers.Flatten(),
    tf.keras.layers.Dense(64, activation='relu'),
    tf.keras.layers.Dropout(0.5),
    tf.keras.layers.Dense(32, activation='relu'),
    tf.keras.layers.Dropout(0.5),
    tf.keras.layers.Dense(5)
])

model.compile(optimizer='adam',
              loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
              metrics=['accuracy'])

epochs=10
early_stopping = tf.keras.callbacks.EarlyStopping(monitor='loss',mode='min',verbose=1,patience=7, restore_best_weights=True)

history = model.fit(train_generator,
        steps_per_epoch=train_generator.n // BATCH_SIZE,
        epochs=1,
        validation_data=validation_generator,
        validation_steps=validation_generator.n // BATCH_SIZE,
        callbacks=[early_stopping])
print(history.history.keys())
Found 2939 images belonging to 5 classes.
Found 731 images belonging to 5 classes.
91/91 [==============================] - 44s 462ms/step - loss: 1.8690 - accuracy: 0.2298 - val_loss: 1.6060 - val_accuracy: 0.2443
dict_keys(['loss', 'accuracy', 'val_loss', 'val_accuracy'])

Also check the parameter validation_steps in model.fit, if for example it is 0, you will not see the validation loss and accuracy in history.history.keys(). If that is the case, try not setting the parameter at all:

history = model.fit(train_generator,
        steps_per_epoch=train_generator.n // BATCH_SIZE,
        epochs=1,
        validation_data=validation_generator,
        callbacks=[early_stopping])

See the docs for more information.

  • Related