Home > Software design >  cannot fit the model using data loaded from tfds ImageFolder
cannot fit the model using data loaded from tfds ImageFolder

Time:08-26

I am trying to use VGG16 in a model but I got an error when calling fit.

ValueError: Input 0 of layer "sequential_1" is incompatible with the layer: expected shape=(None, 363, 360, 3), found shape=(363, 360, 3)

I am using tfds to load images from folders.

builder = tfds.ImageFolder(PATH, shape=(363,360,3))
print(builder.info) 
train_ds, test_ds = builder.as_dataset(split=['train','test'], shuffle_files=True, as_supervised=True,)   

The output is as follows.

tfds.core.DatasetInfo(
    name='image_folder',
    full_name='image_folder/1.0.0',
    description="""
    Generic image classification dataset.
    """,
    homepage='https://www.tensorflow.org/datasets/catalog/image_folder',
    data_path='/root/tensorflow_datasets/image_folder/1.0.0',
    file_format=tfrecord,
    download_size=Unknown size,
    dataset_size=Unknown size,
    features=FeaturesDict({
        'image': Image(shape=(363, 360, 3), dtype=tf.uint8),
        'image/filename': Text(shape=(), dtype=tf.string),
        'label': ClassLabel(shape=(), dtype=tf.int64, num_classes=8),
    }),
    supervised_keys=('image', 'label'),
    disable_shuffling=False,
    splits={
        'test': <SplitInfo num_examples=1712, num_shards=1>,
        'train': <SplitInfo num_examples=15380, num_shards=1>,
    },
    citation="""""",
)

The model is created using the following code.

IMG_SHAPE = (363, 360, 3)
VGG16_MODEL = tf.keras.applications.VGG16(input_shape=IMG_SHAPE,
                                               include_top=False,  
                                               weights='imagenet')
VGG16_MODEL.trainable = False  
global_average_layer = tf.keras.layers.GlobalAveragePooling2D()  
prediction_layer = tf.keras.layers.Dense(len(CLASS_NAMES), activation='softmax')

model = tf.keras.Sequential([
  VGG16_MODEL,
  global_average_layer,
  prediction_layer
])

model.compile(optimizer=tf.keras.optimizers.Adam(), 
              loss=tf.keras.losses.sparse_categorical_crossentropy,
              metrics=["accuracy"])

A problem occurs when I tried to fit the model.

history = model.fit(train_ds, epochs=100,)

CodePudding user response:

You seem to have forgotten to set the batch size:

history = model.fit(train_ds.batch(32), epochs=100,)

builder.as_dataset constructs a tf.data.Dataset, so you need to call train_ds.batch(your_batch_size). See here for more information on this.

  • Related