Home > database >  Input 0 of layer "conv1d_3" is incompatible with the layer: expected min_ndim=3, found ndi
Input 0 of layer "conv1d_3" is incompatible with the layer: expected min_ndim=3, found ndi

Time:04-03

I am trying to develop a VAE using this dataset, I have created and encoder and decoder by myself using keras tutorial, I only used Dense layers but now I wanted to add Conv1D layers too, however, after adding 1 conv layer to the encoder I get: Input 0 of layer "conv1d_3" is incompatible with the layer: expected min_ndim=3, found ndim=2. Full shape received: (None, 3)

I have found many questions like this but haven't found the exact answer, I want to add more Conv1D layers to the encoder and decoder, what do I need to change in both of them to add Con1D layers?

The code:

df = pd.read_csv('local path')
data, data_test = train_test_split(df, test_size=0.20)
batch_size = 128
latent_dim = 100
leaky_relu = tf.nn.leaky_relu
data.shape #(4240, 3)
class Sampling(layers.Layer):

    def call(self, inputs):
       z_mean, z_log_var = inputs
       batch = tf.shape(z_mean)[0]
       dim = tf.shape(z_mean)[1]
       epsilon = tf.keras.backend.random_normal(shape=(batch, dim))
       return z_mean   tf.exp(0.5 * z_log_var) * epsilon

encoder (where I get the error):

encoder_inputs = keras.Input(shape=data.shape[1], name='encoder_inputs')
x = layers.Conv1D(64, 3, activation=leaky_relu)(encoder_inputs)
x = layers.Dense(64, activation=leaky_relu)(x)
x = layers.BatchNormalization(momentum=0.8)(x)
x = layers.Activation('relu')(x)
x = layers.Dropout(0.20)(x)
x = layers.Dense(128, activation=leaky_relu)(x)
x = layers.BatchNormalization(momentum=0.8)(x)
x = layers.Activation('relu')(x)
x = layers.Dropout(0.25)(x)
x = layers.Dense(128, activation=leaky_relu)(x)
x = layers.BatchNormalization(momentum=0.8)(x)
x = layers.Activation('relu')(x)
x = layers.Dropout(0.25)(x)
z_mean = layers.Dense(latent_dim, name="z_mean")(x)
z_log_var = layers.Dense(latent_dim, name="z_log_var")(x)
z = Sampling()([z_mean, z_log_var])
encoder = keras.Model(encoder_inputs, [z_mean, 
z_log_var, z], name='encoder')
encoder.summary()

Decoder:

latent_inputs = keras.Input(shape=(latent_dim,))
x = layers.Dense(64, activation=leaky_relu)(latent_inputs)
x = layers.Dense(64, activation=leaky_relu)(x)
x = layers.BatchNormalization(momentum=0.8)(x)
x = layers.Activation('relu')(x)
x = layers.Dropout(0.20)(x)
x = layers.Dense(128, activation=leaky_relu)(x)
x = layers.BatchNormalization(momentum=0.8)(x)
x = layers.Activation('relu')(x)
x = layers.Dropout(0.20)(x)
x = layers.Dense(128, activation=leaky_relu)(x)
x = layers.BatchNormalization(momentum=0.8)(x)
x = layers.Activation('relu')(x)
x = layers.Dropout(0.20)(x)
x = layers.Dense(data.shape[1], activation='sigmoid')(x)
Reshape((data.shape[1],))(x)
decoder_outputs = layers.Dense(data.shape[1], 
activation='sigmoid')(x)
decoder = keras.Model(latent_inputs, x, name='decoder')
decoder.summary()

earlystopper = EarlyStopping(monitor='kl_loss', mode='min', min_delta=0.005, patience=20, verbose=0, restore_best_weights=True)

vae = VAE(encoder, decoder)
vae.compile(optimizer=keras.optimizers.Adam(learning_rate=0.000002))
hist = vae.fit(data, epochs=50, batch_size=128, callbacks=[earlystopper])


---------------------------------------------------------------------------
   ValueError                                Traceback (most recent call last)
   Input In [17], in <cell line: 4>()
  2 x = layers.Conv1D(64, 3, activation=leaky_relu)(encoder_inputs)
  3 x = layers.Flatten()(x)
   ----> 4 x = layers.Conv1D(64, 3, activation=leaky_relu)(x)
  5 x = layers.Flatten()(x)
  6 x = layers.BatchNormalization(momentum=0.8)(x)

CodePudding user response:

The problem is that data is missing the feature dimension necessary for a Conv1D layer, which needs the input_shape=(timesteps, features). You can try adding an additional dimension with tf.expand_dims:

data = tf.expand_dims(data, axis=-1)

And change your encode input layer to:

encoder_inputs = keras.Input(shape=(data.shape[1], 1), name='encoder_inputs')
x = layers.Conv1D(64, 3, activation=leaky_relu)(encoder_inputs)
x = layers.Flatten()(x)

Also, add a Flatten layer after the Conv1D layer for the downstream Dense layers.

  • Related