Home > Software design >  How to add a dimension using reshape layer in keras
How to add a dimension using reshape layer in keras

Time:08-28

I want to expand dimension in my model. Can I replace tf.keras.layers.Lambda(lambda x: tf.expand_dims(x, axis=1)), with a tf.keras.layers.Reshape() layer

My model is

    tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(10, activation='relu', input_shape=(i1,i2))),
    tf.keras.layers.Lambda(lambda x: tf.expand_dims(x, axis=1)),
    tf.keras.layers.Dense(1)
    

I want to replace the lambda layer

Modified Code:

 tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(10, activation='relu'),input_shape=(i1,i2)),
tf.keras.layers.Reshape((1,)),
tf.keras.layers.Dense(1)

Error :

ValueError: Exception encountered when calling layer "reshape" (type Reshape).

total size of new array must be unchanged, input_shape = [20], output_shape = [1]

Call arguments received:
  • inputs=tf.Tensor(shape=(None, 20), dtype=float32)

CodePudding user response:

Maybe something like this (you do not have to take care of the batch dimension):

import tensorflow as tf

inputs = tf.keras.layers.Input((2, ))
x = tf.keras.layers.Dense(10, activation='relu')(inputs)
outputs = tf.keras.layers.Reshape((1,)   x.shape[1:])(x)

model = tf.keras.Model(inputs, outputs)
model.summary()

With your code:

import tensorflow as tf

inputs = tf.keras.layers.Input((5, 10))
x = tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(10, activation='relu'))(inputs)
x = tf.keras.layers.Reshape((1,)   x.shape[1:])(x)
outputs = tf.keras.layers.Dense(5)(x)
model = tf.keras.Model(inputs, outputs)
model.summary()

Generally, if you check the docs, the output shape of the last layer will be inferred if you use -1:

# also supports shape inference using `-1` as dimension
model.add(tf.keras.layers.Reshape((-1, 2, 2)))
# where 2 and 2 are the new dimensions and -1 is referring to the output shape of the last layer.

This essentially works, because the Reshape layer is internally calling tf.TensorShape:

input_shape = tf.TensorShape(input_shape).as_list()

I personally prefer calling the shape explicitly though.

CodePudding user response:

We can use tf.keras.layers.Reshape((1, -1)) like below, instead of using tf.keras.layers.Lambda(lambda x: tf.expand_dims(x, axis=1)).

import tensorflow as tf

model = tf.keras.Sequential([
        tf.keras.layers.Bidirectional(
            tf.keras.layers.LSTM(10, activation='relu', input_shape=[100, 256])),
        tf.keras.layers.Reshape((1, -1)),
        tf.keras.layers.Dense(10)
    ])
model(tf.random.uniform((1, 100, 256))) # (batch_dim, input.shape[0], input.shape[1])
model.summary()    

Model: "sequential_1"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 bidirectional_1 (Bidirectio  (1, 20)                  21360     
 nal)                                                            
                                                                 
 reshape_1 (Reshape)         (1, 1, 20)                0         
                                                                 
 dense_1 (Dense)             (1, 1, 10)                210       
                                                                 
=================================================================
Total params: 21,570
Trainable params: 21,570
Non-trainable params: 0
_________________________________________________________________

Check your code and we get the same result:

import tensorflow as tf

model = tf.keras.Sequential([
        tf.keras.layers.Bidirectional(
            tf.keras.layers.LSTM(10, activation='relu', input_shape=[100, 256])),
        tf.keras.layers.Lambda(lambda x: tf.expand_dims(x, axis=1)),
        tf.keras.layers.Dense(10)
    ])
model(tf.random.uniform((1, 100, 256))) # (batch_dim, input.shape[0], input.shape[1])
model.summary()    

Model: "sequential_2"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 bidirectional_2 (Bidirectio  (1, 20)                  21360     
 nal)                                                            
                                                                 
 lambda (Lambda)             (1, 1, 20)                0         
                                                                 
 dense_2 (Dense)             (1, 1, 10)                210       
                                                                 
=================================================================
Total params: 21,570
Trainable params: 21,570
Non-trainable params: 0
_________________________________________________________________
  • Related