I wanted to replace an already loaded model in TensorFlow based on this
as you can see, yes, the Mish is defined as tanh(softplus(x)), but this completely messes up with the Keras functionality! Does anyone know a way to work around this issue?
CodePudding user response:
You could use a custom activation function, instead of the one from tensorflow addons.
I've tried this code, and the summary is shown correctly:
def mish(inputs):
x = tf.nn.softplus(inputs)
x = tf.nn.tanh(x)
x = tf.multiply(x, inputs)
return x
model = tf.keras.Sequential([
tf.keras.layers.Conv2D(filters=16, kernel_size=(3, 3), strides=(1, 1),
input_shape=(28, 28, 1), activation='relu'),
tf.keras.layers.MaxPool2D(pool_size=(2, 2)),
tf.keras.layers.Conv2D(filters=32, kernel_size=(3, 3), strides=(1, 1),
activation='relu'),
tf.keras.layers.MaxPool2D(pool_size=(2, 2)),
tf.keras.layers.Flatten(),
tf.keras.layers.Dense(64, activation=mish, name="dense_mish"),
tf.keras.layers.Dropout(5e-1),
tf.keras.layers.Dense(10, activation='softmax')])
model.compile(loss='categorical_crossentropy', optimizer='Adam')
Summary:
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 26, 26, 16) 160
max_pooling2d (MaxPooling2D (None, 13, 13, 16) 0
)
conv2d_1 (Conv2D) (None, 11, 11, 32) 4640
max_pooling2d_1 (MaxPooling (None, 5, 5, 32) 0
2D)
flatten (Flatten) (None, 800) 0
dense_mish (Dense) (None, 64) 51264
dropout (Dropout) (None, 64) 0
dense_1 (Dense) (None, 10) 650
=================================================================
Total params: 56,714
Trainable params: 56,714
Non-trainable params: 0
_________________________________________________________________