Home > other >  Mish keras, how to use the function () to replace the original activation function
Mish keras, how to use the function () to replace the original activation function

Time:09-19

Existing Mish activation function ()

 
The import tensorflow as tf
The from tensorflow. Keras. The layers import Activation
The from tensorflow. Keras. Utils import get_custom_objects

The class Mish (Activation) :
"'
Mish Activation Function.
. Math: :
Mish tanh (x)=x * (softplus (x)=x * tanh (ln (1 + e ^ {x}))
Shape:
- Input: Arbitrary. Use the keyword argument ` input_shape `
(a tuple of integers, does not include the samples axis)
When using this layer as the first layer in a model.
- the Output: the Same shape as the input.
Examples:
> X=Activation (' Mish 'name="conv1_act") (X_input)
"'

Def __init__ (self, activation, * * kwargs) :
Super (Mish, self) __init__ (activation, * * kwargs)
The self. The __name__='Mish'


Def mish (inputs) :
Return inputs * tf. Math. Tanh (tf) math) softplus (inputs))

Get_custom_objects (). The update ({' Mish: Mish (Mish)})


How do I put in the network under the activation function relu replace the mish?

 input_layer=Input ((S, S, L, 1)) 
Conv_layer1=Conv3D (filters=10, kernel_size=(filling), activation='relu') (input_layer)
Conv3d_shape=conv_layer1. _keras_shape
Conv_layer1=Reshape ((conv3d_shape [1], conv3d_shape [2], conv3d_shape * conv3d_shape [3] [4])) (conv_layer1)
Conv_layer3=Conv2D (filters=80, kernel_size=(3, 3), the activation='relu') (conv_layer1)
Output_layer=conv_layer3

CodePudding user response:

To replace 'relu with Mish
  • Related